Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
Now that's a super-peculiar idea.
I may not respond any more after this post. I understand that you're expressing an objection to the Peano axioms or to mathematics in general, but I am not the lord high defender of the realm. I'm just trying to explain a few things. I supplied a number of links you might find of interest.
If you think math is circular, you're probably not the only one. And if you don't think the Peano axioms are ultimately sensible, there are finitists and ultra-finitists who agree with you. I could never talk you out of those opinions, nor do I even want to.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
I have no problem reasoning about and reversing objects of finite size, but I have no idea what it means to "reverse" something with a start and no end.
Reversing a
binary relation is easy. A binary relation is a set of ordered pairs (a,b) indicating that a and b are in a given relation. The reverse relation is the set of corresponding pairs (a,b). If a < b, then in the reverse order, b < a. That is, if (a,b) is an ordered pair in the relation, then (b,a) is an ordered pair in the inverse relation. This is extremely basic.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
Because after all "first n elements of X" means "last n elements of reversed X". Is just that "last element" is a rather strange notion in an inductive sequence, where there is always another element. And another. And another.
ω* is not an inductive relation, obviously.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
I am super curious what the first two pairs of the following function are...
Since ω* is not a Python list, this expression is undefined and would throw a runtime error. Surely you can see that ω* is not defined inductively, and is not a Python list.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
(0, ???), (S0, ???)
You've already gotten a Python error, since the second argument to zip() is not a legal Python list. Nor are we even talking about programming languages. You want to force a mathematical discussion into the context of programming languages, but it's a poor fit. This effort is the source of the confusions you're experiencing, and the frustration I'm having trying to respond.
In any event the < relation is a binary relation, and a binary relation is a set of ordered pairs, and you can reverse any binary relation by reversing all the ordered pairs. And of course it's perfectly clear that if you reverse an inductive relation, the reversed relation is not going to be inductive.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
I notice that in addition to the Peano numbers you've also brought the set-theoretic natural numbers in to the discussion - this doesn't really help anything because they are recursively/inductively defined also. You know, that pesky "...".
The dots are only an informal shorthand. They're not a part of the formal Peano axioms.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
Or in general:
n = {0, 1, …, n − 1}
n + 1 = n ∪ {n}
So the set-theoretic definition of "the natural numbers" is appealing to arithmetic in order to define the natural numbers!
I see no arithmetic here, only the set-theoretic operations of going from n to {n}, and forming a union.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
Is arithmetic a priori numbers, or isn't it? And I won't even get stuck into the (lacking) definition of ∪.
It's perfectly well defined by the
axiom of union.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
I mean, if set theoreticians are happy to use the expressions "n", "n-1", "n+1" and "..." in defining "the numbers" then surely I can write down: ..., (n + 1) + 1 < n + 1 < n < n - 1 < (n - 1) - 1, ....
What does that make it? ℤ*?
What? Not sure what you're getting at. Given the usual order on Z you can certainly define Z*, the reverse order. What's the problem.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
And, of course - I just have to ask the question: What are the first and last elements of zip(ℤ,ℤ*)?
In this case neither Z nor Z* are Python lists, so you'd get two runtime errors here. Z does not have a first element so it is not a Python list. Likewise for Z*.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
All of the above aside it's your "..." that's definitionally vague -
And yours isn't. As I've noted.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
Difference being that I am opnenly admitting my usage of "..." is circular. It's recursive/inductive.
What? Recursive/inductive constructions are not circular. You've made this error a couple of times. The whole point of recursion/induction is the base case to avoid circularity and lack of well-foundedness.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
In ordered theories with a mimimum element I use "..." to indicate recursion: n, n+1, n+2, ...
In ordered theories with a maximum element I use "..." to indicate corecursion: n, n-1, n-2, ...
And in ordered theories without a minimum/maximum elements I am using it corecursively and recursively respectively: ..., -3, -2, -1, 0, 1, 2, 3, ...
Something which you seem unwilling to admit.
I don't recall your asking me to admit it. Doesn't seem connected to the rest of the conversation, you lost me.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
This function does.
Code: Select all
def generator():
return "0010101010111000101..."
Really? Where are you going to store those infinitely many bits? Are you going to keep running down to Best Buy for more memory? Crucial has better prices.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
For any sequence, number, set; or any entity whatsoever in the Mathematical universe that you claim "exists" I can trivially assume the unit type.
Actually not. If a sequence is algorithmically compressible -- that is, if it can be generated by a program or Turing machine -- then you can store it in a computer and generate as many bits of it as you like, subject to time/space constraints. However very few bitstrings are algorithmically compressible. After all there are only countably many Turing machines and uncountably many bitstrings.
Give this a read.
https://en.wikipedia.org/wiki/Algorithm ... ion_theory
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
I don't understand what you mean by "computable" here.
Really?
Computable real numbers were defined by Turing in 1936. A real number is computable if its digits can be cranked out by a Turing machine; that is, if there is a TM that, when the positive integer n is input, halts with the n-th digits of the decimal (or binary if you like) digit of the number.
Since there are only countably many TMs, and uncountably many real numbers, "almost all," in the sense of all but countably many real numbers, are not computable.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
Perhaps you mean "deterministic"? Reproducible? A pure function?
A function that returns a static mapping every time?
When I said computable, I meant computable in the sense of Turing 1936. What other definition is there? It's the only one.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
Your nomenclature is confusing from the get go.
Turing 1936. As the poet said, "That is all ye know, and all ye need to know."
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
There can be no such thing as a randomness in pure mathematics.
Oh my gosh. Ever hear of random variables? But in this context, algorithmic information theory is the operative context. A bitstring is random if it's not computable. That's from Kolmogorov. Please give this a read.
The idea is that if I have 10101010101010... I can say, "Start with 1 and alternate 1's and 0's." That's a short string that generates an infinite sequence. So the sequence is compressible, and not random. But a random sequence can't be compressed. That's literally the definition.
https://en.wikipedia.org/wiki/Kolmogorov_complexity
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
A mathematical function must return the same result for the same input - which literally defeas the purpose of random number generators.
The bits of Chaitin's constant are determined (in the sense of being definable) but noncomputable. It's in the Wiki page on Chaitin's constant.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
However, outside of the domain of "purity" there is a function which is computable in the sense that it runs on a computer which generates an infinite-length random bitstring.
In a lazy-evaluated language such as Haskell you can define such function, you can bind that function to a variable and you can access its n-th element of such a function which is as close to Turing's notion of "computable function" as you can get.
Here's what gets me. You toss out Haskell and Python and generators etc. but you seem to have no familiarity with binary relations or algorithmic randomness or computability. From my point of view there's a mismatch. But I hope some of my links are helpful.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
The caveat being you can only get the n-th element of a random bitstring just once.
Well how many times do you need to get it? It's an arbitrary bitstring.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
If you want to do it twice then you have to record the random string and replay it. Bit by bit.
Yes ok. I have a lot of pencils, I'll write down the bitstring then play it back. I think you're kind of hung up on the wrong thing here. There's some function that inputs a natural number n and outputs a bit, one or zero. That's a function, because it's deterministic. Give me n and I'll give you back the n-th bit. But there is no PROGRAM that can generate the bits. It's a function, but it's not a computable function. In other words God (or an oracle, if you prefer) can tell you the n-th bit; but there is no program that can tell you the n-th bit. Is that a helpful analogy?
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
Definable real number. Lol. We've had that discussion.
I don't recall that, but it's a perfectly sensible notion. You should read the Wiki page on Chaitin's constant. It turns out that there are numbers that are first-order definable in logic, yet not computable. Chaitin's constant is one such.
The distinction between definability and computability is quite subtle, and I'm not an expert. But there are numbers that are definable yet not computable.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
How could I possibly be wrong?
LOL. Thanks for the chuckle.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
Given any sequence where did the sequence originate from?
Coin flip, cosmic ray, who cares? There are uncountably many bitstrings, and only countably many Turing machines or Python programs. So all but countably many bitstrings are uncomputable; that is, Kolmogorov-random.
But you know there's an important distinction here. You want every bitstring to be generated or to "come from" somewhere. In math, we are not so much concerned about that. We have a set, N, the set of natural numbers. We have the set of functions N -> {0,1}, the set of all functions that input a natural number and output a bit. The entire set of all possible such functions has mathematical existence in set theory. There are uncountably many such functions. We are not required to give an account of where these functions "come from." There's nobody in the basement hammering them out. They exist because the set N exists, and the set {0,1} exists, and the set of all functions from N to {0,1} exists.
Here "exists" means, "exists according to the rules of set theory." That's of course a very nebulous form of existence. But if you don't like it or don't believe in it, I hope you can at least accept that mathematicians believe in it and, for the sake of discussion, accept it on its own terms.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
Even if that something is a Mathematician the sequence has a generator.
No, because a generator is a thing in programming language theory and must be computable. But most sequences are not computable hence can not have a generator.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
Sequences don't appear from thin air.
The set of bitstrings exists as an abstract set. It's the set of functions from the natural numbers to the set {0.1}. It's an uncountable set.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
Now THAT is an extraordinary claim and I am going to demand extraordinary evidence.
I've already given the proof several times. There are uncountably many bitstrings but only countably many Turing machines, hence "almost all" -- defined in this context as "all but countably many" -- are noncomputable. They can't have generators, they can't be cranked out by any algorithm.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
1. What is the origin of the sequences you are speaking about?
Just pick any noncomputable sequence. But for concreteness, use the very special case of Chaitin's constant, which is definable but not computable.
But again, you raise the question of the "origin" of these sequences. Their origin is set-theoretic. We have a set, which is commonly denoted 2^N, where ^ denotes exponentiation, and which represents the set of all functions from N to the set {0,1}. There are uncountably many such functions, via the standard diagonal argument. I agree with you that mathematicians posit existence with breathless grandiosity. Where do they "come from?" It's not a bad question. They come from the axioms of set theory.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
2. Even if we ignore the origin and assume that all of those sequences "just exist"... how do you know they are "random"?
Because there are uncountably many sequences of 0's and 1's, and only countably many Turing machines, Python programs, Haskell programs, or algorithms. That's the proof.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
In particular I would love to see your random(X) predicate which returns a Boolean for any given sequence X.
After all this, not a bad question. Actually not sure. Given a bitstring, how do I know whether it corresponds to some algorithm. Not sure at the moment. But why does it matter? I've already shown several times that these sequences exist in great abundance.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
Because from my limited knowledge, I am pretty damn sure randomness is not provable. In the usual, technical sense of "provable".
That would be news to Kolmogorov and Chaitin. And Turing. And everyone else who's taken the trouble to understand their work. Or even glance at the Wiki pages describing their work. I don't want to belabor this point already, I've explained it several times.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
Probably because I don't know that.
Ok. Well I hope I've been able to shed some light. Turing, Kolmogorov, Chaitin. Uncountably many bitstrings, only countably many Turing machines. Hence "almost all" bitstrings have no generator, can't have their digits cranked out by a TM.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
There is literally no way for me know that any particular sequence is random.
This is true. We'd have to see infinitely many of its bits, and we can't do that. This is an epistemological problem, but not an ontological one. Even given an infinite sequence I'm not sure how to determine whether it's computable or not.
There are analogous problems in math. For example we don't know whether pi + e (where e = 2.71etc) is rational or not. An amazing thing not to be able to know, but it's an open problem.
So I agree with you that given an infinite bitstring, we can't look at it and know whether it's computable or not. But that does not matter! Why do you think it does?
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
ot without understanding how the sequence was produced.
Well if it's produced by an algorithm, then it's not random!
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
Least I (eventually) find a computable function that happens to generate that particular sequence.
I agree, but it doesn't matter. A sequence is random or not independently of our ability to know that fact. Just like pi + e is rational or not even though it's an impossibly difficult problem to solve.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
But, of course, there is one particular function that can generate ANY particular sequence it has already seen!
What?
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
You have invinite memory, right? Variable assignment (data copying) is free, right?
Of course, made several trips to Best Buy. In any event this seems to be the identity function. I am not following your point.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
I am not saying that at all. I am saying ALL of Mathematics is circular/recursive.
But circular and recursive are not the same thing at all. What do you mean?
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
I explained my point as precisely as possible.
"0, S0, SS0" is not circular.
"0, S, SS0, ..." is circular.
"n, n+1, n+2 , ..." is circular.
Well there are philosophers who consider the Peano axioms to be circular, in the sense that if I ask for the Peano representation of 11 trillion, one would have to go SSSSS...SSSS0, but how did I know how many S's to count? So there's perhaps a valid point to be made, or at least some finitist or ultra-finitist philosophers would so argue.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
Because infinite induction is the same thing as non-halting recursion. Computer scientists call them
generators.
Generators are by definition algorithms, and there aren't enough algorithms to generate all the possible bitstrings. Do you take this point?
Productions (for example in formal language or automata theory) are algorithms. There are only countably many of them. There are uncountably many binary sequences or bitstrings.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
Indeed, I know what you mean. Too bad you can't define it without falling into circularity.
The dots are never part of the formal descriptions, either in set theory or the Peano axioms.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
wtf wrote: ↑Wed Mar 30, 2022 2:57 am
Now the order 0, 1, 3, 2, 4, ... is a different order. But if you think about it, it's
order-isomorphic to the usual order 0, 1, 2, 3, 4, ... That is, if we simply call 2 by the name '3', and 3 by the name '2', it's exactly the same order. So it's a different order, but the same
order type.
Well. Why can't we say the exact same thing about ALL numbers then?
There is only one order type equivalent to ω, even though there are many distinct representations.
wtf wrote: ↑Wed Mar 30, 2022 2:57 am
Any order is order-isomorphic to any random shuffling of itself. Is just permutations/combinatorics!
No! I already gave you the counterexample. 0, 1, 2, 3, ... is not the same order type as 1, 2, 3, 4, ..., 0. There is no order-isomorphism between those two ordered sets, even though there is a bijection aka permutation.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
Any and all permutations of representations of ℕ, ℤ, ℝ, ℂ will be order-isomorphic if you remember the new names of everything.
No, I just showed the counterexample. Besides, there is no order relation on the complex numbers that respects the algebraic operations.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
An infinity with a largest AND a smallest element makes no sense to me. And your "..." really is not helping.
You just study some order theory and the ordinal numbers. But here is an algorithm for the order type 0, 1, 2, 4, 5, 6, ..., 3, which we'll call <' for "funny less than."
* If n and m are not 3, then n <' m if n < m.
* If m = 3 and n isn't three, then n <' m.
[See below for a Python example that makes this more clear].
You could program this and it would compute <' perfectly. Everything's funny-less-than 3 (except for 3) and other than that, everything's the way it usually is. This simply defines an alternate order relation on the natural numbers. And in particular, it's NOT the same order type as the usual order, even though it's in bijection with the usual order.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
If you grammar allows you to say 7, ..., 3 then surely you also say 0,...,1? Then everything is as it usually is, except 0 is the smallest element and 1 is the largest element.
You lost me here. And perhaps I lost you.
0, 1, 2, 4, 5, 6, ..., 7 is just a shorthand for the funny-order that says that everything is the same as it usually is, but everything other than 7 is funny-less-than 7. You could easily code that up in Python, right? No problem at all.
Code: Select all
def funny_less_than(n, m) :
if n == 7 :
return False
if m == 7 and n != 7 then :
return True
return n < m
Is that more clear? Now this imposes an alternate order on the natural numbers with the property that there's a smallest element, namely 0, which is smaller than everything except itself; and a largest element, namely 7, which is larger than everything except itself. So we have permuted the natural numbers and not only changed the order, but also changed the order type.
Conclusion: All non-identity permutations change the order. But SOME of them change the order type, and others don't. For example 0, 1, 3, 2, 4, 5, 6, ... changes the order, but the order type is the same. There's an order-preserving bijection between this and the usual order.
Skepdick wrote: ↑Wed Mar 30, 2022 10:40 am
And since 1 is the successor to 0, ω+1 is the successor to ω then how about this ordering: ω, ..., ω + 1 ?
That pesky "..."
Well that's not how ordinals work. ω + 1 is the immediate successor to ω, that's how it's defined. There's nothing between them.
And, in case you are concerned that I am using the symbol +, which implies that arithmetic is prior, this too can be fixed. Actually ω+1 is the
set-theoretic union of ω. Without going into detail, you can take my word for it that the set theory is logically prior to the arithmetic without any logical circularity.