Language Negation and Randomness

What did you say? And what did you mean by it?

Moderators: AMod, iMod

Justintruth
Posts: 187
Joined: Sun Aug 21, 2016 4:10 pm

Language Negation and Randomness

Post by Justintruth »

I keep running into phrases like this in the study of probability:

"Clearly nothing can be said about (some set)...."

...implying it is random.

I wonder if randomness can be defined somehow as a set expressing an arrangement about which nothing can be said.

So "these numbers are consecutive" therefore they are not random. Or "these numbers are the digits obtained when one calculates pi in decimal" therefore they are not random.

So then saying some set is random means you can't say something about it?

Saying a set is random says you can't say anything about it other than it is random?

This seems to transcend physical/epistemological interpretations of probability or works with both.

The reason 7 has a higher probability of occurring than 2 when throwing dice is you can say about 1+6, 2+5, 3+4 that they are all 7? But even if you say they are 6 they are still 7 so you have to be saying the truth? So forming a set can be done by stating a property?

Property -> expression -> set.

3+4=7 is true whether someone says so or not so it's not language in one sense. Nor is it necessarily meaning. The set {1,1; 1, 6; 5,6} has the same probability as rolling a 7 yet what can be said about it?

So why does saying "you can't say anything about it" imply it is random or say anything at all about its likelihood?

Being itself can be seen as the most random because about what it can be one could say nothing. But we can say a lot about what it is now and has been. Does the very fact of language mean that Being is non random? Would a situation about which nothing could be said be random? Does that mean our universe is not random? Is what can be said about the universe what Being says? If nature is what Being says and about it we can speak / repeat why does it seem that the universe is random?

In general what is the relation between being nature language and what we mean by random?

Hasn't Being created a being saying something about possible beings that one can say nothing about?

Very confusing.
wtf
Posts: 1178
Joined: Tue Sep 08, 2015 11:36 pm

Re: Language Negation and Randomness

Post by wtf »

https://en.wikipedia.org/wiki/Kolmogorov_complexity

The idea is that we can measure the randomness of a collection of numbers by seeing how much we can compress it. For example if we had the infinite bitstring 1010101010101010101010... we can describe it as "Alternating 1's and 0's starting with 1." So even though the length of the string is infinite, it's not very random.

Likewise even irrational numbers like pi or sqrt(2) are not random. Their digits are generated by simple algorithms.

On the other hand a truly random bitstring is one that can't be compressed at all. There is no rule, algorithm, description, procedure, etc. that can generate its digits.

If you're studying probability, that's the answer to your question. If this is about the philosophy of natural language, that's a different question but probably not the one they mean in probability class.
User avatar
Hobbes' Choice
Posts: 8364
Joined: Fri Oct 25, 2013 11:45 am

Re: Language Negation and Randomness

Post by Hobbes' Choice »

Justintruth wrote:I keep running into phrases like this in the study of probability:

"Clearly nothing can be said about (some set)...."

...implying it is random.

I wonder if randomness c
There is always something to be said about something that is definable. Of a set of random numbers, there is "they are random" to a thing to be said of them. The other thing, at least you can say about them is that "they are NOT random" as some method has been used to select them. Not even a computer can make a set of random numbers and it relies on a long list of numbers.

Who is saying that stuff?

Show me a set of number about which nothing can be said please!
Justintruth
Posts: 187
Joined: Sun Aug 21, 2016 4:10 pm

Re: Language Negation and Randomness

Post by Justintruth »

Hobbes' Choice wrote:
Show me a set of number about which nothing can be said please!
Not sure I can but as I said I keep running into the phrase being used that seem to imply that being random has something to do with not being able to say something about something. Here is the example that triggered my post.

https://projecteuclid.org/euclid.ss/1177011444

He says:. "There is clearly essentially no
meaningful statements about the relationship be-
tween x2000 and xo, even though x2000 is a well-
defined polynomial function (of admittedly high
order) ..."
Justintruth
Posts: 187
Joined: Sun Aug 21, 2016 4:10 pm

Re: Language Negation and Randomness

Post by Justintruth »

Wow, thanks. I had heard of this a while back but totally forgot about it.

So..entropy, chaos, and Kolmogorov complexity....hmm mm. Now you have really deepened the problem. So THAT'S why the call them programming languages!

You have really helped me with this. Now if I can just see fundamentally how a Turing machine compresses data...

I guess one way of describing random data is that it is in compressible.

I taught technically in Japan for a while and was amazed to see me watch a conversation in Japanese and know what they were asking before the translator spoke ...and I didn't know Japanese. After watching a wile I realized I knew already what they would ask before the said it based on the material. So the Shanon entropy of a source, or the information in a signal depends on what you already know and if you already know what I am about to say there is 0 Shandon information in what I say.

Now for me to know maybe requires me to be able to say what I know and the Kolmogorov complexity measures how succinctly I say it in a programming language.

But because of your post you can see it does not require semantics - It's purely a function of the Shandon information of the Turing machine program that reproduces the set.

So now I need a better understanding of the subclass of Turing machine programs that compress data and the requirements for storing it. Wow.

Non-computability. ..hmmm

If you're studying probability, that's the answer to your question. If this is about the philosophy of natural language, that's a different question but probably not the one they mean in probability class.
Actually I started trying to understand lift in an airplane wing. Led me to heat and entropy, and the Boltzman distribution. Under that is this notion of distinguishability which underly the differences in types of physical statistics Bose-Einstein, Fermi Direct, and Boltzman. Distinguishability led me to ideas about language and also issues dealing with distinguishing two identical things...sort of a contradiction but it underlies Medieval notions of matter as well as some aspects of modern physics.

But language gets tied up with it. I wonder if Kolmogorov complexity implies semantics in the syntax of the program. So if I take Searle's Chinese room and give it 100 digits of PI and ask for the next digit all in Chinese does it mean that it knows something about pi? Probably not. But then randomness is a purely syntactical property!

Wow. You certainly have helped me. Now if I can just figure out how.
Impenitent
Posts: 4329
Joined: Wed Feb 10, 2010 2:04 pm

Re: Language Negation and Randomness

Post by Impenitent »

Hobbes' Choice wrote: Show me a set of number about which nothing can be said please!
zero

-Imp
wtf
Posts: 1178
Joined: Tue Sep 08, 2015 11:36 pm

Re: Language Negation and Randomness

Post by wtf »

Justintruth wrote: Wow. You certainly have helped me. Now if I can just figure out how.
LOL. I'm glad I was able to help.

Of course "random" means a lot of different things in different contexts. If you are trying to create some kind of unified theory of Medieval philosophy, the lift of airplane wings, thermodynamics, and probability theory, that sounds like either a lot of work or ... pardon my suspecting this ... too much Wiki surfing. We're all a victim of that these days. You want to look something up, and you start reading and clicking, and you start to imagine you understand things.

Your example of pi was interesting. Supposes I give you, or a computer, or a mathematician, the first 100 digits of pi. They have no idea what's next unless you tell them it's the decimal representation of pi. It could be the decimal representation of some other number that shares pi's first hundred digits.

But if you program a computer to evaluate the famous Leibniz series pi/4 = 1 - 1/3 + 1/5 - 1/7 ..., it's clear that the computer doesn't "know" what it's doing any more than my laundry dryer "knows" when my clothes are dry, even though it has a moisture sensor that beeps when my clothes are dry. It's just a machine executing according to an algorithm. That's Searle's point, which which I agree. Others believe otherwise. I'm not mentioning this to argue that point today. I'm just pointing out that the digits of pi aren't random, they're deterministic and compressible to a short algorithm. https://en.wikipedia.org/wiki/Leibniz_f ... for_%CF%80

What does an uncomputable real number look like? Consider (as a thought experiment, we can't do this in real life) a perfectly fair coin that you flip infinitely many times, one flip for each natural number 1, 2, 3, ...

If we write 1 for heads and 0 for tails and put a decimal point in front, we get the binary representation of some real number in the unit interval. Such a number is highly unlikely to be generated by an algorithm. In fact we say in probability theory that the probability that a randomly chosen real number is computable is 0; and the probability that it's not computable is 1. That's a provable mathematical fact.

It's interesting that the title of Turing's famous paper in which he introduced the notion of Turing machines is: ""On Computable Numbers, with an Application to the Entscheidungsproblem." We needn't worry about the Entscheidungsproblem, which is just the question of whether there's an automated procedure to solve Diophantine equations.

The point I wish to make is that Turing was explicitly considering the nature of computable numbers. Turing was all over this stuff.

Well that's just some rambling ... some "random" rambling, in the natural language sense of the word.
Justintruth
Posts: 187
Joined: Sun Aug 21, 2016 4:10 pm

Re: Language Negation and Randomness

Post by Justintruth »

LOL. I'm glad I was able to help.
Thanks again.

I think what I am trying to understand is how counting occurs in probability theory.

I know that it is wrong to count the number of ways you can roll snake eyes with two dice as two - two because either the first dot could be on the first dice and the second dot on the second dice, or the first dice could have the second dot on it and the second dot could be on the first dice. I know that is wrong. But it seems that if we have 8 energy levels and an overall system energy of 8 and we have 6 particles then we count as 6 the number of ways to have 5 particles with zero energy and one with 8. That's six because any of the six particles could be the one with an energy of 8. Snapshots of each of those 6 ways are indistinguishable. But they are considered distinguishable. But we don't ask which energy is in each particle. Are there two possibilities, one in which this particle has this energy and that particle has that energy and another where that energy is in this particle and this energy is in that particle. Haecity and energy seem to confuse me when counting. I'll get it eventually. When is it like two dice and when is it like two spots. Something about distinguishability?

I am wondering what the relationship of the axiom of choice is to all of this. Here is a quote from the wonderful wiki on the axiom of choice: "In many cases such a selection can be made without invoking the axiom of choice; this is in particular the case if the number of bins is finite, or if a selection rule is available". A selection rule is available? Again there is the notion of the infinite or a function which I take as an example of something that can be said. And the notion of binning just like that used to establish microstates in thermodynamics.

I actually don't think I understand all of this. Just that I have an intuition that there is a way to relate it all together. There is something(s) at the base of it all I don't understand.

Consider a ring of completely homogenous material - so homogenous that its nature is continuous there being "nothing that can be said" to distinguish any piece of it from another. Now is it rotating relative to any frame? If this material is not that material then it can rotate. But if not and "this and that" are like the dots not dice, then when viewed from any two frames rotating with respect to each other the ring is stationary. So two frames rotating with respect to each other both see the ring as stationary? How could you distinguish a motion. (I don't mean in the current physics)
Of course "random" means a lot of different things in different contexts. If you are trying to create some kind of unified theory of Medieval philosophy, the lift of airplane wings, thermodynamics, and probability theory, that sounds like either a lot of work or ... pardon my suspecting this ... too much Wiki surfing. We're all a victim of that these days. You want to look something up, and you start reading and clicking, and you start to imagine you understand things.
Yea. I do use the wiki. Love it. But its no problem because I am the limit not the source. I would be very glad to understand just a fraction of the wiki I read.

The only idea I was looking at from Medieval philosophy is the notion of matter as being that which stays the same when something is changed from one thing completely into another identity. Not just accidental but essential change. That has something to do with haecity, as it is "this" matter which is changed not "that", which seems to be important in determining how to count possibilities which underlies probability theory which is the basis of statistical mechanics and hence thermodynamics and areo dynamics. But its all the same thing my mind is trying to get its head around. How do we establish the spectrums when we calculate probabilities?

It always struck me that Searles Chinese room was first deployed against machines that run on language. They needed the Rosetta stone because they were dealing with symbols. So does the Chinese room. But if you connect the room to sensors you can easily make a machine that records facts about the outside world. What is passed into the room is no longer symbolic. Now this does not mean that the room understands. That may happen but if it does...well...pixie dust... I get that. Still it means that you can encode semantics. You can show a machine something, not just tell it something. For example imagine a 10 picket fence with the second fourth fifth and eighth pickets missing. You can use ascii to encode what I said but you could also go 1010011011. You wouldn't know that it was a picket fence but if I showed different fences with different pickets you could just compare the binary to the pickets to select the right image but could not use the ascii code unless you knew English. Again it has nothing to do with conscious understanding. But there seems to be a second distinction between statements in a language where the symbols are more like pictograms and less like symbols. In an non-understanding machine the data recorded could be purely symbolic or a recording that has some intentionality to it - matches what it is recording in some way. Something that is like a picture of the house and the house. Something the picture says about the house that you don't need a dictionary for because it is in the picture itself that has what is said in it. I think that the point of the Chinese room kind of got expanded from the original use. A Turing machine can process both the binary and the Ascii. I do think it is possible that computers may, and even probably will, eventually actually understand. At that point they will not just be computers. If they do it will have nothing to do with the distinction I am making. It will be a feature of being in my opinion completely un-reducible. As for consciousness, I think you can create it independent of any intentionality with respect to the brain and what is external to it.

Here is a link you probably already know about:

http://www.drdobbs.com/architecture-and ... /240049914

Interesting also that you can compute without creating heat as long as you don't erase.

That was a great point about algorithmic complexity. Don't understand it, but its got me thinking a whole different way. And that's why I'm in it.
User avatar
Hobbes' Choice
Posts: 8364
Joined: Fri Oct 25, 2013 11:45 am

Re: Language Negation and Randomness

Post by Hobbes' Choice »

Justintruth wrote:
Hobbes' Choice wrote:
Show me a set of number about which nothing can be said please!
Not sure I can but as I said I keep running into the phrase being used that seem to imply that being random has something to do with not being able to say something about something. Here is the example that triggered my post.

https://projecteuclid.org/euclid.ss/1177011444

He says:. "There is clearly essentially no
meaningful statements about the relationship be-
tween x2000 and xo, even though x2000 is a well-
defined polynomial function (of admittedly high
order) ..."
Self refuting statement.
User avatar
Hobbes' Choice
Posts: 8364
Joined: Fri Oct 25, 2013 11:45 am

Re: Language Negation and Randomness

Post by Hobbes' Choice »

Impenitent wrote:
Hobbes' Choice wrote: Show me a set of number about which nothing can be said please!
zero

-Imp
Nice try. Zero is not a set, neither is it s number.
wtf
Posts: 1178
Joined: Tue Sep 08, 2015 11:36 pm

Re: Language Negation and Randomness

Post by wtf »

Justintruth, You are all over the map. I find your individual paragraphs interesting and thought provoking but perhaps you could narrow down your focus to one or two things and start separate threads for the others. If I occasionally harp on that point it's not personal, it's just that I find your posts so interesting that I'd like to respond, but so unfocussed that I find it hard to respond.
Justintruth wrote:
I think what I am trying to understand is how counting occurs in probability theory.
That's well understood. You could study finite probability theory and combinatorics. How many poker hands can be dealt and so forth. I don't know much about that.

On the other hand if you are interested in infinitary probability theory, I do know a little about the underlying mathematical theory, which is called measure theory. Is that something you wanted to talk about? https://en.wikipedia.org/wiki/Measure_(mathematics) [the forum software messed up the link, you need to include the final closing paren].

I confess I don't see much relevance to probability theory of the rest of your post, which is a bit unfocussed.
Justintruth wrote: I know that it is wrong to count the number of ways you can roll snake eyes with two dice as two - two because either the first dot could be on the first dice and the second dot on the second dice, or the first dice could have the second dot on it and the second dot could be on the first dice. I know that is wrong.
Do you? That's a bit of a disingenuous remark. There is one way to roll snake eyes, that's with a 1-spot showing on each die (the singular of dice). Switching the spots wouldn't change that fact. Why are you confusing yourself about this?
Justintruth wrote: But it seems that if we have 8 energy levels and an overall system energy of 8 and we have 6 particles then we count as 6 the number of ways to have 5 particles with zero energy and one with 8. That's six because any of the six particles could be the one with an energy of 8. Snapshots of each of those 6 ways are indistinguishable. But they are considered distinguishable. But we don't ask which energy is in each particle. Are there two possibilities, one in which this particle has this energy and that particle has that energy and another where that energy is in this particle and this energy is in that particle. Haecity and energy seem to confuse me when counting.I'll get it eventually. When is it like two dice and when is it like two spots. Something about distinguishability
I'm afraid I don't know much about energy levels in physics. But if you are interested, why not study physics? I think you're confusing yourself again on some point I can't grasp.
Justintruth wrote: I am wondering what the relationship of the axiom of choice is to all of this.
AC is a principle in modern axiomatic set theory. It says that you can choose an element from each of a collection of nonempty sets, even if you don't have any specific rule to make the choice.

I think it was Bertrand Russell who explained that you have infinitely many pairs of shoes, you can pick out all the left shoes. That's a rule, no AC needed. But if you have infinitely many pairs of socks, you need AC to choose one sock from each pair, because there's no rule to distinguish one sock from another.
Justintruth wrote: Here is a quote from the wonderful wiki on the axiom of choice: "In many cases such a selection can be made without invoking the axiom of choice; this is in particular the case if the number of bins is finite, or if a selection rule is available". A selection rule is available?
Like the shoes and socks. I can give you some mathematical examples if you are interested. This is a subject I do know something about. But again, I'm not sure you really care. AC is quite a bit of a red herring in this discussion. Although it does feature prominently in the foundation of measure theory. You can't get measure theory, or modern probability theory, off the ground without AC.
Justintruth wrote: Again there is the notion of the infinite or a function which I take as an example of something that can be said.
Yes. and in mathematics we have many examples of objects that exist about which nothing can be said beyond the fact that they exist. Happy to provide examples if you are interested.
Justintruth wrote:And the notion of binning just like that used to establish microstates in thermodynamics.
I doubt that AC has much to do with the physical universe. For one thing, there's no evidence that the physical universe obeys the axioms of set theory. And if so, which axioms? There are many alternative axiomatic systems. You can't do a physical experiment to determine the truth of AC.
Justintruth wrote: I actually don't think I understand all of this.
Perhaps if you picked one or two things and tried to understand them in greater depth? Probability theory, or 14th century philosophy, or thermodynamics. Divide and conquer. Reductionism, the great method of western civilization. Perhaps it's all wrong, but it's been effective for a long time. Then again many people no longer believe in western civiization. Or as Gandhi said when asked what he thought of western civilzation, "I think it would be a good idea."

And note that I have no idea if Gandhi actually said that. I know that Ben Kingsley said that in the movie Gandhi. It's like Wikipedia. In the modern age we think we know everything but we know nothing at all. We're all postmodernists now.
Justintruth wrote:Just that I have an intuition that there is a way to relate it all together. There is something(s) at the base of it all I don't understand.
That's late night dorm room talk. To actually understand something you have to focus on that one thing. Most people spend a lifetime trying to get good at one single thing. I agree with you that that's frustrating.
Justintruth wrote: Consider a ring of completely homogenous material - so homogenous that its nature is continuous there being "nothing that can be said" to distinguish any piece of it from another.
Like a perfect toroidal ring of dough, boiled then baked, and sprinkled with a perfectly homogeneous array of poppy seeds. God's bagel.
Justintruth wrote: Now is it rotating relative to any frame? If this material is not that material then it can rotate. But if not and "this and that" are like the dots not dice, then when viewed from any two frames rotating with respect to each other the ring is stationary. So two frames rotating with respect to each other both see the ring as stationary? How could you distinguish a motion.
Well I very much disagree with you here. Consider the plain old Cartesian 2-plane, the x-y coordinate system of high school math and freshman calculus. Every point looks exactly like every other point.

We can arbitrarily choose a point and call it the origin. Then we can rotate the plane about the origin through an angle of, say, 90 degrees. That brings the point (1,0) to the point (0,1); and the point (0,1) to the point (-1, 0). We can study trigonometry, complex numbers, linear algebra, and group theory to understand the rotations in the plane. It's a perfectly well understood theory.

When you consider a circle about the origin, it looks exactly the same after the rotation as it did before, even though all the points have moved. This does not confuse anyone. We have still rotated the plane. If you're standing at (1,0) before the rotation, you'll be standing at (0,1) afterwards.

Why do you think this is confusing? You walk from your house to the grocery store. Suppose you removed all the terrain so that you are walking on a (locally) flat plane. Even though when you get to your destination it looks the same as where you started, your legs are tired and you have moved. Most of physics is based on the mathematical understanding of rotations and translations and motion through a featureless space. Why are you pretending to be confused about this? I honestly don't get it. Maybe I'm missing your point entirely.
Justintruth wrote: (I don't mean in the current physics)
Ah ... ok. What physics then? Are you interested in the history of physics, starting with the idea of earth, wind, water, and fire and progressing to the phlogiston theory of heat?

Every time you say anything you radically change direction in the next paragraph.

Justintruth wrote: Yea. I do use the wiki. Love it. But its no problem because I am the limit not the source. I would be very glad to understand just a fraction of the wiki I read.
Perhaps just pick one or two things that interest you and study them in detail. I agree with you that the Internet gives us so many distractions it's hard to focuss on anything at all.

Justintruth wrote: The only idea I was looking at from Medieval philosophy is the notion of matter as being that which stays the same when something is changed from one thing completely into another identity.
I don't think I can parse that. And if you are interested in medieval philosophy that's all well and good, but what does that have to do with anything? Phlogiston theory of heat again. We know more than we did 500 years ago.

Justintruth wrote: Not just accidental but essential change. That has something to do with haecity
Today I learned. "Thisness." Whatever, man. You are dangerously close to word salad. Everything you write is interesting in pieces, but there is no whole to it. But thanks for the word. I love learning new words.

https://en.wikipedia.org/wiki/Haecceity
Justintruth wrote:as it is "this" matter which is changed not "that", which seems to be important in determining how to count possibilities which underlies probability theory which is the basis of statistical mechanics and hence thermodynamics and areo dynamics.
Now that is word salad. No meaning there at all. Fourier studied heat dynamics and gave us Fourier series which underlie the modern theory of digital communications. Reductionism. Learning one thing well is the key to learning other things well. It's pretty clear that if you apply heat to one end of an iron bar, eventually the other end will get warm. That's thermodynamics. It's not fourteenth century philosophy.

Justintruth wrote:But its all the same thing my mind is trying to get its head around. How do we establish the spectrums when we calculate probabilities?
Are you interested in philosophy or probability theory? To want to know everything at once without learning anything in particular is to know nothing at all. The curse of Wikipedia.

I don't mean to pile on you personally like this. I only mention it because I have the same problem.
Justintruth wrote: It always struck me that Searles Chinese room was first deployed against machines that run on language. They needed the Rosetta stone because they were dealing with symbols. So does the Chinese room. But if you connect the room to sensors you can easily make a machine that records facts about the outside world.
I'm going to skip this because people argue endlessly about the Chinese room and perhaps you should make another thread for it. We're all over the map already.

Justintruth wrote: I do think it is possible that computers may, and even probably will, eventually actually understand. At that point they will not just be computers. If they do it will have nothing to do with the distinction I am making. It will be a feature of being in my opinion completely un-reducible.
Fascinating topic. Please start another thread in the Philosophy of Mind. I could personally go on all day about this but let's not do that here.

I will only mention that if consciousness is an emergent property, then it can not possibly be computational. That's because of universality. A computation computes exactly the same function regardless of what medium it's implemented in. An algorithm running on a supercomputer does exactly the same thing as when it's run with pencil and paper. Any property that's emergent is not a computational property, because all the computational properties are already present in the pencil and paper implementation. All the supercomputer does it make it run faster. But it still does exactly the same thing.
Justintruth wrote: As for consciousness, I think you can create it independent of any intentionality with respect to the brain and what is external to it.
Please, start a thread for this. It has nothing to do with thermodynamics, medieval philosophy, the Axiom of Choice, measure and probability theory, or "thingness." You are hijacking your own discussion.
Justintruth wrote: Here is a link you probably already know about:
Thanks for the link.
Justintruth wrote: Interesting also that you can compute without creating heat as long as you don't erase.
I didn't read the article. I don't know what that means. You can't compute without an input of energy. That energy needs to go somewhere. Heat is the usual output. I don't believe that statement. Nor do I know what it means to compute without erasing. Erasing is one of the basic operations of a Turing machine.
Justintruth wrote: That was a great point about algorithmic complexity. Don't understand it, but its got me thinking a whole different way. And that's why I'm in it.
Do you understand the difference between a bitstring generated by random coin flips and a bitstring generated by the algorithm for the digits of pi? If not please ask, it's the heart of the point I was originally trying to make.
wtf
Posts: 1178
Joined: Tue Sep 08, 2015 11:36 pm

Re: Language Negation and Randomness

Post by wtf »

Hobbes' Choice wrote: Nice try. Zero is not a set, neither is it s number.
In math zero is both a set and a number. It's the additive identity of the integers. It's the origin on the real number line. It's the cardinality of the set of flying pink elephants. It's modeled formally in set theory as the empty set.

I agree with you that hundreds of years ago people were confused on this point. But today we're not confused anymore. Just like the phlogiston theory of heat. There are things people used to be confused about that we understand better today.
User avatar
Hobbes' Choice
Posts: 8364
Joined: Fri Oct 25, 2013 11:45 am

Re: Language Negation and Randomness

Post by Hobbes' Choice »

wtf wrote:
Hobbes' Choice wrote: Nice try. Zero is not a set, neither is it s number.
In math zero is both a set and a number. It's the additive identity of the integers. It's the origin on the real number line. It's the cardinality of the set of flying pink elephants. It's modeled formally in set theory as the empty set.

I agree with you that hundreds of years ago people were confused on this point. But today we're not confused anymore. Just like the phlogiston theory of heat. There are things people used to be confused about that we understand better today.
If zero is a set, then what does it contain.

Besides that everything you say about it refutes the original point, since you are attempting to say "something meaningful about it"

As you will recall the question was; "Show me a set of numbers about which nothing can be said please!"

You fell into a nice elephant trap.

Case closed.
wtf
Posts: 1178
Joined: Tue Sep 08, 2015 11:36 pm

Re: Language Negation and Randomness

Post by wtf »

Hobbes' Choice wrote: If zero is a set, then what does it contain.
Note that I explicitly did NOT say that zero is a set. Zero is not a set, if we are structuralist philosophers. But in set theory, zero is modeled as the empty set. That's what I said so it would be fair if you quoted me accurately. Since zero is modeled as the empty set, it doesn't contain anything.
Hobbes' Choice wrote: Besides that everything you say about it refutes the original point, since you are attempting to say "something meaningful about it"
The OP has introduced many interesting topics from medieval philosophy to thermodynamics and more, so I don't feel any personal obligation to stay on any particular topic.

Of course there are many interesting and specific things we can say about zero. This does not invalidate the fact that in math there are many objects that exist yet about which nothing specific can be said. Zero does not happen to fall into this category. I responded only to your claim that zero is not a number. Of course zero is a number.
Hobbes' Choice wrote: As you will recall the question was; "Show me a set of numbers about which nothing can be said please!"
But I already answered that. The set of noncomputable numbers is exactly a set of real numbers that individually have no distinguishing properties at all. They're random. The set of Turing machines is countable yet the set of bitstrings is uncountable. That's all you really need to establish that there are many real numbers that have no properties. Whether they can be said to "exist" is a question of philosophy. There's no question that they exist in standard mathematics. If there's a constructivist in the house, they'll dispute me on that point and I'm quite ready to dispute them back. In fact that's a very interesting discussion, the "existence," whatever that means, of the standard real numbers.
Hobbes' Choice wrote: You fell into a nice elephant trap.

Case closed.
Glad you feel better.
User avatar
Hobbes' Choice
Posts: 8364
Joined: Fri Oct 25, 2013 11:45 am

Re: Language Negation and Randomness

Post by Hobbes' Choice »

wtf wrote:
Hobbes' Choice wrote: If zero is a set, then what does it contain.
Note that I explicitly did NOT say that zero is a set. Zero is not a set, if we are structuralist philosophers. But in set theory, zero is modeled as the empty set.
.
"Show me a set of numbers about which nothing can be said please!"

I suppose its about clarity.
Post Reply