Is 0.9999... really the same as 1?
Is 0.9999... really the same as 1?
I've gone back and forth on this issue and am curious what the general public thinks about the answer given my reservations about it.
Firstly there's something called the Archimedean property or Archimedean axiom which states that there's no such thing as infinity or an infinitesimal real numbers. In that sense there is no difference between 1 and 0.99999... If we take 10.99999... = 0.0000....1 but since the 0's never end and there's no such thing as infinitesimal real numbers this is the same as 0.
Other proofs exist such as 1/3 + 2/3 = 3/3 or 0.33333... + 0.66666... = 0.99999... = 3/3 = 1
Let x = 0.9999999...
then 10x = 9.9999999...
Then subtract the x from 10x:
10x = 9.9999999...
x = 0.9999999...
=================
9x = 9
x = 1
however, I've come to notice the "=" sign is being used in a slightly different context than something like 1+1=2. It's behaving more like the equal sign in a limit
Like the example lim (x>infinity) 1  10^(x) = 1
We can see that lim (x>infinity) 1  10^(x) approaches 1 here:
f(1) = 0.9
f(2) = 0.99
f(3) = 0.999
etc...
f(x) may approach 1 as x increases but still will never reach 1. But the "=" sign in this context means "approaches" and not "of the same value", or is it?
Firstly there's something called the Archimedean property or Archimedean axiom which states that there's no such thing as infinity or an infinitesimal real numbers. In that sense there is no difference between 1 and 0.99999... If we take 10.99999... = 0.0000....1 but since the 0's never end and there's no such thing as infinitesimal real numbers this is the same as 0.
Other proofs exist such as 1/3 + 2/3 = 3/3 or 0.33333... + 0.66666... = 0.99999... = 3/3 = 1
Let x = 0.9999999...
then 10x = 9.9999999...
Then subtract the x from 10x:
10x = 9.9999999...
x = 0.9999999...
=================
9x = 9
x = 1
however, I've come to notice the "=" sign is being used in a slightly different context than something like 1+1=2. It's behaving more like the equal sign in a limit
Like the example lim (x>infinity) 1  10^(x) = 1
We can see that lim (x>infinity) 1  10^(x) approaches 1 here:
f(1) = 0.9
f(2) = 0.99
f(3) = 0.999
etc...
f(x) may approach 1 as x increases but still will never reach 1. But the "=" sign in this context means "approaches" and not "of the same value", or is it?
Re: Is 0.9999... really the same as 1?
If you tell me exactly what you mean by each of the symbols in the expression .999... = 1 I'll tell you whether it's true or false. Some examples:
* If the symbols have their usual meaning as defined standard math, then it's true. It's a theorem that can be derived step by step from first principles according to rules that could easily be programmed into a computer. .999... = 1 is a legal position in math the way a given configuration of pieces on a chessboard could be determined to be a legal position in a game of chess. A formal derivation from given rules.
* Likewise, and I'm getting ahead of something that's often mentioned in these discussions, .999... = 1 is also a theorem in the hyperreals, aka nonstandard analysis. In other words even in the most common system of math that includes infinitesimals, .999... = 1. It's very easy to prove this.
* On the other hand if you define ".999..." as 47, then since 47 isn't 1, the statement's false.
* Likewise if you define the symbols according to various personal philosophies or vague attempts to link the formal symbols to physical notions such as the Planck length and so forth, you can get yourself confused. But once you tell me exactly what the symbols mean, I'll tell you whether .999... = 1.
I hope this little summary can help to frame the topic. Again: If we interpret the symbols within standard math, then .999... = 1 and there's no question about it. And it really has no more to do with the true nature of spacetime than the way a knight moves in chess.
Now the mathematical reason is simply that the notation .999... stands for the geometric series 9/10 + 9/100 + ... which has the sum 1. The sum of an infinite series is defined in math as a certain limit, and the notion of limit has a particular technical definition, and we can drill all this right down to the definition of the real numbers within standard set theory.
The reason the 3 x .333... proof is a little bit bogus is because it relies on the validity of multiplying an infinite series by a constant. It turns out to be true, but the proof is actually more sophisticated than the fact that .999... = 1. So the .333... story is more of a heuristic for the first time you see this, but it's a bit circular since the fact that .333... = 1/3 is every bit as mysterious as .999... = 1.
I hope I didn't get too far away from your question, I've seen too many of these .999... discussions and I'm a little jaded. Hope some of this is helpful.
Just as 1 + 1 and 2 are two different representations of the same abstract mathematical object, 1 and .999... are two different representations of the same mathematical object.
* If the symbols have their usual meaning as defined standard math, then it's true. It's a theorem that can be derived step by step from first principles according to rules that could easily be programmed into a computer. .999... = 1 is a legal position in math the way a given configuration of pieces on a chessboard could be determined to be a legal position in a game of chess. A formal derivation from given rules.
* Likewise, and I'm getting ahead of something that's often mentioned in these discussions, .999... = 1 is also a theorem in the hyperreals, aka nonstandard analysis. In other words even in the most common system of math that includes infinitesimals, .999... = 1. It's very easy to prove this.
* On the other hand if you define ".999..." as 47, then since 47 isn't 1, the statement's false.
* Likewise if you define the symbols according to various personal philosophies or vague attempts to link the formal symbols to physical notions such as the Planck length and so forth, you can get yourself confused. But once you tell me exactly what the symbols mean, I'll tell you whether .999... = 1.
I hope this little summary can help to frame the topic. Again: If we interpret the symbols within standard math, then .999... = 1 and there's no question about it. And it really has no more to do with the true nature of spacetime than the way a knight moves in chess.
Now the mathematical reason is simply that the notation .999... stands for the geometric series 9/10 + 9/100 + ... which has the sum 1. The sum of an infinite series is defined in math as a certain limit, and the notion of limit has a particular technical definition, and we can drill all this right down to the definition of the real numbers within standard set theory.
The reason the 3 x .333... proof is a little bit bogus is because it relies on the validity of multiplying an infinite series by a constant. It turns out to be true, but the proof is actually more sophisticated than the fact that .999... = 1. So the .333... story is more of a heuristic for the first time you see this, but it's a bit circular since the fact that .333... = 1/3 is every bit as mysterious as .999... = 1.
I hope I didn't get too far away from your question, I've seen too many of these .999... discussions and I'm a little jaded. Hope some of this is helpful.
No, = means exactly equal. The reason is that we define the symbol .999... as a particular limit, and we define a limit to be an equality. In other words the limit of .9, .99, .999, .9999, ... is EXACTLY 1. It's part of the definition of a limit, which is that the sequence eventually gets as close as we want to 1. If that's true (which it is in this case) then the limit is exactly 1. It's because we define the limit to make this come out. is that clear or should I try to explain it better?marsh8472 wrote:But the "=" sign in this context means "approaches" and not "of the same value", or is it?
There is absolutely no difference in the two uses of =. 1 + 1 = 2 and lim(n>infinity) sum(k = 1 through n of 9/10^k) = 1 and the two uses of = are identical. The = sign says that the two expressions on the left and right point to the same mathematical object.marsh8472 wrote: however, I've come to notice the "=" sign is being used in a slightly different context than something like 1+1=2. It's behaving more like the equal sign in a limit
Just as 1 + 1 and 2 are two different representations of the same abstract mathematical object, 1 and .999... are two different representations of the same mathematical object.
Re: Is 0.9999... really the same as 1?
0.999... would not equal one even if a supercomputer kept repeating the 9 until the end of the universe.
Re: Is 0.9999... really the same as 1?
Then you are giving a nonstandard meaning of the notation. Shouldn't mathematical expressions be interpreted the way they are defined in math?Greta wrote:0.999... would not equal one even if a supercomputer kept repeating the 9 until the end of the universe.
You are correct IF we interpret the symbols as representing a program in a physical computer. But an infinite series is not a FOR loop. A loop is a notation for a physical process. A mathematical infinite series is manipulated according to the formal rules of math. Two different things.
The age of the universe is irrelevant since this is a question of math and not physics.
Re: Is 0.9999... really the same as 1?
Okay. got it now. Thanks wtf. I don't think I can reverse my vote so it must now stand as testimony to my mistake
While at any given stop point in expanding the number there will be a difference between 0.999...9 and 1, the fact that it's infinite means that there is no stop.
While at any given stop point in expanding the number there will be a difference between 0.999...9 and 1, the fact that it's infinite means that there is no stop.
Re: Is 0.9999... really the same as 1?
It's much more philosophical than that I think. What is "infinite," really? Must we solve the riddle of the universe to know what this means?Greta wrote:
While at any given stop point in expanding the number there will be a difference between 0.999...9 and 1, the fact that it's infinite means that there is no stop.
No. Instead, in math we simply define the symbols and give formal definitions of the real numbers and limits in terms of the axioms of set theory; and then I can formally deduce .999... = 1.
In carrying out this purely formal exercise, we make no ontological claims whatsoever. We don't know what any of this means beyond its provability in a formal system. It's like asking if the knight really moves that way in the physical universe. The question is a category error. Chess is a formal game, it has no analog in the physical world.
I don't really know if the mathematics of the infinite bears on the universe even if the universe turned out to be infinite. What if the universe is infinite, but in a way that differs from the mathematical construction of the real numbers? There is no evidence that our mathematical theories of infinity describe the universe even in future infinitary physics, if such a thing ever comes to be.
Does that make sense? There's no "stop" at all. There's a very clever mathematical formalism that took over 200 years to work out from the time of Newton to the late 19th century that makes the rules of calculus entirely legitimate from a logical point of view. But there is no actual connection between the math and the real world. That part is still a mystery. There is no mapping between math and physics at this level.
Re: Is 0.9999... really the same as 1?
This clarifies for me perfectly. Very helpful. Thanks again.wtf wrote:It's like asking if the knight really moves that way in the physical universe. The question is a category error. Chess is a formal game, it has no analog in the physical world.
 Hobbes' Choice
 Posts: 8364
 Joined: Fri Oct 25, 2013 11:45 am
Re: Is 0.9999... really the same as 1?
One is only a concept.
It's not real.
I think by 0.9999 you mean recurring to infinity?
This also does not exist.
But since it too is a concept it means tend to one, almost one, and has its uses like any other fiction.
It's not real.
I think by 0.9999 you mean recurring to infinity?
This also does not exist.
But since it too is a concept it means tend to one, almost one, and has its uses like any other fiction.

 Posts: 3642
 Joined: Wed Feb 10, 2010 2:04 pm
Re: Is 0.9999... really the same as 1?
infinitesimal fractions...
infinitely halving the distance yet motion still occurs...
something lacking in the interpretation...
Imp
infinitely halving the distance yet motion still occurs...
something lacking in the interpretation...
Imp
Re: Is 0.9999... really the same as 1?
It looks to me that something like 0.999... is a shorthand for lim x>infinity of f(x) where f(x) = 110^(x). Just like I wouldn't say 1/(infinity) = 0 because there is no such thing as a number called infinity in the set of real numbers. Instead the proper way to write it is lim x>infinity of f(x) where f(x) = 1/x. That's another way of asking "What does f(x) approach as x gets larger in f(x)=1/x?" The answer is 0, however since it will never reach 0 it's not the same question being asked as if I were to ask what does 1+1 equals. Just like if I were to derive the limit of 1/x it would not be a formal proof for 1/infinity = 0 since using infinity as a variable like this it is not a valid concept in formal math.wtf wrote:If you tell me exactly what you mean by each of the symbols in the expression .999... = 1 I'll tell you whether it's true or false. Some examples:
* If the symbols have their usual meaning as defined standard math, then it's true. It's a theorem that can be derived step by step from first principles according to rules that could easily be programmed into a computer. .999... = 1 is a legal position in math the way a given configuration of pieces on a chessboard could be determined to be a legal position in a game of chess. A formal derivation from given rules.
* Likewise, and I'm getting ahead of something that's often mentioned in these discussions, .999... = 1 is also a theorem in the hyperreals, aka nonstandard analysis. In other words even in the most common system of math that includes infinitesimals, .999... = 1. It's very easy to prove this.
* On the other hand if you define ".999..." as 47, then since 47 isn't 1, the statement's false.
* Likewise if you define the symbols according to various personal philosophies or vague attempts to link the formal symbols to physical notions such as the Planck length and so forth, you can get yourself confused. But once you tell me exactly what the symbols mean, I'll tell you whether .999... = 1.
I hope this little summary can help to frame the topic. Again: If we interpret the symbols within standard math, then .999... = 1 and there's no question about it. And it really has no more to do with the true nature of spacetime than the way a knight moves in chess.
Now the mathematical reason is simply that the notation .999... stands for the geometric series 9/10 + 9/100 + ... which has the sum 1. The sum of an infinite series is defined in math as a certain limit, and the notion of limit has a particular technical definition, and we can drill all this right down to the definition of the real numbers within standard set theory.
The reason the 3 x .333... proof is a little bit bogus is because it relies on the validity of multiplying an infinite series by a constant. It turns out to be true, but the proof is actually more sophisticated than the fact that .999... = 1. So the .333... story is more of a heuristic for the first time you see this, but it's a bit circular since the fact that .333... = 1/3 is every bit as mysterious as .999... = 1.
I hope I didn't get too far away from your question, I've seen too many of these .999... discussions and I'm a little jaded. Hope some of this is helpful.
No, = means exactly equal. The reason is that we define the symbol .999... as a particular limit, and we define a limit to be an equality. In other words the limit of .9, .99, .999, .9999, ... is EXACTLY 1. It's part of the definition of a limit, which is that the sequence eventually gets as close as we want to 1. If that's true (which it is in this case) then the limit is exactly 1. It's because we define the limit to make this come out. is that clear or should I try to explain it better?marsh8472 wrote:But the "=" sign in this context means "approaches" and not "of the same value", or is it?
There is absolutely no difference in the two uses of =. 1 + 1 = 2 and lim(n>infinity) sum(k = 1 through n of 9/10^k) = 1 and the two uses of = are identical. The = sign says that the two expressions on the left and right point to the same mathematical object.marsh8472 wrote: however, I've come to notice the "=" sign is being used in a slightly different context than something like 1+1=2. It's behaving more like the equal sign in a limit
Just as 1 + 1 and 2 are two different representations of the same abstract mathematical object, 1 and .999... are two different representations of the same mathematical object.
Re: Is 0.9999... really the same as 1?
0.999.... Is very similar to one but it's just a bit smaller.

 Posts: 5621
 Joined: Sun Aug 31, 2014 7:39 am
Re: Is 0.9999... really the same as 1?
The ellipsis means that 0.999... goes on indefinitely and is equivalent to 1 in that sense. Another way of expressing it is that the difference keeps getting smaller no matter how far you extend it.Harbal wrote:0.999.... Is very similar to one but it's just a bit smaller.
PhilX
Re: Is 0.9999... really the same as 1?
An intuitive way of understanding why 0.99999999...... = 1 is that it is not possible to create a number between the two.. i.e. there is no way of having a number that is larger than 0.99999999.... and yet smaller than 1. So the two descriptions must be of the same number.
Re: Is 0.9999... really the same as 1?
But should 0.99999... even be considered a real number? It's a series that has no end so how can a value be obtained without an ending to the series? According to the definition, real numbers is a set of all rational and irrational numbers. A rational number is one that can be written as a fraction. We cannot write 0.99999... as a fraction unless we want to just assume it equals 1 but this would be circular reasoning if we wanted to prove 0.9999... is a real number. If 0.999999... were irrational this would contradict the idea that it equals 1, a rational number. If the Archimedean axiom states that there is no smallest number less than 1, would this not show that 0.99999... is not a number?A_Seagull wrote:An intuitive way of understanding why 0.99999999...... = 1 is that it is not possible to create a number between the two.. i.e. there is no way of having a number that is larger than 0.99999999.... and yet smaller than 1. So the two descriptions must be of the same number.
Re: Is 0.9999... really the same as 1?
Those were once great questions. These exact problems bedevilled mathematicians for 200 years after Newton. Everyone knew that calculus worked but nobody knew how to make the idea of a limit logically rigorous. This is from 1687, say, the year of the publication of the Principia, to the 1880's, the era of Cauchy and Weirstrass and Cantor.marsh8472 wrote: But should 0.99999... even be considered a real number? It's a series that has no end so how can a value be obtained without an ending to the series? According to the definition, real numbers is a set of all rational and irrational numbers. A rational number is one that can be written as a fraction. We cannot write 0.99999... as a fraction unless we want to just assume it equals 1 but this would be circular reasoning if we wanted to prove 0.9999... is a real number. If 0.999999... were irrational this would contradict the idea that it equals 1, a rational number. If the Archimedean axiom states that there is no smallest number less than 1, would this not show that 0.99999... is not a number?
In fact it was during those two centuries that mathematicians were trying advance and apply calculus but were increasingly concerned about the lack of a proper logical foundation. Nobody really knew what a limit was. People started paying attention to the problem.
What they eventually all did was finally make calculus legit. The way they did it was:
* To base everything on set theory and formal rules of logical derivation.
* To build the natural numbers out of the empty set and the rules of set theory, then to build the integers out of the naturals, the rationals out of the integers, and finally the reals out of the rationals.
* Now that we had a logically rigorous theory of the real numbers, we could finally, 200 years after Newton, provide a logically rigorous explanation of dy/dx and the limits of infinite series. We finally tamed Berkeley's "Ghosts of departed quantities." (George Berkeley, my favorite philosopher).
* As a byproduct of this work, infinitesimals were banished from math. It's true that there are logically consistent alternate models that contain infinitesimals, and they are of interest in their own right. But in standard math, there are no infinitesimals.
This entire intellectual project is known as the Arithmetization of Analysis. The Wiki article https://en.wikipedia.org/wiki/Arithmeti ... f_analysis is short but has some links. Analysis is the fancy word for calculus, and by arithmetization they mean basing the math of the infinite and the continuous, on the axioms of set theory and rules of deduction.
In short, we are finally be able to base the continuous on the discrete.
The tl;dr is that you are right to ask those questions, but they all got solved around 18701930. Today when we say that .999... = 1 we can prove it directly from a set of axioms that everyone agrees are reasonable. But yeah, for two hundred years calculus was totally bogus. Today it's legit.