We as a human species tend to do things without question while under authoritative governing principles, similar to math but I'll get to that point in a little bit. A lot of us get up and work from 95 because it is universally accepted for that idea to be a part of society without question of it. We read words from the dictionary and almost never question the origins of such words and whether or not some of these words that we either occasionally or rarely use had any logical basis for existing in the first place.
The only real question is,
Why don't most people want to question these things themselves or try to understand why they do these things without question? I hope it isn't out of fear of possibly thinking for yourselves and drawing your own logical conclusions. And the same can be said for mathematics to a varying extent.
Most people on any topic automatically want to think they are right because they are afraid of possibly being wrong. I used to be that kind of a person but I am not anymore and it has allowed me to keep an open mind and question everything in the world in which we reside in. Of course it has also allowed me to be more accepting of people's differing opinions if you are thinking. That's just a way of life.
But this all leads up to my main topic which is about math,
Math is done mostly without the question of why we accepted such things to be true if there is no distinct evidence pointing in "that" direction, hence "axiomatically true." We accept that (a+b=c) is true. No questions or proof needed. But the problem isn't what we universally accept as true today, but was this "obviously" true 5,000 years ago before the invention of most languages? Or the invention of... numbers, notations and letters so to speak? I will discuss postulation/axioms much later again.
Before I work on that point,
Let me just state the obvious here. If you only believe math was discovered, you better have a good reason for where the numbers representing the quantities of things in nature came from if it wasn't from humans. Not to mention you would probably get the Nobel prize if you could prove otherwise. I'd be surprised to see clouds raining the number "5" when there are "5" rain clouds hovering above or seeing the number "10" growing from an apple tree once "10" apples grew from that apple tree. Yes, we discovered quantities of things but certainly not numbers.
So which leads to my next point,
Abstract concepts. All numbers and even mathematical letters, "taken from the Greek language initially," such as "Sigma" are abstract concepts. Languages aside, You cannot hear, taste, smell numbers. Plato is a great example of this very problem. AKA Plato's heaven theory. Counter argument against the argument against Plato's view by me: How can you ever be aware of numbers/notations in the universe if numbers/notations never existed in the universe until humans came along and invented them to mean what they wanted those numbers to mean just like human languages? Also, did anything other than humans write numbers and notations to try to explain the quantities of things? There certainly is no evidence for it just like there certainly is no evidence of the english dictionary sitting on the ground in front of people’s faces with everything already written down in it 3,000 years ago. And this all leads straight back to my very first points about the rain clouds and the apple tree. Let's pretend it's the year 1000 CE and sigma existed then. I wanted to incorporate the summation letter with another letter. Who's to say that I'm wrong for representing "Sigma" as the old Scot's letter ȝ which is "Yogh?"
Along with that, let's make it even more interesting,
Remember, it's still 1,000 CE and sigma existed already. "Assume ȝ is our sum of a series. Sum of x, from 0 to 2." Does the method actually change? No. All I did was change the notation to mean the same exact thing as sigma axiomatically. Hush, hush no questions, remember? It has to be this way now.
My next point,
Since numbers are invented and not discovered and you can't change something that is discovered for what it is regardless of before or after scenarios, you discovered it in such a way, period. Let's pretend It's 7,000 BCE and I discovered two volcanoes. My language is very limited also. But something about two volcanoes amuses me and I wanted to remember it in some form or another. In front of me I have a stick and some mud. In front of the volcanoes I draw two "Us." The "Us" are connected. It is coincidentally shaped very similarly to the "3" in the future English numerals. I then write it in my primitive language as "Yooyoo" because those are the words that come to this primitive mind at this time when describing these quantities. This represents this primitive human's understanding of the quantities of something and so now it must be more than a single volcano so he decides to draw in the mud for memory the single volcano... etc. Is he wrong? Is there an authoritative presence telling him he's wrong in the realm of abstract thinking? Not at all. Now eventually his civilization will build up and start representing quantities this way.
Next point,
11+5 = 16... right!?!? Well, umm... no and yes. Why? You might ask? Because I didn't give any context for the meaning of 11+5 yet. Aha. It may be universally accepted due to the way our mathematical system is logically built up today to say that 11 + 5 = 16 because axiomatically, addition is "adding" and subtracting is "minusing" etc. But does this always work with any given context? I beg to differ. 11P.M + 5 hours = 4A.M. Aha. In this case the math works like this; 11+5 = 4... A.M. Because we are referring to the clock now. Something they do not teach you in school but it is not wrong given the context. Remember that.
This works the same way Einstein disagrees with 1+1=2 in some cases. What's the implication? Well Einstein thought just that because numbers work without any physical aspects applied to them so 1+1=2 hence, abstract concepts since we cannot either touch, smell, hear numbers. Instead, When looking at physical/tangible objects, we see "2" of them right? Again, depends on the context. Einstein agrees with me. He says, sure, 2 objects but unless they have equal mass, density or volume with the same exact dimensions, atom for atom, then it can only really equal a pair of 2, yes. So for a basic example; We have observed 2 tree logs. Both tree logs have almost the same dimensions. 1 tree log is heavy, weighs a lot and is very thick. Plus 1 tree log that is skinnier, nowhere near as thick, is lighter and therefore weighs less. Okay, 1 tree log plus 1 tree log equals 2 tree logs, right? In the non physical abstract realm of no observation where math by itself rules, sure, but wrong in the physical sense. Since neither of the tree logs have the completely same physical qualities as one another, then it does not equal to 2 of the same pair physically. These axioms do not work in the physical world because numbers aren't discovered and have no physical aspects to them. And so the answer would be more about how close their physical qualities are and how far apart they are to each other. And so if I wanted to start with the heaviest log and have it equal 1 and then compare it physically to how close the physical aspects of the skinnier log is to the heavier one, Then I would realize the skinnier log’s physical qualities aren’t that closely matched to the heavier log’s physical qualities so the mathematics could be; “1+0.457=1.457” as the possible answer, which in return shows the subjectiveness math has by itself. This is what Einstein meant when he said that 1+1 does not equal 2.
Which leads to my next point,
In physics there are no mathematical axioms. Why? Because in physics we explain the physical world with physical applications. Math by itself does not have an inherent trait with the real world. Once again, physical applications bring math to the scientific world view. We cannot assume anything exists without evidence. Hence, the opposite of an axiom. And because physics is binded to the scientific method. We have proved a lot of things in time thus far with physics... I'm sure you know of course. Yes, using the tool of math to explain physical phenomena is possible because it so happens to be a great tool when used in conjunction with physics. Physics must be experimentally tested and observed to be a fact. Not math. You cannot make up axioms in physics to promote a hypothesis that you cannot possibly experiment with and observe and or not understand in the first place as a starting point for a hypothesis for that matter. You cannot accept something for what it is without the burden of proof in physics. It goes against the scientific method and is therefore antiscientific. Both subjects have logic instilled into them but physics seems to be more logically superior considering my whole argument and I can't really at all find fault with how math is dealt with in physics if I was asked. Gödel in his incompleteness theorems acknowledged the limitations math has by itself and he wasn't a physicist.
Another point,
Since numbers by themselves sit in their own world within the human imagination, you can always make up your own rules as you go with postulations, right? Go ahead. That's pure mathematics research right there. Although applied math still falls into this trap of the numbers and notations having no meaning in the first place as I prove here by proving that starting at 0 on a ruler makes no difference compared to 2. The unsettling part is that this is supposed to be applied math. Where's the proof? Okay. Go get a ruler and find an object that is, oh, about 6 by 6 inches long. Let's pretend you want to build the same object again. Apply your ruler to it but instead of starting at 0' start at 2' instead. But start the ruler where you would start the ruler at 0' but at the 2' mark instead on the object so you get 8' that way. You obviously get 6' in length at 0' "the right way." Notice that it doesn't matter whether you started at 0' or 2' or 4 in? That's because the object never changes and you could still set up a schematic or blueprint that way if rulers were made to start at 2 in. The object will still be built the same way but the only things that change are the numbers of course. Those numbers once again prove to have no meaning in this application even with applied math. Now you just have to convince the world to start making rulers at 2' instead of 0 inches... and then 14' to end the ruler in the case of 2' of course. And... if you really cared about the smaller increments of measurements you can erase the first 30 batch of increments, start at 2 1/16 inches and build the ruler all the way up again and making sure you added that extra 30 increments to the end of the ruler which should get you to that new 14' mark. And already assuming the other two 16/16 and 1 16/16 increments are already being established as the whole numbers in my contentions as I speak. Voila.
Last point,
Math does indeed lead to discoveries due to the invention of numbers and notations. Math leads to lots of discoveries which then in return causes other people to invent lots and lots of other fields of mathematics where numbers and notations have whole different meanings. Even those invented numbers and postulations lead to some worthwhile hypotheses to possibly look into, "let's be honest, hypotheses you can't use 99.99% of the time," but some do have that very small 0.1% chance of making a profound discovery if using an application that is of real world use. Such as Physics, Chemistry, Computer Science etc. But remember, it isn't just the math, it's the foundation of those methodologies within the fields of science that really validates those 0.1% math formulas.
A dumb illogical mathematical axiom that should be abolished, RANT,
Due to the sheer non simplicity of this logic and "almost" inexplicable nature of this answer when looking at a math problem associated with something such as 8^0 should seem to be rather easy to understand by just looking at it but the fact that it isn't when providing an explanation is somewhat ludicrous so therefore I never really understood to this day "logically... need I say logically again? Oh there," why anything to the ^0 power is equal to 1 because anything to the power is multiplying by itself. All the math teachers I asked were stumped about why would I ask such a question? Funny thing is... they had no answer surprisingly because it probably isn't worth explaining. The logical fallacy behind this is 0^0 power = undefined and not 0 even though 0×0=0 Are you kidding me? Any finite number such as 99^0=1. But wait, 99×0=0 Why not!? While 1^0=1 and 1^1=1... yet all the other finite numbers such as 98^1=98 equal themselves. Anything to the ^1 makes a lot of sense since it is just the number representing itself as an exponent. But for ^0, Did I just find an illogical double standard in math pertaining to "1?" and possibly even "0" since 0 is a number also, once again? Go figure. This is another problem. Are we making up rules clandestinely in the background as well? No, seriously? This is another great point in my argument then if that's the case. I disagree with the explanation because it is like I said, non simplistic for what the answer is worth compared to all the other easily understood exponents.
Conclusion,
Math by itself is just abstract thinking in the realm of math which has no physical scope. There's no other way to put it. It has no real physical purpose until it reaches physics or any of the branches of science for that matter. And most of math by itself is subjective if you are using numbers for different contexts besides the non physical aspects of addition, subtraction etc. Remember? The clock, Einstein's issues, the ruler, Plato's issues, Gödel's contention of mathematical limits etc. I wrote an equation on a piece of paper and solved it, congratulations. Did it explain anything physically? No, well then it was useless unless you like doing it for fun. Michio Kaku was even more critical about math without science in a similar case with regards to my above statement. Do you like doing math as a subordinate hobby? Do you like it because of the cool symbols? Do you use other symbols for the same methods because it looks artistic... why not? Does it help increase your mental capacity as a whole for other things in life? If so then that's good. Although this is my opinion, I hope to spark some thoughts to some people out there. And just like my first question I could've argued for days about the flaws of the English language and working 95 for 50 years the same way I did with the subject of math. Moral of the story, question everything in life. You'd be amazed what kind of things start being rather peculiar than you initially thought while being adamant.
~ AD/Matrix
Mathematics is 75% Invented and 25% Discovered
 RCSaunders
 Posts: 2102
 Joined: Tue Jul 17, 2018 9:42 pm
 Contact:

 Posts: 2997
 Joined: Wed Feb 10, 2010 2:04 pm
Re: Mathematics is 75% Invented and 25% Discovered
5 out of 4 people don't understand fractions
Imp
Imp
Re: Mathematics is 75% Invented and 25% Discovered
As they say, there are 10 kinds of people, those who understand binary notation, and those who don't.
Re: Mathematics is 75% Invented and 25% Discovered
Broadly, I agree.
In the spirit of "screw axiomatics"  if the conclusion is correct, but the premises are wrong, then the premises don't matter.
If this is similar to your view, consider having a look at reverse mathematics, albeit  it is still a theoretical field.
Where Mathematics hits the tarmac is computer science/physics, so it's not so much as bipartisanshim as friendly rivalry. The three disciplines have really been conjoined at the hip for a long time.
If I am to wave my hands up in the air and point at a definite distinction between the two camps: Mathematicians still believe in infinities. Quantum physicists/computer scientists don't. To this end you will see the philosophy of finitism/ultrafinitism emerging.
Here is a paper by an ultrafinitist who seems to share your views.
In the spirit of "screw axiomatics"  if the conclusion is correct, but the premises are wrong, then the premises don't matter.
If this is similar to your view, consider having a look at reverse mathematics, albeit  it is still a theoretical field.
Where Mathematics hits the tarmac is computer science/physics, so it's not so much as bipartisanshim as friendly rivalry. The three disciplines have really been conjoined at the hip for a long time.
If I am to wave my hands up in the air and point at a definite distinction between the two camps: Mathematicians still believe in infinities. Quantum physicists/computer scientists don't. To this end you will see the philosophy of finitism/ultrafinitism emerging.
Here is a paper by an ultrafinitist who seems to share your views.