Immanuel Can wrote: ↑Thu Jun 23, 2022 2:36 am
That's again only half the story. The idea of a totally inapplicable mathematics is futile. Mathematics is vindicated not merely by its formal elegance, but also by its efficacy and accuracy in representing empirical situations.
Yes, but this efficacy is a concern only for scientists, engineers, and anybody else who desires to apply it to real-world situations. It is not a concern for mathematics.
Godrey Hardy wrote:
We have concluded that the trivial mathematics is, on the whole, useful, and that the real mathematics, on the whole, is not.
By being divorced completely from the physical universe, mathematics is indeed meaningless. Therefore, mathematics is indeed futile, i.e. useless. Still, by successfully deriving new meaningless statements from underlying ones, the only redeeming quality of mathematics is that it is ridiculous.
Science and engineering seek to be meaningful and even seek to be useful. Mathematics does not.
The fact that mathematics has turned out to be an effective enabler for science and engineering does not detract from the fact that mathematics itself is completely abstract and strives to be meaningless and useless. Nothing expresses this better than the formalist ontology of mathematics:
Wikipedia on "mathematical formalism" wrote:
In the philosophy of mathematics, formalism is the view that holds that statements of mathematics and logic can be considered to be statements about the consequences of the manipulation of strings (alphanumeric sequences of symbols, usually as equations) using established manipulation rules. A central idea of formalism "is that mathematics is not a body of propositions representing an abstract sector of reality, but is much more akin to a game, bringing with it no more commitment to an ontology of objects or properties than ludo or chess."[1] According to formalism, the truths expressed in logic and mathematics are not about numbers, sets, or triangles or any other coextensive subject matter — in fact, they aren't "about" anything at all.
Since mathematics is not "about" anything at all, it is purposely meaningless and useless.
Mathematics was originally derived from the empirical. There were two sheep before there was the number "2". The number, remember, is only adjectival.
Not "derived" but "inspired". After proper axiomatization, this source of inspiration became completely irrelevant.
For example, nonstandard natural numbers in arithmetic theory cannot be seen or used in the physical universe. It is physically not possible to inject or even detect a transfinite number in the physical universe. However, nonstandard natural numbers certainly do exist in the abstract, Platonic world of mathematics. There is, however, no hope whatsoever that they could ever correspond to anything in the physical universe. The correspondence theory of truth is therefore inapplicable to mathematics.
Another example might be BODMAS or BOMDAS (whichever you prefer). There is no inherent reason, no reason necessary to the mathematics themselves, that that is the right order of operations. It's a formal agreement made by mathematicians, so as to make the mathematics consistent and predictable. But it's pre-mathematical in origin. There is no formula that proves that must be the rule. It's purely conventional.
Operator precedence is an issue that only exists in the infix notation, which is indeed ambiguous. It does not exist in the postfix or prefix notations, because these alternative notations are not ambiguous.
Infix notation along with the Eulerian notation for function application are costly conventions, because they are so ambiguous. These things tremendously complicate the construction of compiler front ends. If you switch to prefix notation, such as in Lisp, the compiler has a much simpler core. If you switch to postfix, such as in assembler, there is not even a need for a real compiler front end.
So it turns out that there are suppositions prior to the symbols. And from where do those suppositions come, since they are not themselves products of mathematical work? They come from the fact that we have (empirically) discovered that they are necessary assumptions to getting our mathematics to perform in the ways we find useful, consistent and reliable. They came from us. They came from the empirical world.
We avoid using them wherever we can. But then again, because infix is what they teach at school, we cannot confront people with the more efficient use of prefix or postfix notation. Instead, we built complicated front ends to accommodate this otherwise inefficient convention.
We often have to stay compatible with all the miracles and all the horrors of the past. One reason why Lisp has lost the programming languages war, is because they chose to switch to prefix notation. It allows for treating source code as data; just some other nested lists to deal with, which allows for macros that are able to process them. The Lisp people had hoped that the benefits afforded by the ability to treat code as data would compensate for programmers' lack of familiarity with the prefix notation. That was a big mistake. Most programmers prefer to drop macros and just keep the infix notation instead.
Ignoring the legacy of a gigantic installed base, is almost always a mistake. Billions of people have invested massively in learning to use all kinds of conventional monsters.
For example, POSIX is a horror story. Especially libc is a nightmare full of highly inconsistent conventions, some of which are even contradictory and even outright buggy. Will we ever be able to replace these things? No, I don't think so.