Here is an explanation why. "=" means EVERYTHING. Quite literally.

The Mathematical fraternity currently draws a distinction between the notions of provability and decidability.

https://en.wikipedia.org/wiki/Provability_logic
https://en.wikipedia.org/wiki/Decidability_(logic)
All 'proofs' of **x = x** are axiomatic from the Classic identity. Even the Mathematical theorem prover Coq behaves this way.

From the Curry-Howard isomorphism proofs are isomorphic to computer programs:

https://en.wikipedia.org/wiki/Curry%E2% ... espondence
In order to get to where I am I have junked Set Theory and started with Type Theory as fundamental to all Mathematics.

https://en.wikipedia.org/wiki/Type_theory
In the universe of Type Theory. The Human universe 1 is not anything in particular. 1 is just a symbol. It means whatever you want it to mean.

The string 1 represents the integer 1, but what does the integer 1 mean?

In the abstract, and when given infinite amount of time and memory the Turing Machine will indeed determine that for all x: x = x.

The problem in a physical reality is that computation requires non-zero amount of energy, and non-zero number of operations in order to decide on the truth-value of a proposition. Even a proposition as simple as 'x = x' needs to be computed/decided from 1st principles. To conclude it axiomatically from Classical identity simply assumes truth. What does 1 mean?!? To a Mathematician - nothing. To a Physicist - everything!

It's information/meaning. It is the whole universe!

Classical logic conflates the notions of **identity** and **value**.

On a Turing machine **Identity** means unique memory address. Lets call it M.

**Value** means contents-of-memory at location M. Lets call it VALUE(M).

### Drawing a distinction between identity and value

Identity := id(x)

Value := value(x)

P1. for all x: id(x) == value(x) => False

The identity of an object is not the same as its value. We will use Big-O notation to quantify 'value'

https://en.wikipedia.org/wiki/Big_O_notation
For those unfamiliar with algorithmic analysis and Big-O notation, suffice to say that it is synonymous with the English notion of complexity.

A simple thing is O(1). An infinitely complex thing is O(∞). Complexity is a function of value.

The more value (information, matter) an object contains - the more complex it is.

### The law of identity in a Digital Physics universe.

P2.A for all x: id(x) == id(x) => True

Each discernable object in physical reality has a universally unique identity.

The computational cost of the id(x) function is O(1).

Telling that two things are not the same is trivial! That is literally why we say 'two things'.

### The law of value in the Digital Physics universe (Classical law of identity)

The more information an object contains - the harder it is to decide its value. The harder it becomes to prove "x == x".

The cost of the decision is O(1) for small values of X (e.g really simple objects) and O(∞) for infinite values of X e.g really complex objects.

P3. for all x: x == x => UNDECIDABLE, Best case: O(1), Worse case: O(∞)

Aristotelian Identity is the principle of explosion in disguise! For infinitely complex things it is impossible to decide on their value.

Our universe may not be infinite, but it is incredibly complex.

### Implications

The Classical law of identity is an illusion. It causes us to mistake the complex for the simple. Behind a simple statement like "x = x" hides infinite amount of complexity and meaning. This very fact prevents us from saying anything useful about the integers beyond phenomenology. If I can't recognize an integer for what it is then I am stuck in a dream.

The solution: x = x must be proved. And it can't be proven for infinities.