Entropy (Information Theory)

Entropy is the expected value of surprise/self-information of a discrete random variable :

Note: The continuous form “differential entropy” is not analogous and not a good measure for uncertainty or information, as it can be negative (even infinite), and is not invariant under continuous coordinate transformations. See wikipedia.

Entropy is a measure of uncertainty/unpredictabilty High entropy = high uncertainty → higher avg. surprise. Low entropy = low uncertainty → less avg. surprise.

Take on one extreme a uniform distribution. It will have high surprise on average as every outcome is equally surprising, it’s the most unpredictable unpredictable → higher entropy. 1
With more possibilities, the mean surprise increases (logarithmically).
On the other extreme, a certain outcome has no surprise, as we know what will happen → 0 entropy.

Entropy (Physics)

Entropy is the number of microstates per macrostate
Many possible microstates to reach a certain macrostate → High entropy (likely macrostate)
Few microstates to reach a certain macrostate → Low entropy (unlikely macrostate)



Boltzman’s constant

Link to original

Additivity of entropy

We use the logarithm of the microscopic configurations / microstates (equivalent to , the number of ways a particular state can be achieved (i.e. the system’s multiplicity or configuration space)) due to the following properties:
a)
b)


Summary and thoughts on I don’t believe the 2nd law of thermodynamics. - Sabine Hossenfelder

Life on earth exists, because we receive low entropy (high, compact energy), visible light rays from the sun and for each of those dissipate ~20x less-energy infra-red rays.

Life is accelerating entropy.
(Which makes sense, it’s the most probable state of the universe)

Shine light at some random clump of atoms long enough and it should not be suprising that you get a plant.

Life is efficient at using low entropy. But by making use of it in a non-perfect way, entropy increases. In fact, the only way to use energy is by creating more entropy: To harness the compactly stored energy of coal, we burn it, release it.

Entropy is why we have an arrow of time. Because we are going from unlikely to more likely states.
It’s just like the Central Limit Theorem: Add more and more atoms in the mix and the distribution shifts more and more towards an equilibrium. 50/50.

Physicists call the projected end of this process the Heat Death of the Universe. Like the Big Bang a beginning of time, this would be the end of time, no telling whether it moves forwards or backwards. The universe in its most probable state.
dialectical materialism would like to have a word with you…
This won’t happen. Personally I’d say cyclic universe but idk.

Both low and high entropy are low in complexity. In the middle, complex structures arise.
Take for example pouring milk into a coup of tea (lower entropy individually than mixed). All of a sudden, there are interesting patterns etc. Until they are gone again, as in the initial state.

Why do eggs exist even though they are unlikely? Why isn’t everything in a perfext equilibrium?

What we do know is, that the universe was once - as far as we can see back in time - at a very low entropy, an unlikely state.
Now the universe seems to be marching to lower and lower entropy.

In the meanwhile, entropy isn’t evenly distributed. Our sun is a low entropy reservoir. In it, the likely thing, fusion, happens and it sends out lower entropy in form of sunrays. We use that to increase the entropy of other things which enables us to do work, gain motion, electricity etc. We use low entropy to shift energy around / make low entropy structures at the expense of entropy in another place all while this process is with friction i.e. increasing total entropy.

Any time you create structure, you do so by drawing on a reservoir of low entropy.

In milk and a cup of tea, matter is both very evenly distributed. If you pour them together, after a beautiful period of complexity and high entropy, they are evenly distributed again!
Same with the early and the (very) late universe (Heat Death): Evenly distributed with small fluctuations in density.
An even distribution of matter in the early universe was very unlikely (since it was close together gravity was strong, it wants to clump) → high entropy.
Late universe: low gravity → even distribution likely (and it indeed will be spread out) → low entropy.

Entropy (Physics)

Entropy is the number of microstates per macrostate
Many possible microstates to reach a certain macrostate → High entropy (likely macrostate)
Few microstates to reach a certain macrostate → Low entropy (unlikely macrostate)



Boltzman’s constant

To calculate the entropy, you average over the microstates → Throwing away information.
High entropy means low information and low entropy means high information needed to describe a system.
If there is a single microstate for your macrostate, you have perfect knowledge over the system and entropy is .

The universe with all its molecules is only ever in one concrete microstate / configuration, where the probability of this microstate is one and the probability of all other states are zero. A system is only ever in one microstate.
Crucially: As the state changes in time, this remains so. It will always be this state the system is in.

The entropy of this one microstate per microstate is always . The entropy of the a system (the universe) is constant.

The Second Law of Thermodynamics doesn’t state that entropy increases, it only states that entropy doesn’t decrease but it may as well be constant (for the entire universe)!

We say entropy increases, since we - as human observers - loose information about systems as they go from more ordered to more chaos (a higher number of equally probable configurations, increasing entropy and our uncertainty about the system).

e.g.: We put a bunch of molecules in the corner of a box; we don’t know where they are going, but we know where they are → they are released → they spread out (it is still the same microstate) → but now we can no longer tell where the particles were moments ago: We used to have information that got lost.

Our notion of entropy is based on the notion of macroscopical devices, that we humans have easy access to, not a fundamental property of nature.
As the universe gets older and entropy increases according to us, new complex systems will arrive, that rely on different macrostates, macrostates that we ourselves could never use. For those complex systems - call them living beings - the entropy will be small again. Life will go on, but in a form very different from us.

This is in-line with the dialectical nature of the universe, on an evolutionary spiral as described by the law of the negation of the negation.

In quantum mechanics, the universe is still just a single microstate, one big wave function.
It is a misunderstanding of Heisenberg’s uncertainty principle, that there are quantities that you cannot measure at the same time.

why black holes don’t have a singularity at their core

A singularity would equate to a single microstate for their macrostate: All matter condensed to an infinitely small point. All black holes exactly the same. No motion. No change. dialectical materialism already tells us, this is a bunch of malarkey.

Luckily, Stephen Hawking disproved that myth.

Black holes have entropy too, they radiate energy of a trillionth or so Kelvin. They hold most of the universes low entropy.
However, since we know that they do radiate energy and thus do not have entropy, we know that there cannot be a singularity at their core.
From the formula…

Entropy (Physics)

Entropy is the number of microstates per macrostate
Many possible microstates to reach a certain macrostate → High entropy (likely macrostate)
Few microstates to reach a certain macrostate → Low entropy (unlikely macrostate)



Boltzman’s constant

Link to original

we know, that in order to have entropy, the system would need to have only a single microstate ()… a singularity.
Which - hurayy - it doesn’t!

References

Veritasium

Physics

Footnotes

  1. https://www.desmos.com/calculator/wytdhbilxg