# On the Nature of Entropy

*A scientific essay on the multifaceted law of entropy in everyday life*

*You can also watch my lecture (it has fun maths and how to solve problems using entropy) here:*

Have you ever wondered why your desk never stays ordered or

how come coffee mugs break and don’t unbreak?

The answer to these everyday questions is **Entropy.**

What then is it?

In this brief essay, I will take you through the world of disorder to understand that very thing controlling the order of your life, our planet, and the universe.

There’s no better way to begin than by echoing the words of Richard P. Feynman — a famous physicist well known for his eccentric personality, bongo-playing, teaching, and work on quantum electrodynamics. Feynman once began a lecture in the following way:

“If in some cataclysm, all human knowledge was to be destroyed and one sentence was passed to the next generation of creatures, it would be the atomic hypothesis.” [1]

He continued…

“All things are made of atoms, which are moving in perpetual motion, attracting each other at shorter distances and repelling each other at even shorter distances.” — Richard Feynman

The atomic hypothesis is the breadcrumbs to help future thinkers find their way through the labyrinth of our current systems of knowledge. However, it is missing one aspect, the emergence of higher-order complexity, like you, chairs, and cats. Atoms that stick together build together — taking an infinite array of forms that are eternally in flux. Entropy is the reason these atoms can never truly stay put. They jostle around and take those complex shapes, the nature of randomness and impermanence. We will learn entropy is always increased, dictating the nature of atomic assemblies. Therefore, we will call this corollary to Feynman's sentence, the *“atomic entropic hypothesis.”*

All things are made of atoms, which are moving in perpetual motion, attracting each other at shorter distances and repelling each other at even shorter distances, where the complexity and disorder of their microstate arrangements increases in a closed system — Richard Feynman & Bradley Nordell

Like most things in life, Entropy can be viewed in different ways to get different forms. If you look far away, the Earth seems like a blue dot. If you study close enough, you realize it’s an elongated sphere at hemispheres filled with oceans, continents, mountains, etc. You look closer, and you can see liquid magma core at the center. Entropy is just like that. And today, we will venture into the mirror and peer upon its many faces.

**Entropy and Heat**

The first face is in the result of many interacting atoms — i.e., thermodynamics. Entropy is synonymous with “randomness,” “disorder.” The parent of chaos and the byproduct of thermal existents. It is so essential, in fact, that it takes two of our three laws of thermodynamics.

**Law 1: The total change in all forms of energy equals zero.**

This is the conservation of energy. Here energy is neither created nor destroyed, only transformed from one form to another (potential to kinetic energy or chemical/nuclear energy.) In thermodynamics terms, the *Internal Energy* (energy of re-organizing things like molecules or bonds into different forms labeled by Eint) of a thermodynamic system is equal to the difference between a change in *Heat (Q)* added to a system and the *Work (W)* done to add that heat.

**Law 2:** **a change in entropy (dS) is equal to the change in heat (dQ) over its temperature (T), and this is always greater than or equal to zero.**

Hence Entropy always increases.

The second law states that the change in a closed system's entropy is equal to the change in heat added to a system divided by its temperature. The units of entropy are Joules (energy) per Kelvin) temperature. This means that entropy is the arrangement of energy per some average temperature. Where temperature, we recall, is just the average velocity of particles [2]. Hence, Entropy is the measure of energy per random velocity of particles. More energy, more the particles wiggle/vibrate and move, leading to a higher temperature.

According to the Clausius equality, for a

**Law 3: A system’s entropy approaches a constant value as its temperature approaches absolute zero.**

The change in Entropy (denoted by dS below) always moves from low to high heat or temperature systems, never in reverse. We call this irreversibility. And this has a peculiar effect on causality and the nature of time.

**Entropy and the arrow of time**

The second face of Entropy is with time. All the laws of physics are time-reversible. Meaning the mathematics allows us to reverse time, and physics still holds. This includes gravity, electricity, and magnetism, the decay of particles, to the interaction of protons and neutrons. Why do we experience time in the way of the past- present- future? The answer is Entropy. Since entropy flows only one way, the future has more disorder than the past. We call this the arrow of time or time’s arrow [3].

When we say things like “if only I could slow the hands of time.” What we really mean is the increase in entropy or disorder from one moment to the next.

Yet, we are bound by time. Forever, as Nabokov said, stuck between two eternities. And by this, we are stuck in the causal structure of reality. Causes proceed effects. An object accelerates because of a force applied to it and not the other way around — for that would be ridiculous! It seems then that entropy limits natural phenomena' absurdity by making certain events very, very… unlikely.

**Entropy and Microstates**

Finally, we reach the last face of entropy and its microscopic nature, discovered by Ludwig Boltzmann in 1877 [4]. He showed that entropy is nothing more than the logarithmic statistical disorder in a system — where W is the number of microstates a system can have, and kB is the Boltzman constant, which equals approximately 1.381 x 10^(-23) J/K.

A microstate is just a place a particle can be in-whether it's an amount of energy per volume or density. Higher temperature and heat mean more energy, and hence more available states a little wiggler can be in. If you lower these, then eventually, it only has one state to settle into. This is the lowest state at 0 K, which connects to the third law [5].

The image below gives a visual representation of this law. As you increase in temperature, you get more boxes that black object can be in. At 0K, only 1 box is allowed (a), at a higher temperature (b) more boxes and degrees of freedom to move around, and at ( c), a lot of different combinations can be created via the arrows. And remember, all possible configurations are equally probable. Meaning nature does not play favorites.

For example, take a box with 10 particles, which has a partition separating n1 from n2. We can calculate the number of microstates by combining an equal number of 5 per side of the side and then all 10 in the corner of one box. We calculate that the likelihood is 252x greater for equal distribution than for all 10 particles in the box's right partition (see below).

In fact, if you increased this to 100 particles, you would find that it would be 1x10²⁹ times more likely. If you assume a new orientation of 1 nanosecond, it will take 5 x 10¹² years (750 times the universe's age) to ever find that 1 microstate of all particles in a single corner [4].

**Entropy and Death**

As I said earlier, entropy is ubiquitous, which means it also affects you and I. For example, the reason for cellular death lies is because of increasing disorder. This is why we use cryogenics to reduce the entropy of a system (remember, reduce heat and temperature, lower entropy.) Folded proteins have a low entropy — because it only has one microstate, it can be in. Whereas an unfolded one has higher entropy since there are more microstates in the unfolded case, entropy will always go from low to high [6].

We are back once more to time and age. As entropy increases, you decay and get older. Even with research and possible nanotechnology to keep those proteins from unfolding, we may extend this inevitability out farther, but never will we defeat it. If the speed of light is the universe's speed limit, then entropy is truly the limit of death and immortality.

## Final Random Thought

So, there is entropy between our eternities of life and death, bound by increases energy configurations. It is also responsible for the heat death of the universe (link). Lastly, we will never have a perfect machine where the energy output > energy input because of entropy(law 2). So, when anyone tells you they have made a perpetual motion machine with free energy, you can say, “Hey, what about entropy?”

But, it's not all death, nihilism, and lost keys. Entropy is also the reason for the complexity and diversity in nature. Because we are always increasing our microstates — this also means more “states” for atoms to exist in and hence more forms and different possibilities. These possibilities also filter into art, writing, story ideas, and poetry. An aspect of creativity maybe that it emerges from random connections and an increase in entropy in your brain. Why? Because you have more brain states to try out different combinations until something finally looks “beautiful.” And lastly, I believe entropy is truly what gives us free will from determinism. Because no matter what you do, randomness will always give you a little bit of indeterminism in life. And therefore, the future can never be written or known.

So go now, my fellow reader, and choose your entropic course.

**References**

[1] Feynman, R.P, Six Easy Pieces, Basic books, 1963, pg. 4.

[2] Shankar, R., Fundamentals of Physics, Yale University Press, 2014, pg. 411–441

[3] Fermi, E. Thermodynamics, Dover Publications, 1956, pg. 46–75

[4] Halliday; Resnick; Walker; Fundamentals of Physics vol.1, John Wiley & Sons, 6th edition, page. 442–501

[5] Hassani, S., From Atoms to Galaxies: A Conceptual Physics Approach to Scientific Awareness, CRC Press, 2010.

[6] G.I. Makhatadze, and PL. Privalov, On the Entropy of Protein folding, Protein Sci., 5 (3) 1996. doi: 10.1002/pro.5560050312

© Bradley J Nordell 2020

If you enjoyed this essay, you might also enjoy the following: