Skip to main content
← BlogThermodynamics

Entropy Explained: S = k_B ln W, Why It Increases, Examples

Dr. Sarah KimDr. Sarah KimUpdated May 5, 202614 min read
Entropy — ink drop spreading in water illustrating the spontaneous increase in disorder and entropy

Drop a drop of ink into a glass of water. It spreads — slowly, irreversibly — until uniformly distributed. You will never see the ink spontaneously re-concentrate. Crack an egg: you cannot uncrack it. A hot cup of coffee cools to room temperature; a cold cup never spontaneously heats up. These everyday observations share a single physical explanation: entropy. Entropy is the quantity that increases in all natural processes and defines the direction of time. It is one of the most profound concepts in physics, touching thermodynamics, information theory, cosmology, and the deepest questions about why time flows in one direction.

Entropy — Two Definitions

Thermodynamic entropy (Clausius, 1865): ΔS = Q_rev/T — entropy change equals reversible heat transfer divided by absolute temperature. Unit: J/K.

Statistical entropy (Boltzmann, 1877): S = k_B ln W — entropy is proportional to the logarithm of the number of microstates W corresponding to a macrostate. The second definition shows entropy is fundamentally about probability and disorder.

Entropy as Disorder: Macrostates and Microstates

A macrostate is what we can observe: temperature, pressure, volume, colour. A microstate is the exact position and momentum of every single molecule. Many different microstates can correspond to the same macrostate.

Consider 4 gas molecules in a box with two equal halves. How many ways can they be arranged?

Macrostate Number of microstates W Probability
All 4 in left half 1 1/16 = 6.25%
3 left, 1 right 4 4/16 = 25%
2 left, 2 right 6 6/16 = 37.5% (most likely)
1 left, 3 right 4 4/16 = 25%
All 4 in right half 1 1/16 = 6.25%

The most disordered macrostate (equal distribution) has the most microstates and is the most probable. The perfectly ordered macrostate (all molecules in one half) has only 1 microstate out of 16 — improbable with 4 molecules. With 10²³ molecules (a realistic gas), the ratio of microstates between ordered and disordered states is so astronomically large that spontaneous ordering is effectively impossible. This is why entropy increases: it is overwhelmingly probable.

Boltzmann's Entropy Formula

S = k_B ln W

where k_B = 1.38 × 10⁻²³ J/K is Boltzmann's constant and W is the number of microstates available to the system. This equation — so fundamental it is inscribed on Boltzmann's tombstone — unifies thermodynamic entropy with statistical mechanics. It shows that entropy measures the multiplicity of a system: high W (many microstates) = high entropy = high disorder.

For two systems in contact, the total number of microstates is the product W_total = W₁ × W₂. But entropy is additive: S_total = S₁ + S₂ = k_B ln W₁ + k_B ln W₂ = k_B ln(W₁W₂). The logarithm converts the multiplicative property of probabilities to the additive property we expect of extensive thermodynamic quantities.

The Thermodynamic Definition

For a reversible heat transfer at temperature T:

ΔS = Q_rev / T

Entropy increases (ΔS > 0) when heat is added at temperature T. Entropy decreases when heat is removed. For irreversible processes, the actual entropy change of the universe is always greater than Q/T — extra entropy is generated by irreversibilities (friction, mixing, unrestrained expansion).

Example: Melting ice

Melting 1 kg of ice at 0°C (273 K) requires L = 334,000 J. Entropy change:

ΔS = Q/T = 334,000 / 273 = 1,223 J/K

The entropy of the water is 1,223 J/K greater than the entropy of the ice — water has more microstates available than ice because its molecules can move freely rather than being locked in a crystal lattice.

Entropy and Information

In 1948, Claude Shannon developed information theory and defined a quantity he called information entropy — mathematically identical in form to Boltzmann's entropy. Shannon entropy measures the amount of information (or uncertainty) in a message. High entropy = highly random, unpredictable = maximum information content. Low entropy = highly ordered, predictable = minimum information.

This mathematical identity is not a coincidence. Erasing one bit of information in a computer requires a minimum of k_B T ln 2 of energy to be dissipated as heat — Landauer's principle — directly connecting information processing to thermodynamic entropy. Maxwell's demon thought experiment (a demon controlling a partition between two gases) seems to allow entropy to decrease — but the demon must store and erase information, generating entropy that more than compensates.

Entropy and the Arrow of Time

The fundamental laws of physics — Newton's laws, Maxwell's equations, quantum mechanics — are all time-symmetric: they work equally well forward and backward. Yet macroscopic processes have a clear direction. The arrow of time is the direction of increasing entropy.

A film of a glass shattering looks wrong in reverse because the reverse process would involve entropy spontaneously decreasing — overwhelmingly improbable but not impossible in principle. A film of two molecules colliding looks identical forward and backward — microscopic processes are reversible.

The ultimate reason time appears to flow in one direction is that the universe started in an extraordinarily low-entropy state — the Big Bang. The universe has been evolving toward higher entropy ever since. Stars, planets, life, and complexity are all temporary low-entropy structures that form when large entropy increases are available elsewhere (the Sun radiates low-entropy photons and receives high-entropy waste heat).

Frequently Asked Questions

What is entropy?

Entropy (S) measures the number of microscopic arrangements (microstates) available to a system: S = k_B ln W. Higher entropy means more disorder and more possible arrangements. Thermodynamically: ΔS = Q_rev/T. The second law states that total entropy of the universe always increases in natural processes.

Why does entropy always increase?

Because disordered states have vastly more microstates than ordered ones. A system evolving randomly is overwhelmingly likely to move toward higher-entropy (more microstate) configurations. With 10²³ molecules, the probability of spontaneous ordering is so vanishingly small that it is effectively impossible — entropy increases as a matter of statistics, not absolute prohibition.

Can entropy decrease?

Local entropy can decrease — ice forming from water decreases the water's entropy — but only by increasing entropy elsewhere (the surroundings gain heat and entropy). Total entropy of the universe always increases or stays the same. The second law applies to isolated systems or to the universe as a whole.

What is the relationship between entropy and disorder?

"Disorder" is an intuitive shorthand for entropy's precise meaning: the number of microstates. A disordered system (gas spread throughout a container) has many more possible molecular arrangements than an ordered one (all molecules in one corner), so it has higher entropy. The ordered state is not forbidden — just astronomically less probable.

What is entropy in everyday life?

Entropy is why heat flows from hot to cold (more microstates become available), why perfume spreads from a bottle (more positions available to molecules), why buildings decay unless maintained (destruction has more microstates than ordered structure), and why you must put energy into a refrigerator to move heat from cold to hot against the natural entropy increase.

Is entropy the same as energy?

No. Entropy (J/K) and energy (J) are different quantities, though deeply related. Energy can be conserved while entropy increases — the first law and second law are independent. High-entropy energy (dispersed thermal energy) is less useful for doing work than low-entropy energy (organised, like electricity or gravitational potential). Entropy measures the quality (usefulness) of energy, not its quantity.

Share this article

Dr. Sarah Kim

Written by

Dr. Sarah Kim

Thermodynamics researcher with a PhD from MIT, specializing in statistical mechanics and energy transfer. Passionate about connecting molecular physics to everyday phenomena.

View all articles by this author →

Discussion

Leave a comment

Have a question about this article? Spot a mistake? Or just want to share your thoughts? We'd love to hear from you.

0/2000

Comments are moderated and appear after review. Be respectful and constructive.

Keep learning physics fundamentals

Get new articles and platform updates delivered to your inbox.