From Eternity to Here (28 page)

Read From Eternity to Here Online

Authors: Sean Carroll

Tags: #Science

BOOK: From Eternity to Here
9.87Mb size Format: txt, pdf, ePub

Of course, all of this is only statistical, not absolute. That is, it’s certainly possible that we could have started with an even distribution of molecules to the right and left, and just by chance a large number of them could jump to one side, leaving us with a very uneven distribution. As we’ll see, that’s unlikely, and it becomes more unlikely as we get more and more particles involved; but it’s something to keep in mind. For now, let’s ignore these very rare events and concentrate on the most likely evolution of the system.

ENTROPY À LA BOLTZMANN

We would like to do better than simply saying, “Yeah, it’s pretty obvious that the molecules will most likely move around until they are evenly distributed.” We want to be able to explain precisely why we have that expectation, and turn “evenly distributed” and “most likely” into rigorously quantitative statements. This is the subject matter of statistical mechanics. In the immortal words of Peter Venkman: “Back off, man, I’m a scientist.”

Boltzmann’s first major insight was the simple appreciation that there are
more ways
for the molecules to be (more or less) evenly distributed through the box than there are ways for them to be all huddled on the same side. Imagine that we had numbered the individual molecules, 1 through 2,000. We want to know how many ways we can arrange things so that there are a certain number of molecules on the left and a certain number on the right. For example, how many ways are there to arrange things so that all 2,000 molecules are on the left, and zero on the right? There is only one way. We’re just keeping track of whether each molecule is on the left or on the right, not any details about its specific position or momentum, so we simply put every molecule on the left side of the box.

But now let’s ask how many ways there are for there to be 1,999 molecules on the left and exactly 1 on the right. The answer is: 2,000 different ways—one for each of the specific molecules that could be the lucky one on the right side. If we ask how many ways there are to have 2 molecules on the right side, we find 1,999,000 possible arrangements. And when we get bold and consider 3 molecules on the right, with the other 1,997 on the left, we find 1,331,334,000 ways to make it happen.
124

It should be clear that these numbers are growing rapidly: 2,000 is a lot bigger than 1, and 1,999,000 is a lot bigger than 2,000, and 1,331,334,000 is bigger still. Eventually, as we imagine moving more and more molecules to the right and emptying out the left, they would begin to go down again; after all, if we ask how many ways we can arrange things so that all 2,000 are on the right and zero are on the left, we’re back to only one unique way.

The situation corresponding to the largest number of different possible arrangements is, unsurprisingly, when things are exactly balanced: 1,000 molecules on the left and 1,000 molecules on the right. In that case, there are—well, a really big number of ways to make that happen. We won’t write out the whole thing, but it’s approximately 2 × 10
600
different ways; a 2 followed by 600 zeroes. And that’s with only 2,000 total particles. Imagine the number of possible arrangements of atoms we could find in a real roomful of air or even a glass of water. (Objects you can hold in your hand typically have about 6 × 10
23
molecules in them—Avogadro’s Number.) The age of the universe is only about 4 × 10
17
seconds, so you are welcome to contemplate how quickly you would have to move molecules back and forth before you explored every possible allowed combination.

This is all very suggestive. There are relatively few ways for all of the molecules to be hanging out on the same side of the box, while there are very many ways for them to be distributed more or less equally—
and
we expect that a highly uneven distribution will evolve easily into a fairly even one, but not vice versa. But these statements are not quite the same. Boltzmann’s next step was to suggest that, if we didn’t know any better, we should expect systems to evolve from “special” configurations into “generic” ones—that is, from situations corresponding to a relatively small number of arrangements of the underlying particles, toward arrangements corresponding to a larger number of such arrangements.

Boltzmann’s goal in thinking this way was to provide a basis in atomic theory for the Second Law of Thermodynamics, the statement that the entropy will always increase (or stay constant) in a closed system. The Second Law had already been formulated by Clausius and others, but Boltzmann wanted to
derive
it from some simple set of underlying principles. You can see how this statistical thinking leads us in the right direction—“systems tend to evolve from uncommon arrangements into common ones” bears a family resemblance to “systems tend to evolve from lo w-entropy configurations into high-entropy ones.”

So we’re tempted to define “entropy” as “the number of ways we can rearrange the microscopic components of a system that will leave it macroscopically unchanged.” In our divided-box example, that would correspond to the number of ways we could rearrange individual molecules that would leave the total number on each side unchanged.

That’s almost right, but not quite. The pioneers of thermodynamics actually knew more about entropy than simply “it tends to go up.” For example, they knew that if you took two different systems and put them into contact next to each other, the total entropy would simply be the
sum
of the individual entropies of the two systems. Entropy is additive, just like the number of particles (but not, for example, like the temperature). But the number of rearrangements is certainly not additive; if you combine two boxes of gas, the number of ways you can rearrange the molecules between the two boxes is enormously larger than the number of ways you can rearrange them
within
each box.

Boltzmann was able to crack the puzzle of how to define entropy in terms of microscopic rearrangements. We use the letter
W
—from the German
Wahrscheinlichkeit
, meaning “probability” or “likelihood”—to represent the number of ways we can rearrange the microscopic constituents of a system without changing its macroscopic appearance. Boltzmann’s final step was to take the
logarithm
of
W
and proclaim that the result is proportional to the entropy.

The word
logarithm
sounds very highbrow, but it’s just a way to express how many digits it takes to express a number. If the number is a power of 10, its logarithm is just that power.
125
So the logarithm of 10 is 1, the logarithm of 100 is 2, the logarithm of 1,000,000 is 6, and so on.

In the Appendix, I discuss some of the mathematical niceties in more detail. But those niceties aren’t crucial to the bigger picture; if you just glide quickly past any appearance of the word
logarithm
, you won’t be missing much. You only really need to know two things:

• As numbers get bigger, their logarithms get bigger.
• But not very fast. The logarithm of a number grows slowly as the number itself gets bigger and bigger. One billion is much greater than 1,000, but 9 (the logarithm of 1 billion) is not much greater than 3 (the logarithm of 1,000).

That last bit is a huge help, of course, when it comes to the gigantic numbers we are dealing with in this game. The number of ways to distribute 2,000 particles equally between two halves of a box is 2 × 10
600
, which is an unimaginably enormous quantity. But the logarithm of that number is just 600.3, which is relatively manageable.

Boltzmann’s formula for the entropy, which is traditionally denoted by
S
(you wouldn’t have wanted to call it
E
, which usually stands for energy), states that it is equal to some constant
k
, cleverly called “Boltzmann’s constant,” times the logarithm of
W
, the number of microscopic arrangements of a system that are macroscopically indistinguishable.
126
That is:

S
=
k
log
W
.

This is, without a doubt, one of the most important equations in all of science—a triumph of nineteenth-century physics, on a par with Newton’s codification of dynamics in the seventeenth century or the revolutions of relativity and quantum mechanics in the twentieth. If you visit Boltzmann’s grave in Vienna, you will find this equation engraved on his tombstone (see Chapter Two).
127

Taking the logarithm does the trick, and Boltzmann’s formula leads to just the properties we think something called “entropy” should have—in particular, when you combine two systems, the total entropy is just the sum of the two entropies you started with. This deceptively simple equation provides a quantitative connection between the microscopic world of atoms and the macroscopic world we observe.
128

BOX OF GAS REDUX

As an example, we can calculate the entropy of the box of gas with a small hole in a divider that we illustrated in Figure 42. Our macroscopic observable is simply the total number of molecules on the left side or the right side. (We don’t know which particular molecules they are, nor do we know their precise coordinates and momenta.) The quantity
W
in this example is just the number of ways we could distribute the 2,000 total particles without changing the numbers on the left and right. If there are 2,000 particles on the left,
W
equals 1, and log
W
equals 0. Some of the other possibilities are listed in Table 1.

Table 1:
The number of arrangements
W
, and the logarithm of that number, corresponding to a divided box of 2,000 particles with some on the left side and some on the right side.

In Figure 43 we see how the entropy, as defined by Boltzmann, changes in our box of gas. I’ve scaled things so that the maximum possible entropy of the box is equal to 1. It starts out relatively low, corresponding to the first configuration in Figure 42, where 1,600 molecules were on the left and only 400 on the right. As molecules gradually slip through the hole in the central divider, the entropy tends to increase. This is one particular example of the evolution; because our “law of physics” (each particle has a 0.5 percent chance of switching sides every second) involved probabilities, the details of any particular example will be slightly different. But it is overwhelmingly likely that the entropy will increase, as the system tends to wander into macroscopic configurations that correspond to larger numbers of microscopic arrangements. The Second Law of Thermodynamics in action.

So this is the origin of the arrow of time, according to Boltzmann and his friends. We start with a set of microscopic laws of physics that are time-reversal invariant: They don’t distinguish between past and future. But we deal with systems featuring large numbers of particles, where we don’t keep track of every detail necessary to fully specify the state of the system; instead, we keep track of some observable macroscopic features. The entropy characterizes (by which we mean, “is proportional to the logarithm of”) the number of microscopic states that are macroscopically indistinguishable. Under the reasonable assumption that the system will tend to evolve toward the macroscopic configurations that correspond to a large number of possible states, it’s natural that entropy will increase with time. In particular, it would be very surprising if it spontaneously decreased. The arrow of time arises because the system (or the universe) naturally evolves from rare configurations into more common configurations as time goes by.

Other books

Parishioner by Walter Mosley
Her Dearest Enemy by Elizabeth Lane
Songs From Spider Street by Mark Howard Jones
The Glass Word by Kai Meyer
Attention All Passengers by William J. McGee
All Things Christmas by E. G. Lewis