Saturday, June 07, 2008

Entropy and the Direction of Time

Each year, I offer one million points of extra credit (or something along those lines!) to the first student who can define time to me so it makes sense. I have yet to award those points, as there is no good understanding of time and its 'direction' that people like to talk about. However, for a thoughtful and truly interesting article that tries to get to some of the present day thinking about the direction of time, check out a recent Scientific American article about the topic. In addition, it provides a few really good examples defining entropy, and why it is logical for the entropy of systems to increase, which then defines the 2nd law of thermodynamics. Here is one segment of the article and entropy:


"The Puzzle of Entropy
Physicists encapsulate the concept of time asymmetry in the celebrated second law of thermodynamics: entropy in a closed system never decreases. Roughly, entropy is a measure of the disorder of a system. In the 19th century, Austrian physicist Ludwig Boltzmann explained entropy in terms of the distinction between the microstate of an object and its macrostate. If you were asked to describe a cup of coffee, you would most likely refer to its macrostate—its temperature, pressure and other overall features. The microstate, on the other hand, specifies the precise position and velocity of every single atom in the liquid. Many different microstates correspond to any one particular macrostate: we could move an atom here and there, and nobody looking at macroscopic scales would notice.

Entropy is the number of different microstates that correspond to the same macrostate. (Technically, it is the number of digits, or logarithm, of that number.) Thus, there are more ways to arrange a given number of atoms into a high-entropy configuration than into a low-entropy one. Imagine that you pour milk into your coffee. There are a great many ways to distribute the molecules so that the milk and coffee are completely mixed together but relatively few ways to arrange them so that the milk is segregated from the surrounding coffee. So the mixture has a higher entropy.

From this point of view, it is not surprising that entropy tends to increase with time. High-entropy states greatly outnumber low-entropy ones; almost any change to the system will land it in a higher-entropy state, simply by the luck of the draw. That is why milk mixes with coffee but never unmixes. Although it is physically possible for all the milk molecules to spontaneously conspire to arrange themselves next to one another, it is statistically very unlikely. If you waited for it to happen of its own accord as molecules randomly reshuffled, you would typically have to wait much longer than the current age of the observable universe. The arrow of time is simply the tendency of systems to evolve toward one of the numerous, natural, high-entropy states."

Another example would be an egg. There is one state for a perfectly uncracked egg. There are ways to put fractures and cracks in an egg, meaning more states for cracked eggs...this makes cracked eggs a higher entropy state. There are still many more ways for an egg to crack and break apart altogether, with collections of small pieces of shell all the way through large pieces of shell, meaning a broken, shattered egg is the highest entropy state for an egg. Cracked and broken eggs make up the states that eggs tend to go to.



No comments: