Matter is not given. In the present-day view it has to be constructed out of a more fundamental concept in terms of quantum fields. In this construction of matter, thermodynamic concepts (irreversibility, entropy) have a role to play.
–Ilya Prigogine and Isabelle Stengers, Order Out of Chaos, 1984
It is, of course, one of the foundational concepts of modern thought, haunting our dreams with the prospect of the universe grinding to a halt in heat death, but also animating our hope of understanding how life arose in the universe. In a Latourian context one might even speculate that entropy is the concept that, more than any other (except perhaps biological evolution, with which it has become richly intertwined), gives the lie to the Modern’s conceit that they are here and nature is somewhere over there, separated from one another by a sharp line of clear and distinct ideas. For the concept of entropy, unlike relativity and quantum mechanics, has arisen from deep within the world of classical physics.
According to the Wikipedia the term was coined in 1865 by Rudolf Clausius, but the work leading to the concept originated earlier in the century with the research of Lazare Carnot, a mathematician whose
1803 paper Fundamental Principles of Equilibrium and Movement proposed that in any machine the accelerations and shocks of the moving parts represent losses of moment of activity. In other words, in any natural process there exists an inherent tendency towards the dissipation of useful energy. Building on this work, in 1824 Lazare's son Sadi Carnot published Reflections on the Motive Power of Fire which posited that in all heat-engines whenever "caloric", or what is now known as heat, falls through a temperature difference, work or motive power can be produced from the actions of the "fall of caloric" between a hot and cold body.
There you have it, the machine, a mechanical device with moving parts. We have Newtonian mechanics with its three laws of motion and the grand suggestion that the universe works like a clock, a vast device of many parts all ticking away in perfect order, except when they don’t. And there’s La Mettrie’s 1748 treatise, Man a Machine.
Oh! how easy our intellectual life would have become if only the universe were nothing but a clock and we but little tick-tocks within it.
But it is not, nor are we. The mechanistic vision ground to a halt in the analysis of fire and we became but especially clever monkeys through Darwin’s elucidation of a pattern he traced though the geological, paleontological, botanical and zoological records.
Though my interest in entropy is long-standing, my recent thoughts have been occasioned by various and numerous remarks the philosopher Levi Bryant has made at Larval Subjects, his blog. The post Entropy and Me is a representative example. Or, consider this passage from his book, The Democracy of Objects (pp. 227-228):
Entropy refers to the degree of disorder within a system. Suppose you have a tightly closed glass box and somehow introduce a gas into it. During the initial phases following the introduction of the gas into the system, the gas will be characterized by a high degree of order or a low degree of entropy. This is so because the particles of gas will be localized in one or the other region of the box. However, as time passes, the degree of disorder and entropy within the system will increase as the gas becomes evenly distributed throughout the box. In this respect, entropy is a measure of probability. If the earlier phases of the gas distribution indicate a lower degree of entropy than the later stages, then this is because in the earlier phases there is a lower degree of probability that the gas will be localized in any one place in the box. As time passes, the probability of finding gas particles located evenly throughout the box increases and we subsequently conclude that the degree of entropy has increased.
This seemed a bit, well, “off” to me. For one thing Bryant doesn’t say just how the gas gets introduced into the box. Surely he doesn’t mean that it gets magically whisked there through a Star Trekkian transporter. But what DOES he mean?
Well, he probably meant something like poking a small hole somewhere in the box and letting the air rush in. So that’s what I did. Not physically, of course, as I have no convenient source of high-vacuum boxes, but in my imagination.
I began imagining lots and lots of tiny tiny air molecules going in through the hole. Does that first cohort march in formation like a highly trained marching band or drill team, or do they twist and tumble every which way, pushed by the molecules behind them, and those behind them, and so forth? How fast do they move? Who’s the first to make it to the other side? And how do you measure their positions?
It seemed reasonable to think, as Bryant more or less stated (except, remember, he said nothing about a hole), that they’d be bunched up near the hole at the beginning and that, at the end, they’d be scattered evenly throughout the box. But how’d they get from one state to the other? Getting from New Jersey to New York is easy, there’s the Holland Tunnel, the Verrazano-Narrows Bridge, and so forth. But the kind of states we’re talking about aren’t geographical regions and moving from one to the other is not like getting in a car, turning the key (or pushing the button) and driving away.
And, by the way, just what does “evenly” mean? It might mean that they’re at the vertices of a cubic lattice, or some other regular structure, but I suspect that that’s not what Bryant meant. If not THAT, though, then just what? Perhaps he was, in his imagination, dividing the box into lots of tiny cubes. We then count the number of molecules in each cube. It doesn’t matter just where they are in the cube, just so they’re inside it. Some place. And when we’ve done our count we find that there’s approximately the same number in each imaginary cube.
Now we’re getting somewhere, says I to myself, we’re making progress.
But no, we’re not, we’re just getting deeper and deeper into the quicksand. What’s the size of our imaginary cubes? Does it matter? And those molecules, they’re moving, right, always moving. Since we can’t possibly examine all these imaginary cubes at one time, but have to look at one after another, how do we keep those molecules inside their proper imaginary cubes? And, since the little critters are identical to one another, how can we be sure that some of them aren’t sneaking about from cube to cube just to mess up our count?
Now, you might say, this is all nonsense, this stuff about imaginary cubes and pesky molecules who are unwilling to sit still for the count. Well, yes, you’re right, it’s nonsense in a way. But, if Bryant’s talk about order and probability is to have any substantive meaning, then we really do have to have some way of locating and counting those molecules. We need some way of taking measurements and my imaginings, some of them anyhow, are aimed at the informal notion of evenness. If we're going to measure it, well, what does it mean? Without measurements we’re just talking gibberish.
Still, it’s clear that something isn’t working. My thinking was at an impasse, that’s clear. I’m in over my head. What to do?
Call the Plumber
My plumber is Tim Perper. Though he’s not a plumber, he’s not even a physicist. He was trained as a molecular biologist and geneticist, worked in industry for a bit, worked in academia for a bit, and then decided that he was really more interested in human courtship than in complex molecules. So he spent a couple years hanging out in bars, night clubs, church socials and such and wrote down what he saw people doing—all courtesy of the Guggenheim Foundation. He wrote that work up in a book, Sex Signals (1985), that work and, of course, a lot more, including Ovid and Durkheim.
Tim also has a long-standing interest non-linear dynamics, complexity, and chaos, and has used such methods to analyze the back-and-forth moves of human courtship. That’s entropy territory.
So I copied Bryant’s paragraph into an email, added my own thoughts (including the cubic lattice), and sent it off to Tim: Help!
Here’s a couple of sentences from his reply:
"Entropy," [Bryant] writes, "refers to the degree of disorder within a system." No, that's not what entropy is: it’s a measure of how IRREVERSIBLE a physical process is, and disorder has nothing to do with it.
I knew that, said I to myself, meaning the irreversible bit. And I’m sure Bryant knows it too. Sorta. Tim’s reply continues:
A good physical-chemical example occurs during crystallization as a solution (say of table salt): the water evaporates and one ends up with lovely sparkly white crystals. The crystals are not disordered at all compared to the solution; in fact the crystallized salt molecules have a MUCH higher degree of orderliness than they did when they were dissolved. But the process is IRREVERSIBLE given that the water has evaporated -- no one is going to chase down astronomically huge numbers of evaporated water molecules from where they wandered off when they evaporated. So that process is one way only.But, you say, one can add fresh water --That of course is true, but you are not adding the same water that evaporated. You have not reversed the PHYSICAL processes that led to crystallization -- you have just dissolved the salt in newly added water. It's like saying that Harold the Furniture Salesman isn't irreversibly dead -- we can always have ANOTHER baby! No, that doesn't count. Harold is defunct, and irreversibly so no matter how many babies other people have.
So, as Prigogine and Stengers have taught us—by the way, that’s one of Bryant’s favorite locutions, “as X has taught us”—as Prigogine and Stengers have taught us, “the concept of order (or disorder) is more complex than was thought” (Order Out of Chaos, p. 287).
Hoo, boy! Is it ever!
The problem with the concepts of disorder and entropy that were ratting about in my brain, as near as I can tell, is that the connection with temporality was not deep enough, not of the right kind, something like that. Beyond that, I offer you the last paragraph of Perper’s email:
The gas flowing into a vacuum is a classic example from late 19th century physical chemistry. It's a lot trickier than it looks, but I don't recommend trying to figure it out from a thermodynamic viewpoint, not if you're not PERFECTLY comfortable with statistical mechanics, advanced calculus, and related technical topics.
Of course that knowledge—“statistical mechanics, advanced calculus, and related technical topics”—is what I don’t have, nor, I suspect, does Bryant. That’s why I turned to Perper, and have been doing so for well over a decade: Thanks, Tim. His reply certainly didn’t give me the technical knowledge I lack. I wasn't expecting that and Perper didn't attempt to provide it. But we’ve been doing this dance for so long that he was able to read my email and tell me just what I needed (irreversible plus that example, crystallization) to clarify my understanding and intuition. The rest was up to me.
These days every humanist needs a Tim Perper.
Meanwhile, on the Table Top
Fortunately, that’s not the end of the story. I say “fortunately” because I still feel that things are sort of just, I don’t know, “hanging” in mid-air with respect to the nature of entropy. I mean, we now know that “disorder” is not a good synonym for “entropy” and we have some idea of why that is—or I do at any rate—but is there something more? I think there is; another example, and examples always help.
While I was thinking about those pesky gas molecules I was wondering if there was a simple way of visualizing what those molecules were up to, or at any rate, what I suspected they were up to. What would happen if we were to release a drop of black ink into a glass of water? We all know what would happen, the ink would whirl and twirl and eventually diffuse throughout the water. THAT sort of thing is as familiar as stirring cream into coffee. It’s called turbulence.
No sooner had I thought about it than I realized: I can do this. Now, I don’t own the kind of equipment you need to do this sort of thing in a serious way. But that’s not what I’m after. I just want some pictures that give some useful intuition into the evolution of entropy in a closed system.
So, I filled a tumbler with water and set it on a window sill (for the light). I let it settle awhile.
Because any jostling and sloshing in the water would add structure to the ink flow and I wanted to see how it would flow into still water.
OK, So how long did you let the water settle?
‘Till the reflection stopped moving.
The light coming in through the window and reflecting from the surface of the water onto the wall, like this:
Oh, I see. As long as the water was moving around, the bright spot on the wall jiggled about. When the light spot stopped moving, you knew that that water had settled.
Precisely. I figured the water wouldn’t be completely still at that point, but it would be still enough for my purposes.
I then held an ink dropper in my left hand (Higgins Black Magic Waterproof Drawing Ink, No. 44011, if you care to know), squeezed the bulb, and started taking pictures with the camera I was holding in my right hand. I wasn’t able to get the drops—I surmise there was more than one—just as they hit the water’s surface but I reckon I took my first shot within a second or two after I squeezed the bulb. I then took further shots at irregular and increasingly long intervals thereafter. I took the last photograph at about four and a half hours after the first.
Why so long?
Because this is what the tumbler looked like two hours and twenty-one minutes after drop-off:
First, notice the air bubbles that have coalesced and separated out of the water. Second, notice the structure of ink swirls in the water. In particular, look at that darkish area at the lower left where ink appears to have pooled at the bottom. When you look down from the top it doesn’t appear dark at all, but looking from the side you’re looking through a greater (horizontal) depth of ink, so it appears darker.
Now, here’s the first photo, followed by the last one I’d mentioned above, the four-and-a-half hour one:
Looking at the second photo, the end-state one, ignore those whitish streaks, they’re lighting artifacts (reflections on the front of the tumbler) having nothing to do with the phenomenon under investigation. How do I know? Because I was there and looked at the tumbler from various angles. No matter what my point of view, the fluid appeared to be a homogenous light grey. (Discriminating between artifacts and the real phenomenon, noise from data, is, of course, a real issue and often enough it is difficult and contentious. That’s one reason why results need to be replicated by other researchers in other laboratories.)
Now, the first photo, it’s not quite the beginning state, but close to it. I DO have a photo of the tumbler before the ink dropped, and I’ll show to you if you insist. But really, what’s there to see? Nothing, that’s what.
You see why I surmise there were several drops? That cascade appears like there were three closely spaced drops, and perhaps a fourth (look to the left). The important point, however, is that structure is already evolving. We’re not looking at round or tear-shaped drops of ink. we’ve got stringy blobs. There’s structure there, but structure that’s difficult to describe in words. We’ve got mathematics that does a better job.
Entropy: What it means, as Perper said, is that the evolution of the system from the structure in that first photo to the structure in that next photo is irreversible. Never in a million years will the black ink particles pull themselves together and reform into the structure we see in that first photo nor, for that matter, will that structure go backwards and become the structure that existed when the ink first hit the water. Irreversible.
The next three photos are respectively 2, 10, and 31 seconds after the first:
The structure in second photo appears closely related to that in the first photo. Now look at the third photo. Except for that streak at the upper left, that initial structure appears almost completely gone. The ink is spreading along the bottom of the tumbler and the ink's beginning to crawl up the left side. And now–sheesh! always with the now–look at photo four. Draw your own conclusions.
So far so good. This is pretty much what I expected. What I didn’t at all expect is emergent structure. I took these three photos 7:53, 8:52 and 22:20 minutes and seconds after the first one:
If you examine the photographs carefully you’ll see vertical convention cells. The ink particles are now circulating between the top and bottom of the tumbler in side-by-side columns.
Where did THAT order come from?
It wasn’t in the tumbler itself and it certainly wasn’t in the ink. It can only have come from the evolution of the system as the ink circulated throughout the water. The structure is self-organized. But relatively short-lived. We perhaps see remnants of it at the 2-hour and 20-minute interval in the first photograph at the top, but it is gone by the very last one.
If we now review those photographs we can see, not in our mind’s eye, but visibly, in these images, why we cannot identify the informal notion of disorder with the formal notion of entropy. We know that as this system evolved from the state depicted in the first photo to the state depicted in the last, the entropy increased. The structure that was there in that relatively early state has completely disappeared by the last state.
But it had disappeared well before that. By eight minutes in (and probably before then, but I wasn’t photographing or watching continuously), the initial structure had been wiped out and a new structure emerged. That structure was also of a new kind: vertical convection cells. One kind of order had evolved to another—as in Perper’s crystallization example. Unlike the crystallization example, this structure then dissipated, evolving to a homogenous grey fluid.
So, what IS entropy?
As I’ve already said, I don’t really know. I can’t do the math. All I can do is what I’ve done above: Tell you what I know as clearly as I can.
If entropy at all interests you, and you want to use it in your intellectual work, even if only as background—then you have to make your own peace with the concept. If you have the time and inclination, by all means, learn the math. Otherwise you should find yourself someone like Tim Perper and work with him or her. Carefully and closely.
Sure, read popular expositions, good ones. Take a look at real physics textbooks. Do as much of that as you’ve got time for. But that is no substitute for talking with an expert. You can’t learn a concept unless you make active use of it, and that means interacting with others. You can’t interact with a book. You can with a person.
Plato knew what he was doing when he faked those dialogues. But you shouldn't fake your dialogues. Talk with people.
Addendum 1.11.15: There's an interesting discussion of entropy HERE, where we learn:
* * * * *
Addendum 1.11.15: There's an interesting discussion of entropy HERE, where we learn:
Briefly, spontaneous processes tend to proceed from states of low probability to states of higher probability. The higher-probability states tend to be those that can be realized in many different ways.Entropy is a measure of the number of different ways a state with a particular energy can be realized. Specifically,S=klnWwhere k is Boltzmann's constant and W is the number of equivalent ways to distribute energy in the system. If there are many ways to realize a state with a given energy, we say it has high entropy. Often the many ways to realize a high entropy state might be described as "disorder", but the lack of order is beside the point; the state has high entropy because it can be realized in many different ways, not because it's "messy".