Sunday, January 11, 2015

Entropy

There's an interesting discussion of entropy HERE, where we learn:
Briefly, spontaneous processes tend to proceed from states of low probability to states of higher probability. The higher-probability states tend to be those that can be realized in many different ways.

Entropy is a measure of the number of different ways a state with a particular energy can be realized. Specifically,
S=klnW
where k is Boltzmann's constant and W is the number of equivalent ways to distribute energy in the system. If there are many ways to realize a state with a given energy, we say it has high entropy. Often the many ways to realize a high entropy state might be described as "disorder", but the lack of order is beside the point; the state has high entropy because it can be realized in many different ways, not because it's "messy".

No comments:

Post a Comment