Wednesday, December 08, 2010

Why entropy is a good thing.

entropy is a measure of the information you don’t have about a system

This is the most profoundly comprehensible definition for entropy that i have come across. If you think of it this way its so easy to visualize why that measure is useful.

It also invokes thoughts regarding the nature of knowledge - the implication is that the information you don't know is knowable to some degree. But the model system discussed is necessarily simplified, so that it ignores the fact that in reality there are discontinuities along the curve of increasing knowledge of systems. A simple example - a chamber of hydrogen particles acting as a gas is different from a chamber of uranium particles acting as a gas in fundamental and discrete ways.

We can generalize this to say that within each system for which we can measure knowability there also exists a potential for that system to at a deeper level contain unmeasurable unknowability.

No comments: