Bekenstein's topical overview "A Tale of Two Entropies" describes
potentially profound implications of Wheeler's trend, in part by noting a
previously unexpected connection between the world of information theory
and classical physics. This connection was first described shortly
after the seminal 1948 papers of American applied mathematician Claude E. Shannon introduced today's most widely used measure of information content, now known as Shannon entropy.
As an objective measure of the quantity of information, Shannon entropy
has been enormously useful, as the design of all modern communications
and data storage devices, from cellular phones to modems to hard disk drives and DVDs, rely on Shannon entropy.
In thermodynamics (the branch of physics dealing with heat), entropy is popularly described as a measure of the "disorder" in a physical system of matter and energy. In 1877 Austrian physicist Ludwig Boltzmann described it more precisely in terms of the number of distinct microscopic states that the particles composing a macroscopic "chunk" of matter could be in while still looking
like the same macroscopic "chunk". As an example, for the air in a
room, its thermodynamic entropy would equal the logarithm of the count
of all the ways that the individual gas molecules could be distributed
in the room, and all the ways they could be moving. wiki
Thursday, 22 May 2014
Subscribe to:
Posts (Atom)