Talk:entropy

From Wiktionary, the free dictionary
Jump to navigation Jump to search

Background for addition to (2) and new definition (4)[edit]

"Entropy is 'disorder'" has been a standard definition in dictionaries and in science since 1898. Its origin is not in profound science or supported by scientific thought and analytical discussion. Instead, it was the informal, general language summary of a great theorist, Boltzmann, because he knew molecules moved randomly and thus chose 'disorder' as a summary of the final state of any system at equilibrium. Here are the details: http://en.wikipedia.org/wiki/Talk:Entropy/Archive2#Disorder . Because 'disorder' is fundamentally an unscientific term (and its meaning debated without agreement for a hundred years), it has been replaced by a scientific view of entropy (definition 4, here) in the majority of chemistry texts published since 2003. They are listed under "December 2005" in http://www.entropysite.com.

Definition 4, describing entropy as a measurement of the amount and extent of energy dispersal, has been adopted by the majority of first year chemistry textbooks and by several in physical chemistry as listed in the Internet source above. It is part of the modern view of the second law of thermodynamics stating the innate nature of all types of energy to become dispersed and spread out in space and temporally. (A discussion of the modern view of the second law for non-science individuals and beginners in chemistry is at http://en.wikibooks.org/wiki/The_Second_Law_of_Thermodynaimcs [The misspelling was in the origin of the Wikibook page!] FrankLambert 16:35, 15 July 2006 (UTC)[reply]

Public record from .gov site, not even bearing a copyright[edit]

Information Is Not Entropy, Information Is Not Uncertainty! Dr. Thomas D. Schneider National Institutes of Health National Cancer Institute Center for Cancer Research Nanobiology Program Molecular Information Theory Group Frederick, Maryland 21702-1201 schneidt@mail.nih.gov http://www.ccrnp.ncifcrf.gov/~toms/


There are many many statements in the literature which say that information is the same as entropy. The reason for this was told by Tribus. The story goes that Shannon didn't know what to call his measure so he asked von Neumann, who said `You should call it entropy ... [since] ... no one knows what entropy really is, so in a debate you will always have the advantage' (Tribus1971).

Shannon called his measure not only the entropy but also the "uncertainty". I prefer this term because it does not have physical units associated with it. If you correlate information with uncertainty, then you get into deep trouble. Suppose that:


information ~ uncertainty

but since they have almost identical formulae:


uncertainty ~ physical entropy

so


information ~ physical entropy


BUT as a system gets more random, its entropy goes up:


randomness ~ physical entropy


so


information ~ physical randomness


How could that be? Information is the very opposite of randomness!

The confusion comes from neglecting to do a subtraction:


Information is always a measure of the decrease of uncertainty at a receiver (or molecular machine).

If you use this definition, it will clarify all the confusion in the literature.

Note: Shannon understood this distinction and called the uncertainty which is subtracted the 'equivocation'. Shannon (1948) said on page 20:


R = H(x) - Hy(x)

"The conditional entropy Hy(x) will, for convenience, be called the equivocation. It measures the average ambiguity of the received signal."

The mistake is almost always made by people who are not actually trying to use the measure. As a practical example, consider the sequence logos. Further discussion on this topic is in the http://www.ccrnp.ncifcrf.gov/~toms/bionet.info-theory.faq.html under the topic I'm Confused: How Could Information Equal Entropy?

For a more mathematical approach, see the Information Theory Primer.

Some questions and answers might make these isues more clear.



References

@article{Tribus1971, author = "M. Tribus

and E. C. McIrvine",

title = "Energy and Information", journal = "Sci. Am.", volume = "225", note = "{(Note: the table of contents in this volume incorrectly lists this as volume {\bf 224})}", number = "3", pages = "179-188", month = "September", year = "1971"}


Examples of the error

@article{Machta1999, author = "J. Machta", title = "{Entropy, Information, and Computation}", journal = "Am. J. Phys.", volume = "67", pages = "1074-1077", year = "1999"}

"The results of random processes usually have high information content". "Randomness and information are formally the same thing." He also shows an equation relating "Shannon information" to the uncertainty function. This is a perfect example of total confusion on this issue!


@article{Padian2002, author = "K. Padian", title = "{EVOLUTION AND CREATIONISM: Waiting for the Watchmaker}", journal = "Science", volume = "295", pages = "2373-2374", year = "2002"}

"In information theory, the term can imply increasing predictability or increasing entropy, depending on the context." Kevin Padian, who wrote the review, reports that the error came from the book he was reviewing:

Intelligent Design Creationism and Its Critics
Philosophical, Theological, and Scientific Perspectives
Robert T. Pennock, Ed.
MIT Press, Cambridge, MA, 2001. 825 pp. $110,
£75.95. →ISBN. Paper, $45, £30.95.
→ISBN.


@article{Allahverdyan.Nieuwenhuizen2001, author = "A. E. Allahverdyan

and T. H. Nieuwenhuizen",

title = "{Breakdown of the Landauer bound for information erasure in the quantum regime}", journal = "Phys. Rev. E", volume = "64", pages = "056117-1--056117-9", year = "2001"}

This is an example of the typical physicists' muddle about "erasure" in which they set the state of a device to one of several states and call this a "loss of information". But setting a device to one state (no matter what it is) decreases the entropy and increases the information. The main mistake that the physicists make is not having any real working examples. It's entirely theoretical for them. (These people believe that they can beat the Second Law. I would simply ask them to build the perpetual motion machine and run the world power grid from it before making such a claim.)


@article{Crow2001, author = "J. F. Crow", title = "{Shannon's brief foray into genetics}", journal = "Genetics", volume = "159", pages = "915--917", year = "2001"}

He confounds information with uncertainty, but forgot the minus sign on the sum p log p formula. He also confounded information with entropy. Finally, he claimed that "a noisy system can send an undistorted signal provided that the appropriate error corrections or redundancy are built in". This is incorrect since there will always be error, but Shannon's channel capacity theorem shows that the error can be made as low as desired (but not zero as this author claims).

"Entropy measures lack of information; it also measures information. These two conceptions are complementary. " The meanings of entropy, Jean-Bernard Brissaud, Entropy 2005, 7[1], 68-96.

2006 Oct 19: Martin Van Staveren pointed out that

at the top of page 22 Shannon's 1948 paper, it seems to be suggested, that part of the received information is due to noise. This is obviously a slip of the pen of Shannon, as he merely tries to explain, in words, that the information rate R is the initial uncertainty minus the uncertainty due to the noise; but he calls H "information" instead of "entropy". He further pointed out that much of the confusion may have come from Weaver: see this: http://www.uoregon.edu/~felsing/virtual_asia/info.html This is part of the intro that Weaver wrote for "The mathematical theory of information". Some people even refer to "Shannon-Weaver theory" because of this intro. section 2.5 of this intro: noise generates "spurious", or "undesirable" information, whatever that may mean. The section also introduces the esoteric notion of "meaningless information", contrary to what Shannon himself says in the body of the text. I think that Weaver's arrogance in thinking that he had to "explain" Shannon, has done a big disservice to information theory, which really is only probability theory.

"A Bit Confused. Creationism and Information Theory" Skeptical Inquirer, 3/1/01 David Roche. (link 1) (link 2). He made a mistake in his description of Shannon information. What he described is the Shannon uncertainty. Shannon information is a difference between uncertainties. Because of his confusion, he associates disorder with information.

2009 Jan 21: 6.050J Information and Entropy (Spring 2008) an MIT Open Courseware course. The preface reads: "Only recently has entropy been widely accepted as a form of information." which is, of course backwards.

Also, the statement "Second Law states that entropy never decreases as time goes on" is wrong since the entropy of a system can decreas if heat leaves the system - that's how snowflakes form!

At least they admit: "In fact, we may not yet have it right."!!



See also Pitfalls in Information Theory and Molecular Information Theory


Schneider Lab origin: 1997 January 4 updated: 2009 Jan 21: MIT OCW

Removal of seemingly-prescriptive statement in the definition line[edit]

(I'm not even going to bother responding to the mess above). I removed " Originally a tongue-in-cheek coinage, has fallen into disuse to avoid confusion with thermodynamic entropy." from the definition line of the second sense, as it seemed to be a prescriptive statement added here. Nibiko (talk) 21:44, 12 December 2015 (UTC)[reply]