information entropy

From Wiktionary, the free dictionary
Jump to navigation Jump to search

English[edit]

English Wikipedia has an article on:
Wikipedia

Noun[edit]

information entropy (uncountable)

  1. (information theory) A measure of the uncertainty associated with a random variable; a measure of the average information content one is missing when one does not know the value of the random variable (usually in units such as bits); the amount of information (measured in, say, bits) contained per average instance of a character in a stream of characters.

Synonyms[edit]

Translations[edit]

See also[edit]