joint entropy

From Wiktionary, the free dictionary
Jump to navigation Jump to search

English[edit]

English Wikipedia has an article on:
Wikipedia

Noun[edit]

joint entropy (countable and uncountable, plural joint entropies)

  1. (information theory) The Shannon entropy of a "script" whose "characters" are elements of the Cartesian product of the sets of characters of the component scripts.
    If random variables and are mutually independent, then their joint entropy is just the sum of its component entropies. If they are not mutually independent, then their joint entropy will be where is the mutual information of and .

Related terms[edit]