Category:Mutual information: Difference between revisions

From ETHW
No edit summary
No edit summary
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
IEEE GHN Category
Occasionally called transinformation, the quantity that measures the mutual dependence of two random variables
[[File:Mutual Information 2009 Conditional Entropy Attribution.png|200px|thumb|right|A 2009 diagram of joint entropy, conditional entropies, individual entropies and mutual information of two random variables X and Y - Image by Peter.prettenhofer.]]


[[Category:Information_theory|{{PAGENAME}}]]
[[Category:Information_theory|{{PAGENAME}}]]

Latest revision as of 20:32, 31 January 2013

Occasionally called transinformation, the quantity that measures the mutual dependence of two random variables

Pages in category "Mutual information"

This category contains only the following page.