IEEE

Category:Mutual information

SHARE |

From GHN

(Difference between revisions)
Jump to: navigation, search
 
Line 1: Line 1:
IEEE GHN Category
+
Occasionally called transinformation, the quantity that measures the mutual dependence of two random variables
[[File:Mutual Information 2009 Conditional Entropy Attribution.png|200px|thumb|right|A 2009 diagram of joint entropy, conditional entropies, individual entropies and mutual information of two random variables X and Y - Image by Wikimedia User: Peter.prettenhofer.]]
+
  
 
[[Category:Information_theory|{{PAGENAME}}]]
 
[[Category:Information_theory|{{PAGENAME}}]]

Latest revision as of 20:32, 31 January 2013

Occasionally called transinformation, the quantity that measures the mutual dependence of two random variables

Pages in category "Mutual information"

This category contains only the following page.

H