Mutual information

Mutual information

Jesse Russell Ronald Cohn

     

бумажная книга



ISBN: 978-5-5087-2627-0

High Quality Content by WIKIPEDIA articles! In probability theory and information theory, the mutual information (sometimes known by the archaic term transinformation) of two random variables is a quantity that measures the mutual dependence of the two random variables. The most common unit of measurement of mutual information is the bit, when logarithms to the base 2 are used.