Self-Information

Self-Information

Lambert M. Surhone, Miriam T. Timpledon, Susan F. Marseken

     

бумажная книга



Издательство: Книга по требованию
Дата выхода: июль 2011
ISBN: 978-6-1311-5568-0
Объём: 100 страниц
Масса: 172 г
Размеры(В x Ш x Т), см: 23 x 16 x 1

High Quality Content by WIKIPEDIA articles! High Quality Content by WIKIPEDIA articles! In information theory (elaborated by Claude E. Shannon, 1948), self-information is a measure of the information content associated with the outcome of a random variable. It is expressed in a unit of information, for example bits, nats, or hartleys, depending on the base of the logarithm used in its calculation. The term self-information is also sometimes used as a synonym of entropy, i.e. the expected value of self-information in the first sense, because I(X;X) = H(X), where I(X;X) is the mutual information of X with itself. These two meanings are not equivalent, and this article covers the first sense only. For the other sense, see entropy. By definition, the amount of self-information contained in a probabilistic event depends only on the probability of that event: the smaller its probability, the larger the self-information associated with receiving the information that the event indeed occurred.

Данное издание не является оригинальным. Книга печатается по технологии принт-он-деманд после получения заказа.