Entropy (information theory)

Entropy (information theory)

Jesse Russell Ronald Cohn

     

бумажная книга



ISBN: 978-5-5093-8514-8

High Quality Content by WIKIPEDIA articles! In information theory, entropy is a measure of the uncertainty in a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message. Entropy is typically measured in bits, nats, or bans.