Kullback–Leibler Divergence

Kullback–Leibler Divergence

Frederic P. Miller, Agnes F. Vandome, John McBrewster

     

бумажная книга



Издательство: Книга по требованию
Дата выхода: июль 2011
ISBN: 978-6-1338-9768-7
Объём: 84 страниц
Масса: 147 г
Размеры(В x Ш x Т), см: 23 x 16 x 1

High Quality Content by WIKIPEDIA articles! In probability theory and information theory, the Kullback–Leibler divergenc (also information divergence, information gain, relative entropy, or KLIC) is a non-symmetric measure of the difference between two probability distributions P and Q. KL measures the expected number of extra bits required to code samples from P when using a code based on Q, rather than using a code based on P. Typically P represents the "true" distribution of data, observations, or a precise calculated theoretical distribution. The measure Q typically represents a theory, model, description, or approximation of P. Although it is often intuited as a distance metric, the KL divergence is not a true metric – for example, the KL from P to Q is not necessarily the same as the KL from Q to P. KL divergence is a special case of a broader class of divergences called f-divergences. Originally introduced by Solomon Kullback and Richard Leibler in 1951 as the directed divergence between two distributions, it is not the same as a divergence in calculus. However, the KL divergence can be derived from the Bregman divergence.

Данное издание не является оригинальным. Книга печатается по технологии принт-он-деманд после получения заказа.