Издательство: | Книга по требованию |
Дата выхода: | июль 2011 |
ISBN: | 978-6-1311-2405-1 |
Объём: | 68 страниц |
Масса: | 123 г |
Размеры(В x Ш x Т), см: | 23 x 16 x 1 |
High Quality Content by WIKIPEDIA articles! In statistical learning theory, or sometimes computational learning theory, the VC dimension (for Vapnik–Chervonenkis dimension) is a measure of the capacity of a statistical classification algorithm, defined as the cardinality of the largest set of points that the algorithm can shatter. It is a core concept in Vapnik–Chervonenkis theory, and was originally defined by Vladimir Vapnik and Alexey Chervonenkis. Informally, the capacity of a classification model is related to how complicated it can be. For example, consider the thresholding of a high-degree polynomial: if the polynomial evaluates above zero, that point is classified as positive, otherwise as negative. A high-degree polynomial can be wiggly, so it can fit a given set of training points well. But one can expect that the classifier will make errors on other points, because it is too wiggly. Such a polynomial has a high capacity. A much simpler alternative is to threshold a linear function. This polynomial may not fit the training set well, because it has a low capacity. We make this notion of capacity more rigorous below.
Данное издание не является оригинальным. Книга печатается по технологии принт-он-деманд после получения заказа.