Издательство: | Книга по требованию |
Дата выхода: | июль 2011 |
ISBN: | 978-3-8381-1637-2 |
Объём: | 192 страниц |
Масса: | 313 г |
Размеры(В x Ш x Т), см: | 23 x 16 x 1 |
We present a Regularization Network approach based on Kolmogorov's superposition theorem (KST) to reconstruct higher dimensional continuous functions from their function values on discrete data points. The ansatz is based on a new constructive proof of a version of the theorem. Additionally, the thesis gives a comprehensive overview on the various versions of KST that exist and its relation to well known approximation schemes and Neural Networks. The efficient representation of higher dimensional continuous functions as superposition of univariate continuous functions suggests the conjecture that in a reconstruction, the exponential dependency of the involved numerical costs on the dimensionality, the so-called curse of dimensionality, can be circumvented. However, this is not the case, since the involved univariate functions are either unknown or not smooth. Therefore, we develop a Regularization Network approach in a reproducing kernel Hilbert space setting such that the restriction of the underlying approximation spaces defines a nonlinear model for function reconstruction. Finally, a verification and analysis of the model is given by various numerical examples.
Данное издание не является оригинальным. Книга печатается по технологии принт-он-деманд после получения заказа.