Издательство: | Книга по требованию |
Дата выхода: | июль 2011 |
ISBN: | 978-6-1303-5329-2 |
Объём: | 84 страниц |
Масса: | 147 г |
Размеры(В x Ш x Т), см: | 23 x 16 x 1 |
High Quality Content by WIKIPEDIA articles! Total least squares, also known as errors in variables, rigorous least squares, or orthogonal regression, is a least squares data modeling technique in which observational errors on both dependent and independent variables are taken into account. It is a generalization of Deming regression, and can be applied to both linear and non-linear models. In the least squares method of data modeling, the objective function, S, S=mathbf{r^TWr} is minimized. In linear least squares the model is defined as a linear combination of parameters, boldsymbolbeta, so the residuals are given by mathbf{r=y-Xboldsymbolbeta} There are m observations in y and n parameters in ? with m>n. X is a mxn matrix whose elements are either constants or functions of the independent variables, x. The weight matrix, W, is, ideally, the inverse of the variance-covariance matrix, mathbf M_y, of the observations, y. The independent variables are assumed to be error-free. The parameter estimates are found by setting the gradient equations to zero, which results in the normal equations mathbf{X^TWXboldsymbolbeta=X^T Wy}
Данное издание не является оригинальным. Книга печатается по технологии принт-он-деманд после получения заказа.