Frisch–Waugh–Lovell theorem: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Bender235
No edit summary
en>Msrasnw
 
Line 1: Line 1:
{{refimprove|date=April 2013}}
The title of the author is Figures but it's not the most masucline title out there. Doing ceramics is what her family and her enjoy. South Dakota is exactly where me and my spouse live and my family loves it. For many years he's been working as a meter reader and it's something he really enjoy.<br><br>Have a look at my web page; at home std test ([http://Www.hotporn123.com/user/KGottscha simply click the up coming internet site])
In [[statistics]], the '''residual sum of squares (RSS)''' is the [[sum]] of squares of [[errors and residuals in statistics|residuals]]. It is also known as the '''sum of squared residuals (SSR)''' or the '''sum of squared errors of prediction (SSE)'''. It is a measure of the discrepancy between the data and an estimation model. A small RSS indicates a tight fit of the model to the data.
 
In general, [[total sum of squares]] = [[explained sum of squares]] + '''residual sum of squares'''.  For a proof of this in the multivariate [[ordinary least squares]] (OLS) case, see [[Explained sum of squares#Partitioning in the general OLS model|partitioning in the general OLS model]].
 
==One explanatory variable==
 
In a model with a single explanatory variable, RSS is given by
 
:<math>RSS = \sum_{i=1}^n (y_i - f(x_i))^2, </math>
 
where ''y''<sub>''i''</sub> is the ''i'' <sup>th</sup> value of the variable to be predicted, ''x''<sub>''i''</sub> is the ''i'' <sup>th</sup> value of the explanatory variable, and <math>f(x_i)</math> is the predicted value of ''y''<sub>''i''</sub> (also termed <math>\hat{y_i}</math>).
In a standard linear simple [[regression model]], <math>y_i = a+bx_i+\varepsilon_i\,</math>, where ''a'' and ''b'' are [[coefficient]]s, ''y'' and ''x'' are the [[regressand]] and the [[regressor]], respectively, and &epsilon; is the [[errors and residuals in statistics|error term]]. The sum of squares of residuals is the sum of squares of [[estimator|estimates]] of &epsilon;<sub>''i''</sub>; that is
 
:<math>RSS = \sum_{i=1}^n (\epsilon_i)^2 = \sum_{i=1}^n (y_i - (\alpha + \beta x_i))^2, </math>
 
where <math>\alpha</math> is the estimated value of the constant term <math>a</math> and <math>\beta</math> is the estimated value of the slope coefficient ''b''.
 
==Matrix expression for the OLS residual sum of squares==
 
The general regression model with ''n'' observations and ''k'' explanators, the first of which is a constant unit vector whose coefficient is the regression intercept, is
 
:<math> y = X \beta + e</math>
 
where ''y'' is an ''n'' × 1 vector of dependent variable observations, each column of the ''n'' × ''k'' matrix ''X'' is a vector of observations on one of the ''k'' explanators, <math>\beta </math> is a ''k'' × 1 vector of true coefficients,  and ''e'' is an ''n''× 1 vector of the true underlying errors. The [[ordinary least squares]] estimator for <math>\beta</math> is
 
:<math> \hat \beta = (X^T X)^{-1}X^T y.</math>
 
The residual vector <math>\hat e</math> is <math>y - X \hat \beta = y - X (X^T X)^{-1}X^T y</math>, so the residual sum of squares <math>\hat e ^T \hat e</math> is, after simplification,
 
:<math>  RSS = y^T y - y^T X(X^T X)^{-1} X^T y = y^T [I - X(X^T X)^{-1} X^T] y = y^T [I - H] y</math>  , where H is the [[hat matrix]], or the prediction matrix in linear regression.
 
==See also==
*[[Sum of squares (statistics)]]
*[[Squared deviations]]
*[[Errors and residuals in statistics]]
*[[Lack-of-fit sum of squares]]
*[[Degrees of freedom (statistics)#Sum of squares and degrees of freedom]]
*[[Chi-squared distribution#Applications]]
 
==References==
* {{cite book
|title = Applied Regression Analysis
|edition = 3rd
|last1= Draper |first1=N.R. |last2=Smith |first2=H.
|publisher = John Wiley
|year = 1998
|isbn = 0-471-17082-8}}
 
[[Category:Regression analysis]]
[[Category:Least squares]]

Latest revision as of 16:31, 9 May 2014

The title of the author is Figures but it's not the most masucline title out there. Doing ceramics is what her family and her enjoy. South Dakota is exactly where me and my spouse live and my family loves it. For many years he's been working as a meter reader and it's something he really enjoy.

Have a look at my web page; at home std test (simply click the up coming internet site)