Let ẞ be the (k + 1) × 1 vector of OLS estimates. (i) Show that for any (k + 1) x 1 vector b, we can write the sum of squared residuals as SSR(b) = û'û + (ẞ - b)'X'X(ẞ - b). (Hint: Write (y - Xb)'(yXb) = [û + X(ẞ − b)]'[û + X(Bb)] and use the fact that X'û = 0.} -
Let hat(\beta ) be the (k+1)\times 1 vector of OLS estimates.
(i) Show that for any (k+1)\times 1 vector b, we can write the sum of squared residuals as
SSR(b)=hat(u)^(')hat(u)+(hat(\beta )-b)^(')x^(')x(hat(\beta )-b).
{():} Hint: Write (y-xb)^(')(y-xb)=[hat(u)+x(hat(\beta )-b)]^(')[hat(u)+x(hat(\beta )-b)] and use the fact that
{:x^(')u=0.}6 Consider the setup of the Frisch-Waugh Theorem.
(i) Using partitioned matrices, show that the first order conditions (x^(')x)hat(\beta )=x^(')y can be written as
x_(1)^(')x_(1)hat(\beta )_(1)+x_(1)^(')x_(2)hat(\beta )_(2)=x_(1)^(')y
x_(2)^(')x_(1)hat(\beta )_(1)+x_(2)^(')x_(2)hat(\beta )_(2)=x_(2)^(')y.
(ii) Multiply the first set of equations by x_(2)^(')x_(1)(x_(1)^(')x_(1))^(-1) and subtract the result from the second set
of equations to show that
(x_(2)^(')M_(1)x_(2))hat(\beta )_(2)=x_(2)^(')M_(1)y,
where M_(1)=I_(n)-x_(1)(x_(1)^(')x_(1))^(-1)x_(1)^('). Conclude that
hat(\beta )_(2)=(x_(2)^(¨)^(')x_(2)^(¨))^(-1)x_(2)^(¨)^(')y.
Unlock instant AI solutions
Tap the button
to generate a solution