A First Course in Probability (10th Edition)
A First Course in Probability (10th Edition)
10th Edition
ISBN: 9780134753119
Author: Sheldon Ross
Publisher: PEARSON
Bartleby Related Questions Icon

Related questions

Question

please answer all 4 parts of Page 195, 3.4.16

3.5.22. Readers may have encountered the multiple regression model in a previous
course in statistics. We can briefly write it as follows. Suppose we have a vector
of n observations Y which has the distribution N₂(Xß, o²I), where X is an n × p
matrix of known values, which has full column rank p, and 3 is a p× 1 vector of
unknown parameters. The least squares estimator of 3 is
3= (X'X)-¹X'Y.
-
(a) Determine the distribution of 3.
(b) Let Ỹ = X3. Determine the distribution of Ỹ.
(c) Let ê= Y - Ỹ. Determine the distribution of ê.
(d) By writing the random vector (Ỹ', ê')' as a linear function of Y, show that
the random vectors Y and ê are independent.
expand button
Transcribed Image Text:3.5.22. Readers may have encountered the multiple regression model in a previous course in statistics. We can briefly write it as follows. Suppose we have a vector of n observations Y which has the distribution N₂(Xß, o²I), where X is an n × p matrix of known values, which has full column rank p, and 3 is a p× 1 vector of unknown parameters. The least squares estimator of 3 is 3= (X'X)-¹X'Y. - (a) Determine the distribution of 3. (b) Let Ỹ = X3. Determine the distribution of Ỹ. (c) Let ê= Y - Ỹ. Determine the distribution of ê. (d) By writing the random vector (Ỹ', ê')' as a linear function of Y, show that the random vectors Y and ê are independent.
Expert Solution
Check Mark