Repeat Exercises 4 and 5 using the data in Table 13.5
Exercises 5
Consider the VAR models with one and two lags in Exercise 4.
(a) Estimate the characteristic roots and
(b) If there is a cointegrating regression, estimate it from the characteristic vectors and also from the static regression as suggested by Granger and Engle.
(c) Are the results different for the VAR models with one and two lags? Are they different from those from the static regressions? What do you conclude from these results?
(d) Repeat parts (a) to (c) with the seasonally adjusted data (residuals from the regression on seasonal dummies).
Exercise 4
Using the data in Table 13.4, estimate a VAR model for C, and Yt with one lag and two lags.
(a) Is the model with two lags better than the model with one lag? Use the AIC and BIC criteria (see Section 13.5). Also check for residual autocorrelations.
(b) Since the data are quarterly, regress the data on seasonal dummies, and compute the residuals. Repeat the analysis with these residuals (assuming that they are the observations).
Step by stepSolved in 3 steps with 3 images
- Chapter 9, Section 1, Exercise 002 Use the computer output to estimate the intercept β0 and the slope β1.The regression equation is Y=810-4.84X. Predictor Coef SE Coef T P Constant 810.004 88.04 9.20 0.000 X -4.841 1.587 -3.05 0.006 Intercept β0: Slope β1: Click if you would like to Show Work for tarrow_forwardWe are given the following training examples: (1.2, 3.2), (2.8, 8.5), (2,4.7), (0.9, 2.9), (5.1, 11) We want to apply a 3-nearest neighbor rule in order to perform regression. (a) : Predict the label (real value) at each of the following two points: 1 = 1.5 and x2 = 4.5. time we want to perform distance-weighted nearest neighbor regression. What values do we predict now for x1 = 1.5 and x2 = 4.5? (b). Instead of weighing the contribution of each of the 3 nearest neighbors equally, thisarrow_forwardQuestion 6 question 7arrow_forward
- d and earrow_forward)A county real estate appraiser wants to develop a statistical model to predict the appraised value of 3) houses in a section of the county called East Meadow. One of the many variables thought to be an important predictor of appraised value is the number of rooms in the house. Consequently, the appraiser decided to fit the simple linear regression model: E(u) = Bo + Bix, where y = appraised value of the house (in thousands of dollars) and x = number of rooms. Using data collected for a sample of n = 73 houses in Fast Meadow, the following results were obtained: y = 73.80 + 19.72x What are the properties of the least squares line, y = 73.80 + 19.72x? A) Average error of prediction is 0, and SSE is minimum. B) It will always be a statistically useful predictor of y. C) It is normal, mean 0, constant variance, and independent. D) All 73 of the sample y-values fall on the line.arrow_forward
- MATLAB: An Introduction with ApplicationsStatisticsISBN:9781119256830Author:Amos GilatPublisher:John Wiley & Sons IncProbability and Statistics for Engineering and th...StatisticsISBN:9781305251809Author:Jay L. DevorePublisher:Cengage LearningStatistics for The Behavioral Sciences (MindTap C...StatisticsISBN:9781305504912Author:Frederick J Gravetter, Larry B. WallnauPublisher:Cengage Learning
- Elementary Statistics: Picturing the World (7th E...StatisticsISBN:9780134683416Author:Ron Larson, Betsy FarberPublisher:PEARSONThe Basic Practice of StatisticsStatisticsISBN:9781319042578Author:David S. Moore, William I. Notz, Michael A. FlignerPublisher:W. H. FreemanIntroduction to the Practice of StatisticsStatisticsISBN:9781319013387Author:David S. Moore, George P. McCabe, Bruce A. CraigPublisher:W. H. Freeman