Economics 528
HKUST
In Choi
Spring 2004
Review Problems for Midterm
1 Least Squares
1. In the linear regression model
y = Xβ + ε,
thereisaneedforchangingtheunitofmeasurementforthedependentvariable
y. So y? = cy (c is a constant) is now used as a dependent variable.
(a) Does this practice change R2?
(b) What happens to R2 if the unit of measurement is changed only for the
regressor?
2. Consider the linear regression model
yi = α + βprimeXi + εi, εi ~ iid
parenleftBig
μ,σ2
parenrightBig
, μ negationslash= 0
(a) Is the OLS estimator of β affected by the nonzero mean of εi?
(b) Can the least squares estimator of α estimate it accurately?
3. ProvethattheOLSestimatorsof β1 inthefollowinglinearmodelsareidentical
yt = xtβ1 + tβ2 + εt
y?t = x?tβ1 + εt
where y?t and x?t are de—trended yt and xt obtained by regressing yt and xt on t
and setting y?t and x?t equal to the respective residuals.
4. Discuss the validity of the following statements.
(a) Sum of residuals is always zero.
(b) If a regression produces R2 greater than 0.5, the regression is a reliable
one.
2
(c) In a regression model
yi = αxi + εi,
switching the independent and dependent variables and running a least
squares provide a valid estimator of 1α.
(d) ˉR2 tends to favor larger models.
5. Instead of regressor matrix X, its rotation definded Z = XA with a K × K
matrix A is used as a regressor. Show that the residuals from this regression
are the same as those from the regression using regressor X.
2 Finite sample properties of the OLS estimator
1. Suppose that the regression model is
yi = α + βxi + εi,
every εi has density
f (x) = exp
parenleftbigg
?xλ
parenrightbigg
/λ, x ≥ 0.
Note that E (εi) = λ for all i. Show that the least squares estimator of β is
unbiased but that the LSE of α is not.
2. Suppose that you uncessarily included a constant term in a bivariate linear
regression model. In other words, the true model is yi = βxi + εi, whereas you
estimated the model yi = α + βxi + εi.
(a) Is the OLS estimator of β unbiased?
(b) Is the OLS estimator of β more efficient than that from the true model?
3. (Extension of 2). The true model is
yi = βprimexi + εi.
But the model
yi = αprimezi + βprimexi + εi
was estimated.
3
(a) Is the OLS estimator of β from this model more efficient than that from
the true model?
(b) Is the OLS estimator of α unbiased?
4. Consider a simple linear regression model
ln(yt) = α+βt+εt, εt ~iid
parenleftBig
0,σ2
parenrightBig
, t = 1,···,n.
One consider estimating β using
ˉβ = ln(yn)?ln(y1)
n?1 .
(a) Is ˉβ linear and unbiased?
(b) Is ˉβ more efficient than the OLS estimator of β?
(c) Does the variance of ˉβ decrease as n increases?
(d) Can ˉβ be interpreted as the growth rate of yt?
5. In Example 4.3 of our text, earnings equation is estimated as follows
hatwidestlnearnings =3.24
(1.76)
+ 0.20
(0.08)
age? 0.823
(0.001)
age2+ 0.07
(0.03)
edu? 0.35
(0.15)
kids
R2 = 0.04; n = 428
The numbers in paranthesis are standard errors.
(a) Are all the coefficients statistically significant at the 5% level? Assume a
normal distribution for the errors.
(b) Construct 95% confidence interval for the coefficients.
(c) Are the coefficients jointly significant at the 5% level? Assume normality
for the errors.parenleftBig
Hint: The F ?test is defined by F = R2/(K?1)(1?R2)/(n?K).
parenrightBig
6. Prove the independence of the OLS estimator and the unbiased estimator of
error variance in a standard linear regression model with normally distributed
errors.
4
3 Large sample properties of the LSE
1. Consider the linear regression model
yt = βt + εt, εt ~ iid
parenleftBig
0,σ2
parenrightBig
, t = 1,···,n.
(a) Show that the least squares estimator of β, b is consistent.
(b) Derive the asymptotic distribution of b.
2. For the linear regression model
yt = βxt + εt, εt ~ iid
parenleftBig
0,σ2
parenrightBig
, t = 1,···,n,
an estimator
ˉb =
summationtextn
t=1 ytsummationtext
n
t=1 xt
is considered. Assume {Xt} is a sequence of constants.
(a) What assumptions are required for the consistency of ˉb.
(b) Derive the asymptotic distribution of ˉb.
3. Suppose that Xi have a binomial distribution b(m,p) and that X1,···,Xn are
independent.
(a) What is the probability limit of ˉX = 1n summationtextni=1 Xi?
(b) What is the limiting distribution of √n
parenleftBig ˉ
X ?mp
parenrightBig
?
4. Showthatthe meansquare convergence implies the convergence inprobability.
5. Consider the linear regression model
yi = βx
i
+ εi, εi ~ iid
parenleftBig
0,σ2
parenrightBig
, i = 1,···,n
when {Xi} is a sequence of constants. What assumptions are required for the
consistency of the LSE of β? Are these assumptions reasonable for empirical
analysis?
6. Consider the linear regression model
yt = βt + εt, εt ~ iid
parenleftBig
0,σ2
parenrightBig
, t = 1,···,n.
Is the OLS estimator of β consistent?