ത
90% confidence interval: 𝑌−1.64𝑆𝐸( ത ≤ 𝜇Y ≤ 𝑌+1.64𝑆𝐸(
𝑌) ത ത
𝑌)
ത
95% confidence interval: 𝑌−1.96𝑆𝐸( ത ≤ 𝜇Y ≤ 𝑌+1.96𝑆𝐸(
𝑌) ത ത
𝑌)
ത
99% confidence interval: 𝑌−2.58𝑆𝐸( ത ≤ 𝜇Y ≤ 𝑌+2.58𝑆𝐸(
𝑌) ത ത
𝑌)
Homoskedasticity and heteroskedasticity
• False
• Big difference between the White standard errors and conventional
standard errors is a signal of heteroskedasticity.
• Point estimates however will be the same whether White matrix is
used or not.
True or false
• We worry about heteroskedasticity because the estimated coefficient is
biased when heteroskedasticity is presented.
• False.
• We worry about heteroskedasticity as the estimated standard errors are
biased standard errors.
• As we need unbiased standard errors when we construct t statistic,
heteroskedasticity leads to misleading results for hypothesis testing.
• Heteroskedasticity is not a reason for biased estimate of coefficients.
Gauss-Markov theorem
• What is the Gauss-Markov theorem for multiple regression?
• What are the conditions for the Gauss-Markov theorem to hold?
Gauss-Markov theorem
The Gauss-Markov theorem for multiple regression provides conditions under which the
OLS estimator is efficient among the class of linear conditionally unbiased estimators
(BLUE). The Gauss-Markov conditions for multiple regression are: