Anda di halaman 1dari 26

Hetrosadastesity

Homosadastesity is also one of the assumption of


Linear Regression Model.
homo means equal and scedasticity means
spread or variance. Homoscedasticity thus refers
to as equal or same variances.
===> E(u
i
) = ; remains constant
while
i
varies

Consequence Hetrosadastesity
1. Due TO hetrosadastesity, variances of the
coefficients (
i
)are larger, and consequently,
their standard errors and confidence interval
are large, while t ratios are consequently small
and insignificant.
2) Estimated results are misleading.
3) OLS estimators are no longer efficient.
Detection Hetrosadastesity
Tests use for detection of Hetrosadastesity:
Park Test:
Run a usual regression, like:
lnY =
0
+
1
lnX
i
+
i

Obtain residuals e
i
and make them squared, run
regression of the following form:
Lne
2
i
=
0
+
1
lnX
i
+
i

If
1
happens to be statistically significant, it will
indicate the existence of the problems of
heteroscedasticity. Lets do the Park test for evaluating
our Job satisfaction and organizational justice case
for checking existence of heteroscadasticity problem.

Detection Hetrosadastesity
Park Test: (Cont)
Convert data on all dependent and independent
variables JS, DJ,PJ, IJ, INJ and AEE into log using
TRANSFORM and COMPUTE VARIABLE commands
in SPSS; let the newly log-variables have new
names LJS, LDJ,LPJ, LIJ, LINJ and LAEE.

Detection Hetrosadastesity
Park Test: (Cont)
After Converting data into log regress the model
as follows:
lnLJS =
0
+
1
lnDJ +
2
lnPJ +
3
lnIJ +
4
lnIN +

5
lnAEE +
i

Obtain residuals using additional SPSS commands:
ANALYZEREGRESSIONLINEARSAVERESIDU
ALSUNSTANDARDIZEDCONTINUEOK

Detection Hetrosadastesity
Park Test: (Cont)
The command in previous slide will estimate
residuals and put those in the last column of
the data file under name RES_1(this residual
will be in logarithmic form as all the variables in
regression are in log from) . Make this variable
square (as we need Lne
2
i
), using TRANSFORM
and COMPUTE commands.
you can run regression
Lne
2
i
=
0
+
1
lnDJ +
2
lnPJ +
3
lnIJ +
4
lnIN +

5
lnAEE +
i


Detection Hetrosadastesity
Park Test: (Cont)

Model
Unstandardized Coefficients
Standardized
Coefficients t Sig.
B Std. Error Beta B Std. Error
1 (Constant)
.240 .098 2.450 .015
LDJ
-.157 .026 -.455 -6.124 .000
LPJ
-.008 .022 -.027 -.341 .733
LIJ
.026 .024 .069 1.075 .283
LINJ
-.056 .032 -.129 -1.748 .082
LAEE
.021 .024 .046 .848 .397
Detection Hetrosadastesity
Park Test: (Cont)
Result interpretation of Park Test:
All the coefficients (LDJ, LPJ, LIJ, LINJ & LAEE) are
statistically insignificant except LDJ , suggesting
no heterosedasticity problem.


Detection Hetrosadastesity
Goldfeld-Quant Test:
The Goldfeld-Quant test suggests ordering or rank
observations according to the values of X
i
,
beginning with the lowest X
i
value. Then some
central observations are omitted in a way that
the remaining observations are divided into
two equal groups.


Detection Hetrosadastesity
Goldfeld-Quant Test: (Cont)
These two data groups are used for running two
separate regressions, and residual sum of squares
(RSS) are obtained; these RSSs (RSS
1
& RSS
2
) are
then used to compute Goldfeld-Quant F test,
namely:
F = RSS
2
/df
RSS
1
/df
If the F is found significant (F
-calculated
> F
-tabulated
) the
problem of heteroscedasticity is likely to exist.


Detection Hetrosadastesity
Goldfeld-Quant Test: (Cont)
Lets run the stated test for Organizational justice and
Job satisfaction case. In Parks test indicated that log
of variable DJ was found the most collinear with the
log of the squared residuals; this suggested that we
arrange data in ascending order using DJ variable as
the base, and then omit central 14 observations,
which will leave 250 observation to be equally divided
in two parts of 125 observation each.
The SPSS command is: DATASORT CASESTake DJ
to the SORT-BY BOXASCENDING.


Detection Hetrosadastesity
Goldfeld-Quant Test: (Cont)
Remove the 14 central observations, and save
data in two separate files, one having Group 1
data (the first 125 observations) and the second
having Group II data (having 125 later
observations).


Detection Hetrosadastesity
Goldfeld-Quant Test: (Cont)
Then running the required two regressions gives
the following TWO ANOVA tables:
ANOVA
b







a Predictors: (Constant), AEE, Procedural justice, Interactive justice , INJ, Distributive
justice
b Dependent Variable: Job satisfaction



Model
Sum of
Squares df
Mean
Square F Sig.
1 Regression
17.457 5 3.491 7.913 .000(a)
Residual
52.504 119 .441
Total
69.961 124
Detection Hetrosadastesity
Goldfeld-Quant Test: (Cont)
ANOVA
b










a Predictors: (Constant), AEE, Procedural justice, Interactive justice , INJ, Distributive
justice
b Dependent Variable: Job satisfaction



Model
Sum of
Squares df
Mean
Square F Sig.
1 Regression
2.502 5 .500 3.112 .011(a)
Residual
19.139 119 .161
Total
21.641 124
Detection Hetrosadastesity
Goldfeld-Quant Test: (Cont)
The residual sum of squares (RSS) of the two groups
are:
RSS
1
= 52.504 with DF = 119
RSS
II
= 19.139 with DF = 119
Calculating F, using the above values
F = (RSS
II
/DF) / (RSS
I
/DF)
= (19.139/119) / 52.504/119
= 0.3565
F
-calculated
= 0.3565 < F
-tabulated
= 2.29 (at p = 0.05),
suggesting there exists no heteroscadasticity.



Detection Hetrosadastesity
Whites General Heteroscedasticity Test
Consider the following three-variable regression
model:
Y
i
=
1
+
2
X
2i
+
3
X
3i
+ u
i


Step 1: Run the above regression and obtain
the residuals, u
i
.
Step 2: Make Square of the Residual



Detection Hetrosadastesity
Whites General Heteroscedasticity Test (Cont)
Step : 3 Run the following Auxiliary Regression
u
2
i
=
1
+
2
X
2i
+
3
X
3i
+
4
X
2
2i
+
5
X
2
3i
+
6
X
2i
X
3i
+ v
i


Obtain the R
2
from this (auxiliary) regression.

Step 4: Multiply R2 by n
Under the null hypothesis there is no hetrpsedasticity,
when R2 is multipled by n, it approached to
calculated Chi-Square i.e.
n R
2
~ asy 2
df


Detection Hetrosadastesity
Whites General Heteroscedasticity Test (Cont)
Step : 5, Compare calculated Chi-square with
tabulated value,
If 2
df

(Calculated)
> 2
df

(Tabulated)
indicating
hetrosedasticity problem. i.e.
if calculated chi-sqr is greater than tabulated chi-
sqr showing that (
1

2

3

4

5

6
0), so
than u
2
i

1

Showing variance does not remain constant, a
problem of hetrosedastisity
Detection Hetrosadastesity
Whites General Heteroscedasticity Test (Cont)
Take the Example of Jobsatisfaction
In Spss Command
AnalyzeRegressionSaveResidualunstandard
izedcontinue Ok
Make this residual square (as we need e
2
i
), using
TRANSFORM and COMPUTE commands.

Detection Hetrosadastesity
Whites General Heteroscedasticity Test (Cont)
Now run the following Auxiliary Regression
e
2
i
=a1+a2DJ+a3PJ+a4IJ+a5INJ+a6AEE+a7SDJ+a8SPJ
+a9SIJ+a10SINJ+a11SAEE+a12DJPJ+a13DJINJ+a
14DJAEE+a15IJINJ+a16IJAEE+a17INJAEE +e
using spss Commands .Transform.compute
varaibles , as we need
(SDJ,SPJ,SIJ,SINJ,SAEE,DJPJ,DJIJ,DJINJ,DJAEE,PJI
J,PJINJ,PJAEE,IJINJ,IJAEE,INJAEE)

Detection Hetrosadastesity
Whites General Heteroscedasticity Test (Cont)
Using SPSS Command
AnalyzeRegressionLinaerDependent
variable e
2
i
independent variables
SDJ,SPJ,SIJ,SINJ,SAEE,DJPJ,DJIJ,DJINJ,DJAEE,PJIJ,PJI
NJ,PJAEE,IJINJ,IJAEE,INJAEE.OK

Detection Hetrosadastesity
Whites General Heteroscedasticity Test (Cont)
Model Summary




a Predictors: (Constant), INJAEE, SIJ, SDJ, SPJ, SINJ, SAEE, IJAEE, DJIJ, PJAEE, DJAEE, PJIJ,
DJINJ, IJINJ, DJPJ, PJINJ
df = all independent variables excluding constant
(in Auxiliary Regression)
In our case df = 15


Model R R Sqr Adj R Sqr St.Error
1 .522(a) .272 .228 .50813
Detection Hetrosadastesity
Whites General Heteroscedasticity Test (Cont)
2
15

(Calculated)
= R2 * n
We have n =264
So
2
15

(Calculated)
= .272 * 264
= 71.808


Detection Hetrosadastesity
Whites General Heteroscedasticity Test (Cont)
Compare 2
15

(Calculated)
= 4.08 with Chi-Square
table at page 968 Gujrati
10% 5% 1%
2
15

(Tabulated)
28.41 31.41 37.57
2
15

(Calculated)
= 71.808 > 2
15

(Tabulated)
= 37.57
Showing Hetrosedastisity problem


Remedies of Hetrosadastesity
1. If we know , then we use the weighted least
squares (WLS) estimation technique, i.e.,


Where
i
= standard deviation of the X
i
.
2. Log -transformation:


It reduces the heteroscedasticity.



i
i
i
i
i i
i
e X Y
o o
|
o
|
o
+
|
.
|

\
|
+
|
.
|

\
|
=
1 0
1
i i i
X Ln Y Ln | | + + =
1 0
Remedies of Hetrosadastesity
3. Other Transformation:
(a)

After estimating the above model, both the sides are
then multiplied by X
i
.
(b)


Note: In case of transformed data, the diagnostic
statistics t- ratio and F- statistic
are valid only in large sample size.


i
i
i
i
i
i i
i
X X
X
X X
Y
|
|
+
|
.
|

\
|
+ =
0
i
i
i
i
i i
i
Y Y
X
Y Y
Y

1

1 0

| | +
|
|
.
|

\
|
+
|
|
.
|

\
|
=

Anda mungkin juga menyukai