Anda di halaman 1dari 24

Koefisien Determinasi dan

Korelasi Berganda

Y = 1 + 2 X2 + 3 X3 +…+ k Xk + u

PowerPoint® Slides
byYana Rohmana
Education University of Indonesian

© 2007 Laboratorium Ekonomi & Koperasi Publishing Jl. Dr. Setiabudi 229 Bandung, Telp. 022 2013163 - 2523
Koefisien Determiniasi dan Korelasi Berganda

 Ingin diketahui berapa proporsi (presentase) sumbangan


X2 dan X3 terhadap variasi (naik turunnya) Y secara
bersama-sama.
 Besarnya proporsi/persentase sumbangan ini disebut
koefisien determinasi berganda,dengan symbol R2.
 Rumus R2 diperoleh dengan menggunakan definisi :

R 
2 ESS

 yˆ 2
i

TSS y 2
i

b12.3  x2i yi  b1 3. 2  x3i yi


R 
2

y
2
i

Chapter Analisis Regresi Linier Berganda : Persoalan Estimasi dan Pengujian Hipotesis 2
Penerapan pada Kasus 2
b12.3  x2i yi  b1 3. 2  x3i yi
R  2

y i
2

1.203,533  20,0028

1.260,889
 0,9387

Chapter Analisis Regresi Linier Berganda : Persoalan Estimasi dan Pengujian Hipotesis 3
Persamaan garis regresi linier berganda (kasus 2)

Ŷ = b1.23 + b12.3 X2 + b13.2 X3


Ŷ = -17,8685 + 0,9277 X2 + 0,2532 X3
Standar error: (0,0972) (0,1464)
R2 = 0,9387
Se = 3,5907

Chapter Analisis Regresi Linier Berganda : Persoalan Estimasi dan Pengujian Hipotesis 4
The adjusted R2 (R2) as one of indicators of the overall fitness

ESS RSS e^i2


R2 = =1- =1-
TSS TSS yi2
_ ei2 / (n-k)
R2 = 1 - k: # of independent
yi2 / (n-1) variables plus the
_ constant term.
Se2
R2 = 1 -
Sy2
n : # of obs.
_ e2 (n-1)
R2 = 1 -
y2 (n-k)
_
n-1
R2 = 1 - (1 - R2)
n-k
_
R2  R2 0 < R2 < 1
Adjusted R2 can be negative: R2  0
5
Y = 1 + 2 X2 + 3 X3 + u

Y TSS
Y n-1
u
^

6
Suppose X4 is not an explanatory
Variable but is included in regression

C
X2
X3
X4

7
Koefisien Korelasi Parsial Dan Hubungan Berbagai
Koefisien Korelasi dan Regresi

 Y = b1.23 + b12.3 X2 + b13.2 X3 + ei


 r12 = koefisien korelasi antara Y dan X2 (antara X2 dan Y)
 r13 = koefisien korelasi antara Y dan X3 (antara X3 dan Y)
 r23 = koefisien korelasi antara X2 dan X3 (antara X3 dan X2)

 xi yi
 Antara X dan Y : r
 xi2  yi2

 Antara X2 dan Y :  x2i yi


r12 
 x22i  yi2

Chapter Analisis Regresi Linier Berganda : Persoalan Estimasi dan Pengujian Hipotesis 8
Koefisien Korelasi Parsial Dan Hubungan Berbagai
Koefisien Korelasi dan Regresi

 Antara X3 dan Y :  x3i yi


r13 
x 2
3i y 2
i

 Antara X2 dan X3 :  x2i x3i


r23 
x 2
2i x 2
3i

Chapter Analisis Regresi Linier Berganda : Persoalan Estimasi dan Pengujian Hipotesis 9
Partial Correlation Coefficient

 r12.3 = koefisien korelasi antara Y dan X2, kalau X3 konstan


 r13.3 = koefisien korelasi antara Y dan X3, kalau X2 konstan
 r23.1 = koefisien korelasi antara X2 dan X3, kalau Y konstan

r12  r13r23
r12.3 
(1  r132 ) (1  r232 )

r13  r12 r23


r13.2 
(1  r122 ) (1  r232 )

r23  r12 r13


r23.1 
(1  r122 ) (1  r132 )

Chapter Analisis Regresi Linier Berganda : Persoalan Estimasi dan Pengujian Hipotesis 10
1. Individual partial coefficient test

1 holding X3 constant: Whether X2 has the effect on Y ?

Y
H0 : 2 = 0 = 2 = 0?
X2
H1 : 2  0

^
2 - 0 0.9277
t= = = 9.544
Se (^2) 0.0972

Compare with the critical value tc0.025, 6 = 2.447

Since t > tc ==> reject Ho

^
Answer : Yes, 2 is statistically significant and is
significantly different from zero.

11
1. Individual partial coefficient test (cont.)

2 holding X2 constant: Whether X3 has the effect on Y?

Y
H0 : 3 = 0 = 3 = 0?
X3
H1 : 3  0

^
3 - 0 0.2532 - 0
t= = = 1.730
Se (^3) 0.1464

Critical value: tc0.025, 6 = 2.447

Since | t | < | tc | ==> not reject Ho

^
Answer: Yes, 3 is statistically not significant and is
not significantly different from zero.

12
2. Testing overall significance of the multiple regression

3-variable case: Y = 1 + 2X2 + 3X3 + u

H0 : 2 = 0, 3 = 0, (all variable are zero effect)

H1 : 2  0 or 3  0 (At least one variable is not zero)

1. Compute and obtain F-statistics


2. Check for the critical Fc value (F c , k-1, n-k)

3. Compare F and Fc , and


if F > Fc ==> reject H0

13
y=^
Analysis of Variance: Since y+u
^2
^2 + u
==> y2 = y
TSS = ESS + RSS
ANOVA TABLE
(SS) (MSS)
Source of variation Sum of Square df Mean sum of Sq.
^
Due to regression(ESS)
^
 y2 k-1 y2

^ k-1
Due to residuals(RSS)  u2 n-k ^2
u
= ^u2
n-k

Total variation(TSS)  y2 n-1

Note: k is the total number of parameters including the intercept term.


MSS of ESS ESS / k-1 ^
y2/(k-1)
F= = =
MSS of RSS RSS / n-k ^
u 2 /(n-k)

H0 : 2 = … = k = 0
if F > Fck-1,n-k ==> reject Ho
H1 : 2  …  k  0
14
Tabel Anavar, untuk Regresi Tiga Variabel

Derajat Rata-Rata
Sumber Variasi Jumlah Kuadrat (SS) Kebebasan Jumlah Kuadrat
(df) (MSS)*

Dari regresi b12.3 Σ x2iyi + b13.2 Σ x3iyi 2 b12.3 Σ x2iyi + b13.2 Σ x3iyi
(ESS) 2
(k-1)
Kesalahan
Σ ei2 n-3 Σ ei2 / n - 3 = Se2
pengganggu
(n-k)
(RSS)

TSS
Σ yi2 n-1
*Mean Sum of Squares.

Chapter Analisis Regresi Linier Berganda : Persoalan Estimasi dan Pengujian Hipotesis 15
^ ^
Three- y = 2x2 + 3x3 + u^
variable
case ^ ^ ^
 y2 = 2  x2 y + 3  x3 y +  u2

TSS = ESS + RSS

ANOVA TABLE

Source of variation SS df(k=3) MSS

ESS ^ ^ x y
2 x2 y +  3-1 ESS/3-1
3 3

RSS ^2
u n-3 RSS/n-3
(n-k)
TSS y2 n-1

ESS / k-1 (^2 x2y + ^


3 x3y) / 3-1
F-Statistic = =
RSS / n-k ^2 / n-3
u
16
An important relationship between R2 and F
ESS / k-1 ESS (n-k)
F= =
RSS / n-k RSS (k-1)

ESS n-k
=
TSS-ESS k-1 For the three-variables case :
ESS/TSS n-k
= R2 / 2
ESS F=
1- k-1
TSS (1-R2) / n-3
R2 n-k
=
1 - R2 k-1

R2 / (k-1) (k-1) F
R2 =
F = Reverse :
(1-R2) / n-k (k-1)F + (n-k)

17
Overall significance test:
H0 : 2 = 3 = 4 = 0

H1 : at least one coefficient


is not zero.

2  0 , or 3  0 , or 4  0

R2 / k-1
F* = =
(1-R2) / n- k
0.9710 / 3
= (1-0.9710) /16

= 179.13

Fc(0.05, 4-1, 20-4) = 3.24

 k-1 n-k
Since F* > Fc ==> reject H0.
5.18 18
Construct the ANOVA Table (8.4) .(Information from EViews)

Source of
variation
SS Df MSS
2 2 2 2
Due to R (y ) k-1 R (y )/(k-1)
regression =(0.971088)(28.97771)2x1
(SSE) 9 =15493.171 =3 =5164.3903

(1- R )(y ) or (   )
2 2 2 2 2
Due to n-k (1- R )(y )/(n-k)
Residuals =(0.0289112)(28.97771) )2x19
(RSS) =461.2621 =16 =28.8288
2
Total ( y ) n-1
(TSS) =(28.97771) 2x19
=15954.446 =19

Since (y)2 = Var(Y) = y2/(n-1) => (n-1)(y)2 = y2

MSS of regression 5164.3903


F* = = = 179.1339
MSS of residual 28.8288
19
Example:Gujarati(2003)-Table6.4, pp.185)

H0 : 1 = 2 = 3 = 0
2
R / k-1
ESS / k- 0.707665 / 2
F* = = =
1
RSS/(n- k) (1-R2) / n- k (1-0.707665)/ 61

F* = 73.832

Fc(0.05, 3-1, 64-3) = 3.15

 k-1 n-k

Since F* > Fc
==> reject H0.

20
Construct the ANOVA Table (8.4) .(Information from EVIEWS)

Source of
variation
SS Df MSS
2 2 2 2
Due to R (y ) k-1 R (y )/(k-1)
regression =(0.707665)(75.97807)2x6
(SSE) 4 =261447.33 =2 =130723.67

(1- R )(y ) or (   )
2 2 2 2 2
Due to n-k (1- R )(y )/(n-k)
Residuals =(0.292335)(75397807)2x64
(RSS) =108003.37 =61 =1770.547
2
Total ( y ) n-1
(TSS) =(75.97807)2x64
=369450.7 =63

Since (y)2 = Var(Y) = y2/(n-1) => (n-1)(y)2 = y2

MSS of regression 130723.67


F* = = = 73.832
MSS of residual 1770.547
21
Y = 1 + 2 X2 + 3 X3 + u

H0 : 2 = 0, 3= 0,

H1 : 2  0 ; 3  0

Fc0.01, 2, 61 = 4.98
Compare F* and Fc, checks the F-table: Fc0.05, 2, 61 = 3.15

Decision Rule:
Since F*= .73.832 > Fc = 4.98 (3.15) ==> reject Ho

Answer : The overall estimators are statistically significant


different from zero.

22
QUIZ

 1
 2
 3
 4

Chapter Analisis Regresi Linier Berganda : Persoalan Estimasi dan Pengujian Hipotesis 23
TERIMA KASIH

Chapter Analisis Regresi Linier Berganda : Persoalan Estimasi dan Pengujian Hipotesis 24

Anda mungkin juga menyukai