3
?
m l m m = a + bl
1.0 15.0
21
20
1.5 17.0 19
18
a? b?
2.0 18.0 17 a=-4
2.5 19.5 16
15
b = + 0.33
3.0 21.0 1 2 3 l
-4 0.33 20.7
m = a + b l
l = 20.7
m = 2.9
Simple Linear Regression
E(y)
Positive Linear Relationship
Slope 1 is positive
Intercept
0
x
Simple Linear Regression
Intercept
0
Slope 1 is Negative
x
Simple Linear Regression
E(y)
No Relationship
x
Simple Linear Regression
Estimated
Regression Equation
b0 and b1
provide estimates of y = b0 + b1 x
0 and 1
Sample Statistics
b 0, b1
Simple Linear Regression
where:
yi = observed value of the dependent variable for the ith
observation
yi = estimated value of the dependent variable for the ith
observation
Simple Linear Regression
i
( y y ) 2
= i
(
y y ) 2
+ i i
( y
y ) 2
where:
SST = Total Sum of Squares
SSR = Sum of Squares due to Regression
SSE = Sum of Squares due to Error
Simple Linear Regression
rxy = + 0.8772
Hence, rxy = +0.9366
Simple Linear Regression
An Estimate of
The mean square error (MSE) provides the estimate
of 2, and the notation s2 is also used.
s2 = MSE = SSE/(n - 2)
where: SSE = ( yi y i ) 2 = ( yi b0 b1 xi ) 2
Simple Linear Regression
An Estimate of
To estimate we take the square root of s2.
SSE
s = MSE =
n2
Simple Linear Regression
Hypotheses
H0: 1 = 0
H1: 1 0
b1
t=
Test Statistic
sb1
Simple Linear Regression
Rejection Rule
where:
t/2 is based on a t distribution
with n - 2 degrees of freedom
Simple Linear Regression
1. Determine the hypotheses.
H0: 1 = 0
H1: 1 0
Hypotheses
H0: 1 = 0
H1: 1 0
Test Statistic
F = MSR/MSE
Simple Linear Regression
Rejection Rule
Reject H0 if p-value < or F > F
where:
F is based on an F distribution with 1 degree of
freedom in the numerator and n-2 degrees of freedom
in the denominator
Simple Linear Regression
y y Good Pattern
Residual
x
Simple Linear Regression
y y Non-constant Variance
Residual
x
Simple Linear Regression
x
Simple Linear Regression
Residuals
Observation Predicted Cars Sold Residuals
1 15 -1
2 25 -1
3 20 -2
4 15 2
5 25 2
Simple Linear Regression
Multiple Regression
The simple linear regression model was used to analyze how one
variable (the dependent variable y) is related to one other
variable (the independent variable x).
Multiple regression allows for any number of independent
variables.
We expect to develop models that fit the data better than would
a simple linear regression model.
Simple regression considers the relation
between a single explanatory variable and
response variable
Multiple regression simultaneously considers the influence of
multiple explanatory variables on a response variable Y
error variable
coefficients
In the one variable, two dimensional case we drew a regression
line; here we imagine a response surface.
Regression Modeling