Anda di halaman 1dari 7

Multiple Regression:

1. Check for correlation.


2. Then If there is positive correlation or negative correlation.
3. High R square represents the variance explained by the independent variables.

Interpretation:
These variables are duplicating other variables.
Multicollinearity: when independent variables are correlated, then dependent variable suffers.
Adjusted R square: when independent variables are not contributing.
There was some interaction effect between independent variables.
Step-wise Regression and Backward Regression:
Mini Tab software.
Researcher has to take decision which model to take.
Stepwise and Backward are the two models.
Stepwise model is better than Backward model.
If experience is there, then you will prefer Backward model.
Not good models unless they are disciplined.
Factor Analysis:
Bringing down total variables into few meaningful variables.
Online Booking is an independent variable.
If variables are highly correlating together, we can call them factors.
Two different factors could not be highly correlated.
First make conceptual model: data comes from historical means.
Exploratory factor analysis:
Multi-variate analysis: Diagnostic tools
Want to regress later then use save as variables.
0.7 is the cut-off in Communalities.
Factors we need min. one in eigen-values.
67% explanation about variance is not enough in decision
making when you have to put money.
In rotated component matrix, alpha values are there.
Factors are highly correlating with each other then we will
go for oblique rotation.
Its above 250-300, then we have 0.5 as the cutoff.
Its above 500, then we have 0.3 as the cutoff in rotated
component matrix.
Here we have taken 0.5 as cutoff.
Milward brown,
Core Airline Factors: Component C1
Additional features: Component C2
Compensating Factors: Component C3
Factor analysis can be used to measure the construct
validity.
Construct validity: it is good as every variable has come into
some component.
Low reliability, but factor came out then add additional
questions to that factor as pilot.
Testing the reliability through scale.
Communality:
Factor analysis is an interdependence technique.
Andy field text-book.
The rejections happen
Kmo says above 0.5, u can go ahead with the factor
analysis, it is appropriate.
Kmo=0.7 or 0.8, then it is robust.
More eigen values, then more explanation of variance.
Huge difference is called residual.
Rotation of factors for forcefully fitting.
Surrogate variable : also called compensation factor.
Correlation matrix is important for factor analysis if
correlation matrix is not there, factor analysis would not be
done.
Principal axis factoring: 3 variables reduced from 14
variables.
Manhotra book on marketing research.
Discriminant and Logic Analysis:
Dependent variable was a continuous variable,
Discriminate is there: if two groups are statistically
different.
In case of discriminant, dependent variable would be
categorical variable.
Through discriminant we try to get the decision variable.
Logistic Regression:
Metric or a continuous variable in Anova and regression,
but categorical variable in Discriminant/Logit for dependent
variables.
In discriminant, dependent variables stay away from each
other, one or more categorical variable.
Discriminant analysis used for classification and prediction,
by using discriminant function.
Wilks Lambda in Discriminant Analysis: the value of group
should be zero or lower, the more difference will be there.
More Eigen value, two variables are significantly different.
Function:
Wilks lambda is 0.8 then it is not significantly differentiating
two groups by three independent variables.
Discriminant analysis is multi-variate analysis, u need
atleast 250 samples.
Canonical discriminant: take unstandardized coefficients.
Come out with a decision score and group centroid is the
cut-off.
Standardized coefficient tells the order of importance of the
independent variable .
Multiple discriminant analysis
D= -6.212+.015*income-
0.003*travel+0.175*vacation+0.510*hsize+0.003*age+1.304
*amount
Divide 1000 samples into two groups of 500 each, then find
discriminant function, beta coefficients, wilks lambda then
get the comparable variables, otherwise again divide the
samples into new groups.
Structure matrix: is between group.
Logistic Regression: Logit Model
Dependent variable is binary and several independent
variables are metric.
It is important in choice modeling, depends on different
attributes.
Marketing mix variables, price variable.
Binary Logistics in SPSS to use Logit model.
Independent variables have different units, u may have
dummy variables.
Context is buyer behavior.
Ai is the parameter that you will be estimating.

Anda mungkin juga menyukai