Upcoming SlideShare
×

# Quantitative Data Analysis: Hypothesis Testing

4,435 views
4,053 views

Published on

Published in: Technology, Economy & Finance
2 Likes
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
Your message goes here
• Be the first to comment

Views
Total views
4,435
On SlideShare
0
From Embeds
0
Number of Embeds
16
Actions
Shares
0
278
0
Likes
2
Embeds 0
No embeds

No notes for slide

### Quantitative Data Analysis: Hypothesis Testing

1. 1. Research Methodology Chapter 12 Quantitative Data Analysis: Hypothesis Testing
2. 2. Types I errors, Type II Errors &statistical Power Type I error : the probability of rejecting the null hypothesis when it is actually true. Type II error the probability of failing to reject the null hypothesis given that the alternative hypothesis is actually true.
3. 3. Statistical power (1 - ):  the probability of correctly rejecting the null hypothesis. alpha Sample size Effect size
4. 4. Testing Hypotheses on a Single Mean  One sample t-test: statistical technique that is used to test the hypothesis that the mean of the population from which a sample is drawn is equal to a comparison standard.
5. 5. Testing hypothesis about two related means  Paired sample t-test to examine the differences in the same group before and after treatment.  The Wilcoxon signed-rank test: a nonparametric test for examining significant differences between two related samples or repeated measurements on a single sample. Used as an alternative for a paired samples ttest when the population cannot be assumed to be normally distributed.
6. 6. RESEARCH METHODOLOGY OF TEN STUDENTS IN THE FIRST WEEK AND LAST WEEK OF SEMESTER
7. 7. Testing hypothesis about two related means  McNemar's test: non-parametric method used on nominal data. It assesses the significance of the difference between two dependent samples when the variable of interest is dichotomous. It is used primarily in before-after studies to test for an experimental effect.
8. 8. Performance of student before and after extra class
9. 9. Testing hypothesis about two unrelated means • Independent samples t-test: is done to see if there are any significant differences in the means for two groups in the variable of interest.
10. 10. Testing hypothesis about several means • Analysis Of Variance (ANOVA) helps to examine the signiﬁcant mean differences among more than two groups on an interval or ratio-scaled dependent variable.
11. 11. Regression Analysis • Simple regression analysis is used in a situation where one metric independent variable is hypothesized to affect one metric dependent variable.
12. 12. Scatter plot 100 LKLHD_DATE 80 60 40 20 30 40 50 60 PHYS_ATTR 70 80 90
13. 13. Simple Linear Regression Yi 0 1 Xi i Y ˆ ˆ ˆ 1 0 `0 1 0 X
14. 14. Standardized regression coefficients  Standardized regression coefficients or beta coefficients are the estimates resulting from a multiple regression analysis performed on variable that have been standardized. This is usually done to allow the researcher to compare the relative effects of independent variable on the dependent variable, when independent variable are measured in different unit of measurement.
15. 15. Regression with dummy variable • A dummy variable (also known as an indicator variable, design variable, categorical variable, binary variable, or qualitative variable) • Dummy variable allow to use nominal or ordinal variable as independent variable to explain, understand, or predict the dependent variable.
16. 16. MULTICOLLINEARITY • Encountered statistical phenomenon in which two or more independent variables in a multiple regression model are highly correlated. • It makes the estimation of the regression coefficients impossible and sometimes unreliable. • To detect multicollinearity, we must check the correlation matrix for the independent variables. • The high correlations is first sign of sizeable multicollinearity. TWO MEASURES : Tolerance value Variance inflation factor ( VIF ) To measure indicate the degree to which one independent variable and explained by the other independent variable.
17. 17. A display of the FEV data in SPSS
18. 18. • To fit multiple linear regression model in SPSS using the FEV data do the following: • Analyze > Regression > Linear and then move forced expiratory volume into the dependent box and Smoke and age into independent(s) box. Then Click OK. • This will give you the model summary table, ANOVA table and the regression coefficients table in the output window.
19. 19. A demonstration of how to start fitting the multiple regression model in SPSS
20. 20. A demonstration of how to select the dependent and independent variable(s) for fitting multiple regression in SPSS.
21. 21. A demonstration of how to select diagnostic statistic for checking outliers and multicollinearity issues in SPSS.
22. 22. Multicollinearity is not a serious problem, because the estimation of the regression coefficients may be unstable. But when the objective of the study is to reliably estimate the individual regression coefficients, multicollinearity is a problem. The Methods to Reduce Reduce the set of independent variables to a set that are not collinear. Use more sophisticated ways to analyze the data, such as ridge regression. Create a new variable that is a composite of the highly correlated variables.
23. 23. Testing moderating using regression analysis : interaction effects It is effect one variable ( X1 ) on Y depends on the value of another variable ( X2 ). Moderating variable as a variable that modifies the original relationship between an independent variable and dependent variable. Example : H1 : The students’ judgement of the university’s library is affected by the students’ judgement of the computers. -It’s means the relationship between the judgement of computers in the library and the judgement of the library is affected by computer ownership. H2 : The relationship between the judgement of computers in the library is moderated by computer ownership.
24. 24. Other multivariate tests and analysis • • • • • • Discriminant analysis Logistic regression Conjoint analysis Two-way ANOVA MANOVA Canonical correlation
25. 25. Other multivariate tests and analysis • Discriminant analysis -help to identify IV that discriminate a normally scaled DV of interest.
26. 26. Other multivariate tests and analysis • Logistic regression -used when the DV is nonmetric -always used when DV has only 2 groups. -it allows researcher to predict discrete outcome.
27. 27. Other multivariate tests and analysis • Conjoint analysis -statistical technique used in many fields. -used to understand how consumers develop preferences for product/services -built on the idea that consumers evaluate the value of a product or service by combining the value that is provided by each attribute.
28. 28. Other multivariate tests and analysis • Two-way ANOVA -used to examine the effect of two non metric IV on a single metric DV -enable us to examine main effects & also interaction effects that exist between the independent variables.
29. 29. Other multivariate tests and analysis • Two-way ANOVA -example DV : Satisfy with toy IV : i) toy colour (pink & blue) ii) gender (male & female)  Main effect of toy colour. Pink toys significantly more satisfaction than the blue toys.  Main effect of gender. The female are more satisfy with the toy than the male
30. 30. Other multivariate tests and analysis • Multivariate Analysis of Variance (MANOVA) -is a multivariate extension of analysis of variance. -the IV measured on a nominal scale & the DV on interval/ratio scale i) The null hyphothesis: Hₒ :µ1=µ2=µ3... µn ii) The alternate hyphothesis: HA:µ1≠µ2≠µ3≠... µn
31. 31. Other multivariate tests and analysis • Canonical correlation -examine the relationship between two or more DV & several IV
32. 32. Data warehousing • Most companies are now aware of the benefits of creating a data warehouse that serves as the central repository of all data collected from disparate sources including those pertaining to the company's finance, manufacturing, sales, and the like.
33. 33. Data Mining • Complementary to the functions of data warehousing, many companies resort to data mining as a strategic tool for reaching new levels of business intelligence. • Using algorithms to analyze data in a meaningful way, data mining more effectively leverages the data warehouse by identifying hidden relations and patterns in the data stored in it.
34. 34. Operations Research • Operations research (OR) or management science (MS) is another sophisticated tool used to simplify and thus clarify certain types of complex problem that lend themselves to quantification.