Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Quantitative techniques for psychology


Published on

Quantitative techniques for psychology

  2. 2. Lesson -1Introduction to Quantitativemethods& Measurement• TYPES OF RESEARCHTypes ofResearchNomotheticQuantitativeQualitativeIdiographic
  3. 3. Classification of Psychological Research• Descriptive Research• Correlation Research• Experimental Research• Naturalistic observation• Self Report
  4. 4. Quantitative Techniques• Need• Ad vantages• Limitations• Classification• Tests
  5. 5. MEASUREMENTNominalScaleOrdinalScaleIntervalScaleRatioscale
  6. 6. scales• Nominal – Mere No s 1 2 3 4 5 6 7 8 9• Ordinal – Ist Rank, II Rank ………• Interval - Equal Intervel - temperature• Ratio – Absolute Zero , IQ levelss
  7. 7. ERRORS IN MEASUREMENTRespondents FactorsSituationMeasurer EffectInstrument Tools
  8. 8. LESSON – 2• CONCEPT OF OBJECTIVITY and Types of Tests
  9. 9. LESSON -3 TEST CONSTRUCTION• QUESTIONNAIRES• Questionaire as per Goode (1981) refers to adevice for securing answers to questions byusing a form which the Respondents fills inhimself.
  10. 10. Advantages & Disadvantages• Advantages – Low cost. Free from the bias ofthe interviewer, Adequate time, Large samples• Disadvantages – Low rate of return, onlyeducated, no control once it is sent, slowestmethod, possibility of ambiguous, untrue,incomplete answers
  11. 11. Types of Qs• Standardized / structured Questionnaire• Nonstandard zed / unstructuredQuestionnaire• Closed end/ open ended Questionnaire
  12. 12. Types of Responses• True /False items• Multiple Choice items• open ended itemsLikerts Attitude ScaleThis is multiple agree – disagree 5point scale
  13. 13. ADMINSTRATION OF Qs• Face to Face• Written administration• Computerized administration• Telephone administration
  14. 14. Psychological Test• Test in Psychology are a formal technique ofmeasuring feelings, abilities, traits, beliefs andbehavior in a particular situation.TYPES OF TESTS• Performance / Verbal Tests• Paper Pencil Test• Personality Tests• Situational Tests• Projective Tests
  15. 15. • Individual test /group test• Speed test and power test• Researcher made/ standardized test• Oral/ written/ performance test
  16. 16. Lesson -4 TEST STANDARDIZATION• The characteristics of Good test it should bestandardized test• Reliability• Validity• Objectivity• Norms
  17. 17. Test construction• 1. Planning the Test• 2.Preparing the preliminary draft of the test• 3.Trying out preliminary draft of the test• 4.Evaluating the test• 5.Construction of the final Draft of the test
  18. 18. Item writing• 1.Address single issue per item• Avoid bais• Make alternatives clear• Beware of social desirability• Determine the format of item• Sequence the item
  19. 19. Item analysisItem analysis is a process which examines studentresponses to individual test items (questions) in orderto assess the quality of those items and of the test as awhole. Item analysis is especially valuable in improvingitems which will be used again in later tests, but it canalso be used to eliminate ambiguous or misleadingitems in a single test administration. In addition, itemanalysis is valuable for increasing instructors skills intest construction, and identifying specific areas ofcourse content which need greater emphasis or clarity.2 Characteristics of Item analysis - Difficulty value andDiscriminative power
  20. 20. Reliability & Validity• Reliability refers to the consistency of scoresobtained by the same persons when they arereexamined with the same test on differentoccasions. (consistency )• Validity refers to the concern what the testmeasures and how well it does so.The degreeto which the test measures what it purports tomeasure.
  21. 21. Types of Reliability• i . Test Retest Reliability• ii. Alternate form Reliability• iii. Split Half Reliability• Iv. Inter rater Reliability ( scorer Reliability)The standard Error of measurementAnalysis of variance
  22. 22. Types of Validity• Types of Valitidity• Content Validity• Face validity• Criterion related validity – Predictive &concurrent validity• Construct validity• Factorial validityThe error of Estimate - Correlation coeficient
  23. 23. Norms• Freeman defines a norm as norm inmeasurement as the ‘average or standardscore on a particular test made by a specifiedpopulation.• It is a device of transforming raw scores intostandard scores in a group.
  24. 24. VARIABLES• What is a Variable?• Types of variables• IV & DV• Confounding variable• Quantitative & Categorical variable• Continuous & Discrete Variable
  25. 25. Sampling Design• What is a sample ?• Universe – Population – Samples• Systematic Sampling• Simple Random Sampling• Stratified Random Sampling• Purposive Sampling• Cluster Sampling• Convienence Sampling
  26. 26. LESSON- 5 RESEARCH DESIGNS• RESEARCH – knowing – ‘what’ of a study• Research Problem – The variables to bestudied• Formulate the Hypothesis to be tested• To test the Hypothesis research Design isrequired.
  27. 27. RD• Green etal (2008)defines Research Design as‘the specification of methods and proceduresfor acquiring the information needed’.• Purpose of RD• 1.Maximise the variance• 2.control extraneous variance• 3.minimize error varience
  28. 28. Important Elements of RD• Introduction – Reserch Plan/Problem• Statement of Problem• Review of Literature• Scope of the studyObjectives of thestudyConceptual model• Hypotheses• Operational Def of Variables• Significance of the study• Sampling design/plan
  29. 29. • Tools• Analysis• Time budjet• Financial budjet
  30. 30. Classification of RDResearch DesignExploraory RD conclusive RDDescriptive Research Casual Researchcross sectional Design Longitudinal DesignSingle cross sectional Design &Multiple cross sectional Design
  31. 31. Informal Research Design• Exploratory Research Design• Secondary resource analyses• Comprehensive case method• Expert opinion survey• Focus group Discussions.