scales• Nominal – Mere No s 1 2 3 4 5 6 7 8 9• Ordinal – Ist Rank, II Rank ………• Interval - Equal Intervel - temperature• Ratio – Absolute Zero , IQ levelss
ERRORS IN MEASUREMENTRespondents FactorsSituationMeasurer EffectInstrument Tools
LESSON – 2• CONCEPT OF OBJECTIVITY and Types of Tests
LESSON -3 TEST CONSTRUCTION• QUESTIONNAIRES• Questionaire as per Goode (1981) refers to adevice for securing answers to questions byusing a form which the Respondents fills inhimself.
Advantages & Disadvantages• Advantages – Low cost. Free from the bias ofthe interviewer, Adequate time, Large samples• Disadvantages – Low rate of return, onlyeducated, no control once it is sent, slowestmethod, possibility of ambiguous, untrue,incomplete answers
Types of Qs• Standardized / structured Questionnaire• Nonstandard zed / unstructuredQuestionnaire• Closed end/ open ended Questionnaire
Types of Responses• True /False items• Multiple Choice items• open ended itemsLikerts Attitude ScaleThis is multiple agree – disagree 5point scale
ADMINSTRATION OF Qs• Face to Face• Written administration• Computerized administration• Telephone administration
Psychological Test• Test in Psychology are a formal technique ofmeasuring feelings, abilities, traits, beliefs andbehavior in a particular situation.TYPES OF TESTS• Performance / Verbal Tests• Paper Pencil Test• Personality Tests• Situational Tests• Projective Tests
• Individual test /group test• Speed test and power test• Researcher made/ standardized test• Oral/ written/ performance test
Lesson -4 TEST STANDARDIZATION• The characteristics of Good test it should bestandardized test• Reliability• Validity• Objectivity• Norms
Test construction• 1. Planning the Test• 2.Preparing the preliminary draft of the test• 3.Trying out preliminary draft of the test• 4.Evaluating the test• 5.Construction of the final Draft of the test
Item writing• 1.Address single issue per item• Avoid bais• Make alternatives clear• Beware of social desirability• Determine the format of item• Sequence the item
Item analysisItem analysis is a process which examines studentresponses to individual test items (questions) in orderto assess the quality of those items and of the test as awhole. Item analysis is especially valuable in improvingitems which will be used again in later tests, but it canalso be used to eliminate ambiguous or misleadingitems in a single test administration. In addition, itemanalysis is valuable for increasing instructors skills intest construction, and identifying specific areas ofcourse content which need greater emphasis or clarity.2 Characteristics of Item analysis - Difficulty value andDiscriminative power
Reliability & Validity• Reliability refers to the consistency of scoresobtained by the same persons when they arereexamined with the same test on differentoccasions. (consistency )• Validity refers to the concern what the testmeasures and how well it does so.The degreeto which the test measures what it purports tomeasure.
Types of Reliability• i . Test Retest Reliability• ii. Alternate form Reliability• iii. Split Half Reliability• Iv. Inter rater Reliability ( scorer Reliability)The standard Error of measurementAnalysis of variance
Types of Validity• Types of Valitidity• Content Validity• Face validity• Criterion related validity – Predictive &concurrent validity• Construct validity• Factorial validityThe error of Estimate - Correlation coeficient
Norms• Freeman defines a norm as norm inmeasurement as the ‘average or standardscore on a particular test made by a specifiedpopulation.• It is a device of transforming raw scores intostandard scores in a group.
VARIABLES• What is a Variable?• Types of variables• IV & DV• Confounding variable• Quantitative & Categorical variable• Continuous & Discrete Variable
Sampling Design• What is a sample ?• Universe – Population – Samples• Systematic Sampling• Simple Random Sampling• Stratified Random Sampling• Purposive Sampling• Cluster Sampling• Convienence Sampling
LESSON- 5 RESEARCH DESIGNS• RESEARCH – knowing – ‘what’ of a study• Research Problem – The variables to bestudied• Formulate the Hypothesis to be tested• To test the Hypothesis research Design isrequired.
RD• Green etal (2008)defines Research Design as‘the specification of methods and proceduresfor acquiring the information needed’.• Purpose of RD• 1.Maximise the variance• 2.control extraneous variance• 3.minimize error varience
Important Elements of RD• Introduction – Reserch Plan/Problem• Statement of Problem• Review of Literature• Scope of the studyObjectives of thestudyConceptual model• Hypotheses• Operational Def of Variables• Significance of the study• Sampling design/plan
• Tools• Analysis• Time budjet• Financial budjet