This document provides an overview of quantitative techniques for psychology. It discusses types of psychological research including nomothetic, quantitative, qualitative and idiographic approaches. It also describes different research designs such as descriptive research, correlation research, experimental research and naturalistic observation. Additionally, it covers topics like measurement, scales, errors in measurement, test construction, questionnaire design, reliability, validity, norms, variables, sampling, and research designs. The goal is to introduce students to key concepts and methods in quantitative psychological research.
In psychology, we use various types of tests to assess various attributes of humans. To study complex humans there are several forms of tests that are used by researchers, clinicians, psychologists, etc.
1. types of psychological tests by S.Lakshmanan PsychologistLAKSHMANAN S
My sincere thanks to: - Professor Dr. V.Suresh
Annamalai University
Dear Viewers, Please, See this updated version of the types of psychological tests in this www.slideshare.net
A short note about the concept of the psychological test; introduction, definition, characteristics, needs, classification, types, and some selected psychological tests.
In psychology, we use various types of tests to assess various attributes of humans. To study complex humans there are several forms of tests that are used by researchers, clinicians, psychologists, etc.
1. types of psychological tests by S.Lakshmanan PsychologistLAKSHMANAN S
My sincere thanks to: - Professor Dr. V.Suresh
Annamalai University
Dear Viewers, Please, See this updated version of the types of psychological tests in this www.slideshare.net
A short note about the concept of the psychological test; introduction, definition, characteristics, needs, classification, types, and some selected psychological tests.
A chapter describing the use and application of exploratory factor analysis using principal axis factoring with oblique rotation.
Provides a step by step guide to exploratory factor analysis using SPSS.
Types of tests: proficiency, achievement, diagnostic, placement
Types of testing: direct vs indirect tests, discrete point vs integrative tests, criterion-referenced vs norm-referenced tests, objective vs subjective tests
Achievement Test
What is Achievement Tests
Types of Achievement Tests
Standardized Achievement Tests
Features of Achievement Test
Characteristics of Standardized Achievement Tests
Standardized tests versus Informal Classroom Tests
Classification of SAT
Conclusion
TEST CONSTRUCTION in Psychology to measure different traitsVandanaGaur15
The process of designing and constructing a test, from the initial concept to its final administration and scoring, as well as its statistical analysis and reporting, is known as test construction (Zijlmans et al., 2019).
Test construction is the set of activities involved in developing and evaluating a test of some psychological function.
The development of a good psychological test requires six essential steps:
Planning.
Writing items for the test.
Preliminary administration of the test.
Checking the reliability of the final test.
Checking the validity of the final test.
Preparation of the test manual and reproduction of the test.
This slideshow was used for teacher training workshops I conducted in the fall of 2011 at the Center for English as a Second Language, University of Arizona (Tucson, USA).
RESEARCH DESIGN , Sampling Designs , Dependent and Independent Variables, Extraneous Variables, Hypothesis, Exploratory Research Design, Descriptive and Diagnostic Research
2. Lesson -1
Introduction to Quantitative
methods& Measurement
• TYPES OF RESEARCH
Types of
Research
Nomothetic
QuantitativeQualitative
Idiographic
3. Classification of Psychological Research
• Descriptive Research
• Correlation Research
• Experimental Research
• Naturalistic observation
• Self Report
6. scales
• Nominal – Mere No s 1 2 3 4 5 6 7 8 9
• Ordinal – Ist Rank, II Rank ………
• Interval - Equal Intervel - temperature
• Ratio – Absolute Zero , IQ levelss
8. LESSON – 2
• CONCEPT OF OBJECTIVITY and Types of Tests
9. LESSON -3 TEST CONSTRUCTION
• QUESTIONNAIRES
• Questionaire as per Goode (1981) refers to a
device for securing answers to questions by
using a form which the Respondents fills in
himself.
10. Advantages & Disadvantages
• Advantages – Low cost. Free from the bias of
the interviewer, Adequate time, Large samples
• Disadvantages – Low rate of return, only
educated, no control once it is sent, slowest
method, possibility of ambiguous, untrue,
incomplete answers
12. Types of Responses
• True /False items
• Multiple Choice items
• open ended items
Likert's Attitude Scale
This is multiple agree – disagree 5
point scale
13. ADMINSTRATION OF Qs
• Face to Face
• Written administration
• Computerized administration
• Telephone administration
14. Psychological Test
• Test in Psychology are a formal technique of
measuring feelings, abilities, traits, beliefs and
behavior in a particular situation.
TYPES OF TESTS
• Performance / Verbal Tests
• Paper Pencil Test
• Personality Tests
• Situational Tests
• Projective Tests
15. • Individual test /group test
• Speed test and power test
• Researcher made/ standardized test
• Oral/ written/ performance test
16. Lesson -4 TEST STANDARDIZATION
• The characteristics of Good test it should be
standardized test
• Reliability
• Validity
• Objectivity
• Norms
17. Test construction
• 1. Planning the Test
• 2.Preparing the preliminary draft of the test
• 3.Trying out preliminary draft of the test
• 4.Evaluating the test
• 5.Construction of the final Draft of the test
18. Item writing
• 1.Address single issue per item
• Avoid bais
• Make alternatives clear
• Beware of social desirability
• Determine the format of item
• Sequence the item
19. Item analysis
Item analysis is a process which examines student
responses to individual test items (questions) in order
to assess the quality of those items and of the test as a
whole. Item analysis is especially valuable in improving
items which will be used again in later tests, but it can
also be used to eliminate ambiguous or misleading
items in a single test administration. In addition, item
analysis is valuable for increasing instructors' skills in
test construction, and identifying specific areas of
course content which need greater emphasis or clarity.
2 Characteristics of Item analysis - Difficulty value and
Discriminative power
20. Reliability & Validity
• Reliability refers to the consistency of scores
obtained by the same persons when they are
reexamined with the same test on different
occasions. (consistency )
• Validity refers to the concern what the test
measures and how well it does so.The degree
to which the test measures what it purports to
measure.
21. Types of Reliability
• i . Test Retest Reliability
• ii. Alternate form Reliability
• iii. Split Half Reliability
• Iv. Inter rater Reliability ( scorer Reliability)
The standard Error of measurement
Analysis of variance
22. Types of Validity
• Types of Valitidity
• Content Validity
• Face validity
• Criterion related validity – Predictive &
concurrent validity
• Construct validity
• Factorial validity
The error of Estimate - Correlation coeficient
23. Norms
• Freeman defines a norm as norm in
measurement as the ‘average or standard
score on a particular test made by a specified
population.
• It is a device of transforming raw scores into
standard scores in a group.
24. VARIABLES
• What is a Variable?
• Types of variables
• IV & DV
• Confounding variable
• Quantitative & Categorical variable
• Continuous & Discrete Variable
25. Sampling Design
• What is a sample ?
• Universe – Population – Samples
• Systematic Sampling
• Simple Random Sampling
• Stratified Random Sampling
• Purposive Sampling
• Cluster Sampling
• Convienence Sampling
26. LESSON- 5 RESEARCH DESIGNS
• RESEARCH – knowing – ‘what’ of a study
• Research Problem – The variables to be
studied
• Formulate the Hypothesis to be tested
• To test the Hypothesis research Design is
required.
27. RD
• Green etal (2008)defines Research Design as
‘the specification of methods and procedures
for acquiring the information needed’.
• Purpose of RD
• 1.Maximise the variance
• 2.control extraneous variance
• 3.minimize error varience
28. Important Elements of RD
• Introduction – Reserch Plan/Problem
• Statement of Problem
• Review of Literature
• Scope of the studyObjectives of the
studyConceptual model
• Hypotheses
• Operational Def of Variables
• Significance of the study
• Sampling design/plan