Scholar : Tahira Altaf
Subject: Applied Research in Education
Submitted to: Dr Muhammad Ramzan
Department of Educational T...
 Measurement as a tool of research
 Four scale of measurement
 Validity and reliability in measurement
 Statistics as ...
According to advance learner dictionary page # 954
“The act or process of finding the size, quality or degree of something...
Scales of Measurement:
According to Gay L.R Mills Geoffrey E and Airasian Peter, in book Educational research
P. (145),
“A...
 Nominal scale.
 The word nomin Latin
rooted word which means
name. Nominal scale
measures the nominal
variables.
 Acco...
 Interval Scales:
 According to Gay. LR. P.
(421),
 “An interval scale has
all characteristic of
nominal and ordinal
sc...
Scale Description Example
Nominal Categorical Northern, Southern, Dictators,
Democrats, Eye color, Male, Female,
Public, P...
 “Validity refers to the degree to which a test
measures what it is supposed to measure and
consequently permits appropri...
 Content Validity
 Criterion related validity
 Concurrent validity
 Predictive validity
 Construct validity
 Consequ...
 Definition
“Content validity is a degree to which a test
measures an intended constant area.”
In such type of validity f...
 According Airasian Peter,
 “Criterion validity is the degree to which
scores on one test are related to scores to
score...
“Concurrent validity is the degree to which
scores on one test are related to scores on
similar, preexisting test administ...
 Administer the new test to defined group of
individuals.
 Administer a previously established valid
criterion test to t...
According to Geoffrey E. Mills,
 “A predictive validity is the degree to which
a test can predict how well an individual ...
Gay L.R Mills, Geoferry E and Peter Airasian P
#(156) describe following procedure for
determining the predictive validity...
 According to Gay LR. P# 140,
 “Concurrent validity is a degree to which a
test measures an intended hypothesis. A
const...
 Consequential validity is related with
consequences that occur from tests. All the
tests which have been conducted under...
 Unclear test direction.
 Confusing and ambiguous test items.
 Vocabulary too difficult for test takers.
 Overall diff...
Types Method Purpose
Content Validity Compare content of
test to domain being
measure.
To what extent does
this test repre...
 According to Gay L.R,
“Reliability is a degree to which test
consistently measure whatever it wants to
measure.”
 Test retest reliability
 Equivalent forms reliability
 Spilt half reliability
 Scorer rater reliability
 Test retest is a form of reliability in which
one test is conducted in two different
timings to the same participants
 Administer the test an appropriate group.
 After passing some time same test should be
conducted to same group.
 Corre...
In such type of reliability two test are
conducted which don’t have same test items
but they are identical in every way
 Administer one form of test to an
appropriate group.
 At the same session or shortly thereafter
administer the second f...
 Rational equivalence reliability is not
established by correlation but it estimates
internal consistency by determining ...
Statistical analysis involves the process of
collecting and analysing data and then
summarizing the data into numerical fo...
 Where center of body of data lies?
 How broadly the data is spread?
 How much two or more variables are
interrelated w...
Descriptive
statistics
 How much
variability exist in
different pieces of
data?
 How two or more
characteristics are
int...
 All other tools and logic become useless
without effective involvement of human
minds.
 Researcher mind is key element ...
 Deductive reasoning
 Inductive reasoning
 Scientific method
 Critical thinking
 Deductive reasoning involve essentially the
reverse process-arriving at specific
conclusion based on general principles ...
 Inductive reasoning don not begin with pre-
established truth or assumption.
 It leads towards examples to principles.
 The goal of scientific endeavors is to explain,
predict and control phenomena. This goal is
based upon the assumptions t...
 Recognition and
definition of
problem
i) Sensation
ii) Conception
iii) Perception
iv) Observation
 Formulation of
hypot...
It involves following steps.
 Verbal reasoning
 Argument analysis
 Decisional making
 Critical analysis of prior resea...
Presentation1
Presentation1
Upcoming SlideShare
Loading in …5
×

Presentation1

139 views

Published on

Tools of Reseach

Published in: Education, Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
139
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Presentation1

  1. 1. Scholar : Tahira Altaf Subject: Applied Research in Education Submitted to: Dr Muhammad Ramzan Department of Educational Training THE ISLAMIA UNIVERSITY OF BAHAWALPUR
  2. 2.  Measurement as a tool of research  Four scale of measurement  Validity and reliability in measurement  Statistics as a tool of research  Functions of statistics in educational research  Human mind as a tool of research  Inductive and deductive logic,  Scientific method  Critical thinking
  3. 3. According to advance learner dictionary page # 954 “The act or process of finding the size, quality or degree of something is called measurement. According to Leedy Paul “Measurement is limiting data of any phenomena substantial or insubstantial so that those data may be interpreted and ultimately compared to an acceptable qualitative or quantitative standard.”
  4. 4. Scales of Measurement: According to Gay L.R Mills Geoffrey E and Airasian Peter, in book Educational research P. (145), “A measurement scale is a system for organizing data so that it may be inspected, analyzed and interpreted. In other words the scale is the instrument used to provide the range of values or scores for each variable.”
  5. 5.  Nominal scale.  The word nomin Latin rooted word which means name. Nominal scale measures the nominal variables.  According to Gay. LR. P. (145),  “A nominal variable is also called a categorical variable because the value includes two or more name named categories.” These variables include sex, employment factors etc  Ordinal Scale  According to Gay. LR. P. (421), “An ordinal scale not only classifies subjects but also rank them in terms of degree to which they process a characteristic of interest.”  ordinal scales permit us to describe performance is higher, lower, worse, better etc
  6. 6.  Interval Scales:  According to Gay. LR. P. (421),  “An interval scale has all characteristic of nominal and ordinal scales. But in addition it is based upon predetermine intervals.”  Achievement test, aptitude test and intelligence tests are examples of interval scales.  Ratio Scales:  According to Gay. LR. P. (422),  “A ratio scale represents the highest, most precise level of measurement. A ratio scale has all advantage of other types of scales and in addition it has a meaningful true zero point.”  Height, weight, time, distance and speed are example of ratio scales.
  7. 7. Scale Description Example Nominal Categorical Northern, Southern, Dictators, Democrats, Eye color, Male, Female, Public, Private, Gifted, Students, Typical Students Ordinal Rank Order and Unequal units Scores of 5, 6, and 10 are unequal to the scores of 1, 2, and 3. Interval Rank order and interval units but no zero points A score of 10 and score of 30 have the same degree difference as a score of 60 and score of 90. Ratio All of above and true zero point A woman is 5 feet tall and her friends is two third as tall as she. Comparison of Measurement Scales:
  8. 8.  “Validity refers to the degree to which a test measures what it is supposed to measure and consequently permits appropriate interpretation of scores”.
  9. 9.  Content Validity  Criterion related validity  Concurrent validity  Predictive validity  Construct validity  Consequential validity
  10. 10.  Definition “Content validity is a degree to which a test measures an intended constant area.” In such type of validity first researcher select the target content and then construct the test to check its validity. Content validity has further two types.
  11. 11.  According Airasian Peter,  “Criterion validity is the degree to which scores on one test are related to scores to scores on similar, preexisting test administered in the same time frame or to the other available valid measure at the same time.”
  12. 12. “Concurrent validity is the degree to which scores on one test are related to scores on similar, preexisting test administered at the same time”,
  13. 13.  Administer the new test to defined group of individuals.  Administer a previously established valid criterion test to the same group at the same time or shortly thereafter.  Co-relate the sets of scores.  Evaluate the results.
  14. 14. According to Geoffrey E. Mills,  “A predictive validity is the degree to which a test can predict how well an individual do in a future situation.”
  15. 15. Gay L.R Mills, Geoferry E and Peter Airasian P #(156) describe following procedure for determining the predictive validity.  Identity and carefully define the criterion.  Administer the predictor variable to a group.  Wait until the behavior to be predicted, the criterion variable occurs.  Obtain measures of the criterion for the same group.  Co-relate the two sets of scores.  Evaluate the results
  16. 16.  According to Gay LR. P# 140,  “Concurrent validity is a degree to which a test measures an intended hypothesis. A construct is non observable trait such as intelligence.”
  17. 17.  Consequential validity is related with consequences that occur from tests. All the tests which have been conducted under the umbrella of consequential validity involve more and more individuals. That’s why consequence of test become more important.
  18. 18.  Unclear test direction.  Confusing and ambiguous test items.  Vocabulary too difficult for test takers.  Overall difficult and complex sentence structure.  Inconsistent and subjective scoring methods.  Untaught items including on achievement test.  Failure to follow standardized test administration procedures.  Cheating either by participants or by someone teaching the correct answers to the specific test items.
  19. 19. Types Method Purpose Content Validity Compare content of test to domain being measure. To what extent does this test represent the general domain of interest. Criterion related validity Co-relate score from one instrument of scores on a criterion measure, either at the same time (concurrent) or different time (predictive). To what extent does this test correlate highly with another test? Construct validity A mass convergent, divergent, and content related evidence to determine that the presumed construct is what is being measured? To what extent does this test reflect the construct it is intended to measure? Consequential validity Observe and determine whether the test adverse consequences for test takers or users? To what extent does the test create harmful consequences for test takers?
  20. 20.  According to Gay L.R, “Reliability is a degree to which test consistently measure whatever it wants to measure.”
  21. 21.  Test retest reliability  Equivalent forms reliability  Spilt half reliability  Scorer rater reliability
  22. 22.  Test retest is a form of reliability in which one test is conducted in two different timings to the same participants
  23. 23.  Administer the test an appropriate group.  After passing some time same test should be conducted to same group.  Correlate two sets of scores.  Evaluate the results.
  24. 24. In such type of reliability two test are conducted which don’t have same test items but they are identical in every way
  25. 25.  Administer one form of test to an appropriate group.  At the same session or shortly thereafter administer the second form of test to the same group.  Correlate two sets of scores.  Evaluate results.
  26. 26.  Rational equivalence reliability is not established by correlation but it estimates internal consistency by determining how items in the test are relevant with each other.
  27. 27. Statistical analysis involves the process of collecting and analysing data and then summarizing the data into numerical form.
  28. 28.  Where center of body of data lies?  How broadly the data is spread?  How much two or more variables are interrelated with each other?
  29. 29. Descriptive statistics  How much variability exist in different pieces of data?  How two or more characteristics are interrelated with each other? Inferential statistics  It help researcher to make decision about data.  It answers the quantitative nature of questions.
  30. 30.  All other tools and logic become useless without effective involvement of human minds.  Researcher mind is key element in research work.
  31. 31.  Deductive reasoning  Inductive reasoning  Scientific method  Critical thinking
  32. 32.  Deductive reasoning involve essentially the reverse process-arriving at specific conclusion based on general principles i.e observation or experience. Separate and individual facts leads towards single conclusion
  33. 33.  Inductive reasoning don not begin with pre- established truth or assumption.  It leads towards examples to principles.
  34. 34.  The goal of scientific endeavors is to explain, predict and control phenomena. This goal is based upon the assumptions that all behaviors and events are orderly and that they are effects which have discoverable causes.
  35. 35.  Recognition and definition of problem i) Sensation ii) Conception iii) Perception iv) Observation  Formulation of hypothesis  Collection of data i)Observation ii) Interview iii) Questionnaire  Analysis of data  Conclusion
  36. 36. It involves following steps.  Verbal reasoning  Argument analysis  Decisional making  Critical analysis of prior research.

×