Your SlideShare is downloading. ×
Concepts and-measurement
Concepts and-measurement
Concepts and-measurement
Concepts and-measurement
Concepts and-measurement
Concepts and-measurement
Concepts and-measurement
Concepts and-measurement
Concepts and-measurement
Concepts and-measurement
Concepts and-measurement
Concepts and-measurement
Concepts and-measurement
Concepts and-measurement
Concepts and-measurement
Concepts and-measurement
Concepts and-measurement
Concepts and-measurement
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Concepts and-measurement

8,199

Published on

Published in: Education
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
8,199
On Slideshare
0
From Embeds
0
Number of Embeds
7
Actions
Shares
0
Downloads
44
Comments
0
Likes
2
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Concepts
    • creating explanations
    • building concepts
    • measurement
  • 2. creating explanations
    • Independent Variable
    • Explanatory Variable
    • Driving Force
    Dependent Variable Outcome X + Y  Z
  • 3. units
    • Unit of analysis: the level the dependent variable is being measured; the case
      • Ex: Nation (portion Christian)
    • Unit of observation: the unit on which data are being gathered
      • Ex: Individual within the nation (data on that individual’s belief)
  • 4. what is a concept?
    • Gerring : “an alignment among three intertwined components: the term ( a linguistic label comprised of one or a few words), the phenomena to be defined (the referents, extension, or denotation of a concept), and the properties or attributes that define those phenomena (the definition, intension, or connotation of a concept)” (39)
  • 5. How do we build (background/systematized) concepts?
    • process of adjustment to maximize fulfillment of the criteria for a good concept (Gerring, see table --> )
    • “ successful definition involves the identification of attributes that provide necessary and sufficient conditions for locating examples of the term”
    • avoid homonymy, synonymy, instability
    • strategies
      • choose a definition from a classic work
      • find a causal-explanatory understanding of a term (term defined by what explains it or what it explains)
      • intellectual history of the concept (genealogy is a version of this)
      • focus on specific definitional attributes, grouping attributes that are similar
  • 6. “ ideology”
  • 7.  
  • 8. from concept to indicator
  • 9. Measurement: sensitive , valid, reliable
    • Sensitivity : extent to which cases are homogenous within the value of your variable; precision in measures
      • Keep variable as sensitive as possible
      • But keep in mind limits of measurement method
    There is a sensitivity-reliability trade-off in self-reports with rating scales. People are generally good at giving up to 7 positions, but more than that leads to greater error.
  • 10. measurement levels
    • nominal : variation in kind or type; categories
      • Example: married, living with partner, never married, separated, divorced, widowed
    • ordinal : variation in degree along a continuum; rank order
      • Example: support for Obama (strongly favor, favor, neutral, oppose, strongly oppose)
    • interval : variation in degree along a continuum; relative positions on a continuum
      • Example: year in which event occurred
    • ratio : variation in degree along a continuum; numbers signify absolute position on the continuum and the number zero is meaningful
      • Example: number of battlefield deaths; age of a person
  • 11. Measurement: sensitive, valid, reliable
    • Valid: extent to which what you measure is what you say you measure
      • Face Validity: plausible on its “face”
      • Content Validity: extent to which all components of a systematized concept are measured in the indicator; matching a list of attributes
      • Criterion-related Validity: extent to which an indicator matches criteria; predictive, concurrent
      • Construct Validity: extent to which what you measure behaves as it should within a system of related concepts; an attribute of a measure/indicator
      • Convergent/Divergent Tests
      • Internal Validity: extent to which a research design yields strong evidence of causality; an attribute of the research design
      • External Validity: extent to which a research design yields findings that generalize; an attribute of the research design
  • 12. Measurement: Construct Validity
    • Construct Validity: extent to which what you measure behaves as it should within a system of related concepts; an attribute of a measure/indicator
      • Nomological Net
        • does not have to be causal, could be correlational
        • should be completely non-controversial, established in literature or common sense
        • example: GRE scores and first-year grad school GPA
  • 13. Measurement: convergence and divergence
    • Convergent: alternative measures of a concept should be strongly correlated
    • Divergent: different concepts should not correlate highly with each other
  • 14. Measurement: sensitive, valid, reliable
    • Reliable: extent to which a measure is free from random error; measures are repeatable, consistent, dependable
    • Assessments:
      • test-retest
      • interobserver; intercoder
      • split-half
      • Cronbach’s Alpha
  • 15. True score theory & reliability
    • The error term has two components:
    • Random Error: the more reliable the measure, the less the random error
    • Systematic Error: the more valid the measure, the less the systematic error
    Observed score = True ability + Random error X = T + e
  • 16. Assessing reliability
    • test-retest: about consistency of response to a treatment; assumes the characteristic being measures is stable over time
    • interobserver; intercoder: two or more observe or code; can be done as a test on a random subset of cases
    • alternative-form: measure same attribute more than once using two different measures of the same concept ( e.g. to measure liberalism, use two different sets of question on same respondents at two different times )
    • split-half: two measures of same concept applied at the same time ( e.g. survey of political opinions; ten questions related to liberalism, take two different sets of questions as different measures of liberalism )
    • Cronbach’s Alpha: average of all split halves coefficients - all possible combinations; a statistical technique...
  • 17. Elkins example
    • graded v. dichotomous measures of democracy
  • 18. Concept & Measurement Exercise
    • describe each level for your concept
    • identify a scale (nominal, ordinal, etc.)
    • identify strategies for considering the reliability and validity of your measure
    • is your concept embedded in any particular ontological or epistemological understanding of the world?

×