Measuring variables 3
Upcoming SlideShare
Loading in...5

Measuring variables 3






Total Views
Views on SlideShare
Embed Views



0 Embeds 0

No embeds



Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment
  • “ T he lower the precision, the more subjects you'll need in your study to make up for the "noise" in your measurements. Even with a larger sample, noisy data can be hard to interpret. And if you are an applied scientist in the business of testing and assessing clients, you need special care when interpreting results of noisy tests.“ -
  • From Bryman & Cramer (1997) “Concepts & their Measurement”
  • This researcher was interested in measuring game satisfaction performance as perceived by the player. She brainstormed the behaviors one would expect to find if someone was pleased with his or her performance in a game. From those behaviors she developed a self report scale.
  • Another researcher wanted to look at the concepts associated with one feeling empowered.
  • From Figure 4.2 Bryman & Cramer (1997) This researcher was looking at professionalism and he came up with five dimensions. He then provided behaviours indicative of that dimension.
  • Likert scales basically ask: How do you feel on a scale of 1-10, or 1-5 or 1-4 depending on the scale and the age of the one completing the scale.
  • This is an example of a tally sheet when the researcher is interested in decreasing negative or increasing positive behaviors. This scale is found in the resources for this lecture.
  • Figure from Reliability is about the consistency of a measure. Validity is about whether a test actually measures what its meant to measure. Validity requires reliability, but reliability alone is not sufficient for validity.
  • Reliable; not valid Reliable; valid

Measuring variables 3 Measuring variables 3 Presentation Transcript

  • Kinds of Data
    • Demographic
    • Interview
    • Survey
    • Questionnaire
    • Observations
    • Rating Scales
    • Grades
    • Test Scores
    • Performance Ratings
  • Data Collection Instruments
    • Researcher Completes
      • Rating Scales
      • Interview Schedules
      • Tally Sheets
      • Flowcharts
      • Performance Checklist
      • Anecdotal records
      • Time & motion logs
      • Rubrics
    • Subject Completes
      • Questionnaires
      • Self Checklists
      • Attitude scales
      • Personality or Character inventories
      • Achievement / aptitude tests
      • Performance tests
      • Projective devices
      • Sociometric Devices
      • Rubrics
    View slide
  • Data Collection Matrix What am I measuring? How? From Whom? Dependent variables Sources Students Parents other Teacher 1. Math achievement Math chapter test Observe them solving problems 2. Attitude Likert Scale 3. View slide
  • Create a Likert Scale: Measuring Concepts - Process
    • Define concept
    • Brainstorm indicators of concept
    • Operationalise – draft a measurement device
    • Pilot test
    • Examine psychometric properties – how precise are the measures?
    • Redraft/refine and re-test
  • Operationalisation
    • Operationalisation is the act of making a fuzzy concept measurable.
    • Social sciences often use multi-item measures to assess related but distinct aspects of a fuzzy concept.
  • Fuzzy Concepts - Mindmap
  • Fuzzy Concepts - Mindmap
  • Factor Analysis Process
  • Method: Instrument Greenleaf 1992 scale (16 items)
  • Reliability & Validity
  • Reliability vs. Validity
  • Reliability Reproducibility of a measurement
  • Action Research Proposal: Step 1: What is your problem? Step: 2 What is your research question? Step 3: Detailed description of your intervention. Step 4: Description of Sample Step 5: Data Collection Matrix Step 6: Likert Scale