SlideShare a Scribd company logo
1 of 61
Download to read offline
Validating Quality Rating and
       Improvement Systems

                        Webinar
                  March 15, 2012
Validating Quality
Rating and
Improvement
Systems

Webinar
March 15, 2012



                     2
Welcome
• Ivelisse Martinez-Beck
  – Child Care Research Coordinator, Office of
    Planning Research and Evaluation




                                                 3
Welcome
• Shannon Rudisill,
  – Director, Office of Child Care




                                     4
5
Welcome
• Kathryn Tout
  – Co-Director of Early Childhood Research,
    Child Trends




                                               6
Goals of the Webinar
• Introduce a framework for QRIS validation
• Describe real examples of state validation
  efforts
• Highlight challenges and lessons learned
• Offer guidance on developing an
  individualized state plan for QRIS
  validation

                                               7
QRIS Validation Panelists
•   Gail Zellman, RAND Corporation
•   Kelly Maxwell, University of North Carolina
•   Michel Lahti, University of Southern Maine
•   Jim Elicker, Purdue University
•   Kim Boller, Mathematica Policy Research
•   Kathryn Tout, Child Trends


                                              8
What is QRIS Validation?
• An ongoing, iterative process that assesses
  whether design decisions about program quality
  standards and measurement strategies are
  producing meaningful and accurate ratings

• QRIS validation studies assess whether rating
  components and summary ratings can be relied
  on as accurate indicators of program quality

• Validation studies can identify needed changes
  and support continuous quality improvement

                                                   9
Why is Validation Important?
• Promotes increased credibility and support for
  the QRIS
  – Parents can rely on ratings in selecting care
  – Providers more willing to participate
• Supports effective deployment of limited rating
  resources (measuring only those things that
  contribute to quality)
• Promotes efficient use of limited QI resources
  – Technical assistance can target key aspects of care
  – Providers can use ratings to target QI efforts

                                                          10
What is QRIS Validation?
• A complex iterative process
• Relies on multiple sources of evidence
  – Expert judgments of degree to which
    measures capture key quality components
  – Scores on different measures of the same
    concept
  – Patterns of relationships
     • Across scores on different measures
     • Among the items within a measure


                                               11
What is QRIS Validation?
•   Four approaches may be used
    1. Examine validity of key underlying concepts
    2. Examine the psychometric properties of measures
       used to assess quality
    3. Assess the outputs of the rating process
    4. Relate ratings to expected child outcomes
•   Approaches vary in terms of timing, cost,
    difficulty
•   Approaches are not rigid; may overlap in time
    and goals
                                                         12
1. Examine the Validity of Key
    Underlying Concepts in QRIS
• Assesses whether basic concepts included in
  QRIS rating are the “right” ones by examining
  level of empirical and expert support
• Addresses questions like:
  – Do the rating components capture the key elements
    of quality?
  – Is there sufficient empirical support for including each
    component?
• Ideally conducted prior to QRIS implementation


                                                           13
1. Examine the Validity of Key
  Underlying Concepts in QRIS
• Data needed
  – Empirical literature on relationship of
    components to high quality care
  – Expert views
• Analysis methods
  – Synthesis of available data to determine level
    of support for each component
  – Consensus process

                                                 14
Example: Indiana
• Indiana Paths to QUALITY
  – Purdue University
• Questions: What does research tell us
  about
  – Whether the QRIS components and levels
    result in increasing quality of programs?
  – Whether the QRIS will improve developmental
    outcomes for children?

                                             15
Example: Indiana
• Methods:
  – Comprehensive review
  – Classified each QRIS indicator as having
    “some,” “moderate,” or “substantial” evidence
• Found “substantial” evidence for 75% of
  indicators



                                                16
Example: Georgia
• Georgia Department of Early Care and
  Learning
• Question: What are the key indicators of
  quality?
• Methods for Addressing:
  –   Stakeholder group
  –   Expert review
  –   Crosswalk with other program standards
  –   Statewide study of quality of care
• Validation helped identify set of indicators
  included in pilot QRIS
                                                 17
Example: Kentucky
• Kentucky STARS for KIDS NOW
  – Child Trends
• Question: How do current standards align
  with existing quality frameworks?
• Methods:
  – Crosswalk comparison of standards &
    frameworks
• Confirmed some standards and identified
  possible gaps
                                             18
2. Examine the Psychometric Properties of
    the Measures Used to Assess Quality
• Assesses whether component measures and
  overall ratings perform as claimed and expected
  by theory
• Addresses questions like:
   – Do component measures which claim four
     scales actually have four scales?

  – Do measures of similar concepts relate more
    closely to each other than to other measures?

  – Do different cut scores produce better
    distributions or more meaningful distinctions
    among programs?
                                                    19
2. Examine the Psychometric Properties of
   the Measures Used to Assess Quality
• Data needed
  – Rating data from participating programs
  – Data on additional quality measures
• Analysis methods
  – Factor analyses of some measures
  – Correlations among components
  – Correlations of selected components with
    other measures of quality

                                               20
Example: Maine

• Maine – Quality for ME
  – University of southern Maine
• Random selection of programs by type and Step
  Level over time. (3 year period)
• Data sources for the validation study
  –   On site observations using the ERSs.
  –   Confidential staff questionnaire
  –   Anonymous parent questionnaire
  –   Administrative Data:
       • QRIS Enrollment Data (self-report thru web based system)
       • ME DHHS Licensing Data
       • ECE Training and Technical Assistance Data (Registry Data)

                                                                 21
Example: Maine
• Supports to parents is measured by the QRIS (provider self
  report).
• Do parents’ survey reports of services align with providers’
  self report?
   – “Did you receive this support/service?”
   – Higher step levels more likely to provide more supports.
   – Differences found across family child care, Head Start centers
• Use of Emlen (2000) 15-item scale – perceptions of quality:
   – Good reliability
   – Did not discern differences in total mean scores by step level.
   – Step level rating did correlate with 6 individual items.
      • Healthy place for my child / Caregiver knows a lot about children / Lots
        of creative activities going on / Interesting place for my child / Child is
        treated with respect / Caregiver is open to new information
   – Found total mean score differences by program types.

                                                                               22
Example: Maine
• Staff development and supports and resources for
  staff are measured in the QRIS (provider self report)
• Do staff/provider surveys of job stress, demands
  and resources align with step level?
  – Higher step level program standards focus more on staff
    development, supports and resources for staff.
     • Secondary data available from Registry (Education / Training)
• Measures:
  – Job Stress Inventory (Curbow et al, 2000): Job Demands,
    Job Control and Job Resources subscales – no
    relationship to step levels.
  – Differences were found by type of program.
                                                                       23
Example: Kentucky
• Question: What strategy for combining
  indicators will produce valid rating levels?
• Method: Use data from the existing STARS
  rating and additional data from a survey of
  providers to simulate alternative ratings
  – Current structure is a block design
  – Alternative models included a point system and a
    hybrid model using blocks and points
• “New” and existing ratings were compared to
  understand the impact of different structures
                                                       24
3. Assess the Outputs of the
           Rating Process
• Examines program-level ratings scores to
  assess rating distribution and relationship of
  ratings to other quality measures

• Addresses questions like:
  – Are providers that received 4 stars actually
    providing higher quality care than those that
    earned 3 stars?
  – Do rating distributions for programs of
    different types, e.g., center vs. home-based
    vary?
  – Are cut scores and combining rules producing
    appropriate distributions?
                                                   25
3. Assess the Outputs of the
          Rating Process
• Data needed
  – Program-level ratings from participating programs
  – Data from additional quality measures
• Analysis methods
  – Examination of rating distributions by program type
  – Correlations of program ratings with other measures
  – Changes in rating distributions over time




                                                          26
Example: Indiana

• Question:
 –Do Indiana PTQ level ratings
  correlate with quality, as measured
  by established measures: ECERS-
  R, ITERS-R, FCCERS-R?



                                        27
Example: Indiana

• Method:
 –ERS quality assessments using the
  scales in a stratified random sample
  of Level 1, 2, 3, and 4 PTQ-rated
  child care providers


                                     28
Key Findings:
                         All Providers: Average ERS scores
                                by PTQ level (n=314)
 Excellent 7
             6
             5
    Good
             4
             3
  Minimal
             2
             1
                     Space &      Personal   Language &                              Program     Parents &     Global
                                                          Activities   Interaction
Inadequate          Furnishings    Care       Reasoning                              Structure     Staff     Quality Score
   Level 1 (n=84)       3.2         2.2          3.7         2.7          3.9            3          4.8           3.2
   Level 2 (n=90)       3.8         2.3           4          3.3          4.5           3.7         5.3           3.7
   Level 3 (n=74)       3.5         2.3          4.3         3.4          4.6            4          5.9           3.8
   Level 4 (n=66)       4.2         2.7          4.5          4           4.9           4.7         6.2           4.3


(Correlation between ERS global scores and PTQ ratings: r = .45***)

                                                                                                                 29
Example: Indiana

• How findings are being used:
  – Findings were presented to stakeholders in
    11 regional meetings around the state
  – Series of research briefs for general public
    and policy makers in development
  – Follow-up study: Identify lowest-rated quality
    areas in the evaluation sample


                                                 30
Example: Indiana
• Issues:
• What should be the quality validation
  standard(s) for each state?
  – Based on national research-validated
    measures of quality? (Accreditation;
    research measures)
  – Or based on local definitions of quality?
• What are the criteria for acceptable
  quality validation evidence?
                                                31
Example: Indiana

• Question:
  – Among Level 3 and 4 PTQ
  rated providers, what are the
  quality areas most in need of
  improvement?


                                  32
Example: Indiana

• Detailed examination of ERS data for
  Level 3 and Level 4 rated providers in the
  evaluation sample
• Separate analyses for ECERS-R, ITERS-
  R, and FCCERS-R
• Identified ERS items with means < 4, and
  ERS items with means <3

                                               33
Example: Indiana
• Level 4 Providers had 10 ECERS-R
  items that averaged below 4.0:
  (*7 items that averaged below 3)

• Level 3 Providers had 15 ECERS-R
  items that averaged below 4:
  (* 7 items that averaged below 3)

                                      34
Example: Indiana
• Findings presented to PTQ committees:
  evaluation, standards revision, provider
  resources
• Data are used as a guide. ERS data
  alone will not drive revisions.
• Committees will consider changes in PTQ
  standards, rating procedures, and T/TA
  based on these findings.

                                         35
Example: Minnesota
• Parent Aware
  – Child Trends
• Question: Does the quality of teacher-
  child interactions differ by star level?
• Method: Examine patterns of scores on
  the CLASS (which is included in the rating
  tool - points are awarded for higher
  scores)
                                           36
Example: Minnesota
            There were no differences in observed teacher/child
     7           interaction quality at different star levels.
     6                          5.74             5.65
           5.35                     5.34                         5.46
                                                        5.21
                  4.77                                                  4.96
     5

     4
                                                                                  Emotional Support
     3                                                    2.64             2.74
                         2.42          2.49                                       Classroom Organization
                                                                                  Instructional Support
     2

     1

     0
         2-stars (n= 28) 3-stars (n= 19)      4-stars fully-        4-stars
                                              rated (n= 12)      automatically
                                                                 rated (n= 25)                        37
No significant differences at initial ratings.
Example: Minnesota
• Challenges:
  – Small numbers of programs overall and
    limited numbers of programs at the lower star
    levels
  – CLASS was conducted only in center-based
    preschool rooms. What about other age
    groups?
• Findings were used in the revision of
  Parent Aware standards and statewide
  rollout
                                                38
4. Relate Ratings to Expected
          Child Outcomes
• Examines the extent to which exposure to
  higher quality providers is associated with
  better child functioning

• Addresses questions like:
  – Do higher-rated programs produce better
    learning outcomes?



                                              39
4. Relate Ratings to Expected
          Child Outcomes
• Data needed
  – Rating data from participating programs
  – Assessments of child functioning
• Analysis methods
  – Examine statistical relationship between
    ratings and child outcomes
  – Rigorous analytic methods should be used to
    account for selection factors and sampling
    bias
                                              40
Example: Minnesota
• Question: Do gains in children’s school readiness vary
  by star level or quality component?
• Method: 700 four-year olds were recruited from 138
  QRIS-rated programs (up to 6 from selected classrooms
  and 4 from family child care programs). Low-income
  children were prioritized.
   – In the fall and spring of the year before kindergarten, children
     completed direct assessments of expressive and receptive
     vocabulary, phonological awareness, print knowledge, and early
     math skills
   – Teachers/caregivers completed assessments of children’s
     social-emotional development and approach to learning

                                                                    41
Example: Minnesota
• Fall to spring gains on measures were
  calculated and compared across star rating
  levels (combining children from 1- and 2-stars)
  and quality categories using multilevel models
   – Four quality categories: Family Partnerships, Teacher
     Training and Education, Tracking Learning, and
     Teaching Materials and Strategies
• No consistent evidence that children’s gains
  varied by star rating or by points earned in the
  different quality categories
                                                         42
Example: Virginia
• Virginia Star Quality Initiative
  – University of Virginia
• Questions:
  – Do ratings of pre-kindergarten programs
    relate to children’s early literacy skills as they
    enter kindergarten?
  – Do gains in early literacy from pre-
    kindergarten to kindergarten relate to program
    ratings?
                                                    43
Example: Virginia
• Method: Multi-level models were estimated
  using rigorous controls at the child- and center-
  level and community fixed effects
• No significant differences in literacy skills at
  kindergarten entry by star level
• Children in 3- and 4-star programs did show
  more significant growth in literacy skills in the
  year before kindergarten compared to children in
  2-star programs
• Differences in growth were not sustained into
  kindergarten                                      44
Issues with Inclusion of Child Outcomes in
         QRIS Validation Studies
• Studies are costly when they include direct assessments
• Limited measures to use with children who are English
  Language Learners
• Difficult to measure attendance and exposure to a
  program
• Studies are not measuring how the QRIS affects
  children; they measure associations between ratings and
  measures of children’s development
• Methods should account for nesting of children within
  programs and control for selection factors
• See new paper by Zellman and Karoly
                                                       45
Approaching Validation with a Plan
• Given complexity, useful to develop a plan
  early in the process, before QRIS
  implementation
  – Thinking about validation may help in the
    design phase
  – Some validation data can be collected as part
    of ratings or other QRIS activities
• Plan ideally should include all four
  approaches
                                                46
Validation Plan Considerations
Approach                  Timing                    Cost issues              Getting by
Examine the Validity of   Ideally, do before        Relatively inexpensive   Can rely on other
Key Underlying            implementation                                     states’ efforts for many
Concepts                  Should take just a few                             measures
                          months

Examine the               Must wait until ratings   Depends on data          Can rely to some extent
Psychometric              occur                     quality and amount of    on available resources
Properties of the         Can conduct several       analysis Additional
Measures Used to          studies using same        measures will increase
Assess Quality            data set                  costs
Assess the Outputs of     Must wait until ratings   Depends on data          This work is system-
the Rating Process        occur                     quality and amount of    dependent but lessons
                          Can conduct several       analysis Additional      learned about structure
                          studies using same        measures will increase   and cut-points can be
                          data set                  costs                    shared
Relate Ratings to         Best to delay until       Child data collection    Requires significant
Expected Child            ratings process stable    costs very high Some     funds and expertise;
Outcomes                  and sufficient            agencies may collect     sampling children and
                          programs have been        these data               programs reduce costs
                          rated
                                                                                                   47
Five Takeaways
1. QRIS validation is a process
2. A validation plan charts a course for short-
   term and long-term validation activities
3. Usually the four validation approaches are
   assessed sequentially, but they can overlap
   in time
4. Validation is a measure of a successful
   QRIS and requires assessment at all stages
   of implementation
5. Validation tools and technical assistance
   resources are available
                                              48
QRIS Validation is a Process
• Validation is not a destination
• Validation activities can be done on their
  own or as part of a research and
  evaluation agenda
• Validation assesses whether the system
  and its measures of quality are working as
  designed
• Never too late to prepare and implement a
  validation plan
                                           49
Benefits of Planning for the Long Haul

• A shared, comprehensive, phased plan for
  validation can help weather funding uncertainties
   – Stepping outside of immediate budget constraints
     encourages deeper thinking and setting of both short-
     term and long-term goals
   – Getting cross agency and stakeholder buy-in
     leverages support for a comprehensive long-term
     plan
• Long-term validation, implementation, and
  outcome evaluation planning reveals synergies
  that may reduce future data collection and
  analysis costs
                                                         50
The Four Validation Approaches
       Build on Each Other
• Assessment of the validity of key concepts
  is foundational for the other approaches
• Assessment of psychometric properties is
  often done as part of an output or outcome
  assessment
• Ideally, assessment of outputs comes
  before assessment of child outcomes


                                           51
Validity Assessment is Important at Each
       Stage of QRIS Implementation
• Pilot and scale-up validation usually focuses on
  the first two approaches
• The early operation stage (2-5 years) may focus
  on any of the four approaches, but the first three
  approaches predominate
• Mature stage (>5 years) validation assessments
  provide the opportunity to refresh standards and
  measures and inform system changes
• When significant changes are made to the
  ratings or measures of quality, that should
  trigger a validity assessment
                                                   52
Validation Tools and Resources
• QRIS Evaluation Toolkit (Lugo-Gil et al.
  2011)
    – Includes a section on validation
    – Provides examples of study designs to answer
      validation questions
    – Provides examples of the measures used in state
      studies
    – Includes costs for actual studies as well as
      features of a strong request for proposals
•   Recently released state reports
•   Forthcoming OPRE validation brief
•   TA centers
•   INQUIRE members                                 53
Validation Study: QRIS Evaluation Toolkit
A validation study assesses the degree to which the quality standards
component of the QRIS reflects meaningful quality levels that are linked to
desired outcomes.
What is this          It is a way to ask critical questions about the tools used in a
approach?             QRIS and how they are functioning
What can be
                      Whether the quality ratings actually mean something
learned with
                      important to programs, parents, and children
this approach?
                      • Do the quality standards reflect the current research base?
What research
                      • Do the quality standards represent distinct areas that do not
questions can be
                        overlap with other standards in the QRIS?
answered with this
                      • Can improvement be detected in the quality levels?
approach?
                      • Are the quality levels related to children’s functioning?
                      • The timing of the validation study and how findings can
What are the key
                        inform QRIS design and implementation
factors to consider
                      • The degree to which the QRIS includes a representative
when using this
                        sample of providers from the communities served by the
approach?
                        QRIS

                                                                                        54
Ongoing Validation Assessment Issues
• Characteristics and density of providers
  enrolled in the system
  – In voluntary systems that have over-
    representation of higher quality settings or a
    small number of participants, results may be
    skewed
• Validation does not take the place of
  implementation and outcome evaluation
• Cost and fluctuation in commitment to
  validation efforts
                                                     55
Validation Efforts Can Be Scoped to
        Available Resources




                                      56
Questions and Discussion




                           57
Recording and Resources will
       be available on:
• http://researchconnections.org




                                   58
Presenter Contact Information
• Ivelisse Martinez-Beck –
  ivelisse.martinezbeck@acf.hhs.gov
• Shannon Rudisill – shannon.rudisill@acf.hhs.gov
• Kathryn Tout – ktout@childtrends.org
• Gail Zellman – zellman@rand.org
• Kelly Maxwell – maxwell@unc.edu
• Michel Lahti - mlahti@usm.maine.edu
• Jim Elicker - elickerj@purdue.edu
• Kimberly Boller – kboller@mathematica-mpr.com


                                                    59
Reports
• Indiana - Evaluation of Paths to Quality
•   http://www.cfs.purdue.edu/cff/publications/publications.html
• Minnesota – Evaluation of Parent Aware
•   http://www.melf.us/
• Kentucky
•   http://www.kentuckypartnership.org/starsevaluation.aspx
• Maine
•   Contact Michel Lahti for further information
• QRIS Evaluation Toolkit
•   http://www.acf.hhs.gov/programs/opre/cc/childcare_quality/qris_toolkit/qris_toolkit.pdf
• INQUIRE products
•   http://www.acf.hhs.gov/programs/opre/cc/childcare_technical/index.html



                                                                                              60
Reports
• Virginia
•   Contact Terri Sabol for further information
•   terri.sabol@northwestern.edu
• RAND (Zellman and Karoly) report on child assessments
  in QRIS
•   http://www.rand.org/pubs/occasional_papers/OP364.html
•   http://www.rand.org/pubs/research_briefs/RB9639.html




                                                            61

More Related Content

What's hot

Program evaluation
Program evaluationProgram evaluation
Program evaluationspanishpvs
 
Evaluation Principles: Theory-Based, Utilization-Focused, Participatory
Evaluation Principles: Theory-Based, Utilization-Focused, ParticipatoryEvaluation Principles: Theory-Based, Utilization-Focused, Participatory
Evaluation Principles: Theory-Based, Utilization-Focused, ParticipatoryMHTP Webmastere
 
Evaluating Student Success Initiatives
Evaluating Student Success InitiativesEvaluating Student Success Initiatives
Evaluating Student Success InitiativesBradley Vaden
 
Program evaluation 20121016
Program evaluation 20121016Program evaluation 20121016
Program evaluation 20121016nida19
 
Can We Demonstrate the Difference that Norwegian Aid makes? - Evaluation of t...
Can We Demonstrate the Difference that Norwegian Aid makes? - Evaluation of t...Can We Demonstrate the Difference that Norwegian Aid makes? - Evaluation of t...
Can We Demonstrate the Difference that Norwegian Aid makes? - Evaluation of t...Itad Ltd
 
Reviewing the Research and PEAC Recommendations around Principal Evaluation
Reviewing the Research and PEAC Recommendations around Principal EvaluationReviewing the Research and PEAC Recommendations around Principal Evaluation
Reviewing the Research and PEAC Recommendations around Principal EvaluationRichard Voltz
 
Program evaluation
Program evaluationProgram evaluation
Program evaluationYen Bunsoy
 
Seeking Evidence of Impact: Answering "How Do We Know?"
Seeking Evidence of Impact: Answering "How Do We Know?"Seeking Evidence of Impact: Answering "How Do We Know?"
Seeking Evidence of Impact: Answering "How Do We Know?"EDUCAUSE
 
Using Assessment Data for Educator and Student Growth
Using Assessment Data for Educator and Student GrowthUsing Assessment Data for Educator and Student Growth
Using Assessment Data for Educator and Student GrowthNWEA
 
Assessment in SEBD schools
Assessment in SEBD schoolsAssessment in SEBD schools
Assessment in SEBD schoolsGav Weedy
 
Research In Action #2
Research In Action #2Research In Action #2
Research In Action #2guestea0a10
 
Introduction to Logic Models
Introduction to Logic ModelsIntroduction to Logic Models
Introduction to Logic ModelsDiscoveryCenterMU
 
Monitoring and Evaluation at the Community Level : A Strategic Review of ME...
Monitoring and Evaluation at the Community Level: A Strategic Review of ME...Monitoring and Evaluation at the Community Level: A Strategic Review of ME...
Monitoring and Evaluation at the Community Level : A Strategic Review of ME...MEASURE Evaluation
 
Pseudo-evaluation and Quasi-evaluation Approach
Pseudo-evaluation and Quasi-evaluation ApproachPseudo-evaluation and Quasi-evaluation Approach
Pseudo-evaluation and Quasi-evaluation Approachkhrystyl23
 
Caballes characteristics of evaluation
Caballes  characteristics of evaluationCaballes  characteristics of evaluation
Caballes characteristics of evaluationYouise Saculo
 
Evaluating an Integrated Family Planning and Mother/Child Health Program
Evaluating an Integrated Family Planning and Mother/Child Health ProgramEvaluating an Integrated Family Planning and Mother/Child Health Program
Evaluating an Integrated Family Planning and Mother/Child Health ProgramMEASURE Evaluation
 

What's hot (20)

Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
Evaluation Principles: Theory-Based, Utilization-Focused, Participatory
Evaluation Principles: Theory-Based, Utilization-Focused, ParticipatoryEvaluation Principles: Theory-Based, Utilization-Focused, Participatory
Evaluation Principles: Theory-Based, Utilization-Focused, Participatory
 
Educ 148 2.1
Educ 148 2.1Educ 148 2.1
Educ 148 2.1
 
Evaluating Student Success Initiatives
Evaluating Student Success InitiativesEvaluating Student Success Initiatives
Evaluating Student Success Initiatives
 
cipp model
 cipp model cipp model
cipp model
 
Program evaluation 20121016
Program evaluation 20121016Program evaluation 20121016
Program evaluation 20121016
 
Can We Demonstrate the Difference that Norwegian Aid makes? - Evaluation of t...
Can We Demonstrate the Difference that Norwegian Aid makes? - Evaluation of t...Can We Demonstrate the Difference that Norwegian Aid makes? - Evaluation of t...
Can We Demonstrate the Difference that Norwegian Aid makes? - Evaluation of t...
 
Reviewing the Research and PEAC Recommendations around Principal Evaluation
Reviewing the Research and PEAC Recommendations around Principal EvaluationReviewing the Research and PEAC Recommendations around Principal Evaluation
Reviewing the Research and PEAC Recommendations around Principal Evaluation
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
Seeking Evidence of Impact: Answering "How Do We Know?"
Seeking Evidence of Impact: Answering "How Do We Know?"Seeking Evidence of Impact: Answering "How Do We Know?"
Seeking Evidence of Impact: Answering "How Do We Know?"
 
Using Assessment Data for Educator and Student Growth
Using Assessment Data for Educator and Student GrowthUsing Assessment Data for Educator and Student Growth
Using Assessment Data for Educator and Student Growth
 
Assessment 101 Part 3
Assessment 101 Part 3Assessment 101 Part 3
Assessment 101 Part 3
 
Assessment in SEBD schools
Assessment in SEBD schoolsAssessment in SEBD schools
Assessment in SEBD schools
 
Research In Action #2
Research In Action #2Research In Action #2
Research In Action #2
 
Introduction to Logic Models
Introduction to Logic ModelsIntroduction to Logic Models
Introduction to Logic Models
 
Monitoring and Evaluation at the Community Level : A Strategic Review of ME...
Monitoring and Evaluation at the Community Level: A Strategic Review of ME...Monitoring and Evaluation at the Community Level: A Strategic Review of ME...
Monitoring and Evaluation at the Community Level : A Strategic Review of ME...
 
Pseudo-evaluation and Quasi-evaluation Approach
Pseudo-evaluation and Quasi-evaluation ApproachPseudo-evaluation and Quasi-evaluation Approach
Pseudo-evaluation and Quasi-evaluation Approach
 
Caballes characteristics of evaluation
Caballes  characteristics of evaluationCaballes  characteristics of evaluation
Caballes characteristics of evaluation
 
Evaluating an Integrated Family Planning and Mother/Child Health Program
Evaluating an Integrated Family Planning and Mother/Child Health ProgramEvaluating an Integrated Family Planning and Mother/Child Health Program
Evaluating an Integrated Family Planning and Mother/Child Health Program
 
Rti ppt
Rti pptRti ppt
Rti ppt
 

Viewers also liked

Viewers also liked (8)

Contents
ContentsContents
Contents
 
CIDOC 2014
CIDOC 2014CIDOC 2014
CIDOC 2014
 
The Props List
The Props ListThe Props List
The Props List
 
A special story of antiques lover
A special story of antiques loverA special story of antiques lover
A special story of antiques lover
 
【社内提案】弊社のステッカー欲しいです。Vitalify.Inc Sticker Plan.
【社内提案】弊社のステッカー欲しいです。Vitalify.Inc Sticker Plan.【社内提案】弊社のステッカー欲しいです。Vitalify.Inc Sticker Plan.
【社内提案】弊社のステッカー欲しいです。Vitalify.Inc Sticker Plan.
 
Booosting nieuwsbrief 53 (Okt 1999)
Booosting nieuwsbrief 53 (Okt 1999)Booosting nieuwsbrief 53 (Okt 1999)
Booosting nieuwsbrief 53 (Okt 1999)
 
6yr 09 #13 Litter
6yr 09 #13 Litter6yr 09 #13 Litter
6yr 09 #13 Litter
 
Smack
SmackSmack
Smack
 

Similar to QRIS Validation Webinar

Step 1 Project Initiation and get organized Rev1_print.pptx
Step 1 Project Initiation and get organized Rev1_print.pptxStep 1 Project Initiation and get organized Rev1_print.pptx
Step 1 Project Initiation and get organized Rev1_print.pptxARNELUSMAN2
 
2010 bush foundation inputs, process indicators, and intermediate outcomes
2010 bush foundation inputs, process indicators, and intermediate outcomes2010 bush foundation inputs, process indicators, and intermediate outcomes
2010 bush foundation inputs, process indicators, and intermediate outcomesChristopher Thorn
 
Using case-based methods to assess scalability and sustainability: Lessons fr...
Using case-based methods to assess scalability and sustainability: Lessons fr...Using case-based methods to assess scalability and sustainability: Lessons fr...
Using case-based methods to assess scalability and sustainability: Lessons fr...Barb Knittel
 
Building agency capacity
Building agency capacityBuilding agency capacity
Building agency capacitySCSUTRIO
 
Qc2 0 revised july2013_website
Qc2 0 revised july2013_websiteQc2 0 revised july2013_website
Qc2 0 revised july2013_websiteDiana Lane
 
[Merit trac webinar] - it is assessments that cause improvement
[Merit trac webinar] - it is assessments that cause improvement[Merit trac webinar] - it is assessments that cause improvement
[Merit trac webinar] - it is assessments that cause improvementMeritTracSvc
 
Needs Assessment
Needs AssessmentNeeds Assessment
Needs AssessmentLeila Zaim
 
Introduction to Quality Counts 2.0
Introduction to Quality Counts 2.0Introduction to Quality Counts 2.0
Introduction to Quality Counts 2.0Diana Lane
 
Organizational Capacity-Building Series - Session 6: Program Evaluation
Organizational Capacity-Building Series - Session 6: Program EvaluationOrganizational Capacity-Building Series - Session 6: Program Evaluation
Organizational Capacity-Building Series - Session 6: Program EvaluationINGENAES
 
14.30 pre registration standards - geraldine walters
14.30 pre registration standards - geraldine walters14.30 pre registration standards - geraldine walters
14.30 pre registration standards - geraldine waltersNHS England
 
Neeraj Trivedi - Training of district officials in Bihar
Neeraj Trivedi - Training of district officials in BiharNeeraj Trivedi - Training of district officials in Bihar
Neeraj Trivedi - Training of district officials in BiharPOSHAN
 
Using Nursing Exam Data Effectively in Preparing Nursing Accreditation
Using Nursing Exam Data Effectively in Preparing Nursing AccreditationUsing Nursing Exam Data Effectively in Preparing Nursing Accreditation
Using Nursing Exam Data Effectively in Preparing Nursing AccreditationExamSoft
 
Evaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation MethodsEvaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation MethodsDebbie_at_IDS
 
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptxEDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptxwelfredoyu2
 

Similar to QRIS Validation Webinar (20)

Step 1 Project Initiation and get organized Rev1_print.pptx
Step 1 Project Initiation and get organized Rev1_print.pptxStep 1 Project Initiation and get organized Rev1_print.pptx
Step 1 Project Initiation and get organized Rev1_print.pptx
 
2010 bush foundation inputs, process indicators, and intermediate outcomes
2010 bush foundation inputs, process indicators, and intermediate outcomes2010 bush foundation inputs, process indicators, and intermediate outcomes
2010 bush foundation inputs, process indicators, and intermediate outcomes
 
Policy imperatives for systems-oriented approaches to scaling up
Policy imperatives for systems-oriented approaches to scaling upPolicy imperatives for systems-oriented approaches to scaling up
Policy imperatives for systems-oriented approaches to scaling up
 
Using case-based methods to assess scalability and sustainability: Lessons fr...
Using case-based methods to assess scalability and sustainability: Lessons fr...Using case-based methods to assess scalability and sustainability: Lessons fr...
Using case-based methods to assess scalability and sustainability: Lessons fr...
 
Building agency capacity
Building agency capacityBuilding agency capacity
Building agency capacity
 
Qc2 0 revised july2013_website
Qc2 0 revised july2013_websiteQc2 0 revised july2013_website
Qc2 0 revised july2013_website
 
Chapter 30
Chapter 30Chapter 30
Chapter 30
 
EMIS
EMIS EMIS
EMIS
 
[Merit trac webinar] - it is assessments that cause improvement
[Merit trac webinar] - it is assessments that cause improvement[Merit trac webinar] - it is assessments that cause improvement
[Merit trac webinar] - it is assessments that cause improvement
 
Needs Assessment
Needs AssessmentNeeds Assessment
Needs Assessment
 
Introduction to Quality Counts 2.0
Introduction to Quality Counts 2.0Introduction to Quality Counts 2.0
Introduction to Quality Counts 2.0
 
Organizational Capacity-Building Series - Session 6: Program Evaluation
Organizational Capacity-Building Series - Session 6: Program EvaluationOrganizational Capacity-Building Series - Session 6: Program Evaluation
Organizational Capacity-Building Series - Session 6: Program Evaluation
 
14.30 pre registration standards - geraldine walters
14.30 pre registration standards - geraldine walters14.30 pre registration standards - geraldine walters
14.30 pre registration standards - geraldine walters
 
Neeraj Trivedi - Training of district officials in Bihar
Neeraj Trivedi - Training of district officials in BiharNeeraj Trivedi - Training of district officials in Bihar
Neeraj Trivedi - Training of district officials in Bihar
 
Using Nursing Exam Data Effectively in Preparing Nursing Accreditation
Using Nursing Exam Data Effectively in Preparing Nursing AccreditationUsing Nursing Exam Data Effectively in Preparing Nursing Accreditation
Using Nursing Exam Data Effectively in Preparing Nursing Accreditation
 
Evaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation MethodsEvaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation Methods
 
Quality management and process improvement layton
Quality management and process improvement   laytonQuality management and process improvement   layton
Quality management and process improvement layton
 
Chapter 14
Chapter 14Chapter 14
Chapter 14
 
Peer-to-Peer Webinar Series: Success Stories in EIDM 2018 / Webinar #3
Peer-to-Peer Webinar Series: Success Stories in EIDM 2018 / Webinar #3Peer-to-Peer Webinar Series: Success Stories in EIDM 2018 / Webinar #3
Peer-to-Peer Webinar Series: Success Stories in EIDM 2018 / Webinar #3
 
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptxEDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
 

Recently uploaded

APIForce Zurich 5 April Automation LPDG
APIForce Zurich 5 April  Automation LPDGAPIForce Zurich 5 April  Automation LPDG
APIForce Zurich 5 April Automation LPDGMarianaLemus7
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsMemoori
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr BaganFwdays
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...shyamraj55
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Patryk Bandurski
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLScyllaDB
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationSlibray Presentation
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr LapshynFwdays
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024Lorenzo Miniero
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clashcharlottematthew16
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Scott Keck-Warren
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxNavinnSomaal
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubKalema Edgar
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupFlorian Wilhelm
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Mattias Andersson
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 

Recently uploaded (20)

APIForce Zurich 5 April Automation LPDG
APIForce Zurich 5 April  Automation LPDGAPIForce Zurich 5 April  Automation LPDG
APIForce Zurich 5 April Automation LPDG
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial Buildings
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQL
 
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptxE-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck Presentation
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clash
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptx
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding Club
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project Setup
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 

QRIS Validation Webinar

  • 1. Validating Quality Rating and Improvement Systems Webinar March 15, 2012
  • 3. Welcome • Ivelisse Martinez-Beck – Child Care Research Coordinator, Office of Planning Research and Evaluation 3
  • 4. Welcome • Shannon Rudisill, – Director, Office of Child Care 4
  • 5. 5
  • 6. Welcome • Kathryn Tout – Co-Director of Early Childhood Research, Child Trends 6
  • 7. Goals of the Webinar • Introduce a framework for QRIS validation • Describe real examples of state validation efforts • Highlight challenges and lessons learned • Offer guidance on developing an individualized state plan for QRIS validation 7
  • 8. QRIS Validation Panelists • Gail Zellman, RAND Corporation • Kelly Maxwell, University of North Carolina • Michel Lahti, University of Southern Maine • Jim Elicker, Purdue University • Kim Boller, Mathematica Policy Research • Kathryn Tout, Child Trends 8
  • 9. What is QRIS Validation? • An ongoing, iterative process that assesses whether design decisions about program quality standards and measurement strategies are producing meaningful and accurate ratings • QRIS validation studies assess whether rating components and summary ratings can be relied on as accurate indicators of program quality • Validation studies can identify needed changes and support continuous quality improvement 9
  • 10. Why is Validation Important? • Promotes increased credibility and support for the QRIS – Parents can rely on ratings in selecting care – Providers more willing to participate • Supports effective deployment of limited rating resources (measuring only those things that contribute to quality) • Promotes efficient use of limited QI resources – Technical assistance can target key aspects of care – Providers can use ratings to target QI efforts 10
  • 11. What is QRIS Validation? • A complex iterative process • Relies on multiple sources of evidence – Expert judgments of degree to which measures capture key quality components – Scores on different measures of the same concept – Patterns of relationships • Across scores on different measures • Among the items within a measure 11
  • 12. What is QRIS Validation? • Four approaches may be used 1. Examine validity of key underlying concepts 2. Examine the psychometric properties of measures used to assess quality 3. Assess the outputs of the rating process 4. Relate ratings to expected child outcomes • Approaches vary in terms of timing, cost, difficulty • Approaches are not rigid; may overlap in time and goals 12
  • 13. 1. Examine the Validity of Key Underlying Concepts in QRIS • Assesses whether basic concepts included in QRIS rating are the “right” ones by examining level of empirical and expert support • Addresses questions like: – Do the rating components capture the key elements of quality? – Is there sufficient empirical support for including each component? • Ideally conducted prior to QRIS implementation 13
  • 14. 1. Examine the Validity of Key Underlying Concepts in QRIS • Data needed – Empirical literature on relationship of components to high quality care – Expert views • Analysis methods – Synthesis of available data to determine level of support for each component – Consensus process 14
  • 15. Example: Indiana • Indiana Paths to QUALITY – Purdue University • Questions: What does research tell us about – Whether the QRIS components and levels result in increasing quality of programs? – Whether the QRIS will improve developmental outcomes for children? 15
  • 16. Example: Indiana • Methods: – Comprehensive review – Classified each QRIS indicator as having “some,” “moderate,” or “substantial” evidence • Found “substantial” evidence for 75% of indicators 16
  • 17. Example: Georgia • Georgia Department of Early Care and Learning • Question: What are the key indicators of quality? • Methods for Addressing: – Stakeholder group – Expert review – Crosswalk with other program standards – Statewide study of quality of care • Validation helped identify set of indicators included in pilot QRIS 17
  • 18. Example: Kentucky • Kentucky STARS for KIDS NOW – Child Trends • Question: How do current standards align with existing quality frameworks? • Methods: – Crosswalk comparison of standards & frameworks • Confirmed some standards and identified possible gaps 18
  • 19. 2. Examine the Psychometric Properties of the Measures Used to Assess Quality • Assesses whether component measures and overall ratings perform as claimed and expected by theory • Addresses questions like: – Do component measures which claim four scales actually have four scales? – Do measures of similar concepts relate more closely to each other than to other measures? – Do different cut scores produce better distributions or more meaningful distinctions among programs? 19
  • 20. 2. Examine the Psychometric Properties of the Measures Used to Assess Quality • Data needed – Rating data from participating programs – Data on additional quality measures • Analysis methods – Factor analyses of some measures – Correlations among components – Correlations of selected components with other measures of quality 20
  • 21. Example: Maine • Maine – Quality for ME – University of southern Maine • Random selection of programs by type and Step Level over time. (3 year period) • Data sources for the validation study – On site observations using the ERSs. – Confidential staff questionnaire – Anonymous parent questionnaire – Administrative Data: • QRIS Enrollment Data (self-report thru web based system) • ME DHHS Licensing Data • ECE Training and Technical Assistance Data (Registry Data) 21
  • 22. Example: Maine • Supports to parents is measured by the QRIS (provider self report). • Do parents’ survey reports of services align with providers’ self report? – “Did you receive this support/service?” – Higher step levels more likely to provide more supports. – Differences found across family child care, Head Start centers • Use of Emlen (2000) 15-item scale – perceptions of quality: – Good reliability – Did not discern differences in total mean scores by step level. – Step level rating did correlate with 6 individual items. • Healthy place for my child / Caregiver knows a lot about children / Lots of creative activities going on / Interesting place for my child / Child is treated with respect / Caregiver is open to new information – Found total mean score differences by program types. 22
  • 23. Example: Maine • Staff development and supports and resources for staff are measured in the QRIS (provider self report) • Do staff/provider surveys of job stress, demands and resources align with step level? – Higher step level program standards focus more on staff development, supports and resources for staff. • Secondary data available from Registry (Education / Training) • Measures: – Job Stress Inventory (Curbow et al, 2000): Job Demands, Job Control and Job Resources subscales – no relationship to step levels. – Differences were found by type of program. 23
  • 24. Example: Kentucky • Question: What strategy for combining indicators will produce valid rating levels? • Method: Use data from the existing STARS rating and additional data from a survey of providers to simulate alternative ratings – Current structure is a block design – Alternative models included a point system and a hybrid model using blocks and points • “New” and existing ratings were compared to understand the impact of different structures 24
  • 25. 3. Assess the Outputs of the Rating Process • Examines program-level ratings scores to assess rating distribution and relationship of ratings to other quality measures • Addresses questions like: – Are providers that received 4 stars actually providing higher quality care than those that earned 3 stars? – Do rating distributions for programs of different types, e.g., center vs. home-based vary? – Are cut scores and combining rules producing appropriate distributions? 25
  • 26. 3. Assess the Outputs of the Rating Process • Data needed – Program-level ratings from participating programs – Data from additional quality measures • Analysis methods – Examination of rating distributions by program type – Correlations of program ratings with other measures – Changes in rating distributions over time 26
  • 27. Example: Indiana • Question: –Do Indiana PTQ level ratings correlate with quality, as measured by established measures: ECERS- R, ITERS-R, FCCERS-R? 27
  • 28. Example: Indiana • Method: –ERS quality assessments using the scales in a stratified random sample of Level 1, 2, 3, and 4 PTQ-rated child care providers 28
  • 29. Key Findings: All Providers: Average ERS scores by PTQ level (n=314) Excellent 7 6 5 Good 4 3 Minimal 2 1 Space & Personal Language & Program Parents & Global Activities Interaction Inadequate Furnishings Care Reasoning Structure Staff Quality Score Level 1 (n=84) 3.2 2.2 3.7 2.7 3.9 3 4.8 3.2 Level 2 (n=90) 3.8 2.3 4 3.3 4.5 3.7 5.3 3.7 Level 3 (n=74) 3.5 2.3 4.3 3.4 4.6 4 5.9 3.8 Level 4 (n=66) 4.2 2.7 4.5 4 4.9 4.7 6.2 4.3 (Correlation between ERS global scores and PTQ ratings: r = .45***) 29
  • 30. Example: Indiana • How findings are being used: – Findings were presented to stakeholders in 11 regional meetings around the state – Series of research briefs for general public and policy makers in development – Follow-up study: Identify lowest-rated quality areas in the evaluation sample 30
  • 31. Example: Indiana • Issues: • What should be the quality validation standard(s) for each state? – Based on national research-validated measures of quality? (Accreditation; research measures) – Or based on local definitions of quality? • What are the criteria for acceptable quality validation evidence? 31
  • 32. Example: Indiana • Question: – Among Level 3 and 4 PTQ rated providers, what are the quality areas most in need of improvement? 32
  • 33. Example: Indiana • Detailed examination of ERS data for Level 3 and Level 4 rated providers in the evaluation sample • Separate analyses for ECERS-R, ITERS- R, and FCCERS-R • Identified ERS items with means < 4, and ERS items with means <3 33
  • 34. Example: Indiana • Level 4 Providers had 10 ECERS-R items that averaged below 4.0: (*7 items that averaged below 3) • Level 3 Providers had 15 ECERS-R items that averaged below 4: (* 7 items that averaged below 3) 34
  • 35. Example: Indiana • Findings presented to PTQ committees: evaluation, standards revision, provider resources • Data are used as a guide. ERS data alone will not drive revisions. • Committees will consider changes in PTQ standards, rating procedures, and T/TA based on these findings. 35
  • 36. Example: Minnesota • Parent Aware – Child Trends • Question: Does the quality of teacher- child interactions differ by star level? • Method: Examine patterns of scores on the CLASS (which is included in the rating tool - points are awarded for higher scores) 36
  • 37. Example: Minnesota There were no differences in observed teacher/child 7 interaction quality at different star levels. 6 5.74 5.65 5.35 5.34 5.46 5.21 4.77 4.96 5 4 Emotional Support 3 2.64 2.74 2.42 2.49 Classroom Organization Instructional Support 2 1 0 2-stars (n= 28) 3-stars (n= 19) 4-stars fully- 4-stars rated (n= 12) automatically rated (n= 25) 37 No significant differences at initial ratings.
  • 38. Example: Minnesota • Challenges: – Small numbers of programs overall and limited numbers of programs at the lower star levels – CLASS was conducted only in center-based preschool rooms. What about other age groups? • Findings were used in the revision of Parent Aware standards and statewide rollout 38
  • 39. 4. Relate Ratings to Expected Child Outcomes • Examines the extent to which exposure to higher quality providers is associated with better child functioning • Addresses questions like: – Do higher-rated programs produce better learning outcomes? 39
  • 40. 4. Relate Ratings to Expected Child Outcomes • Data needed – Rating data from participating programs – Assessments of child functioning • Analysis methods – Examine statistical relationship between ratings and child outcomes – Rigorous analytic methods should be used to account for selection factors and sampling bias 40
  • 41. Example: Minnesota • Question: Do gains in children’s school readiness vary by star level or quality component? • Method: 700 four-year olds were recruited from 138 QRIS-rated programs (up to 6 from selected classrooms and 4 from family child care programs). Low-income children were prioritized. – In the fall and spring of the year before kindergarten, children completed direct assessments of expressive and receptive vocabulary, phonological awareness, print knowledge, and early math skills – Teachers/caregivers completed assessments of children’s social-emotional development and approach to learning 41
  • 42. Example: Minnesota • Fall to spring gains on measures were calculated and compared across star rating levels (combining children from 1- and 2-stars) and quality categories using multilevel models – Four quality categories: Family Partnerships, Teacher Training and Education, Tracking Learning, and Teaching Materials and Strategies • No consistent evidence that children’s gains varied by star rating or by points earned in the different quality categories 42
  • 43. Example: Virginia • Virginia Star Quality Initiative – University of Virginia • Questions: – Do ratings of pre-kindergarten programs relate to children’s early literacy skills as they enter kindergarten? – Do gains in early literacy from pre- kindergarten to kindergarten relate to program ratings? 43
  • 44. Example: Virginia • Method: Multi-level models were estimated using rigorous controls at the child- and center- level and community fixed effects • No significant differences in literacy skills at kindergarten entry by star level • Children in 3- and 4-star programs did show more significant growth in literacy skills in the year before kindergarten compared to children in 2-star programs • Differences in growth were not sustained into kindergarten 44
  • 45. Issues with Inclusion of Child Outcomes in QRIS Validation Studies • Studies are costly when they include direct assessments • Limited measures to use with children who are English Language Learners • Difficult to measure attendance and exposure to a program • Studies are not measuring how the QRIS affects children; they measure associations between ratings and measures of children’s development • Methods should account for nesting of children within programs and control for selection factors • See new paper by Zellman and Karoly 45
  • 46. Approaching Validation with a Plan • Given complexity, useful to develop a plan early in the process, before QRIS implementation – Thinking about validation may help in the design phase – Some validation data can be collected as part of ratings or other QRIS activities • Plan ideally should include all four approaches 46
  • 47. Validation Plan Considerations Approach Timing Cost issues Getting by Examine the Validity of Ideally, do before Relatively inexpensive Can rely on other Key Underlying implementation states’ efforts for many Concepts Should take just a few measures months Examine the Must wait until ratings Depends on data Can rely to some extent Psychometric occur quality and amount of on available resources Properties of the Can conduct several analysis Additional Measures Used to studies using same measures will increase Assess Quality data set costs Assess the Outputs of Must wait until ratings Depends on data This work is system- the Rating Process occur quality and amount of dependent but lessons Can conduct several analysis Additional learned about structure studies using same measures will increase and cut-points can be data set costs shared Relate Ratings to Best to delay until Child data collection Requires significant Expected Child ratings process stable costs very high Some funds and expertise; Outcomes and sufficient agencies may collect sampling children and programs have been these data programs reduce costs rated 47
  • 48. Five Takeaways 1. QRIS validation is a process 2. A validation plan charts a course for short- term and long-term validation activities 3. Usually the four validation approaches are assessed sequentially, but they can overlap in time 4. Validation is a measure of a successful QRIS and requires assessment at all stages of implementation 5. Validation tools and technical assistance resources are available 48
  • 49. QRIS Validation is a Process • Validation is not a destination • Validation activities can be done on their own or as part of a research and evaluation agenda • Validation assesses whether the system and its measures of quality are working as designed • Never too late to prepare and implement a validation plan 49
  • 50. Benefits of Planning for the Long Haul • A shared, comprehensive, phased plan for validation can help weather funding uncertainties – Stepping outside of immediate budget constraints encourages deeper thinking and setting of both short- term and long-term goals – Getting cross agency and stakeholder buy-in leverages support for a comprehensive long-term plan • Long-term validation, implementation, and outcome evaluation planning reveals synergies that may reduce future data collection and analysis costs 50
  • 51. The Four Validation Approaches Build on Each Other • Assessment of the validity of key concepts is foundational for the other approaches • Assessment of psychometric properties is often done as part of an output or outcome assessment • Ideally, assessment of outputs comes before assessment of child outcomes 51
  • 52. Validity Assessment is Important at Each Stage of QRIS Implementation • Pilot and scale-up validation usually focuses on the first two approaches • The early operation stage (2-5 years) may focus on any of the four approaches, but the first three approaches predominate • Mature stage (>5 years) validation assessments provide the opportunity to refresh standards and measures and inform system changes • When significant changes are made to the ratings or measures of quality, that should trigger a validity assessment 52
  • 53. Validation Tools and Resources • QRIS Evaluation Toolkit (Lugo-Gil et al. 2011) – Includes a section on validation – Provides examples of study designs to answer validation questions – Provides examples of the measures used in state studies – Includes costs for actual studies as well as features of a strong request for proposals • Recently released state reports • Forthcoming OPRE validation brief • TA centers • INQUIRE members 53
  • 54. Validation Study: QRIS Evaluation Toolkit A validation study assesses the degree to which the quality standards component of the QRIS reflects meaningful quality levels that are linked to desired outcomes. What is this It is a way to ask critical questions about the tools used in a approach? QRIS and how they are functioning What can be Whether the quality ratings actually mean something learned with important to programs, parents, and children this approach? • Do the quality standards reflect the current research base? What research • Do the quality standards represent distinct areas that do not questions can be overlap with other standards in the QRIS? answered with this • Can improvement be detected in the quality levels? approach? • Are the quality levels related to children’s functioning? • The timing of the validation study and how findings can What are the key inform QRIS design and implementation factors to consider • The degree to which the QRIS includes a representative when using this sample of providers from the communities served by the approach? QRIS 54
  • 55. Ongoing Validation Assessment Issues • Characteristics and density of providers enrolled in the system – In voluntary systems that have over- representation of higher quality settings or a small number of participants, results may be skewed • Validation does not take the place of implementation and outcome evaluation • Cost and fluctuation in commitment to validation efforts 55
  • 56. Validation Efforts Can Be Scoped to Available Resources 56
  • 58. Recording and Resources will be available on: • http://researchconnections.org 58
  • 59. Presenter Contact Information • Ivelisse Martinez-Beck – ivelisse.martinezbeck@acf.hhs.gov • Shannon Rudisill – shannon.rudisill@acf.hhs.gov • Kathryn Tout – ktout@childtrends.org • Gail Zellman – zellman@rand.org • Kelly Maxwell – maxwell@unc.edu • Michel Lahti - mlahti@usm.maine.edu • Jim Elicker - elickerj@purdue.edu • Kimberly Boller – kboller@mathematica-mpr.com 59
  • 60. Reports • Indiana - Evaluation of Paths to Quality • http://www.cfs.purdue.edu/cff/publications/publications.html • Minnesota – Evaluation of Parent Aware • http://www.melf.us/ • Kentucky • http://www.kentuckypartnership.org/starsevaluation.aspx • Maine • Contact Michel Lahti for further information • QRIS Evaluation Toolkit • http://www.acf.hhs.gov/programs/opre/cc/childcare_quality/qris_toolkit/qris_toolkit.pdf • INQUIRE products • http://www.acf.hhs.gov/programs/opre/cc/childcare_technical/index.html 60
  • 61. Reports • Virginia • Contact Terri Sabol for further information • terri.sabol@northwestern.edu • RAND (Zellman and Karoly) report on child assessments in QRIS • http://www.rand.org/pubs/occasional_papers/OP364.html • http://www.rand.org/pubs/research_briefs/RB9639.html 61