SlideShare a Scribd company logo
1 of 86
   
   Any set of measurement activities that uses direct
    observation and recording of a student’s performance in the
    local curriculum as a basis for gathering information to
    make instructional decisions
    o Standard directions (straightforward)
    o Timing
    o Set of materials (passages, sheets, lists, etc)
    o Scoring rules
    o Standards for judging performance
    o Record forms and charts
        • Curriculum Based Assessment (CBA)
        • Curriculum Based Measurement (CBM)
        • Curriculum Based Evaluation (CBE)
   Most forms of classroom assessment is MM
    o NOT CBA
   Describes mastery of a series of short-term instructional
    objectives
    o Criterion-referenced tests to determine success in each sequence of
       the curriculum
   Problems with MM
    o Hierarchy of skills is logical, not empirical
    o Assessment does not reflect maintenance or generalization
    o Designed by teachers, therefore reliability and validity is unknown
   Alignment
   Technical adequacy
   Criterion-Referenced
   Sensitive to instruction/alterable variables
   Standardized
   Performance sampling
   Decision rules
   Repeated measures
   Efficient
   Efficacious
   Efficient summarizing of data
   Tests are of relatively short duration
   Direct link b/t data collection and intervention
 Unable  to measure every aspect of academic
  content and should not replace all achievement
  tests
 Promoted as a one-size-fits-all measurement
  system that will answer all questions related to
  special education
 Only assesses a limited number of academic
  behaviors and cannot possibly generalize to all
  aspects of academic achievement – it does not
  provide a comprehensive picture of skills by only
  measuring fluency
 Denounces the use of published, norm-referenced
  tests, but uses these to establish reliability and
 If a curriculum is inappropriate, then the outcomes
  of CBM are inappropriate
 It is impossible to choose among the different CBA
  models because they are so similar
 CBM is nothing more than a tack on procedure for
  traditional assessments
 Using situation-based information to categorize a
  student as needing services means that a student
  with a disability in one school may not have a
  disability in another
 The goals set by CBM are too ambitious and
  unrealistic
   Type of CBA/CBM
   Used to sample across several goals at the same time
   Capstone tasks
    o can only be accomplished by successfully applying a # of contributing
      skills
    o Ex: ORF
   Advantages
    o Efficient
    o Assessment in context
    o Adequate opportunities for progress monitoring and data-based
      modifications
   Disadvantage
    o Some curricular areas do not have a capstone task
    o Not good for determining where holes are in skills
   CBA/CBE
   Goals/single skills in a curricular area
   Advantage
    o Can be used in areas where capstone skills are not available
    o Also usually measures annual or long-term goals as long as
       adequately sampling from curriculum
   Mastery Measures
    o Less complex
    o Less skills
    o Focuses on tool skills – skills that must be performed at high levels
      of proficiency or skills that are pivotal to many other operations
    o Used to trouble shoot a learning problem or look at short-term
      mastery of concepts
o Unique application of DA that is appropriate for problem-
  solving
o Primary purpose: determine instructional level at which a
  student is successful
o Useful for a variety of educational decisions

Advantages
   Tests for success in curriculum
    o Continue testing until find appropriate instructional level as
       opposed to traditional testing goals.
   Efficient
    o Obtains a broad sample of performance in a short period of
      time
    o Gives large amount of data to analyze and form hypotheses
   Technically Adequate
   Screening
    o Which students are currently at risk for failure?
   Progress-monitoring
    o Is the student making adequate progress toward important
      goals?
   Diagnostic
    o What and how should we teach this student?
   Outcome
    o Has this progress been a success?
   
   HIGH reliance on fluency (automaticity) for all areas
   Fluency = speed + accuracy
   Mostly concerned with basic skills:
    o   Reading fluency
    o   Math fluency
    o   Writing fluency
    o   Spelling
 Does not look at history, science, social studies, etc.
  Assessments are called probes
 Assessed frequently as it is a capstone skill
 Can develop your own probes or use generic probes
    o AIMSWeb, DIBELS, EasyCBM etc.
   Select probes that represent skills to be mastered by the
    end of the year or beginning of next year
 DIBELS
  o Letter Naming Fluency (LNF)
     • Risk indicator for reading
  o Letter Sound Fluency (LSF)
     • Closely related to decoding and reading
  o Initial Sound Fluency (ISF)
  o Phoneme Segmentation Fluency (PSF)
  o Nonsense Word Fluency (NWF)
     • Basic decoding skills
     • Short vowels and consonants
 Word   Identification Fluency
 Oral   Reading Fluency (ORF)
  o Capstone skill
 Administration
  o Student is asked to read aloud for one minute from a
    grade-level text
  o Examiner copy of the probe has cumulative word
    count
  o Start stopwatch when child says first word or within 3
    seconds if he/she cannot read the first word
  o Mark all errors as the child reads
     • Count 3 second pauses as errors
  o Administer 3 probes
   Acceptable responses
    o   Correctly pronounced
    o   Self-corrected w/in 3 seconds
    o   Dialectical differences
    o   Repeated or inserted words are ignored, but indicated for error
        analysis
   Errors
    o   Substitutions
    o   Omitted words
    o   Words mispronounced or not corrected w/in 3 seconds
    o   Reversals
    o   Numbers
   Other considerations
    o   Hyphenated words
    o   Abbreviations
    o   Discontinuing
    o   Dealing with interruptions
   Count correctly read words
    o WCPM
   Count errors
    o WCPM/E
   Calculate accuracy
    o WCPM/(WCPM + Errors) x 100 = % Accuracy
   Looking for the median score
   Also look for:
    o Errors that preserve rather than distort meaning
    o Effective strategies for dealing with unknown words
    o Reading with expression
    o Self-correction (self-monitoring)
    o Fluency or efficiency of reading
    o Adjusting pace for text complexity
   Reading passage
    o Silent reading
    o Restore every 7th word that has been deleted and replaced with 3
      words
   Better predictor of reading success for grade 4 and up
   Face validity due to its relationship to comprehension
   Group or individually administered
   Administration
    o Give each student a passage
    o Time for 1-3 minutes
    o Grade number of correct responses
        • WCPM
    o Count errors
 Assesses   students’ ability to generalize learned
  spelling rules in novel tasks in addition
 Number of words students can spell correctly
 Words sampled from the year’s curriculum
 Administration:
   o Dictate 1 word every 10 seconds grades 1-2
      • 12 words or 50-70 CLS
   o Dictate 1 word every 7 seconds grades 3+
      • 17 words or 125-155 CLS
 Scored as correct letter sequences (CLS) and
  words spelled correctly (WSC)
 Usesinstructional story starters
 Administration
  o 1 minute thinking and 3 minute writing
 Can  count words written or # of words spelled
  correctly
 Scoring
  o Total Words Written (TWW)
  o Words Spelled Correctly (WSC)
  o Correct Writing Sequence (CWS)
     • Considered the most sensitive measure
  o Total Correct Punctuation (TCP)
   Subject of math typically broken into:
    o Early Numeracy
    o Computation
    o Concepts and Application
   Types of CBM assessment
    o Single Skills
    o Facts
    o Grade level or Multi-Skill
   Administration
    o 2 minutes
   Scoring
    o Digits correct
 Early   Numeracy
  o Missing Numbers
  o Number Identification
  o Oral Counting
  o Quantity Array
  o Quantity Discrimination
 Concepts    and Application
  o Estimation
  o Other math skills
     • Measurement
     • Time
     • Graph interpretation
   
 Instructional   Placement Standards
  o Survey Level Assessments
 Benchmarks  or Proficiency Levels
 National Norms
 Local Norms
 Prepare   Materials
  o Select a minimum of 3 probes from each level of the
    curriculum
  o Begin at or below the student’s grade level
 Administer   and Score Probes
  o Administer 3 passages per level in progressively
    lower order
  o Continue until the student’s normative level or
    appropriate instructional level has been reached
 Summarize    the Data
  o Enter student median scores into a table or graph
Grade Level   Level           WCPM     Errors Per Min

              Frustration     <40      >4
    1-2
              Instructional   40-60    4 or less

              Mastery         >60      4 or less

              Frustration     <70      >6
    3-6
              Instructional   70-100   6 or less

              Mastery         >100     6 or less
Grade Level            Placement Level                      DC
                Data from Deno and Mirkin (1977)

   1-3             Frustration                             <20
                   Instructional                           21-40
                   Mastery                                 >41
  4-12             Frustration                             <40
                   Instructional                           41-80
                   Mastery                                 >81
         Data from Burns, VanDerHeyden, and Jiban (2006)
                   Frustration                             <14
   2-3
                   Instructional                           14-31
                   Mastery                                 >31
                   Frustration                             <24
   4-5
                   Instructional                           24-49
                   Mastery                                 >49
Reading SLA for Cassie                   Reading SLA for Lucy

Leve WCPM     Errors   Placement     Level   WCP Errors     Placement
l    Median   Median                         M     Median
                                             Media
5    45       8        Frustration           n
                                     5       60    7        Frustration
4    50       7        Frustration
                                     4       70    8        Frustration
3    55       10       Frustration
                                     3       70    7        Frustration?
2    55       4        Instruction
                       al            2       70    5        Instructional
                                                            ?
1    85       0        Mastery
                                     1       85    3        Mastery
 Represent   the lowest possible score one could
  obtain and still be considered “OK” in the subject
  area being assessed.
 Usually aligned with Instructional Placement
  Standards…
Grade        Task Benchmark
Kindergarten LSF 40 correct letter sounds per 1
                  minute
First        WIF 50 correct words from list per 1
                  minute
Grade    Fall (WCPM)   Winter (WCPM)   Spring (WCPM)

 First       --             20              40

Second       44             68              90

Third        77             92              110

Fourth      104             115             124

 Fifth      104             115             124

 Sixth      109             120             125
Grade Level            Placement Level                      DC
                Data from Deno and Mirkin (1977)

   1-3             Frustration                             <20
                   Instructional                           21-40
                   Mastery                                 >41
  4-12             Frustration                             <40
                   Instructional                           41-80
                   Mastery                                 >81
         Data from Burns, VanDerHeyden, and Jiban (2006)
                   Frustration                             <14
   2-3
                   Instructional                           14-31
                   Mastery                                 >31
                   Frustration                             <24
   4-5
                   Instructional                           24-49
                   Mastery                                 >49
 Norms  are not available for each grade
 level, but the following can be used as a
 guideline:
  o 1st/2nd grade lists should have 55-70 CLS
  o Upper lists should have 125-155 CLS
 Available  from commercial programs
 Use caution with national norms as they may fall
  to the same criticisms lobbied against NRT’s
 Districts will set a threshold for lowest possible
  scores before being concerned
  o Many times set at 25%tile, but may be different
    depending on your district
Grade       Percentile   Fall   Winter   Spring
Kindergarten     90%        21      43       53
                 75%        11      32       43
                 50%         3      19       32
                 25%         0       8       20
                 10%         --      2        11


First            90%        47      59       62
                 75%        37      49       52
                 50%        27      38       41
                 25%        16      27       30
                 10%         8      16       20
Grade     Percentile   Fall        Winter   Spring
First        50%              2        4        9
Second       50%              7        14       20
Third        50%              17       24       29

Fourth       50%              26       33       39

Fifth        50%              34       40       45

Sixth        50%              40       46       50

Seventh      50%              44       50       52

Eighth       50%              50       52       56
Grade     Percentile   Fall   Winter   Spring
First        50%       28      41       45
Second       50%       49      55       61
Third        50%       80      90       100

Fourth       50%       92      107      107

Fifth        50%       115     116      120

Sixth        50%       120     119      129

Seventh      50%       112     121      124

Eighth       50%       129     122      118
Grade   Percentile   Fall   Winter   Spring

    1      50th       5       11      15

    2      50th      10      22       22

    3      50th      15      24       28

    4      50th      33      44       52

    5      50th      30      38       47

    6      50th      28      36       34
   A student’s performance is compared to students that are
    participating in the same “local” curriculum
    The distance between the referred student’s performance
    and those of their peers is examined for significance
   Most useful for screening decisions
   In small districts, all students complete the same tasks and
    are rank ordered by performance
   In large districts, a random sample is taken and the same
    procedure is conducted
   Collect data for fall, winter, and spring
   Collect a base year period of 3 years, then update norms
    every 3rd year
   If new curriculum is adopted or there is a change in the size
    or other student population characteristics, collect new
   
   Provides data that fuels the system
    o Who to service
    o How best to intervene
    o Whether the interventions are effective
   Assessment method for screening, prescriptive
    decisions, and progress-monitoring
   Defensible
   Highly effective
   Frequently collected
   Precise as students progress through the tiers
   Schoolwide screening
    o Universal screening & PM of
      all students are the hallmarks
      of Tier I
   Early Identification
   Direct academic methods
   CBM heavily relied upon here
    o Defensible, feasible, & repeatable
   Data needs to identify the category of
    the deficit
   Sub-skill mastery measures are
    important
    o Identifying areas of remediation within tier II
       intervention
   Most defensible of outcome data
   Multiple methods
    o Less students makes more cumbersome methods feasible
   
 Improves  understanding of the data
 Mechanism for communication
 Motivator and reinforcer
 Allows us to record changes in student learning
  over time
  o Level of performance
  o Rate of progress
 Important   decision-making tool
 Ongoing   access to complete record of data
 Direct and continuous contact with the data
  allows for exploration of behavior as it occurs
 Judgmental aids
 Allows for independent judgment of meaning
 Effective source of feedback for the behavior
 Baseline data
 Student Data
  o Data points – quantifiable amount of behavior and time and condition
    under which measurement was conducted
  o Data path – connecting of the data points (relationship between the
    IV and DV)
 Figure legend – identifies IV and DV
 X and Y axes
 Intervention and Phase lines
 Goal and/or Aimline
 Trend line
   X-axis = time
   Y-axis = outcome data
   All axes and conditions should be labeled
   Scales should:
    o Begin at zero

    o Be evenly spaced (equal-interval)

    o Show scale breaks (Spring Break, holidays)

    o Cover the range of variability

    o Provide a “key” when multiple variables are presented
 Essential
          to determining if the intervention caused
  any change in the target behavior
   o Provides understanding of behavior before an
     intervention is implemented
 Critical to progress monitoring
 Allows for prediction of what the behavior will look
  like in the future if it is not remediated
 Steps:
   o Identify target behavior and intervention
   o Collect a series of stable data points
      • In general, 3 data points
   
   Essential component of RTI
   CBA indicates progress toward intervention goals
   Goals must be written in a clear, observable, measurable
    way
   Must include
    o Time
    o Learner
    o Behavior
    o Level
    o Content
    o Material
    o Criteria
o Time: amount of time the goal is written for
   • “In 2 weeks…”
o Learner: the student for whom the goal is written for
   • “…Jose will…”
o Behavior: the specific skill the student will demonstrate
   • “…read aloud…”
o Level: the grade/level the content is from
   • “at the first-grade level”
o Content: what the student is learning about
   • “…in reading…”
o Material: What the student is using
   • “…using first grade passages from DIBELS ORF…”
o Criteria: Expected level of performance, including time and
  accuracy
   • “…40 WCPM with greater than 95% accuracy.”
   Time: decide when the student will be addressed again in
    the RTI meeting
   Behavior: determined by CBA/Schoolwide Screening
   Level: determined by CBA/SLA
   Material: determine an intervention after behavior and
    level are identified
   Criteria: use SLAs, benchmarks, growth rates to calculate
    where the student “should” be at the next RTI meeting if
    the intervention is effective
 End   of Year Benchmarking
  o Mastery Standards
 Growth   Rates
 Intra-Individual Framework
 Local and National Norms
Grade    Realistic Growth   Ambitious Growth   Growth Rate per
         Rates per Week     Rates per Week     Week
 First            2                3                  1.80

Secon            1.5               2                  1.66
  d
 Third            1               1.5                 1.18

Fourth          0.85              1.1                 1.01

 Fifth           0.5              0.8                 0.58

 Sixth           0.3              0.65                0.66
Grade   Realistic Growth Rates   Ambitious Growth Rates
               per Week                 per Week
  1              0.4                      0.84

  2              0.4                      0.84

  3              0.4                      0.84

  4              0.4                      0.84

  5              0.4                      0.84

  6              0.4                      0.84
Grade    Growth Rates per
           Week (CLS)
Second         1-1.5

Third         0.65-1

Fourth       0.45-0.85

 Fifth       0.3-0.65

 Six         0.3-0.65
Grade   Realistic Growth Rates per   Ambitious Growth Rates per
        Week                         Week

1                    0.30                       0.50

2                    0.30                       0.50

3                    0.30                       0.50

4                    0.70                       1.15

5                    0.75                       1.20

6                    0.45                       1.00
Duration of Treatment
               1 week    2 weeks 3 weeks 1 month 2 mos   4 mos


       Every      2        4       6       7       15      30
       other

       1x         5        10      15      20      41      81
Dose
 Per   2x         6        13      19      26      52     103
Day
       4x         7        15      22      29      58     117


       8x        12        24      35      47      95     189
   First, find student’s median baseline score
    (based on at least 3 scores) (ex., 29)
   Find “normal” growth for student (ex, .85 for 4th
    grader)
   Multiply norm by number of weeks left in year
    (ex., 16 x .85 = 13.6)
   Add this to median (ex., 13.6 + 29 = 42.6)
   43 is your new end-of-year goal
     Connect to baseline median for Goal Line
   May be more appropriate for students who you
    know are well-below or above the norm
   Need at least 8 data points to begin
   Figure out individual student’s own rate of
    progress based on first (8) data points.
     Order scores from smallest to largest
     Find range (largest – smallest)
     Divide by number of weeks (not days) that the *8) pts
      are based one.
     That is student’s current average weekly gain
 To set goal, multiply individual’s own rate
  of improvement by 1.5
 Then, multiply that “new” rate of
  improvement by the number of weeks left
  in the year
 Add that to the student’s median to obtain
  end-of-year goal
   David’s 1st 8 scores
     10, 12, 9, 14, 12, 15, 12, 14 (one score per week)
   Median: 9, 10, 12, 12, 12, 14, 14, 15
   Range: difference between high and low scores
     15-9 = 6
   Divide by number of weeks elapsed
       6/ 8 = .75
   Multiply by 1.5
     0.75 x 1.5 = 1.125
   Multiply by number of weeks left
     1.125 x 14 = 15.75
   Add sum to median score
     5.75 + 12 = 27.75
   End of year performance goal is 28.
   Connect median baseline score to end-of year goal = Goal
    Line
 Local   Norms
  o Set goals based on performance of children from
    same educational environment
  o Cautionary statements!
 National   Norms
  o Set goals based on large samples of students
 Can    be linear
  o Showing how the data “should” progress if the student
       is to meet the goal in the specified time frame
 Can    be horizontal
  o Displaying the discrepancy/closing of the gap
       between current performance and goal performance
 Or   both
   
   Formative evaluation
   Informs viewers if the student is on track to meet the goal
     o Predicts future performance if intervention remains in
       place
     o Decisions about overall movement of the behavior can
       be made
   Applied to the graph after 7-11intervention data points have
    been collected
   Can be applied via:
     o Software (Excel)
     o Tukey Method
   Graphed scores (not baseline) are divided into 3
    fairly equal groups
   Draw vertical lines to separate 3 groups (2 lines)
   In the first and third groups:
     Find median data point
     Mark an x where median data point intersects with
      median date for that group
     Draw line between x’s of 1st and 3rd groups
     This is the trend line
Median score
for first group
of data points




                  Median score
                  for last set of
                  data points
   
   Formative evaluation
   Visual analysis – systematic form of graph interpretation
   Questions to be answered:
     o Did meaningful changes occur?
     o Can the change be attributed to the intervention or
       instructional program?
     o Should adjustments be made?
   Level
       Comparing level of data at baseline to intervention phase
       Comparing level of data to the goal
   Immediacy/Latency of Change
    o If the change is immediate it probably is due to the intervention
    o If the response is delayed it is difficult to ascribe behavior
        change to the intervention
   Variability
    o Amount of variation in range or consistency in the set of data
   Trend
    o Rate of change within a phase
   To analyze a trend, you must compare it to the goal/aimline
   Team compares actual progress (trendline) against
    expected progress (aimline)
     o If trend is parallel to the aimline
        • Indicates that the student has made consistent growth
        • Continue as planned
    o If trend is steeper than the aimline
        • Indicates that the student has exceeded the goal
        • Look for ways to maintain or generalize while fading the
           intervention, if appropriate
    o If trend is flatter than the aimline
        • Indicates slower progress than anticipated
 Don’tneed a trendline for this
 Based on 4 most recent consecutive data points
  o If scores are above goal line, goal needs to be
    increased
  o If scores are below goal line, student instructional
    program or intervention needs revision
I   Inspect the last 4 data points.
D   Decide what the scores look like. Are they
    going up? Down? Variable?
E   Evaluate why the scores are this way?
    (motivation, attendance, instruction?)
A   Apply a decision to improve achievement (continue
    intervention w/o change, increase goal, tweak
    instruction or intervention, add a 2nd intervention, etc.
   Intervention Central
    o www.interventioncentral.org
   National Center on Student Progress Monitoring
   Big Ideas in Beginning Reading
   Research Institute on PM
   Florida Center for Reading Research
   DIBELS
   AIMSWeb
   Mathfactcafe.com
   www.aplusmath.com
   Themathworksheetsite.com
   Superkids.com/aweb/tools/math/
   www.schoolhousetech.com
   http://www.gosbr.net/
   http://ebi.missouri.edu/
   http://www.fsu.edu/~truancy/interventions.html
   http://www.pbis.org/
   https://www.msu.edu/course/cep/886/Reading%20Comp
    rehension/main.html

More Related Content

What's hot

The 15 most influential learning theories in education (a complete summary)
The 15 most influential learning theories in education (a complete summary)The 15 most influential learning theories in education (a complete summary)
The 15 most influential learning theories in education (a complete summary)Paul Stevens-Fulbrook
 
Cooperative learning teaching method
Cooperative learning teaching methodCooperative learning teaching method
Cooperative learning teaching methodMonique Pringle
 
THEORIES OF LANGUAGE LEARNING
THEORIES OF LANGUAGE LEARNINGTHEORIES OF LANGUAGE LEARNING
THEORIES OF LANGUAGE LEARNINGssujatha3
 
Power Point Presentation Gagne's hierarchy of learning
Power Point Presentation Gagne's hierarchy of learningPower Point Presentation Gagne's hierarchy of learning
Power Point Presentation Gagne's hierarchy of learningmumthazmaharoof
 
Introduction to-learning-theories
Introduction to-learning-theoriesIntroduction to-learning-theories
Introduction to-learning-theoriescananbarnard
 
Cooperative Learning in Special Education
Cooperative Learning in Special EducationCooperative Learning in Special Education
Cooperative Learning in Special EducationKapil Rathi
 
Constructivism and the_5_e_s
Constructivism and the_5_e_sConstructivism and the_5_e_s
Constructivism and the_5_e_ssjmankoo
 
The humanistic approach
The humanistic  approachThe humanistic  approach
The humanistic approachHala Fawzi
 
Scaffolding- Lev Vygotsky
Scaffolding- Lev Vygotsky Scaffolding- Lev Vygotsky
Scaffolding- Lev Vygotsky Justin Ramdhanee
 
Concept of intelligence
Concept of intelligenceConcept of intelligence
Concept of intelligenceSarfraz Ahmad
 
Structured peer tutoring
Structured peer tutoring Structured peer tutoring
Structured peer tutoring BSEPhySci14
 
Education for collective living and peaceful living
Education for collective living and peaceful livingEducation for collective living and peaceful living
Education for collective living and peaceful livingThanavathi C
 
Information processing model or memory model
Information processing model or memory modelInformation processing model or memory model
Information processing model or memory modelmahamiqbalrajput
 
Gagnes Cognitive Theory
Gagnes Cognitive TheoryGagnes Cognitive Theory
Gagnes Cognitive TheoryNiena Majid
 

What's hot (20)

Metacognition
MetacognitionMetacognition
Metacognition
 
The 15 most influential learning theories in education (a complete summary)
The 15 most influential learning theories in education (a complete summary)The 15 most influential learning theories in education (a complete summary)
The 15 most influential learning theories in education (a complete summary)
 
Cooperative learning teaching method
Cooperative learning teaching methodCooperative learning teaching method
Cooperative learning teaching method
 
COGNITIVISM THEORY
COGNITIVISM THEORYCOGNITIVISM THEORY
COGNITIVISM THEORY
 
THEORIES OF LANGUAGE LEARNING
THEORIES OF LANGUAGE LEARNINGTHEORIES OF LANGUAGE LEARNING
THEORIES OF LANGUAGE LEARNING
 
Power Point Presentation Gagne's hierarchy of learning
Power Point Presentation Gagne's hierarchy of learningPower Point Presentation Gagne's hierarchy of learning
Power Point Presentation Gagne's hierarchy of learning
 
Introduction to-learning-theories
Introduction to-learning-theoriesIntroduction to-learning-theories
Introduction to-learning-theories
 
Cooperative Learning in Special Education
Cooperative Learning in Special EducationCooperative Learning in Special Education
Cooperative Learning in Special Education
 
Constructivism and the_5_e_s
Constructivism and the_5_e_sConstructivism and the_5_e_s
Constructivism and the_5_e_s
 
Cognitivism
CognitivismCognitivism
Cognitivism
 
The humanistic approach
The humanistic  approachThe humanistic  approach
The humanistic approach
 
Plurilingualism
PlurilingualismPlurilingualism
Plurilingualism
 
Creative child
Creative childCreative child
Creative child
 
Scaffolding- Lev Vygotsky
Scaffolding- Lev Vygotsky Scaffolding- Lev Vygotsky
Scaffolding- Lev Vygotsky
 
Concept of intelligence
Concept of intelligenceConcept of intelligence
Concept of intelligence
 
Structured peer tutoring
Structured peer tutoring Structured peer tutoring
Structured peer tutoring
 
Theories of Learning
Theories of LearningTheories of Learning
Theories of Learning
 
Education for collective living and peaceful living
Education for collective living and peaceful livingEducation for collective living and peaceful living
Education for collective living and peaceful living
 
Information processing model or memory model
Information processing model or memory modelInformation processing model or memory model
Information processing model or memory model
 
Gagnes Cognitive Theory
Gagnes Cognitive TheoryGagnes Cognitive Theory
Gagnes Cognitive Theory
 

Similar to CBA and Graphing

Spe 501 class 11
Spe 501 class 11Spe 501 class 11
Spe 501 class 11jzurheide
 
Administering,scoring and reporting a test ppt
Administering,scoring and reporting a test pptAdministering,scoring and reporting a test ppt
Administering,scoring and reporting a test pptManali Solanki
 
Languageassessmenttsl3123notes 141203115756-conversion-gate01 (1)
Languageassessmenttsl3123notes 141203115756-conversion-gate01 (1)Languageassessmenttsl3123notes 141203115756-conversion-gate01 (1)
Languageassessmenttsl3123notes 141203115756-conversion-gate01 (1)hakim azman
 
Tabe level l training ppt (fy11 final)[1]
Tabe level l training ppt (fy11 final)[1]Tabe level l training ppt (fy11 final)[1]
Tabe level l training ppt (fy11 final)[1]charlie_herbert
 
Six steps for avoiding misinterpretations
Six steps for avoiding misinterpretationsSix steps for avoiding misinterpretations
Six steps for avoiding misinterpretationsAbdul Majid
 
Language assessment tsl3123 notes
Language assessment tsl3123 notesLanguage assessment tsl3123 notes
Language assessment tsl3123 notesPeterus Balan
 
Basic Testing Terminology
Basic Testing TerminologyBasic Testing Terminology
Basic Testing TerminologyYee Bee Choo
 
Slo Demonstration For Web
Slo Demonstration For WebSlo Demonstration For Web
Slo Demonstration For Webtayapage
 
measurment, testing & eveluation
measurment, testing & eveluationmeasurment, testing & eveluation
measurment, testing & eveluationmpazhou
 
EDUC 554Lesson Plan Grading Rubric (for edTPA preparation)Crit
EDUC 554Lesson Plan Grading Rubric (for edTPA preparation)CritEDUC 554Lesson Plan Grading Rubric (for edTPA preparation)Crit
EDUC 554Lesson Plan Grading Rubric (for edTPA preparation)CritEvonCanales257
 
Educational Assessment - Presentation for Concord College
Educational Assessment - Presentation for Concord CollegeEducational Assessment - Presentation for Concord College
Educational Assessment - Presentation for Concord Collegenbteacher
 
Administering & scoring tabe 9 10
Administering & scoring tabe 9 10Administering & scoring tabe 9 10
Administering & scoring tabe 9 10SABES
 
Tools To Assess The Quality Of The Curriculum
Tools To  Assess The  Quality Of The  CurriculumTools To  Assess The  Quality Of The  Curriculum
Tools To Assess The Quality Of The Curriculumdbrady3702
 
Vertical Scale Scores
Vertical Scale ScoresVertical Scale Scores
Vertical Scale Scoresguest3921f8
 
Standardized assessments
Standardized assessmentsStandardized assessments
Standardized assessmentsjzurheide
 
Assessment in English for Specific Purposes
Assessment in English for Specific Purposes Assessment in English for Specific Purposes
Assessment in English for Specific Purposes Neny Isharyanti
 
Intro to e as t-tle
Intro to e as t-tleIntro to e as t-tle
Intro to e as t-tlejudeweavers
 
Optimum assessment of cognitive domain in medical education
Optimum assessment of cognitive domain in medical educationOptimum assessment of cognitive domain in medical education
Optimum assessment of cognitive domain in medical educationK Raman Sethuraman
 

Similar to CBA and Graphing (20)

Spe 501 class 11
Spe 501 class 11Spe 501 class 11
Spe 501 class 11
 
Administering,scoring and reporting a test ppt
Administering,scoring and reporting a test pptAdministering,scoring and reporting a test ppt
Administering,scoring and reporting a test ppt
 
Languageassessmenttsl3123notes 141203115756-conversion-gate01 (1)
Languageassessmenttsl3123notes 141203115756-conversion-gate01 (1)Languageassessmenttsl3123notes 141203115756-conversion-gate01 (1)
Languageassessmenttsl3123notes 141203115756-conversion-gate01 (1)
 
Tabe level l training ppt (fy11 final)[1]
Tabe level l training ppt (fy11 final)[1]Tabe level l training ppt (fy11 final)[1]
Tabe level l training ppt (fy11 final)[1]
 
Six steps for avoiding misinterpretations
Six steps for avoiding misinterpretationsSix steps for avoiding misinterpretations
Six steps for avoiding misinterpretations
 
Language assessment tsl3123 notes
Language assessment tsl3123 notesLanguage assessment tsl3123 notes
Language assessment tsl3123 notes
 
Basic Testing Terminology
Basic Testing TerminologyBasic Testing Terminology
Basic Testing Terminology
 
Slo Demonstration For Web
Slo Demonstration For WebSlo Demonstration For Web
Slo Demonstration For Web
 
measurment, testing & eveluation
measurment, testing & eveluationmeasurment, testing & eveluation
measurment, testing & eveluation
 
EDUC 554Lesson Plan Grading Rubric (for edTPA preparation)Crit
EDUC 554Lesson Plan Grading Rubric (for edTPA preparation)CritEDUC 554Lesson Plan Grading Rubric (for edTPA preparation)Crit
EDUC 554Lesson Plan Grading Rubric (for edTPA preparation)Crit
 
Educational Assessment - Presentation for Concord College
Educational Assessment - Presentation for Concord CollegeEducational Assessment - Presentation for Concord College
Educational Assessment - Presentation for Concord College
 
Administering & scoring tabe 9 10
Administering & scoring tabe 9 10Administering & scoring tabe 9 10
Administering & scoring tabe 9 10
 
Tools To Assess The Quality Of The Curriculum
Tools To  Assess The  Quality Of The  CurriculumTools To  Assess The  Quality Of The  Curriculum
Tools To Assess The Quality Of The Curriculum
 
Vertical Scale Scores
Vertical Scale ScoresVertical Scale Scores
Vertical Scale Scores
 
Wiat iii indtest report
Wiat iii indtest reportWiat iii indtest report
Wiat iii indtest report
 
Standardized assessments
Standardized assessmentsStandardized assessments
Standardized assessments
 
Assessment in English for Specific Purposes
Assessment in English for Specific Purposes Assessment in English for Specific Purposes
Assessment in English for Specific Purposes
 
Intro to e as t-tle
Intro to e as t-tleIntro to e as t-tle
Intro to e as t-tle
 
Optimum assessment of cognitive domain in medical education
Optimum assessment of cognitive domain in medical educationOptimum assessment of cognitive domain in medical education
Optimum assessment of cognitive domain in medical education
 
Nat faq
Nat faqNat faq
Nat faq
 

Recently uploaded

An Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdfAn Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdfSanaAli374401
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfagholdier
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfAyushMahapatra5
 
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...KokoStevan
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAssociation for Project Management
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDThiyagu K
 
Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.MateoGardella
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxDenish Jangid
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsTechSoup
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxnegromaestrong
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Disha Kariya
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104misteraugie
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityGeoBlogs
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactPECB
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introductionMaksud Ahmed
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...EduSkills OECD
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingTeacherCyreneCayanan
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.christianmathematics
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17Celine George
 

Recently uploaded (20)

An Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdfAn Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdf
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdf
 
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across Sectors
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writing
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 

CBA and Graphing

  • 1.
  • 2. Any set of measurement activities that uses direct observation and recording of a student’s performance in the local curriculum as a basis for gathering information to make instructional decisions o Standard directions (straightforward) o Timing o Set of materials (passages, sheets, lists, etc) o Scoring rules o Standards for judging performance o Record forms and charts • Curriculum Based Assessment (CBA) • Curriculum Based Measurement (CBM) • Curriculum Based Evaluation (CBE)
  • 3. Most forms of classroom assessment is MM o NOT CBA  Describes mastery of a series of short-term instructional objectives o Criterion-referenced tests to determine success in each sequence of the curriculum  Problems with MM o Hierarchy of skills is logical, not empirical o Assessment does not reflect maintenance or generalization o Designed by teachers, therefore reliability and validity is unknown
  • 4. Alignment  Technical adequacy  Criterion-Referenced  Sensitive to instruction/alterable variables  Standardized  Performance sampling  Decision rules  Repeated measures  Efficient  Efficacious  Efficient summarizing of data  Tests are of relatively short duration  Direct link b/t data collection and intervention
  • 5.  Unable to measure every aspect of academic content and should not replace all achievement tests  Promoted as a one-size-fits-all measurement system that will answer all questions related to special education  Only assesses a limited number of academic behaviors and cannot possibly generalize to all aspects of academic achievement – it does not provide a comprehensive picture of skills by only measuring fluency  Denounces the use of published, norm-referenced tests, but uses these to establish reliability and
  • 6.  If a curriculum is inappropriate, then the outcomes of CBM are inappropriate  It is impossible to choose among the different CBA models because they are so similar  CBM is nothing more than a tack on procedure for traditional assessments  Using situation-based information to categorize a student as needing services means that a student with a disability in one school may not have a disability in another  The goals set by CBM are too ambitious and unrealistic
  • 7.
  • 8. Type of CBA/CBM  Used to sample across several goals at the same time  Capstone tasks o can only be accomplished by successfully applying a # of contributing skills o Ex: ORF  Advantages o Efficient o Assessment in context o Adequate opportunities for progress monitoring and data-based modifications  Disadvantage o Some curricular areas do not have a capstone task o Not good for determining where holes are in skills
  • 9. CBA/CBE  Goals/single skills in a curricular area  Advantage o Can be used in areas where capstone skills are not available o Also usually measures annual or long-term goals as long as adequately sampling from curriculum  Mastery Measures o Less complex o Less skills o Focuses on tool skills – skills that must be performed at high levels of proficiency or skills that are pivotal to many other operations o Used to trouble shoot a learning problem or look at short-term mastery of concepts
  • 10. o Unique application of DA that is appropriate for problem- solving o Primary purpose: determine instructional level at which a student is successful o Useful for a variety of educational decisions Advantages  Tests for success in curriculum o Continue testing until find appropriate instructional level as opposed to traditional testing goals.  Efficient o Obtains a broad sample of performance in a short period of time o Gives large amount of data to analyze and form hypotheses  Technically Adequate
  • 11. Screening o Which students are currently at risk for failure?  Progress-monitoring o Is the student making adequate progress toward important goals?  Diagnostic o What and how should we teach this student?  Outcome o Has this progress been a success?
  • 12.
  • 13.
  • 14. HIGH reliance on fluency (automaticity) for all areas  Fluency = speed + accuracy  Mostly concerned with basic skills: o Reading fluency o Math fluency o Writing fluency o Spelling  Does not look at history, science, social studies, etc. Assessments are called probes  Assessed frequently as it is a capstone skill  Can develop your own probes or use generic probes o AIMSWeb, DIBELS, EasyCBM etc.  Select probes that represent skills to be mastered by the end of the year or beginning of next year
  • 15.  DIBELS o Letter Naming Fluency (LNF) • Risk indicator for reading o Letter Sound Fluency (LSF) • Closely related to decoding and reading o Initial Sound Fluency (ISF) o Phoneme Segmentation Fluency (PSF) o Nonsense Word Fluency (NWF) • Basic decoding skills • Short vowels and consonants  Word Identification Fluency
  • 16.  Oral Reading Fluency (ORF) o Capstone skill  Administration o Student is asked to read aloud for one minute from a grade-level text o Examiner copy of the probe has cumulative word count o Start stopwatch when child says first word or within 3 seconds if he/she cannot read the first word o Mark all errors as the child reads • Count 3 second pauses as errors o Administer 3 probes
  • 17. Acceptable responses o Correctly pronounced o Self-corrected w/in 3 seconds o Dialectical differences o Repeated or inserted words are ignored, but indicated for error analysis  Errors o Substitutions o Omitted words o Words mispronounced or not corrected w/in 3 seconds o Reversals o Numbers  Other considerations o Hyphenated words o Abbreviations o Discontinuing o Dealing with interruptions
  • 18. Count correctly read words o WCPM  Count errors o WCPM/E  Calculate accuracy o WCPM/(WCPM + Errors) x 100 = % Accuracy  Looking for the median score  Also look for: o Errors that preserve rather than distort meaning o Effective strategies for dealing with unknown words o Reading with expression o Self-correction (self-monitoring) o Fluency or efficiency of reading o Adjusting pace for text complexity
  • 19. Reading passage o Silent reading o Restore every 7th word that has been deleted and replaced with 3 words  Better predictor of reading success for grade 4 and up  Face validity due to its relationship to comprehension  Group or individually administered  Administration o Give each student a passage o Time for 1-3 minutes o Grade number of correct responses • WCPM o Count errors
  • 20.  Assesses students’ ability to generalize learned spelling rules in novel tasks in addition  Number of words students can spell correctly  Words sampled from the year’s curriculum  Administration: o Dictate 1 word every 10 seconds grades 1-2 • 12 words or 50-70 CLS o Dictate 1 word every 7 seconds grades 3+ • 17 words or 125-155 CLS  Scored as correct letter sequences (CLS) and words spelled correctly (WSC)
  • 21.
  • 22.  Usesinstructional story starters  Administration o 1 minute thinking and 3 minute writing  Can count words written or # of words spelled correctly  Scoring o Total Words Written (TWW) o Words Spelled Correctly (WSC) o Correct Writing Sequence (CWS) • Considered the most sensitive measure o Total Correct Punctuation (TCP)
  • 23.
  • 24. Subject of math typically broken into: o Early Numeracy o Computation o Concepts and Application  Types of CBM assessment o Single Skills o Facts o Grade level or Multi-Skill  Administration o 2 minutes  Scoring o Digits correct
  • 25.  Early Numeracy o Missing Numbers o Number Identification o Oral Counting o Quantity Array o Quantity Discrimination  Concepts and Application o Estimation o Other math skills • Measurement • Time • Graph interpretation
  • 26.
  • 27.
  • 28.  Instructional Placement Standards o Survey Level Assessments  Benchmarks or Proficiency Levels  National Norms  Local Norms
  • 29.  Prepare Materials o Select a minimum of 3 probes from each level of the curriculum o Begin at or below the student’s grade level  Administer and Score Probes o Administer 3 passages per level in progressively lower order o Continue until the student’s normative level or appropriate instructional level has been reached  Summarize the Data o Enter student median scores into a table or graph
  • 30. Grade Level Level WCPM Errors Per Min Frustration <40 >4 1-2 Instructional 40-60 4 or less Mastery >60 4 or less Frustration <70 >6 3-6 Instructional 70-100 6 or less Mastery >100 6 or less
  • 31. Grade Level Placement Level DC Data from Deno and Mirkin (1977) 1-3 Frustration <20 Instructional 21-40 Mastery >41 4-12 Frustration <40 Instructional 41-80 Mastery >81 Data from Burns, VanDerHeyden, and Jiban (2006) Frustration <14 2-3 Instructional 14-31 Mastery >31 Frustration <24 4-5 Instructional 24-49 Mastery >49
  • 32. Reading SLA for Cassie Reading SLA for Lucy Leve WCPM Errors Placement Level WCP Errors Placement l Median Median M Median Media 5 45 8 Frustration n 5 60 7 Frustration 4 50 7 Frustration 4 70 8 Frustration 3 55 10 Frustration 3 70 7 Frustration? 2 55 4 Instruction al 2 70 5 Instructional ? 1 85 0 Mastery 1 85 3 Mastery
  • 33.  Represent the lowest possible score one could obtain and still be considered “OK” in the subject area being assessed.  Usually aligned with Instructional Placement Standards…
  • 34. Grade Task Benchmark Kindergarten LSF 40 correct letter sounds per 1 minute First WIF 50 correct words from list per 1 minute
  • 35. Grade Fall (WCPM) Winter (WCPM) Spring (WCPM) First -- 20 40 Second 44 68 90 Third 77 92 110 Fourth 104 115 124 Fifth 104 115 124 Sixth 109 120 125
  • 36. Grade Level Placement Level DC Data from Deno and Mirkin (1977) 1-3 Frustration <20 Instructional 21-40 Mastery >41 4-12 Frustration <40 Instructional 41-80 Mastery >81 Data from Burns, VanDerHeyden, and Jiban (2006) Frustration <14 2-3 Instructional 14-31 Mastery >31 Frustration <24 4-5 Instructional 24-49 Mastery >49
  • 37.  Norms are not available for each grade level, but the following can be used as a guideline: o 1st/2nd grade lists should have 55-70 CLS o Upper lists should have 125-155 CLS
  • 38.  Available from commercial programs  Use caution with national norms as they may fall to the same criticisms lobbied against NRT’s  Districts will set a threshold for lowest possible scores before being concerned o Many times set at 25%tile, but may be different depending on your district
  • 39. Grade Percentile Fall Winter Spring Kindergarten 90% 21 43 53 75% 11 32 43 50% 3 19 32 25% 0 8 20 10% -- 2 11 First 90% 47 59 62 75% 37 49 52 50% 27 38 41 25% 16 27 30 10% 8 16 20
  • 40. Grade Percentile Fall Winter Spring First 50% 2 4 9 Second 50% 7 14 20 Third 50% 17 24 29 Fourth 50% 26 33 39 Fifth 50% 34 40 45 Sixth 50% 40 46 50 Seventh 50% 44 50 52 Eighth 50% 50 52 56
  • 41. Grade Percentile Fall Winter Spring First 50% 28 41 45 Second 50% 49 55 61 Third 50% 80 90 100 Fourth 50% 92 107 107 Fifth 50% 115 116 120 Sixth 50% 120 119 129 Seventh 50% 112 121 124 Eighth 50% 129 122 118
  • 42. Grade Percentile Fall Winter Spring 1 50th 5 11 15 2 50th 10 22 22 3 50th 15 24 28 4 50th 33 44 52 5 50th 30 38 47 6 50th 28 36 34
  • 43. A student’s performance is compared to students that are participating in the same “local” curriculum  The distance between the referred student’s performance and those of their peers is examined for significance  Most useful for screening decisions  In small districts, all students complete the same tasks and are rank ordered by performance  In large districts, a random sample is taken and the same procedure is conducted  Collect data for fall, winter, and spring  Collect a base year period of 3 years, then update norms every 3rd year  If new curriculum is adopted or there is a change in the size or other student population characteristics, collect new
  • 44.
  • 45. Provides data that fuels the system o Who to service o How best to intervene o Whether the interventions are effective  Assessment method for screening, prescriptive decisions, and progress-monitoring  Defensible  Highly effective  Frequently collected  Precise as students progress through the tiers
  • 46. Schoolwide screening o Universal screening & PM of all students are the hallmarks of Tier I  Early Identification  Direct academic methods
  • 47. CBM heavily relied upon here o Defensible, feasible, & repeatable  Data needs to identify the category of the deficit  Sub-skill mastery measures are important o Identifying areas of remediation within tier II intervention
  • 48. Most defensible of outcome data  Multiple methods o Less students makes more cumbersome methods feasible
  • 49.
  • 50.  Improves understanding of the data  Mechanism for communication  Motivator and reinforcer  Allows us to record changes in student learning over time o Level of performance o Rate of progress  Important decision-making tool
  • 51.  Ongoing access to complete record of data  Direct and continuous contact with the data allows for exploration of behavior as it occurs  Judgmental aids  Allows for independent judgment of meaning  Effective source of feedback for the behavior
  • 52.  Baseline data  Student Data o Data points – quantifiable amount of behavior and time and condition under which measurement was conducted o Data path – connecting of the data points (relationship between the IV and DV)  Figure legend – identifies IV and DV  X and Y axes  Intervention and Phase lines  Goal and/or Aimline  Trend line
  • 53. X-axis = time  Y-axis = outcome data  All axes and conditions should be labeled  Scales should: o Begin at zero o Be evenly spaced (equal-interval) o Show scale breaks (Spring Break, holidays) o Cover the range of variability o Provide a “key” when multiple variables are presented
  • 54.
  • 55.  Essential to determining if the intervention caused any change in the target behavior o Provides understanding of behavior before an intervention is implemented  Critical to progress monitoring  Allows for prediction of what the behavior will look like in the future if it is not remediated  Steps: o Identify target behavior and intervention o Collect a series of stable data points • In general, 3 data points
  • 56.
  • 57. Essential component of RTI  CBA indicates progress toward intervention goals  Goals must be written in a clear, observable, measurable way  Must include o Time o Learner o Behavior o Level o Content o Material o Criteria
  • 58. o Time: amount of time the goal is written for • “In 2 weeks…” o Learner: the student for whom the goal is written for • “…Jose will…” o Behavior: the specific skill the student will demonstrate • “…read aloud…” o Level: the grade/level the content is from • “at the first-grade level” o Content: what the student is learning about • “…in reading…” o Material: What the student is using • “…using first grade passages from DIBELS ORF…” o Criteria: Expected level of performance, including time and accuracy • “…40 WCPM with greater than 95% accuracy.”
  • 59. Time: decide when the student will be addressed again in the RTI meeting  Behavior: determined by CBA/Schoolwide Screening  Level: determined by CBA/SLA  Material: determine an intervention after behavior and level are identified  Criteria: use SLAs, benchmarks, growth rates to calculate where the student “should” be at the next RTI meeting if the intervention is effective
  • 60.  End of Year Benchmarking o Mastery Standards  Growth Rates  Intra-Individual Framework  Local and National Norms
  • 61. Grade Realistic Growth Ambitious Growth Growth Rate per Rates per Week Rates per Week Week First 2 3 1.80 Secon 1.5 2 1.66 d Third 1 1.5 1.18 Fourth 0.85 1.1 1.01 Fifth 0.5 0.8 0.58 Sixth 0.3 0.65 0.66
  • 62. Grade Realistic Growth Rates Ambitious Growth Rates per Week per Week 1 0.4 0.84 2 0.4 0.84 3 0.4 0.84 4 0.4 0.84 5 0.4 0.84 6 0.4 0.84
  • 63. Grade Growth Rates per Week (CLS) Second 1-1.5 Third 0.65-1 Fourth 0.45-0.85 Fifth 0.3-0.65 Six 0.3-0.65
  • 64. Grade Realistic Growth Rates per Ambitious Growth Rates per Week Week 1 0.30 0.50 2 0.30 0.50 3 0.30 0.50 4 0.70 1.15 5 0.75 1.20 6 0.45 1.00
  • 65. Duration of Treatment 1 week 2 weeks 3 weeks 1 month 2 mos 4 mos Every 2 4 6 7 15 30 other 1x 5 10 15 20 41 81 Dose Per 2x 6 13 19 26 52 103 Day 4x 7 15 22 29 58 117 8x 12 24 35 47 95 189
  • 66. First, find student’s median baseline score (based on at least 3 scores) (ex., 29)  Find “normal” growth for student (ex, .85 for 4th grader)  Multiply norm by number of weeks left in year (ex., 16 x .85 = 13.6)  Add this to median (ex., 13.6 + 29 = 42.6)  43 is your new end-of-year goal  Connect to baseline median for Goal Line
  • 67. May be more appropriate for students who you know are well-below or above the norm  Need at least 8 data points to begin  Figure out individual student’s own rate of progress based on first (8) data points.  Order scores from smallest to largest  Find range (largest – smallest)  Divide by number of weeks (not days) that the *8) pts are based one.  That is student’s current average weekly gain
  • 68.  To set goal, multiply individual’s own rate of improvement by 1.5  Then, multiply that “new” rate of improvement by the number of weeks left in the year  Add that to the student’s median to obtain end-of-year goal
  • 69. David’s 1st 8 scores  10, 12, 9, 14, 12, 15, 12, 14 (one score per week)  Median: 9, 10, 12, 12, 12, 14, 14, 15  Range: difference between high and low scores  15-9 = 6  Divide by number of weeks elapsed  6/ 8 = .75  Multiply by 1.5  0.75 x 1.5 = 1.125  Multiply by number of weeks left  1.125 x 14 = 15.75  Add sum to median score  5.75 + 12 = 27.75  End of year performance goal is 28.  Connect median baseline score to end-of year goal = Goal Line
  • 70.  Local Norms o Set goals based on performance of children from same educational environment o Cautionary statements!  National Norms o Set goals based on large samples of students
  • 71.  Can be linear o Showing how the data “should” progress if the student is to meet the goal in the specified time frame  Can be horizontal o Displaying the discrepancy/closing of the gap between current performance and goal performance  Or both
  • 72.
  • 73.
  • 74.
  • 75. Formative evaluation  Informs viewers if the student is on track to meet the goal o Predicts future performance if intervention remains in place o Decisions about overall movement of the behavior can be made  Applied to the graph after 7-11intervention data points have been collected  Can be applied via: o Software (Excel) o Tukey Method
  • 76. Graphed scores (not baseline) are divided into 3 fairly equal groups  Draw vertical lines to separate 3 groups (2 lines)  In the first and third groups:  Find median data point  Mark an x where median data point intersects with median date for that group  Draw line between x’s of 1st and 3rd groups  This is the trend line
  • 77. Median score for first group of data points Median score for last set of data points
  • 78.
  • 79.
  • 80. Formative evaluation  Visual analysis – systematic form of graph interpretation  Questions to be answered: o Did meaningful changes occur? o Can the change be attributed to the intervention or instructional program? o Should adjustments be made?
  • 81. Level  Comparing level of data at baseline to intervention phase  Comparing level of data to the goal  Immediacy/Latency of Change o If the change is immediate it probably is due to the intervention o If the response is delayed it is difficult to ascribe behavior change to the intervention  Variability o Amount of variation in range or consistency in the set of data  Trend o Rate of change within a phase
  • 82. To analyze a trend, you must compare it to the goal/aimline  Team compares actual progress (trendline) against expected progress (aimline) o If trend is parallel to the aimline • Indicates that the student has made consistent growth • Continue as planned o If trend is steeper than the aimline • Indicates that the student has exceeded the goal • Look for ways to maintain or generalize while fading the intervention, if appropriate o If trend is flatter than the aimline • Indicates slower progress than anticipated
  • 83.  Don’tneed a trendline for this  Based on 4 most recent consecutive data points o If scores are above goal line, goal needs to be increased o If scores are below goal line, student instructional program or intervention needs revision
  • 84. I Inspect the last 4 data points. D Decide what the scores look like. Are they going up? Down? Variable? E Evaluate why the scores are this way? (motivation, attendance, instruction?) A Apply a decision to improve achievement (continue intervention w/o change, increase goal, tweak instruction or intervention, add a 2nd intervention, etc.
  • 85. Intervention Central o www.interventioncentral.org  National Center on Student Progress Monitoring  Big Ideas in Beginning Reading  Research Institute on PM  Florida Center for Reading Research  DIBELS  AIMSWeb  Mathfactcafe.com  www.aplusmath.com  Themathworksheetsite.com
  • 86. Superkids.com/aweb/tools/math/  www.schoolhousetech.com  http://www.gosbr.net/  http://ebi.missouri.edu/  http://www.fsu.edu/~truancy/interventions.html  http://www.pbis.org/  https://www.msu.edu/course/cep/886/Reading%20Comp rehension/main.html

Editor's Notes

  1. Direct assessmentEvaluates students’ skill development across the entire curriculumCurriculum-based: tied to the curriculum, which allows us to sample and measure what a student is actually taught; unlike norm-referenced tests which are designed to avoid sampling specific curriculum or content areas. Also designed to function in an PS ModelInterested in assessing whether a student has attained a certain level of skill proficiency with one particular aspect of the curriculumCBA- umbrella termCBM- broad, long-term goal objectives- structure the assessment throughout the entire school year and same performance objective is continually assessedAllows for the assessment of retention and generalization of learningStandardized administration and scoring CBE Proceduressurvey-level assessment that samples from a wide range of skills within a curricular domainDiagnostic assessment by following a task analytic procedure using skill-specific criterion- referenced tests (specific blends, digraphs, etc)
  2. Describes mastery of a series of short-term instructional objectives-CBM makes no assumptions about instructional hierarchy for determining measurement (i.e., fits with any instructional approach)-CBM incorporates automatic tests of retention and generalizationCriterion-referenced: uses a test to sample the student’s behavior and then allows the user to compare the behavior to a performance criterion so that the meaning of the score can be interpreted-different from norm-referenced: compare behavior to a norm
  3. Result of 30 years of researchAlignment: your educational efforts will be more effective if you test what you teach and teach what you test. What = curriculum; goals and objectives that must be met to achieve social and academic competence.In other words, the content of what we test is the same as what is taught, stimulus materials the student is given look the same, responses they are expected to make are the same as what is expected in their curriculumTechnical Adequacy: they have been established reliably and validly. However, not considered an informal assessment (will cover more on technical adequacy next lecture)Criterion-Referenced: as opposed to norm-referenced. Can be used to determine if students can demonstrate their knowledge by reaching specified performance levels. Helps with lesson planning: more important to know if the students knows/doesn’t know the skill, not how others compare on the skill.sensitive to instruction : sensitive enough to measure progress in a curriculum. Norm-referenced measures tell you if your relative standing in the group changed, not necessarily whether you gained skills on a task or how much gain you are making toward meeting the instructional criterionStandard procedures are used – administration and scoring rules are very similarPerformance sampling– procedures are low-inference, direct measures Low-inference measures: conclusions can be drawn without making inferences or ascribing to a particular theory. CBM not developed to explain how learning does/doesn’t occur or conform to a particular theory that tells how a student thinks, attends, remembers, or processes information. Correct/incorrect responses are counted. In contrast, the WJ COG would be a high inference measure: why? Decision rules are in place to provide those who use the data with information about what it means when students score at different levels of performance or illustrate different rates of progress on the measures over time (more to come later)Emphasizes repeated measures over time and can be used to generate rate of progress as well as level of performance data – can be used for progress monitoring to illustrate rate of learning as it occurs - Efficient – measures are given in a short amount of time and training is easy. Efficacy – measure is easy and understandable – less time spent assessing, more time spent teachingCan be summarized efficiently by use a variety of techniques (paper and pencil graphs to elaborate data management systems)Short duration 1-3 minutes in most casesBecause measures are drawn from the curriculum, there is a direct link between what is taught and what is measured. Therefore, this makes it easier to link between what is measured and providing intervention. Additionally, lends well to ongoing progress monitoring, so that we can make decisions about the effectiveness of an intervention in a shorter amount of time.
  4. confusion over technical issuesCriticism: CBM is unable to measure every aspect of an academic content domain and should not replace all achievement tests.Response: CBM was not designed to do this and is not supported as a replacement for other achievement tests. Rather, CBM is a measure of DIBS, which are integral aspects of achievement.Criticism: CBM is promoted as a one-size-fits-all measurement system that will answer all questions related to specialized education.Response: CBM is best used in a problems-solving model, and is best designed to answer questions about basic skill areas, not every academic area.Criticism: CBM only assesses a limited sample of behaviors related to academic achievement and cannot possibly generalize to all aspects of academic performance – it does not provide a comprehensive picture of skills by only assessing things such as fluency.Response: CBM measures are based on a scientific research base that has consistently found fluency to be one of the best indicators of basic skills.Criticism: CBM denounces the use of published, norm-referenced tests, but has used these tests to establish its reliability and validity.Response: PNT’s may be well constructed and technically adequate for many applications and are good measures of general skills; therefore, CBM measures should correlate with them. However, PNT’s are not appropriate for problem-solving model approaches like CBM is because they do not lend themselves to easy repeated administration and are not as sensitive to change. Furthermore, the use of local norms as CBM employs them is more appropriate for defining problems as situationally-based issues
  5. Criticism: If a curriculum is inappropriate, then the outcomes of CBM are inappropriate.Response: this does not deal with CBM – if a curriculum is inappropriate it should be changed. However, if general education requires that a student learn whatever curriculum is in place, then we should examine the student’s performance in learning it. Furthermore, if the curriculum is poor, the deficits in ability are more likely to surface using a PNT that CBM!model discriminationsCriticism: It is impossible to choose among the different CBM/CBA models because they are all so similar.Response: Careful review of the different CBA models reveals that they have little in common with respect to purpose, test construction, adequacy, efficacy, utility, or research support.Criticism: CBM is nothing more than a tack-on to traditional assessments.Response: CBM has been proposed as both an addition to and a substitute for traditional test-place models. However, it is most appropriately used as a problem-solving model in which a student’s progress is continually monitored, and changes are made to the curriculum accordingly. disagreement over conceptual issues Criticism: using situationally based information to categorize a student as needing services means that a student with a disability in one school may not have a disability in another.Response: This already occurs under currently models. Furthermore, the true issue is the difference in how deficits are defined. Traditional models define the problem as intrachild – in other words, the deficit lies in some characteristics that is inherent to the child. CBM as a problem-solving approach defines the problem as interchild – the deficit lies in the difference between the child and his/her peers.Criticism: the goals set by CBM are too ambitious and unrealistic.Response: this makes several assumptions. It assumes that students with achievement problems have limited capacity to learn and deletes the role of instruction as an important factor in achievement. It also implies that measures of aptitude accurately predict achievement, thus implying needs for lower goals. This is refuted by the finding that correlations between aptitude and achievement are imperfect, and that research supports the setting of goals that are moderate to highly ambitious to encourage achievement.
  6. General Outcome Measures- Ex: ORF have to use a variety
  7. ORF: have to use a variety of skills at the same time (letters, letter combos, blending, vocab, syntax, content knowledge)-don’t need to test each of these subskills separately Serves as a “sign” of general achievementUsed to assess student progress toward annual goalsUseful for long-term progress monitoring and screeningAdvantagesCuts down the # of different measures one has to develop, introduce, manage, administer, score, and trackRecognizes limitations of isolating subskills form the context in which they normally are expected to functionVisual displays of progress on a GOM will how longer acquisition slopes allowing adequate opportunities for progress monitoring and data-based instructional modificationsDisadvantageSome curriculum areas do not have a capstone task that represents the synthetic application of most of the content- difficult to develop in math beyond the early grades
  8. Constructed by identifying the set of goals that will be taught within a curricular areaEx: math single skillsAdvantageCan be used to screen, progress monitor, and do survey-level assessment in curriculum domains where capstone skills are not availableMastery MeasuresLess complex, less skillsUsed on parts of the curriculum that contain discrete and easily identified sets of items that are closely related by some common skill, theme, concept, or solution strategyEx: punctuation, multiplying fractions, sounds for lettersFocuses on a particular skill- math facts, silent e, etc
  9. Survey Level AssessmentIt is a process by which a student is tested in successively lower levels in the curriculum sequence until a level was located where the student performs successfully. Therefore, the primary purpose of SLA is to determine the instructional level at which a student is successful.
  10. Have to have good information to make good decisionsScreening: decide which students needs help and which don’t and if overall curriculum is adequateResults only indicate there is a problem, doesn’t necessarily indicate what the problem is Term “at-risk” does not mean that a student has a disability, just that he/she may need additional supportProgress-monitoring: decide when to move on to new goals or modify instructionEnsures the instruction/intervention is workingthe procedure needs to be directly inline with what is taught, must be sensitive to learning, and has to be given frequentlyYields info that can be easily summarized and displayed on a chart or graphFrequency must increase as a student progresses through the tiersDiagnostic: what kind of help students needfunction: to develop an instructional plan in response to a significant problemA personalized evaluation procedure that will allow the careful and systematic examination of a student’s skills. This allows the selection of individual expectations and teaching approachesReserved for those relatively rare instances when progress monitoring shows that various educational supports have not workedOutcome: when special services can be discontinued and to document the overall effectiveness of efforts across all studentsFunction: to determine and document the effectiveness of an educational programUse a procedure that will give the info needed to determine if program goals have been met
  11. Reading, writing, math
  12. Tests are drawn from the annual curriculum rather than from the exact point at which a student is being taught (not the same as a criterion-based assessment of knowledge type of test)High reliance on fluency = speed + accuracyFluency is the metric of interest across all CBM procedures; however, other data is also available.Fluency is rate (how fast) and accuracy (how correct). CBM is not a speed test but is a measure of the child’s automaticity with the basic skills. Think about the multiplication tables, we teach those to the automatic level to facilitate more advanced math skills such as division, fractions, decimals, etc. The more automatic the basic skills, the more cognitive energy the student has to devote to more advanced activities….such as comprehension, problem-solving, and the like. Assessments are termed PROBESCan use ready-made probes or design your own. Research has shown that even if we do not draw your reading probes directly from your reading curriculum, we can still use that data to make important decisionsMost school use literature-based reading programs (anthology of literature chosen for interest and motivation relative to a student’s grade level; passages do not control for vocabulary or skill development, so books may vary widely in readability). In such programs, generic, ready-made probes are suggested (e.g. AIMSweb)When assessing ORF, passages should be different but equivalent in grade level/difficulty and should have at least 200 words per passage; 20-30 passages should be generated to monitor progress throughout the yearShould represent reading skills that student is expected to master throughout the entire school yearTypically draw passages from the end of the year or the beginning of the following yearThree equivalent passages are given when doing any type of assessment (screening, progress monitoring or SLA)
  13. -DIBELS: Phonemic Awareness-word identification fluency, letter sound fluency, initial sound fluency, phoneme segmentation fluency, letter naming fluency, nonsense word fluencyLNF: page of upper- and lower-case letters in random order-timed for 1 minute- accuracy and fluency scoresLSF: closely related to decoding and reading – sounds of letters, 1 minuteISF: page with 4 pictures- examiner names each picture and asks the student to recognize and produce the first sound in a word the examiner says aloud- timed on how long it takes to respond to a total of 16 prompts- accuracy and fluency scoresPSF: examiner reads a list of 2, 3, 4, and 5 phoneme words and the student is asked to produce all the sounds in the word- timed for 1 minuteISF and PSF- measure phonological awarenessNWF: indication of how well students can map the correct sounds/phonemes onto letters/graphemes- provides a clear understanding of how well students can use their basic decoding skills to read short vowel sounds and consonants-page of 50 nonsense words (2-3 letters)- timed for one minute-correct individual sounds, or whole wordWIF: the lists should have different items but be equivalent in difficulty and should consist of at least 50 words each-sample what the student is supposed to master throughout the year-timed 1 minute
  14. For reading fluency, students are asked to read for one minute from a grade level text. On the examiner copy the cumulative word count is located down the right hand side of the paper. The student copy is exactly the same but with the numbered lines missing. While a student reads the story, the examiner marks all mispronunciations, and other reading errors. Total words read correctly and total errors are recordedDetermine the median score, as that is what we use for progress monitoring. Start your stopwatch when the child says first word. Time for one minute. You will need to watch that stopwatch. If they don’t say the first word within 3 seconds, you give them the word and it is considered an error, then begin the stopwatch – timing would begin after 3 secondsSlash mark each error…we will tell you what errors are to be crossed out. Record the total reading time if less than one minuteAt the end of 1 minute say “stop.” Or, as an alternative, it is ok to allow the child finish the sentence or paragraph, but be sure to put a bracket where the minute elapsed. This may be less disruptive as it may frustrate some students to stop in the middle of the sentence. Just be sure to put the bracket at the right place.The three second rule comes up a lot. You don’t want the child attempting the word for the whole minute. Instead, you want to see them do some reading. Some kids don’t try the word and sit there silently. Just tell them the word after 3 seconds.*** HO from Proctor’s Train the Trainer overheads to demo.If the child is to answer comprehension questions, but does not complete the section in 1 minute, let them complete the passageComprehension scores are graphed ()
  15. 1. Word must be pronounced correctly given the context of the sentence “He will read the book” vs. “He will red the book”2. Self-corrected words – within 3 seconds3. Repeated Words and Inserted Words are Ignored – Why are repeated words ignored? They lose time by repeating the problem so we don’t penalize them twice. Same as with inserted words.Dialect – it helps if you know the student’s background. Also, known speech problems are not errors.Substitutions, such as “Dig” for “dog,” are errors.Omitted words including complete lines are all counted as errors. Draw a line through the entire line if this occurs.Words not pronounced or read w/in 3 seconds. Why the 3 seconds? (Comprehension is affected negatively, frustration may occur, and the student may not move on with the rest of the story).Reversals - “it is” read as “is it” - both words are counted as errors.Numbers - example, years - 1983 must be read “nineteen eighty three” not “one nine eight three.”Hyphenated words -the rule is if the morpheme can stand alone, it is considered as a word. Examples: Daughter-in-law counted as three words Re-evaluate counted as one word since “re” cannot stand aloneAbbreviations - examples: “Dr.”, “St.,” “Ave.” – must read the word they stand for, not d-r for dr. Exceptions - when abbreviation is commonly used - “TV” is correct as “TV” or “television”.Discontinue passages in the same difficulty level if the student reads less than 10 words in the 1st passageLet the student finish a sentence, even though you mark where they finish at the end of 1 minuteDiscard any passages and re-administer a new passage if interruptions happen
  16. Count the total # of words read and the total # of errors. Metric of interest: words read correctly per minute. Record the score as WCM/E. At each level, you are looking for median scores – The median is the middle score. CBM research has found that the median better represents the student’s typical performance on 3 passages from the same grade level rather than calculating the mean. Additionally, it requires no calculation. And controls for potential effects of very difficult or very easy passagesQualitative features checklist (see appendix of training form)Reads fluently or efficientlyReads accurately (&gt;95%)Has effective strategies for dealing with unknown wordsReads errors that preserve rather than distort meaningReads with expression (prosodic features)Self-corrects errors (comprehension self-monitoring)Adjusts pace when complexity or “considerateness” of text changes
  17. Maze passage reading: reading a passage silently and restoring every seventh word that has been deleted and replaced with 3 words-more appropriate for grades 4 and up-should include 300 words and 42 deleted words-reading skills should represent the skills the student is expected to master by the end of the year
  18. Good spellers are always good readersPoor speller can be a good reader or a poor readerTaken directly from the annual curriculum; however, recommend premade lists as often spelling curriculum is not necessarily connected across schools (e.g. some connect it to reading, others to a separate spelling series, and still others to teacher generated lists)Requires students test on combo of words they have already learned and words they will learn in the upcoming weeks or monthsAlso provides information about a student’s decoding; good spellers are always good readers, but not vice versa (may or may not be a strong reader if a poor speller)Assesses student’s ability to generalize learned spelling rules in novel tasks, in addition to number of words they can spell correctlyDictate words every 10 second grades 1-2Dictate words every 7 seconds grades 3-8ScoringScore for CLS and words spelled correctly (WSC)CLS can be diagnostic and can detect small improvements in a student’s responsesCLS = total number of pairs of letters that are in correct sequences – considered more sensitive to gauge student improvementCorrect sequence includes the beginning space next to the first letter, letter to letter, letter to punctuation, punctuation to letter, and an end letter to a space. Equal to the number of letters in the word + 1 except when punctuation is used, then it’s the number of letters +2In words with double letters:If one of the letters is missing, treat the present letter as the first letter (e.g. col the c-o = 1 but the o-l = 0)Hyphens, periods in abbreviations, and apostrophes in contractions are counted as correct lettersProper names must be capitalizedIn compound words, if the words are split, results in a loss of one CLS
  19. Short, simple measure of students’ writing skill-required to think for 1 minute and write for 3 minutes on an instructional-level story starter and are scored on specific writing skillsTWW and WSC- easy to score, yield fluency informationCWS- supplies a great deal of useful information about error patterns and missing skills-more sensitive to instruction- better PM tool Growth rates aren’t availableNorms are available
  20. Computation is the traditional focus of math CBM- most research to support it Does not involve capstone assessments or GOMs. These all use SBMs which consist of all the specific skills that must be mastered by the end of the yearAccording to National Research Council (2001), math is comprised of 5 intertwined strands of proficiency, including procedural fluency, skill in carrying out procedures flexibly, accurately, efficiently, and appropriately. In AIMSweb, M-CBM provides educators narrow-band tests (lots of items across a limited grade level or type of computation problem) that are simple to administerIf designing your own probes, must begin by obtaining the sequence of instruction for computational skills for your school districtTypes of probes:Single skill probes: ex. All addition Can provide information about deficient and mastered math skillsTypically used for assessment of instructional placementCan be used to monitor acquisition of newly taught skillsMust begin by defining the specific types of math problems that are of interestIt is not necessary to assess every single computational objective between mastery and frustrational levelsFor simpler problems 30-35 problems per sheet is acceptableFact probes: All are given for 2 minutesAssess adding, subtraction, multiplication, and divisionGrade level or Multiple skill probes: ex. Mixed reviewRepresents problems that are drawn from the annual grade-level curriculumAssess more skills at onceUsed for progress monitoring and determining where additional assessment may be necessaryDigits Correct: evaluates students’ gradual acquisition of the skill -each digit below the answer line is counted-allows for error analysis-small and minor errors in the computation process result in the student’s obtaining an incorrect answer-need a scoring metric that across time can be sensitive to the student’s gradual acquisition of the skills required to complete computations accurately
  21. Missing numbers: presented with a box that contains 3 numbers and 1 blankNumbers have a pattern to them (counting by 1’s, 2’s, 5’s, or 10’s)-1 minuteNumber ID: sheet of numbers (0-100 or 0-20) in random order-1 minuteOral Counting: student counts orally starting at one-1 minuteQuantity array: student is presented with a box containing several dots. Asked to identify how many dots are in each box and tell the teacher-1 minuteQuantity Discrimination: presented with 2 adjoining boxes that each contain a number- identify which # is greater-1 minuteConcepts and Application: National Center on Progress Monitoring and AIMSWeb--response format varies- some are fill in the blank, multiple choice-first grade measure is read to the student- others are independent-time to complete 6-8 minutesEstimation: important math skill that is being explored as a useful criterion measure because it is thought to be a good indicator of number sense-generally consists of 40 problems (word and computation) with 3 answer provided-one of the answer is close to the correct answer but not exact and the other 2 are far away-3 minutes to complete
  22. Instructional Placement Standards were recommended by Fuchs and Deno in 1982, formula is still applicable, especially when conducting Survey Level Assessments to determine appropriate placement in a curriculumSLAs: frustration, instructional, mastery levels- includes fluency and accuracy ratesBenchmarks and Proficiency Levels are used more frequently and provide a nice method of determining current success and predicting future success in high stakes assessments – typically represent the lowest score we would accept before saying a child is not at risk for academic failureKeep in mind: benchmark data not available for all academic areasNational Norms come from programs such as AIMSwebLocal norms must be developed within your district or school
  23. May need to do survey-level assessment to determine instructional levels for interventions-may do before tier II, but definitely needs to be done before tier IIIPrepare Materials – Select a minimum of three probes from each level of the general education curriculum. You’ll typically begin SLA with the student’s grade level. When you know that the student’s skills are lower, then begin at a lower level than their grade placement.Administer and Score Probes – The student is administered three passages per level in progressively lower levels of the general education curriculum until the student’s normative level or appropriate instructional level has been reached.*Instructional Placement Level – The instructional placement level refers to the highest level of the curriculum in which the student is expected to profit from the teacher-led instruction. Continue giving probes until median scores are instructional and the one above is frustrational (optimal pattern: level 7 frustrational, level 6 instructional, level 5 instructional, level 4 mastery)If an instructional level is not found in first grade material, must go back to prereading skillsIf you find 2 consecutive instructional levels, it is unnecessary to continue further. In that case, the placement is in the highest instructional levelShould be used as guidelines, not hard standardsSee table 4-2 for Fuchs and Deno placement criteriaA student may be instructional in words read but frustrational in comprehension or words incorrect. This involves evaluating all three measures of reading and making a judgment call about instructional level. If they are instructional move up in difficulty; if they are frustrational, move downwardUsing IPS indicates a huge leap between reading skills in 2nd and 3rd grades (ex. Student may be at 60 WCPM in 3rd grade materials, but may have 70 wcpm in 2nd grade materials, which is mastery). This says they are somewhere between 2nd and 3rd grade levels
  24. Fuchs and Deno (1982)
  25. For Cassie, notice that we are considering both errors and words read correctly using Fuchs and Deno IPS’s. In this case, she is instructional in second grade materialsFor Lucy, it’s not as clear. Using Fuchs and Deno’s criteria, where 3-6th grade students reading 70-100 words with 6 or fewer errors, she is definitely at frustration with 4th and 5th grade reading materials. Although she is just within the instructional level for wcpm at 3rd grade level, her MEDIAN errors are above the recommended 6 or less; the actual accuracy rate for a passage where she read 70 words correct with 7 errors would be 91%, which is in the instructional zone. The reason we would probably shy away from placing her in the 3rd grade curriculum is because we are looking at MEDIAN scores. Her performance across the 3 passages probably ranged from instructional to frustration; she is on the bubble. We wouldn’t want to risk placing her in material that is too difficult (and in this case, may want to look at her actual performance rate on all 3 passages and look at the preponderance of evidence).At the 2nd grade level, Fuchs and Deno IPS changes: 40-60 wcpm with 4 or fewer errors is instructional. Her WCPM is 70 for 2nd grade material, which is clearly at a mastery level; whereas her median errors was above the recommended cutoff of 4 or fewer. Her actual accuracy rate is: 93%.Once we drop down to 1st grade material, she is clearly at mastery.The best evidence in her profile would be to place her in 2nd grade material using a conservative estimate, but closely monitoring her performance and being prepared to adjust the level if needed.Also in Lucy’s case, we’d want to probably look at a finer-grained analysis of grade level – for example a 2.5 grade level placement may be exactly perfect for Lucy.CLINICAL JUDGEMENT!Keep in mind that as you progressively regress in the curriculum, the child is held to the standard of that grade level. For example, if assessing a 3rd grader and you back up into the 2nd grade curriculum, their wrcm and error rates are now held to the 2nd grade standard (40-60/&lt;4) rather than the 3rd grade expectations (70-100/&lt;6). Similarly, if using normative data, you would identify where their reading falls relative to other second graders.If we were administering an SLA to a 2ND grade student, looking for their success level, then we would give the three probes for each grade level, calculating both correct words per minute and accuracy rate. When would we stop thinking that we had reading the appropriate instructional level?When the student had a median words correct per minute between 40 and 60 AND their median error rate was at 90% accuracy, meaning they were making 4 or fewer errors.If the student had been in 3rd grade and we backed down to a second grade level, the criteria would be the same – we hold her to a second grade standard.
  26. Remember, represent the lowest possible score one could obtain and still be OK in the subject area
  27. Predictive of later student achievement
  28. Same as the instructional placement levels
  29. AIMSWeb, DIBELS, etc
  30. Letter Sounds Correct (LSC)- predictive of later student achievementNo data on growth rates
  31. CWS
  32. AIMSWeb norms- can only use if you use AIMSWeb probes
  33. Most useful for screening decisions: answers the question of how a child is doing compared to “typical performance” in a particular curriculum in a particular district or school In small districts, all students complete the same tasks and are rank ordered by performance.In large districts a random sample is taken and the same procedure is conducted. Collect data for fall, winter, and springCollect a base year period of 3 years, then update norms every 3rd yearIf new curriculum is adopted or there is a change in the size or other student population characteristics, collect new normsA word of caution about local norms: local norms are based on the local expectations of the district that you are working in. This can be dangerous if you are working in a system that has depressed norms when compared to national standards. If using local norms, this does provide you with information about how a child is doing in a particular context, compared to peers; however, if the peers are also struggling it will skew the results. Local norms must be examined in light of state-wide and nation-wide expectations for performance. Bottom line, if using local norms, do not disregard the importance of IPS or benchmarks as criterion scores! Local norms – normative data based on students that are participating in the local curriculum. Can range from classroom to school to district level. Local norms may be ideal, but not everyone has established local norms. You may want to ask the teacher to select several students who are progressing satisfactorily, not the advanced readers. The selected students should be meeting the teacher’s standards for his or her grade level and be a good approximation of the norm for the grade. When local norms are available at the classroom or school level, discrepancy ratios are used to determine if the difference between the student and peers is significantWhen local norms are available at the district level, percentile ranks are used to determine significance; agreed upon cut scores are determined, typically scores below the 10th percentile in the district are considered significantly different.
  34. Should be around 80% of students
  35. Should be around 15% of students
  36. Should only be about 5% of students
  37. Essential to graph baseline, intervention, and PM data to be presented at RTI meetings-graphs are also presented at the RTI meetings-needs to paint a picture of the information- its an art
  38. Improves evaluator’s understanding of the dataIt’s the mechanism for communicating with parents, teachers, and students about student progressMotivator and reinforcer to parents, teachers, and the studentsAllows us to record changes in student learning over timeLevel of performance- how well a student can do a taskProgress score- how quickly a student is learning how to perform it
  39. Scales should begin at zero and be evenly spacedShow scale breaks – don’t distort time.Cover the range of variability – scaling should allow for highest and lowest score. Be aware of distortion. Size the increments so that student growth can be accurately observedToo large= understates the student’s growthToo small= overestimates the student’s growthProvide a “key” when multiple variables are presented (Note to Trainer: show example and point out the key)- no more than 3 variables (target behaviors)per graph
  40. Line graph is the most helpful and efficient method to present data
  41. Also called “pre-intervention” dataEssential to determining if the intervention caused any change in the target behaviorCritical to progress monitoring because it allows for the comparison of post-intervention scores to pre-intervention scoresAllows for prediction of what the behavior will look like in the future if interventions is not utilized to remediate itSteps: identify target behavior and intervention, collect a series of stable data points. In general, a sufficient baseline should contain at last 3 data points to ensure there is no naturally occurring trend and should be presented a condition severe enough to warrant a concern
  42. Without a solid goal, how do you know if an intervention has been effective? Measurable goals are essential
  43. Norms and benchmarks= 3 baseline pointsBenchmarks: for examples, at end of year should be reading 40 wcpm in 1st gradeIntra-Individual framework = at least 8 data points
  44. Reading the same words at a faster rate- continuing to learn and improve in reading-higher levels must be expected in those cases where students are behind in a skill area and must be caught upFirst 2 growth rates are from Fuchs et al. (1993) and third from Deno, Fuchs, Marston, and Shin (2001)These are good for goal-setting in RTI!!Average learning rates (also considered a normative comparison)Establishes the amount of gain we expect a student to make over timeBe sure to use the growth rates for the grade you are instructing the student inEx. For a 3rd grade student being instructed in 2nd grade curriculum, choose the 2nd grade growth rates for establishing the goal. Once instruction is moved up to the 3rd grade level, the goal is adjusted to 3rd grade growth rates.Take the current wcpm and add the average growth rate x number of weeks until goal is met.Caution: progress norms reflect the quality of instruction; the more intense, the greater the growth. We don’t know how good or intense the instruction was when establishing growth rates.Students who are the farthest behind need to have the steepest slope in order to catch upRates of progress that can be expected = growth rates. Do not necessarily represent new words learned, but do indicate average number of words expected to be read per week in order to improve fluency – may be reading the same words, only faster every weekDiscussion: Findings indicate that for first graders, an improvement of 2 words per week may represent a realistic slope. On the other hand, given research indicating the importance of ambitious goals to enhance student achievement (e.g., Fuchs, Fuchs, &amp; Hamlett, 1989), an improvement of approximately 3 words per week (i.e., 2.10 plus one standard deviation of .80) may represent an appropriately ambitious standard for weekly growth. This may be especially true for students with disabilities who must decrease discrepancies between their performance and that of their peers. As indexed by CBM passage reading, students make their most dramatic growth in the early grades, with slopes of 2 words per week at Grade 1 and slopes between .85 and 1.5 words per week at Grades through 4. By Grades 5 and 6, however, slope for general education students&apos; oral passage reading drops to one-half word per week and less. This inverse relation between slope and grade for oral passage reading fits within developmental reading theory. Research clearly indicates that the CBM oral passage reading measure can be used as a global indicator of reading, to index student proficiency across the multiple component skills of reading, including comprehension (e.g., Deno, Mirkin, &amp; Chiang, 1982; Fuchs, Fuchs, &amp; Maxwell, 1988; Shinn, 1989). Nevertheless, oral passage reading most directly requires the earlier component skills proposed by developmental reading theorists: decoding and fluency. According to theory, greater growth on tests directly requiring decoding and fluency should be, and indeed was, manifest as earlier reading stages in the earlier grades.
  45. When establishing appropriate weekly rates of improvement for monitoring student progress with the maze measurement procedure, however, the student&apos;s grade level is of no consequence: A realistic target for weekly improvement appears to be approximately .39 (i.e., the grand mean across grade level); an ambitious target, .84 (i.e., the grand mean plus one pooled within group standard deviation). CBM maze task may more directly require not only decoding and fluency, but also comprehension: To score well on the maze (or to index improvement over time), students must decode text, proceed fluently, and understand the content for successful blank restoration. Developmental theory would predict that this more comprehensive set of component skills required by the maze would result in more similar rates of growth over the grades (and smaller growth rates than ORF). This predicted pattern actually was demonstrated in the current study.
  46. CLSNo growth rates for WSCNo growth rates for first grade, but can assume that they would make progress similar to that of a second grade studentNorms are available as well– too big to put in the slide
  47. ComputationFuchs, Fuchs, Hamlett,Walz, Germann (1993)
  48. Multiplication study with 4th grade students Utilized intervention ‘doses’ to increase fluencyResults indicated that increased dosage resulted in increased fluencyBUT there was a diminished returnGenerated a dosage tableSame fluency acquisition as previous studyExplicit timing with goal setting component
  49. ORF
  50. Uses the student’s current level of performance and rate of progress to set end of year goals. May underestimate student’s rate of learning and may never catch him/her up if they started behind.Best to use this method when a student’s past performance is average or above averageCollect at least 8 data points, order from smallest to largestFind range: subtract lowest from highest score.divide difference by number of weeks used to collect the 8 data points (usually will be 8 if we collect once per week) = baseline rate of growth
  51. take baseline rate of growth and multiply by 1.5 in order to set weekly progress goal. Multiply that number by number of weeks left until goal must be metAdd number to median score obtained during baseline = goal
  52. Local norms sets goals based on performance of large numbers of children from same educational environmentIf local norms are not available, can use national normsWord of caution: be sure the SES group used for national norms matches your norm groupWord of caution: does not take into account criterion (if the whole district is doing poorly, data is not necessarily meaningful)Goals should be realistic, yet ambitiousMay aim to have the child at or above the 25th percentile compared to peers in year-end curriculum. (see pp 241-242 for example) ABC book suggests 50th percentileIf you do not have local norms and do not have time to establish them, you could take a small sample from your district and compare their scores to established norms
  53. Graph of ORF with goal line across the top-WCPM and accuracy were plotted-trendline
  54. What’s missing from these graphs?? BASELINE!
  55. Visually analyze the performance data during the intervention implementation and compare these data to the baseline data collected prior to the intervention
  56. Typically, one might examine the following:Level: average of a certain portion of the data. If you take it over a very long time….add up all data points and get average.Most basic techniqueComparing level of the data at baseline to the intervention phaseLevel of change is also compared to the goal for the target behavior to determine intervention effectiveness2. Immediacy/Latency of Change:Immediacy: reviews the data immediately after the intervention is introduced, and, ideally, the intervention would alter the target behavior in such a way that one can observe an immediate “Step” in the graph after the intervention is initiatedLatency: seeks to determine how long it takes (immediate or delayed) for the intervention to change the target behaviorVariability: the bounce in the data points around the trend line. Let’s put it into perspective relative to the later part of the graph. You may see some variability. That is to be expected. But, huge leaps up and down are signals to question what was happening on the very high or very low days. A lot bounce in the data may not reflect the difficulty of the passage, but might reflect inconsistency in performance in the classroom, which can be a significant problem in its own right.The objective of intervention may be to reduce the variability of the target behavior rather than establish a completely new level. Presenting a high-low range is a straightforward method of expressing the variability of data; however, the percent of nonoverlapping data can also be utilized to analyze variability. This would be accomplished by observing the amount of data overlap between phases, and one would expect to see no overlap in behavior between phases. Trend/slope: average rate of progress over a period of time. (Note to the trainer--go back to slide #33 as needed to illustrate). The pink line represents the slope. How did we get this pink line? Future training will cover how to get this line. a. a change in the trend of the outcome data is indication of satisfactory change. Evaluating the trend in data allows researchers and practitioners to make predictions about the data
  57. When at least 6 scores and at least 3 weeks have passed since the last goal was set, the 4 most recent scores are examined (although only 4 points are being evaluated, recommend 6-8 data points be collected before making this decision = represents 3-8 weeks of data)Don’t need a “trend line” for thisExamine last 4 data points:If all scores are above the aim line, raise the goalIf 4 consecutive points fall below the aim line, change the intervention
  58. Anagram to remind you of the 4point rule