Building Capacity in Functional Behavior Assessment in Schools

702 views

Published on

Published in: Education, Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
702
On SlideShare
0
From Embeds
0
Number of Embeds
19
Actions
Shares
0
Downloads
0
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • -The empirical and conceptual basis of functional assessment is VERY strong. -Multiple methods of FBA exist (with greater or lesser support however key is skill of implementer)Question: Why do we not see (a) wide-spread use of “good” FBA and (b) effective implementation of support plans--in most schools in the country, with all students needing intensive intervention?
  • Scaling work now ongoing in IL, MS, MD, and FL
  • Focus here on the function-based support portion--How we attempted to build capacity--how effective were we?=what did we learn?
  • Progress monitoring—really hard to move schools away from “admire the problem” to data-based decision makingRequires on-going coaching for about the first 4-6 months
  • Why match?a. Review of school-wide discipline and academic data suggests that approximately 20% of students require more than Tier 1Elementary school with 400 students==80 students; middle school with 750 students—that is 150 students. Way too many for individualized assessment and intervention. Alternative: package intervention and embed within a multi-tiered prevention system (Gresham, 2004)a) Intensity of intervention is matched to severity of problem; in multi-tiered models prevention is an outcome. Tier I: Prevent, Tier II: Reverse harm, Tier III: reduce harm (e.g., Walker et al., 1996)Point cards are great way to progress monitor at Tier III
  • Number of schools thus limited by ability to provide follow-up (grant allowed us to have doctoral students help with this)
  • Completed by outside observers trained in tool; interviews but primarily a review of permanent products (e.g., meeting minutes, copies of support plans)
  • ISSET—ask coordinator for FBA/BSP—score between 1 and 4 (a lot of them were done by district people or in Eugene, my students!)FBAOperational definitionABC relationTeam have right peopleImplementationIntervention components (multicomponent & linked to hypothesis statement)—missed most; consequence for problem behaviorProgress MonitoringIs there a plan for assessing fidelity and outcomes?
  • Caveat—small change; talking about 1-3% of populationLooking at proportion
  • Building Capacity in Functional Behavior Assessment in Schools

    1. 1. SCHOOL-BASED Cynthia M. Ander son, R. FUNCTIONAL Justin Boyd, Nadia Sampson, & Anna Mar shall ASSESSMENT: CAPACITY Univer sity of Oregon DEVELOPMENT & SCALINGFunded in part byOSEP
    2. 2. WHY AREN’T WE “AT SCALE?”Limited resources Expertise Time FundsMultiple competing initiativesResearch focus: effective interventions, little work on how to implement
    3. 3. INTENSIVE POSITIVE BEHAVIOR SUPPORT Goals  Build district and school capacity at Tiers II and III  Identify factors affecting implementation, effectiveness, and sustainability Participants  Model districts  2 districts in OR; 8 schools (4 elementary & 4 middle)  Scaling districts  5 districts in OR (28 schools)
    4. 4. IPBS Systems  Technical assistance (district)  Team-based decision-making Interventions  Tier II  Function-based support Data/Progress monitoring  Student outcomes  Intervention effectiveness overall  Fidelity of implementation
    5. 5. BUILDING CAPACIT Y IN SCHOOLSProcess Year 1  Progress monitoring team formed  CICO implemented Year 2  Build capacity of 1-2 individuals in FBA/BSP  Years 3 &4  Train the trainer model of capacity building in schools
    6. 6. RTI MODEL  Efficient FBA  School-based staff  Interview/observation  Formal FBA  District/school staff  Parental involvement  +structural analysis  Complex FBA  District staff/consultant  +experimental manipulations 6
    7. 7. YEAR 2 FBA TRAINING Attendees selected by district coach  FTE available to conduct FBA/BSP  Access to coaching FBA Training varied by district (1/2 day to 2 -day) 1  Typical workshop  FBA interview  ABC observations  Emphasis on ABC relations within routines  Follow-up coaching
    8. 8. OUTCOMES
    9. 9. SAMPLE Schools in Pacific NW  20 elementary schools  16 middle schools All schools implementing Tier I of SWPBS with fidelity Participants selected by district  Interested in scaling up  District personnel with FTE for coaching/TA
    10. 10. TO WHAT EXTENT CANSCHOOLS IMPLEMENT IPBS? ISSET
    11. 11. Tier III Subscales BL T1BL 100.00 T2 80.00 T3% Features Implemented 60.00 T4 40.00 20.00 0.00 Functional T3-Assess. T3-Implement. Implementation T3-Monitor. Progress Behavior Monitoring Assessment
    12. 12. EVALUATION OF FBA/BSPS Participants  School district in Pacific NW  Implementing IPBS for 4 years at scale Sampling  Requested copies of all FBAs and support plans  26 FBA/BSP*  Most schools had only support plans  “we just talked about the support plan” Scoring  2 doctoral students  IOA on 24%
    13. 13. EVALUATION12% of support plans had accompanying FBAFBA summary statements (competing behavior pathway)  100% identified ABC relation  60% operationally defined problem behavior  Antecedents  94% technically accurate “setting event” or none  94% observable & environmental S+  Consequences  98% observable & environmental variables  90% identified more than one reinforcer
    14. 14. SUPPORT PLAN EVALUATION Components  75% logical antecedent strategy  50% strategy for minimizing reinforcement of target response  89% reinforcement for desired/alternative response  99% contained no contra-indicated strategies…….When EVERYTHING is a reinforcer anyinterventions almost any strategieswill be a match
    15. 15. IMPLEMENTATION PLAN
    16. 16. SUMMARY Documentation of FBA is a problem May not be accurately identifying function Support plans  Antecedent & differential reinforcement strong  Contingencies for problem behavior missing Implementation planning rarely occurs
    17. 17. EFFECTS ON STUDENT BEHAVIOR
    18. 18. PROPORTION OF POPULATION WITH REFERRALS 80% 60%Mean % Change Between Baseline and Year 4 40% 20% 0% -20% -40% Implementing -60% Not Implementing -80% -100% 3 4 5 6+ ODRs ODRs ODRs ODRs
    19. 19. LESSONS LEARNED1. District involvement is key 1. District level expertise in function-based support crucial  1.0 FTE for every 4 schools  Evidence-based practice  Data-based decision-making  Building & maintaining capacity2. Capacity  Tier I and Tier II is feasible  Tier III?3. Behavior analysis needed in training programs

    ×