Tools and resources for continuous
  improvement of technology in
             schools
  Kevin Oliver, LaTricia Townsend, ...
Description

This interactive session will highlight resources
school leaders can utilize to facilitate data-
driven decis...
Defining Evaluation

• two common evaluation modes--formative and
  summative
• formative evaluation is concerned with col...
Evaluation vs. Assessment

• evaluation is concerned with improving or
  judging the worth of some innovation
• assessment...
Evaluation vs. Assessment

                Evaluation

Assessment

  questionnaires
  pre-test/post-test            survey...
What is Commonly Evaluated?

• instructional materials (e.g., a problem-based
  learning lesson and resources)
• projects ...
Where to Begin?

• evaluation questions are difficult to write for
  project or program evaluation with so many
  factors ...
CIPP Model

• context questions seek to identify needs of target
  pop., opportunities to address them, determine how
  we...
Flashlight Model

• the Flashlight program is another evaluation
  model developed by AAHE for technology-
  based evaluat...
Flashlight Model
Logic Models

• logic models can also help you identify
  questions for your evaluation; they contain
  four elements:
  1...
After Generating Evaluation
              Questions
• helpful to generate a table that matches
  evaluation questions with...
Indicators

• indicators are a continuous factor used to
  describe a construct of interest (e.g.,
  unemployment percenta...
Benchmarks

• if an indicator states: number of students who
  use the Internet for research
• a benchmark would specify t...
Examples

         Indicators         Benchmarks
number of wireless    1 in 2006, 2 in 2007,
access points         and 3 i...
A More Detailed Planning Grid

• Irving, TX long-range technology plan
• different look, similar planning items: goals-que...
Remainder of Session

• focus on tools and resources you can use to
  collect data in evaluating a 1:1 computing
  program...
Data source instruments
School Technology Readiness
            Measures
CDW-G One-to-One Readiness Assessment
Online access: http://tinyurl.com/3...
School Technology Readiness
        Measures (continued)
ISTE/CEO Forum: STaR chart (CEO Forum’s
Interactive School Techno...
School Technology Readiness Measures
                              (continued)



exas Teacher STaR Chart http://starchart...
School Technology Readiness
         Measures (continued)
SEIR*TEC School Technology Needs
Assessment (STNA)

Sample of th...
STNA 4.0 - School Technology
          Needs Assessment
•   What is STNA?
•   What data does STNA collect?
•   What does t...
What is STNA?

•   STNA is a valid and reliable instrument allowing effective assessment
    of educational technology nee...
What data does STNA collect?

•   STNA reports at the school level
•   Documents respondents’ perceptions and attitudes
• ...
What does this data tell us?

•   STNA reports descriptive data
     – Item and response set
     – Frequencies
     – Per...
How is STNA data interpreted?

•   Each construct examined is theoretically beneficial to successful
    implementation of...
How can findings be used?

•   Plan your technology program implementation
     –   Incorporate into your school improveme...
What is the STNA process?

•   Currently free, thanks to Friday Institute support
•   Convene your team, decide if you nee...
Technology Integration Measures

ISTE Classroom Observation Tool (ICOT®)
 http://icot.craftyspace.com/

   -   Register (c...
Technology Integration Measures
                     (continued)



SEIR*TEC Technology Integration Progress
 Gauge (2000)...
Technology Integration Measures
                        (continued)


North Central Regional Technology in Education
Conso...
Technology Integration Measures
            (continued)
Why use the LoFTI Instrument?

• Determine how technology is being used
  school-wide.
• Record instances of particular us...
Who Can Use the LoFTI Instrument?

• The observer can be any member of the
  school staff.
• Observers should be trained b...
Using the Protocol

• The observer’s presence will undoubtedly
  influence what they observe.
• Be mindful of the observer...
Using the Protocol

• Visits are not scheduled.
• Keep a separate record of visits.
• Avoid typical non-instructional time...
Using the Protocol

• The observer’s presence will undoubtedly
  influence what they observe.
• Be mindful of the observer...
Practice Using the LoFTI


      Online LoFTI

Sample Lesson - Artic Ice
Additional Data Sources


•   Teacher Lesson Plans
•   Teacher Reflection Logs
•   Technology Use Logs
•   Rubrics
Using the Data to Effectively Plan
       Professional Development



As a whole group, identify professional
  developmen...
Effective Professional Development

• Fosters a deepening of subject-matter
  knowledge, understanding of learning, and
  ...
Professional Development
  Questionnaire (PDQ)
Data Source Instruments
      (Summary)
Ning

Join the discussion online

Technology Evaluation in K-12 schools
http://nc1to1.ning.com/group/technologyevaluationi...
Tools And Resources For Continuous Improvement Of Technology In Schools
Tools And Resources For Continuous Improvement Of Technology In Schools
Upcoming SlideShare
Loading in …5
×

Tools And Resources For Continuous Improvement Of Technology In Schools

1,937 views
1,844 views

Published on

Published in: Business, Technology
1 Comment
1 Like
Statistics
Notes
  • Hello,
    It's amazing to see how you guys presented this great information via slide show. It is indeed useful to many to see vividly the importance tools and resources to continuously improve the use of technology in school. Let me encourage also our readers if they can get time to find out how 81 county school districts in LA county evaluated their level of service. Amazing how easy it is when you do it right http://snurl.com/lacountyschools

    Keep up the works and more power to YOU!
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
No Downloads
Views
Total views
1,937
On SlideShare
0
From Embeds
0
Number of Embeds
34
Actions
Shares
0
Downloads
13
Comments
1
Likes
1
Embeds 0
No embeds

No notes for slide

Tools And Resources For Continuous Improvement Of Technology In Schools

  1. 1. Tools and resources for continuous improvement of technology in schools Kevin Oliver, LaTricia Townsend, Rodolfo Argueta, Daniel Stanhope Institute on Leading Innovation: Implementing Effective 1:1 Learning Technology Programs July 8-10, 2009. The Friday Institute, NCSU, Raleigh, NC
  2. 2. Description This interactive session will highlight resources school leaders can utilize to facilitate data- driven decisions to improve the effectiveness of a technology project. Participants will learn about a variety of data collection tools freely available to them, practice using these tools, and discuss ways in which the results may shape practice in their schools.
  3. 3. Defining Evaluation • two common evaluation modes--formative and summative • formative evaluation is concerned with collecting data to help revise an innovation of interest (e.g., one-to-one computing) • summative evaluation is concerned with collecting data to help judge the worth of an innovation and whether to adopt/reject • 1:1 leaders may be involved in both modes
  4. 4. Evaluation vs. Assessment • evaluation is concerned with improving or judging the worth of some innovation • assessment is concerned with how much a student knows • evaluation is the more inclusive term, often making use of assessment data as one data source, in addition to surveys, observations, interviews, and more
  5. 5. Evaluation vs. Assessment Evaluation Assessment questionnaires pre-test/post-test surveys observations interviews artifacts
  6. 6. What is Commonly Evaluated? • instructional materials (e.g., a problem-based learning lesson and resources) • projects (e.g., school professional development or technology plans) • programs (e.g., one-to-one computing in a school district or state) • in project or program evaluation, you may study instructional materials purchased or created, but you will also look at leadership, professional development, collaboration, sustainability, etc.
  7. 7. Where to Begin? • evaluation questions are difficult to write for project or program evaluation with so many factors to consider • evaluation models can help you develop evaluation questions to study the most important elements • for example, Stufflebeam's CIPP Model encourages a close look at Context, Inputs, Processes, and Products
  8. 8. CIPP Model • context questions seek to identify needs of target pop., opportunities to address them, determine how well goals address needs • input questions seek to define capabilities, project strategies and designs, types of support useful in reaching goals, what was required to achieve goals • process questions seek to define deficiencies in the process or implementation, how resources were allocated, barriers to success • product questions seek to define outcomes, describe lessons learned
  9. 9. Flashlight Model • the Flashlight program is another evaluation model developed by AAHE for technology- based evaluations • Flashlight is based on the concept of "triads," consisting of the relationship between outcomes (goals/objectives), strategies, and technologies, tools, or supporting materials
  10. 10. Flashlight Model
  11. 11. Logic Models • logic models can also help you identify questions for your evaluation; they contain four elements: 1. inputs or the resources that go into the project 2. activities that take place as part of the project 3. short term objectives (e.g., "students will use...") 4. long term goals (e.g., "increase student scores...")
  12. 12. After Generating Evaluation Questions • helpful to generate a table that matches evaluation questions with indicators and benchmarks of success (expectations), and data sources that you will use to check on that success (today's session topic!) Questions Indicators Benchmarks Data Results, Sources, Changes, Instruments Outcomes
  13. 13. Indicators • indicators are a continuous factor used to describe a construct of interest (e.g., unemployment percentage, lots per acre, violent crime per 1000 citizens, etc.) • have some historical/past value, present value, and future value • value is in constant flux and can be changed by projects we are evaluating
  14. 14. Benchmarks • if an indicator states: number of students who use the Internet for research • a benchmark would specify the time interval by which you would expect to see changes • if we know 40% used Internet last year before our project was implemented, we might expect 60% to use it this year, and 80% next year
  15. 15. Examples Indicators Benchmarks number of wireless 1 in 2006, 2 in 2007, access points and 3 in 2008 mobile labs in the 5 in 2006, 10 in 2007, district 15 in 2008 teacher access to 80% in 2006, 90% in e-mail 2007, 100% in 2008
  16. 16. A More Detailed Planning Grid • Irving, TX long-range technology plan • different look, similar planning items: goals-questions, objectives-benchmarks, evaluation-data sources
  17. 17. Remainder of Session • focus on tools and resources you can use to collect data in evaluating a 1:1 computing program Questions Indicators Benchmarks Data Results, Sources, Changes, Instruments Outcomes
  18. 18. Data source instruments
  19. 19. School Technology Readiness Measures CDW-G One-to-One Readiness Assessment Online access: http://tinyurl.com/35v305
  20. 20. School Technology Readiness Measures (continued) ISTE/CEO Forum: STaR chart (CEO Forum’s Interactive School Technology and Readiness (STaR) Chart ) Online access: http://www.iste.org/inhouse/starchart/index.cfm?Section=ST
  21. 21. School Technology Readiness Measures (continued) exas Teacher STaR Chart http://starchart.esc12.net/docs/TxTSC.pdf exas Campus STaR Chart http://starchart.esc12.net/docs/TxCSC.pdf ichigan 2003-04 Freedom to Learn School Readiness Rubric http://school.discoveryeducation.com/schrockguide/pdf/schooltechrubric.pdf tate of Washington Technology Essential Conditions Rubric http://www.k12.wa.us/edtech/TechEssCondRubric.aspx or http://www.k12.wa.us/edtech/pubdocs/TECR-WA.doc
  22. 22. School Technology Readiness Measures (continued) SEIR*TEC School Technology Needs Assessment (STNA) Sample of the online survey: http://www.keysurvey.com/survey/134127/1516/ Download Paper-Pencil Versions: http://www.serve.org/evaluation/capacity/EvalFramework/resou
  23. 23. STNA 4.0 - School Technology Needs Assessment • What is STNA? • What data does STNA collect? • What does this data tell us? • How can findings be used? • What is the STNA process? • What data is reported? • How is STNA data interpreted?
  24. 24. What is STNA? • STNA is a valid and reliable instrument allowing effective assessment of educational technology needs to better design and evaluate projects and initiatives • STNA provides a free, user-friendly online tool that allows for planning and formative evaluation of technology projects in educational settings • Helps planners collect and analyze needs data related to implementation of technological innovation aimed at examining technology use in teaching and learning • Guides school- and district-level decisions about professional development for educators
  25. 25. What data does STNA collect? • STNA reports at the school level • Documents respondents’ perceptions and attitudes • Broad areas of school technology – Supportive Environment for Technology Use • Vision, Planning and Budget, Communication, Infrastructure and Staff Support – Professional Development • Professional Development Needs, Professional Development Quality – Teaching and Learning • Teacher Technology Use, Student Technology Use – Impact of Technology • Teacher Impact, Student Impact
  26. 26. What does this data tell us? • STNA reports descriptive data – Item and response set – Frequencies – Percentages • Scales – 1 (Strongly Agree) to 5 (Strongly Disagree) and 6 (Do Not Know) – 1 (Daily) to 5 (Never) and 6 (Do Not Know) • Look at the profile: – Generally positive or negative? – Number of highs and lows? – Height of highs and lows? – Very different than other items? • You might get more questions instead of answers
  27. 27. How is STNA data interpreted? • Each construct examined is theoretically beneficial to successful implementation of technology in teaching and learning settings – Strongly agreeing with an item or indications of daily use is inherently “positive” – “Do not know” is neither positive nor negative • All respondents SA or A: “Needs are being met” • Mostly SA or A: “Not as emphatically positive, perhaps there is room for improvement” • Mostly neutral, D, or SD: “Area for concern” • Large number of DNK: “Lack of awareness in this item” • Mixed responses: “Lack of strong feelings about this item… why?” • Split between D/SD and A/SA: “Much disagreement between staff, and an area of concern”
  28. 28. How can findings be used? • Plan your technology program implementation – Incorporate into your school improvement plan or technology plan – Define your priorities – Plan professional development – Allocate resources – funding, staffing, infrastructure • Clarify your technology program implementation steps – Provide rationale for your goals and objectives – Connect to your strategies – Make your case with needs data • Repeated uses track changes in the school STNA “profile” over time • Ultimately, uses are driven by the questions you are looking to answer
  29. 29. What is the STNA process? • Currently free, thanks to Friday Institute support • Convene your team, decide if you need STNA data and how it will be used • Communicate with staff for buy-in • Email Danny at daniel.s.stanhope@gmail.com with: – Names of each school participating – Opening and closing dates – An accurate count of expected respondents • Have all staff working with students complete STNA – goal is 100% response rate • Coordinate STNA completion and track who has completed the survey • Reports are distributed shortly after STNA closes • Reconvene team to make decisions based on STNA results
  30. 30. Technology Integration Measures ISTE Classroom Observation Tool (ICOT®) http://icot.craftyspace.com/ - Register (create an account and confirm it). - Install tool (Adobe AIR Runtime needed for installation). - Record observations (online or pencil-and-paper) - Export and analyze data - Help (pdf file) Online tutorial: http://iste.acrobat.com/p92443979/
  31. 31. Technology Integration Measures (continued) SEIR*TEC Technology Integration Progress Gauge (2000) Download PDF version: http://www.serve.org/Evaluation/Capacity/EvalFramework/reso
  32. 32. Technology Integration Measures (continued) North Central Regional Technology in Education Consortium Scoring Guide for Student Products – create scoring guides to evaluation a variety of student products – export scoring guides to pdf Description and directions: http://www.ncrtec.org/tl/sgsp/teachers.htm Tool page: http://goal.learningpt.org/spsg/GetProd.asp
  33. 33. Technology Integration Measures (continued)
  34. 34. Why use the LoFTI Instrument? • Determine how technology is being used school-wide. • Record instances of particular uses of technology, not “how well” it is used. • This is not a teacher evaluation tool.
  35. 35. Who Can Use the LoFTI Instrument? • The observer can be any member of the school staff. • Observers should be trained before conducting observations using the LoFTI instrument.
  36. 36. Using the Protocol • The observer’s presence will undoubtedly influence what they observe. • Be mindful of the observer’s placement and interactions with students and teachers. • Observers should record only what is observed during the actual visit.
  37. 37. Using the Protocol • Visits are not scheduled. • Keep a separate record of visits. • Avoid typical non-instructional times – beginnings, endings, or transitions. • Visit a given room at different times and days of the week.
  38. 38. Using the Protocol • The observer’s presence will undoubtedly influence what they observe. • Be mindful of the observer’s placement and interactions with students and teachers. • Observers should record only what is observed during the actual visit.
  39. 39. Practice Using the LoFTI Online LoFTI Sample Lesson - Artic Ice
  40. 40. Additional Data Sources • Teacher Lesson Plans • Teacher Reflection Logs • Technology Use Logs • Rubrics
  41. 41. Using the Data to Effectively Plan Professional Development As a whole group, identify professional development opportunities that seem to be a good match for the needs identified using the data from the STNA results.
  42. 42. Effective Professional Development • Fosters a deepening of subject-matter knowledge, understanding of learning, and appreciation of students’ needs • Centers around the critical activities of teaching and learning • Engages educators in professional learning communities • Is sustained, intensive, and woven into the everyday fabric of the teaching profession
  43. 43. Professional Development Questionnaire (PDQ)
  44. 44. Data Source Instruments (Summary)
  45. 45. Ning Join the discussion online Technology Evaluation in K-12 schools http://nc1to1.ning.com/group/technologyevaluationink12schools

×