Formative evaluation of student learning and course design made simple: Fevatools Jim Julius, ITS


Published on

Published in: Education
1 Like
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Evaluating the cooking environment / experience: organization, tools, ingredients, recipe. How do you know if it’s “good enough”? How do you know what’s possible?
  • Summative and formative assessment are often referred to in a learning context as assessment of learning and assessment for learning respectively. Assessment of learning is generally summative in nature and intended to measure learning outcomes and report those outcomes to students, parents, and administrators. Assessment of learning generally occurs at the conclusion of a class, course, semester, or academic year. Assessment for learning is generally formative in nature and is used by teachers to consider approaches to teaching and next steps for individual learners and the class. (from Wikipedia assessment article)In reality, formative evaluation of course design potentially could subsume just about all of both of the other circles. Everything is data. But our attention to this may be limited. We need to know what we want to know.
  • Achievement: what students know and are able to do at end of course: distinguish "enduring understandings”, "important to know and do" and "worth being familiar with”Satisfaction Retention Success Achievement External proficiencies Real-world performance
  • Learner characteristics (demographics, prior experience, learning styles/preferences)Design (consider carefully the elements of the design you really want to examine; be sure they are documented)Learning resources (support, intervention, materials of all sorts - and how/when they are used)ContextFaculty Development
  • Formative evaluation of student learning and course design made simple: Fevatools Jim Julius, ITS

    1. 1. Fevatools: A Toolkit to jump-start formative assessment of student learning and course design<br />Jim Julius<br />SDSU Course Design Institute<br />May 26, 2010<br />
    2. 2. Guiding Questions<br />What does it mean to bring an inquiry mindset to course design?<br />How should one decide what kind of feedback to seek?<br />What tools are available to collect feedback?<br />What do I do with the data?<br />
    3. 3. Summative<br />
    4. 4. Formative<br />
    5. 5. What (and why) are you measuring?<br />Outcomes: tell you what you got, not how or why<br />Inputs<br />Processes <br />Seeking continuous improvement<br />
    6. 6. Also formative<br />
    7. 7. What (and why) are you measuring?<br />
    8. 8. Outcomes<br />
    9. 9. Inputs, Processes<br />
    10. 10. Refining type and quality of learning materials & activities<br />Optimized Hybrid or Fully <br />Online<br />Classes<br />ConventionalFace-to-FaceClasses<br />Cycle 1<br />Cycle 2<br />Cycle 3<br />Cycle 4<br />
    11. 11. Narrowing Your Inquiry + Selecting Useful Tools<br /><br />
    12. 12. Data from M. Laumakis<br />pICT fellow in 2005<br />Began teaching parallel 500-student sections of PSYCH 101 in 2006, one traditional and one hybrid<br />First fully online PSYCH 101, Summer 2008<br />Grade data, attendance data, IDEA survey, clicker survey, observational protocols all part of data gathering<br />
    13. 13. SALG<br />How much did the following aspects of the class help your learning?<br />Rated from 1 (no help) to 5 (great help)<br />
    14. 14. Summer 2008 Fully Online: SALG Data<br />
    15. 15. SALG Data over time & format<br />
    16. 16. Community of Inquiry Survey<br />Statements rated from 1 (strongly disagree) to 5 (strongly agree)<br />Based on the Community of Inquiry framework’s three elements:<br />Social Presence<br />Cognitive Presence<br />Teaching Presence<br />
    17. 17. Community of Inquiry Survey<br />
    18. 18. What would you like to further explore?<br />
    19. 19. Image sources<br /><br /><br /><br /><br />