Qualitative and quantitative student assessment

22,193 views

Published on

Workshop given by Gail Matthews-DeNatale to the 2009 NERCOMP conference on assessment in higher education.

Published in: Education
0 Comments
4 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
22,193
On SlideShare
0
From Embeds
0
Number of Embeds
8
Actions
Shares
0
Downloads
0
Comments
0
Likes
4
Embeds 0
No embeds

No notes for slide
  • Assessment is a form of research, whether it’s an investigation into what students know/understand, how a program is going, etc.
  • The definition of what’s being assessed is integral to defining a strategy for assessment For example, In the 21st Century, meaning of information is made across a range of media and formats MIT, UC Santa Barbara, Simmons, are all moving in the direction of transliteracy, which is defined as The ability to find, assess credibility, analyze, read, and author across a range of platforms, tools and media: texts, visualizations, multimedia, video, etc.
  • Qualitative and quantitative student assessment

    1. 1. Qualitative and Quantitative Student Assessment Gail Matthews-DeNatale Simmons College
    2. 2. <ul><li>Presentation based on a two day workshop for NERCOMP </li></ul><ul><li>Developed in collaboration by Brandeis, Simmons, and Bryn Mawr </li></ul><ul><li>Workshop agenda/files located at http://www.nercomp.org/events/event_single.aspx?id=1763 http://www.nercomp.org/events/event_single.aspx?id=5734 </li></ul>Preliminary Caveat
    3. 3. The Cycle of Assessment … is a cycle of research Identify Issue(s) Define Research Question Design, Implement Strategy Interpret Data Act Upon Findings
    4. 4. Issues to Consider <ul><li>Formative vs. Summative </li></ul><ul><li>How do you plan to use the results? </li></ul><ul><li>To inform future work OR evaluate accomplishments of students </li></ul><ul><li>To inform program and workshop offerings OR to evaluate the effectiveness of programs that are in place </li></ul><ul><li>Be clear, up front, with yourself and with your students </li></ul>
    5. 5. <ul><li>Topic: What is it that you’re trying to assess? </li></ul><ul><li>Purpose: Why is it important to assess? </li></ul><ul><li>Central Question: What do you want to know? </li></ul>50,000 Foot View Topic: Student information literacy Purpose: Ensure college preparedness Question: In what areas do students need to improve? Issues to Consider
    6. 6. <ul><li>But It’s Not That Simple </li></ul><ul><li>What do we mean by “information literacy”? </li></ul><ul><li>What expectations do colleges have for the information literacy of their incoming students? Does this expectation vary depending on the type of school? </li></ul><ul><li>How do we intend to use the assessment to improve student preparedness? … Work with individual students? … Change program offerings? </li></ul>Issues to Consider
    7. 7. Convergence Culture Transliteracy Research Focus & Question What, specifically, is being assessed? For example, “information literacy” as … University of California, Santa Barbara Transliteracies Project, http://transliteracies.english.ucsb.edu Howard Rheingold, 21st Century Literacies http://www.blip.tv/file/2373937
    8. 8. Thought Experiment: What would an “information literate” person DO with the following? Assessment Design Strategy: Begin with Outcomes
    9. 9. What type of assessment(s) will yield the desired evidence? How will you know what students DID with these examples? Assessment = Evidence
    10. 10. Open-ended Question Focus Group Think Aloud ePortfolio One-on-One Interview Observation or Shadowing Role Play or Simulation Thought Experiment Option: Qualitative Assessment In Conjunction with Novice-to-Expert Rubric
    11. 11. Surveys T/F and MC Tests Sounds simple, but the devil is in the details! <ul><li>Is the sample statistically significant? </li></ul><ul><li>Are the questions “testing” what you need to know? </li></ul><ul><li>Are the questions clear? </li></ul><ul><li>Do you have the expertise to analyze results? </li></ul>Option: Quantitative Assessment
    12. 12. Quantitative Tells You “What” Qualitative Tells You “Why” Qualitative vs. Quantitative A False Dichotomy
    13. 13. For Example Quant Multiple Choice: You are doing background research for a science project. In addition to perusing library sources, you decide to go online and see what’s available. You come across two web sites. Which site do you think is more credible? Qual Open Ended: Why?
    14. 14. Results Only 24% selected the most credible site for the appropriate reasons
    15. 15. <ul><li>A website is more credible if it … </li></ul><ul><li>has a .org domain name (more credible than .gov) </li></ul><ul><li>has lots of “information” / links </li></ul><ul><li>has a “shorter,” “easy,” “straightforward” URL </li></ul><ul><li>includes information on how to “send feedback” and/or contact the webmaster/designer </li></ul><ul><li>“ comes from the United States” </li></ul><ul><li>“ claims to be unbiased” </li></ul><ul><li>is “official,” or has a name that “sounds official” (such as “Centers” and “Organizations”) </li></ul><ul><li>links to other “well-known organizations” </li></ul><ul><li>includes advertising … a search option … press releases … has a “professional “look” </li></ul>Results
    16. 16. Use Results to Guide Teaching, Inform Program Development A Comparison of Student Perceptions of their Computer Skills to their Actual Abilities (NC Central University) Focus: Difference in perception and performance Results: Some areas of self perception vs. ability were accurate, others were significantly divergent. Action: As a result of this research, the curriculum for the introductory course was redesigned to focus on areas of divergence http://jite.org/documents/Vol8/JITEv8p141-160Grant428.pdf
    17. 17. Cycle of Assessment, Full Circle! Identify Issue Define Research Question Design, Implement Strategy Interpret Data Act Upon Findings
    18. 18. Gail Matthews-DeNatale Simmons College [email_address]

    ×