Name
School
Department
ACTIVE ENGAGEMENT USING
CLASSROOM RESPONSE SYSTEMS
@ CSU PUEBLO FACULTY DEVELOPMENT WORKSHOP
DR. JEFF LOATS
DEPARTMENT OF PHYSICS
THE TECHNOLOGY CHALLENGE
2
“The challenge is not simply to incorporate
learning technologies into current institutional
approaches, but rather to change our
fundamental views about effective teaching and
learning and to use technology to do so.”
(Higher Education in an Era of Digital Competition, Donald E. Hanna)
GUIDING PRINCIPLES
3
Technology is not an educational panacea
Seek tools that offer new approaches
As always, let evidence guide our attention
OVERVIEW
4
1. Motivation for change
2. Peer Instruction
3. Technology options
4. Question types
5. Practice writing questions
6. Evidence for effectiveness
7. Summaries
5
In (roughly) what area do you teach?
A) Humanities
B) Natural sciences & mathematics
C) Professions & applied sciences
D) Social sciences
E) Teacher education
6
Are you currently using clickers or another
classroom response system in your courses?
A) I have never used them.
B) I have used them before, but don’t currently.
C) I use them currently in at least one class.
PHYSICS EDUCATION REVOLUTION
Eric Mazur, Physicist at Harvard:
7
8
“ALL SIMILARLY (IN)EFFECTIVE…”
9
University of Washington
University of Colorado
University of Illinois
at Urbana-Champaign
FEEDBACK THAT WORKS
10
“Improvement of performance is actually a
function of two perceptual processes.The
individual’s perception of the standards of
performance, and her/his perception of his/her
own performance.”
The Feedback Fallacy – Steve Falkenberg (via Linda Nilson)
TECHNOLOGIES VS.TECHNIQUES
11
Clickers
Colored cards
Hands
Virtual response tools
Peer Instruction
Factual recall
Polling/survey
Poll-Teach-Poll
Thought Questions
Teach-Test-Retest
… adding metacognition
MAZUR’S PERSONAL REVOLUTION
12
(added) Pre-class reading, enforced
(removed) Most sample problems
(removed) Derivations
(modified) Lecture broken up into small bites
(added) Depth over coverage
(added) ConceptTests with Peer Instruction
PEER INSTRUCTION
13
Multiple choice questions
–Conceptual
–Hard
1. Students answer Individually
2. Discussion with peers
3. Students answer post-discussion
4. Class-wide discussion
Students have developed a robot dog
and a robot cat, both of which can
run at 8 mph and walk at 4 mph.
A the end of the term, there is a race!
The robot cat must run for half of its
racing time, then walk.
The robot dog must run for half the
racing distance, then walk.
Which one wins the race?
A) Robot cat B) Robot dog C) They tie
14
MAZUR AFTER 1YEAR
15
ELSEWHERE?
16
WHY CLICKERS?
17
Alternatives:
–Hand raising
–Numbered/colored cards
Anonymity + secrecy honesty
Inclusive
Fast
Credit for learning
STILL CLICKERS?
18
Hardware clickers are (basically) obsolete
Good options:
• PollEverywhere
• Top Hat Monocle
• LearningCatalytics
TECHNOLOGIES VS.TECHNIQUES
19
Clickers
Colored cards
Hands
Virtual response tools
Peer Instruction
Factual recall
Polling/survey
Poll-Teach-Poll
Thought Questions
Teach-Test-Retest
… adding metacognition
FACTUAL RECALL
20
Rated poorly by students
Usually requires high stakes
Good uses: Reading quiz or diagnostic?
What is the correct expression for the area of a
circle?
A) e ∙ r
B) e ∙ r2
C) π ∙ d
D) π ∙ r2
E) π ∙ r
21
POLLING/SURVEY
22
Share without risk
Comparison statistics
Controversial topics are engaging
Do you feel you were treated fairly at all
levels of review when you had your most
recent professional review (renewal, tenure,
promotion, etc.)?
A) Yes
B) No
First:Women only Second: Men only
23
How large of an effect does bias have in the social
sciences? [Measurement was of faculty
responsiveness to prospective student emails.]
A) Women/minorities do worse by ~11%)
B) Women/minorities do worse by ~3%
C) No difference across gender/ethnicities
D) Caucasian males do worse by ~3%
E) Caucasian males do worse by ~11%
24
POLL-TEACH-POLL
25
1. Poll but don’t show results
2. Teach
3. Poll again (explore shifts in attitude)
Peer sharing for added metacognition
Insightful results for instructor
Which best describes your feelings about female
circumcision/female genital mutilation?
A) I am writing letters to theWHO to protest.
B) To each their own… we shouldn’t interfere
with another culture.
C) What is the big deal… males around the world
are circumcised.
D) I don’t know anything about it.
26
THOUGHT QUESTIONS
27
• Choose a relevant open-ended question.
• Small group discussion
• Presentation & defense by a single group
• Class votes: Agree/Disagree/Don’t know
• If threshold isn’t met… next group presents!
Repeat until majority agrees
Created byTeresa Foley & Pei-SanTai
from the CU Integrative Physiology Department
Endocrinology:
What would you predict would happen to the
ovulatory frequency if one ovary were removed?
Immunology:
Given that all blood cell types derive from the
pluripotent hemopoietic stem cell, why are there
so many different types of cells in the immune
system?
28
TEACH-TEST-RETEST
29
Skill focused questions
Diagnostic and formative assessment
Repeated testing beats repeated studying!
ADDING METACOGNITION
30
“… and I can explain why”
“… but I don’t know why”
Good for two-choice questions
Adds to formative assessment value
GOOD QUESTIONS
31
WRITE A QUESTION AND SHARE...
32
Imagine an introductory course and a topic early
in that course.
Write a question, then share
Peer Instruction
Factual recall (add metacognition?)
Polling/survey
Poll-Teach-Poll
Thought Questions
Teach-Test-Retest
THE EVIDENCE STANDARD
33
Quick/easy attendance in large class sizes.
Provides anonymity (Banks, 2006).
Every student participates (Banks, 2006).
Encourages active learning (Martyn, 2007).
THE EVIDENCE STANDARD
34
Improved concentration (Hinde & Hunt, 2006)
Improved learning and retention
(Moreau, 2010).
Improved exam scores (Poirier & Feldman, 2007)
Efficient use of class time
(Anderson, et al. 2011).
STUDENT FEEDBACK ON CLICKERS
315 students in 7 classes over 4 terms (roughly ±6%)
Rated on 5 point scale (strongly disagree to agree)
The use of iClickers, and activities
that used them have…
Agreed or
Strongly Agreed
…helped me to stay more engaged
in class than I would otherwise be.
93%
…helped me to learn the material
better than I otherwise would
83%
…been worth the cost to buy them 78%
WHAT MIGHT STOPYOU?
36
In terms of the technique:
Time, coverage, not doing your part, pushback…
In terms of the technology:
Learning curve, tech. failures, perfectionism…
In any reform of your teaching:
Reinventing, no support, too much at once…
BEST PRACTICES
37
Start small – 5 min of each hour of class
Sell it – Be explicit about why
Be consistent – Nearly every class
Engage students –Wait for explanation
Demonstrate value – Focus on wrong answers
Follow up – Assessments must change
Credit – 2%-15% for participation… mostly.
MY SUMMARY
38
Classroom response systems can be integrated
into most teaching styles and disciplines to good
effect.
From an evidence-based perspective, classroom
response systems addresses often-neglected
areas.
As with all reforms, be prepared to find that
students know less than we might hope.
YOUR SUMMARY
39
For yourself… or to share?
What part of using a classroom response system
is the fuzziest for you after this?
What is the biggest reason you thing trying a
classroom response system might work well?
Contact Jeff: Jeff.Loats@gmail.com
Today’s slides: www.slideshare.net/JeffLoats

Active Engagement Using Classroom Response Systems - CSU Pueblo - Jeff Loats

  • 1.
    Name School Department ACTIVE ENGAGEMENT USING CLASSROOMRESPONSE SYSTEMS @ CSU PUEBLO FACULTY DEVELOPMENT WORKSHOP DR. JEFF LOATS DEPARTMENT OF PHYSICS
  • 2.
    THE TECHNOLOGY CHALLENGE 2 “Thechallenge is not simply to incorporate learning technologies into current institutional approaches, but rather to change our fundamental views about effective teaching and learning and to use technology to do so.” (Higher Education in an Era of Digital Competition, Donald E. Hanna)
  • 3.
    GUIDING PRINCIPLES 3 Technology isnot an educational panacea Seek tools that offer new approaches As always, let evidence guide our attention
  • 4.
    OVERVIEW 4 1. Motivation forchange 2. Peer Instruction 3. Technology options 4. Question types 5. Practice writing questions 6. Evidence for effectiveness 7. Summaries
  • 5.
    5 In (roughly) whatarea do you teach? A) Humanities B) Natural sciences & mathematics C) Professions & applied sciences D) Social sciences E) Teacher education
  • 6.
    6 Are you currentlyusing clickers or another classroom response system in your courses? A) I have never used them. B) I have used them before, but don’t currently. C) I use them currently in at least one class.
  • 7.
    PHYSICS EDUCATION REVOLUTION EricMazur, Physicist at Harvard: 7
  • 8.
  • 9.
    9 University of Washington Universityof Colorado University of Illinois at Urbana-Champaign
  • 10.
    FEEDBACK THAT WORKS 10 “Improvementof performance is actually a function of two perceptual processes.The individual’s perception of the standards of performance, and her/his perception of his/her own performance.” The Feedback Fallacy – Steve Falkenberg (via Linda Nilson)
  • 11.
    TECHNOLOGIES VS.TECHNIQUES 11 Clickers Colored cards Hands Virtualresponse tools Peer Instruction Factual recall Polling/survey Poll-Teach-Poll Thought Questions Teach-Test-Retest … adding metacognition
  • 12.
    MAZUR’S PERSONAL REVOLUTION 12 (added)Pre-class reading, enforced (removed) Most sample problems (removed) Derivations (modified) Lecture broken up into small bites (added) Depth over coverage (added) ConceptTests with Peer Instruction
  • 13.
    PEER INSTRUCTION 13 Multiple choicequestions –Conceptual –Hard 1. Students answer Individually 2. Discussion with peers 3. Students answer post-discussion 4. Class-wide discussion
  • 14.
    Students have developeda robot dog and a robot cat, both of which can run at 8 mph and walk at 4 mph. A the end of the term, there is a race! The robot cat must run for half of its racing time, then walk. The robot dog must run for half the racing distance, then walk. Which one wins the race? A) Robot cat B) Robot dog C) They tie 14
  • 15.
  • 16.
  • 17.
    WHY CLICKERS? 17 Alternatives: –Hand raising –Numbered/coloredcards Anonymity + secrecy honesty Inclusive Fast Credit for learning
  • 18.
    STILL CLICKERS? 18 Hardware clickersare (basically) obsolete Good options: • PollEverywhere • Top Hat Monocle • LearningCatalytics
  • 19.
    TECHNOLOGIES VS.TECHNIQUES 19 Clickers Colored cards Hands Virtualresponse tools Peer Instruction Factual recall Polling/survey Poll-Teach-Poll Thought Questions Teach-Test-Retest … adding metacognition
  • 20.
    FACTUAL RECALL 20 Rated poorlyby students Usually requires high stakes Good uses: Reading quiz or diagnostic?
  • 21.
    What is thecorrect expression for the area of a circle? A) e ∙ r B) e ∙ r2 C) π ∙ d D) π ∙ r2 E) π ∙ r 21
  • 22.
    POLLING/SURVEY 22 Share without risk Comparisonstatistics Controversial topics are engaging
  • 23.
    Do you feelyou were treated fairly at all levels of review when you had your most recent professional review (renewal, tenure, promotion, etc.)? A) Yes B) No First:Women only Second: Men only 23
  • 24.
    How large ofan effect does bias have in the social sciences? [Measurement was of faculty responsiveness to prospective student emails.] A) Women/minorities do worse by ~11%) B) Women/minorities do worse by ~3% C) No difference across gender/ethnicities D) Caucasian males do worse by ~3% E) Caucasian males do worse by ~11% 24
  • 25.
    POLL-TEACH-POLL 25 1. Poll butdon’t show results 2. Teach 3. Poll again (explore shifts in attitude) Peer sharing for added metacognition Insightful results for instructor
  • 26.
    Which best describesyour feelings about female circumcision/female genital mutilation? A) I am writing letters to theWHO to protest. B) To each their own… we shouldn’t interfere with another culture. C) What is the big deal… males around the world are circumcised. D) I don’t know anything about it. 26
  • 27.
    THOUGHT QUESTIONS 27 • Choosea relevant open-ended question. • Small group discussion • Presentation & defense by a single group • Class votes: Agree/Disagree/Don’t know • If threshold isn’t met… next group presents! Repeat until majority agrees Created byTeresa Foley & Pei-SanTai from the CU Integrative Physiology Department
  • 28.
    Endocrinology: What would youpredict would happen to the ovulatory frequency if one ovary were removed? Immunology: Given that all blood cell types derive from the pluripotent hemopoietic stem cell, why are there so many different types of cells in the immune system? 28
  • 29.
    TEACH-TEST-RETEST 29 Skill focused questions Diagnosticand formative assessment Repeated testing beats repeated studying!
  • 30.
    ADDING METACOGNITION 30 “… andI can explain why” “… but I don’t know why” Good for two-choice questions Adds to formative assessment value
  • 31.
  • 32.
    WRITE A QUESTIONAND SHARE... 32 Imagine an introductory course and a topic early in that course. Write a question, then share Peer Instruction Factual recall (add metacognition?) Polling/survey Poll-Teach-Poll Thought Questions Teach-Test-Retest
  • 33.
    THE EVIDENCE STANDARD 33 Quick/easyattendance in large class sizes. Provides anonymity (Banks, 2006). Every student participates (Banks, 2006). Encourages active learning (Martyn, 2007).
  • 34.
    THE EVIDENCE STANDARD 34 Improvedconcentration (Hinde & Hunt, 2006) Improved learning and retention (Moreau, 2010). Improved exam scores (Poirier & Feldman, 2007) Efficient use of class time (Anderson, et al. 2011).
  • 35.
    STUDENT FEEDBACK ONCLICKERS 315 students in 7 classes over 4 terms (roughly ±6%) Rated on 5 point scale (strongly disagree to agree) The use of iClickers, and activities that used them have… Agreed or Strongly Agreed …helped me to stay more engaged in class than I would otherwise be. 93% …helped me to learn the material better than I otherwise would 83% …been worth the cost to buy them 78%
  • 36.
    WHAT MIGHT STOPYOU? 36 Interms of the technique: Time, coverage, not doing your part, pushback… In terms of the technology: Learning curve, tech. failures, perfectionism… In any reform of your teaching: Reinventing, no support, too much at once…
  • 37.
    BEST PRACTICES 37 Start small– 5 min of each hour of class Sell it – Be explicit about why Be consistent – Nearly every class Engage students –Wait for explanation Demonstrate value – Focus on wrong answers Follow up – Assessments must change Credit – 2%-15% for participation… mostly.
  • 38.
    MY SUMMARY 38 Classroom responsesystems can be integrated into most teaching styles and disciplines to good effect. From an evidence-based perspective, classroom response systems addresses often-neglected areas. As with all reforms, be prepared to find that students know less than we might hope.
  • 39.
    YOUR SUMMARY 39 For yourself…or to share? What part of using a classroom response system is the fuzziest for you after this? What is the biggest reason you thing trying a classroom response system might work well? Contact Jeff: Jeff.Loats@gmail.com Today’s slides: www.slideshare.net/JeffLoats

Editor's Notes

  • #2 “Learning technologies should be designed to increase, and not to reduce, the amount of personal contact between students and faculty on intellectual issues.”Study Group on the Conditions of Excellence in American Higher Education, 1984
  • #4 “Learning technologies should be designed to increase, and not to reduce, the amount of personal contact between students and faculty on intellectual issues.”(Study Group on the Conditions of Excellencein American Higher Education, 1984)
  • #6 Asking
  • #7 Asking
  • #8 About ~20 years ago, physics teachers began treating education as a research topic!Their findings were pretty grim"But the students do fine on my exams!“It appeared that students had been engaging in “surface learning” allowing them to solve problems algorithmically without actually understanding the concepts.
  • #9 Was this just at Harvard (silly question)!Data from H.S., 2-year, 4-year, universities, etc.0.23 Hake gain on the FCI means that of the newtonian physics they could have learned in physics class, they learned 23% of it.Conclusion: Traditional physics lectures are all similarly (in)effective in improving conceptual understanding.
  • #10 Enter Physics Education Research:An effort to find empirically tested ways to improve the situation.
  • #13 Students had fragile knowledge from engaging in surface learningObviously this isn't what he thought he was teaching.Very “learner centered” moveClass time is now almost entirely focused on tackling tough conceptual questionsLater shifted to JiTT instead of reading quizzes
  • #14 Carefully chosen questionsIdeally: 30-70% correct on first try)In 95% of cases, students “migrate” towards correct answer, often dramatically.Explanation and discussion by instructor follows the second “vote”, as necessary.In my class, participation is required (5% of final grade) but correctness is not required.
  • #16 Is this just about new energy being put into an old class?(This is a difficult confounding factor in assessing new teaching techniques.)
  • #17 Is this just about new energy being put into an old class?
  • #21 Can occasionally serve as a reading quiz or as a lecture diagnostic.Students rate these questions as much less valuable compared to harder, deeper questions.If these are “high-stakes,” technical difficulties become even more problematic.
  • #23 Polls or surveys provide a way for students to express their opinion on topics, when they otherwise might remain quiet. Polls or surveys can be used as a comparison with statistics in the text.Students enjoy voting on controversial topics. The classroom can quickly become more animated after viewing the results.
  • #25 Actual results:Social science faculty responded 2.5% more to white men (71% vs. 68.5%)Natural, physical sciences and math responded 5.9% more (69.8% vs. 63.9%)
  • #26 Polling first (without showing the results), then teaching, then polling again allows tracking of changes in student attitudes or opinions. Can enhance critical thinking (analyzing, evaluating).Increases metacognition (partner/peer share).Provides quick checks on knowledge and understanding of material.
  • #28 Start by choosing a learning goal to assess.Develop an open-ended application/prediction question for the goal.Present the question, organize groups of 3-4 students and allow 5-7 minutes for discussion.One group presents their answer and rationale.Class votes on rationale: Agree/Disagree/Don’t know.If the majority of the class disagrees, another group gets to offer their answer and rationale.Repeat 5 & 6 until the majority agrees.
  • #30 Reference for repeated testing beating out repeated studying is: Roediger & Karpicke, 2006
  • #34 Major caveat: Using classroom response system does not automatically bring these benefits. The method matters much more than the means.Hinde & Hunt: We survey 219 first-year business studies students tackling introductory economics, and find that the technology enhances learning in lectures because, among other things, it improves concentration, provides instantaneous and more effective student feedback, and allows students to make comparisons on how well they fare relative to their peers. Moreau, 2010: Overall, the experimental group scored higher on the posttest than the control group, and weak students in the experimental group made more improvement as measured by the posttest than similar ability students in the control groupPoirier & Feldman, 2007: There are reports of modest increases in exam grades when instructors use clickers to test concepts and probe opinions in large sections of introductory psychology .Anderson, et al. 2011: Compression (dropping topics that are well understood) based on group, or individual-level performance caused no decrease in learning compared to no compression.
  • #35 Hinde & Hunt: We survey 219 first-year business studies students tackling introductory economics, and find that the technology enhances learning in lectures because, among other things, it improves concentration, provides instantaneous and more effective student feedback, and allows students to make comparisons on how well they fare relative to their peers. Moreau, 2010: Overall, the experimental group scored higher on the posttest than the control group, and weak students in the experimental group made more improvement as measured by the posttest than similar ability students in the control groupPoirier & Feldman, 2007: There are reports of modest increases in exam grades when instructors use clickers to test concepts and probe opinions in large sections of introductory psychology .Anderson, et al. 2011: Compression (dropping topics that are well understood) based on group, or individual-level performance caused no decrease in learning compared to no compression.
  • #38 The 10% rule or 5 min/hour.Clicker questions should be worth credit (2% - 15%)Give 2-3 “free days” to reduce complaints/excusesGrading based on correctness:Distorts student discussion and learning strategies.Limits question types.Leads to technology/human issues: “I meant to hit, C!”Grade based on participation (mostly?)Immediate feedback without penalty seems key to learning.Allows for “bad” questions that are still great for learning.Occasional “must be correct” questions can motivate.