Faculty college_Gurung 2013
Upcoming SlideShare
Loading in...5
×

Like this? Share it with your network

Share

Faculty college_Gurung 2013

  • 279 views
Uploaded on

The complete set of slides for my workshop. Please do not distribute/share.

The complete set of slides for my workshop. Please do not distribute/share.

More in: Education , Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
279
On Slideshare
279
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
7
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • Don’t really have to be able to read it : ) A LOT OF FACTORS!! This one from Noel Entwhistle in England.
  • An overview of the construct validity concepts.
  • Headings for this section.
  • Screenshot from p. 147
  • We often cover question wording on the same day I go over the first exam. That leaves less time for the topic of question wording. Most students find this material fairly easy to understand so I don’t lecture extensively on it. Instead, I use the activity, “How Amazing My Class Is,” discussed in the Instructor’s Manual, p. 60. I present the survey first and ask students to nominate the problems with it. (I take out the bold headings that are shown in the sample in the IM). As students suggest different problems, I point to the heading on this slide or the next. We might suggest alternative ways of wording the questions to improve them. The activity takes 10 to 15 minutes.
  • I often cover question wording on the same day I go over the first exam. That leaves less time for the topic of question wording. Most students find this material fairly easy to understand so I don’t lecture extensively on it. Instead, I use the activity, “How Amazing My Class Is”, discussed in the Instructor’s Manual, p. 60. I present the survey first, and ask students to nominate the problems with it. (I take out the bold headings that are shown in the sample in the IM). As students suggest different problems, I point to the heading on this slide or the next. We might suggest alternative ways of wording the questions to improve them. The activity takes 10 to 15 minutes.
  • I use this figure from the text (Figure 6.2) as a way to review key elements of a good survey. I might discuss here how the questions are simple, the rating system is straightforward. And I might mention how there is no reverse-worded item in this scale, and what that might mean for construct validity. (Students may see that in the context of the other evidence supporting the construct validity of this scale—reviewed in Chapter 5—the lack of a reverse-worded item is not that much of a problem).
  • Chapter 10 covers nine new threats to internal validity; combined with the three from Chapter 9 (design confounds, selection effects, and order effects) we get an even dozen. The threats in bold above are especially relevant to the Really Bad Experiment.
  • This article documents a scholarship of teaching and learning project designed to help literature students cultivate the core disciplinary skill of reading for complexity. We offer a close reading of student responses from a collaboratively designed lesson to understand what happens when students read complex texts in introductory literature courses. “Pressing an Ear against the Hive” Reading Literature for Complexity, Pedagogy 2009 Volume 9, Number 3: 399-422

Transcript

  • 1. 1. Name2. Discipline3. YourBiggestclassroomfrustration.4. What doYOU wantfrom thisworkshop?
  • 2. “The essence of skillfulteaching lies in the teacherconstantly researchinghow her students areexperiencing learning andthen making pedagogicaldecisions informed by theinsights she gains from thestudents’ responses.”—Stephen D. Brookfield
  • 3.  Contemporary Issues in SoTL The Big Picture Major Research Designs Threats toValidity of SoTL KeyVariables in the Study of Learning Evidence-BasedTeaching in Higher Education
  • 4. SoTLGoing Through MotionsSincere TeachingScholarly Teaching
  • 5. Gurung & Schwartz (2009) adapted from Richlin (1993)Systematic, IntentionalModificationsKNOWLEDGE BASEABOUTTEACHING/LEARNINGAssess SuccessPublicationPeer ReviewScholarly TeachingReflect onTeaching/LearningShare resultsPresentation
  • 6. Aubrey Stoll: http://500px.com/photo/9827809
  • 7. ..the systematic study of teaching and learning,using established or validated criteria ofscholarship, to understand how teaching(beliefs, behaviours, attitudes, and values) canmaximize learning, and/or develop a moreaccurate understanding of learning, resulting inproducts that are publicly shared for critiqueand use by an appropriate community. (Potter and Kustra, 2011, p. 2)
  • 8.  Pedagogical research is scholarship too!!! Greeks to James …….to …Boyer Hutchins & Shulman (1999)
  • 9.  LearningToThink▪ Donald (2002) Decoding disciplines▪ Identify bottlenecks▪ Pace & Middendorf, (2004) Signature Pedagogies▪ Teach students your discipline’s habit of mind▪ Gurung, Chick, & Haynie, (2009); Shulman, (2005) Threshold Concepts▪ Teach students fundamental/troublesome concepts▪ Land, Meyer & Smith, 2008; Meyer & Land, 2003
  • 10.  Infiltrate the Mainstream Run interference Look at the Big Picture Catalyze SoTL Use
  • 11.  Goal: Make SoTL even more visible. Where do you publish your SoTL?▪ SeeWeimer, 2008 Where can you publish it? Where is your ‘Commons’? Break new ground▪ SoTL tracks▪ SoTL journals
  • 12.  Goal: Facilitate more SoTL Connect with higher administration Become higher administration Network of department chairs Mentoring & Advocacy
  • 13.  Goal: Partition out Influences on Learning Models ofTeaching and Learning▪ What’s on your notepad? Empirical data: Meta-meta-analyses
  • 14. Traditional lecture Active learning Service learning Problem-based learningGroup learning Mentoring Cooperative learning Discovery learningInductive learning Learning by example Inter-teaching Desirable difficultyLearner centered Curriculum Centered On-line teaching ClickersPowerpoint Overheads Chalktalks Teachable momentsUniversal design of instruction CAP Model Multiple Intelligences Kolb’s learning stylesJournaling Reflective practice Reciprocal teaching UncoverageConcept maps question generation Film strips Laboratory-based instructionVideo clips Role playing Modeling Programmed instructionKeller method Skill practice Guided practice Collaborative learningApprenticeship Situated learning Authentic assessment Formative assessmentClassroom research techniques Book reports Class Discussion Small group discussionThink-pair-share Peer instruction ConcepTests Panel of expertsBrainstorming Case studies Worksheets Guest speakersStudent debates Jeopardy Portfolios Posters or Bulletin boardsFlashcards Research papers Interviewing Lecture with discussionOral reports Study abroad Mock convention Textbook assignmentsJust-in-time teaching Jigsaw method Wikis Team teachingSocratic method Modules Podcasts internships or practicums
  • 15. Traditional lecture Active learning Service learning Problem-based learningGroup learning Mentoring Cooperative learning Discovery learningInductive learning Learning by example Inter-teaching Desirable difficultyLearner centered Curriculum Centered On-line teaching ClickersPowerpoint Overheads Chalktalks Teachable momentsUniversal design of instruction CAP Model Multiple Intelligences Kolb’s learning stylesJournaling Reflective practice Reciprocal teaching UncoverageConcept maps question generation Film strips Laboratory-based instructionVideo clips Role playing Modeling Programmed instructionKeller method Skill practice Guided practice Collaborative learningApprenticeship Situated learning Authentic assessment Formative assessmentClassroom research techniques Book reports Class Discussion Small group discussionThink-pair-share Peer instruction ConcepTests Panel of expertsBrainstorming Case studies Worksheets Guest speakersStudent debates Jeopardy Portfolios Posters or Bulletin boardsFlashcards Research papers Interviewing Lecture with discussionOral reports Study abroad Mock convention Textbook assignmentsJust-in-time teaching Jigsaw method Wikis Team teachingSocratic method Modules Podcasts internships or practicums
  • 16. Entwistle (2009)
  • 17. KnowledgeCourseDesignTeacher-StudentInteractionCourseManagementBeginning of the CourseFink, L. D. (2003).
  • 18.  Dimension 1: Intellectual Excitement Clarity of Presentations (what is presented) Emotional Impact on Students (way material is presented) Dimension 2: Interpersonal Rapport Awareness of Interpersonal Nature of the Classroom Communication Skills that Enhance Motivation andEnjoyment of Learning and that Foster IndependentLearning~Lowman, J. (1995).
  • 19. StudentTeacher Knowledge LearningTechniqueSocial Context(rapport)Social Context(rapport)Intellectual ExcitementIntellectual Excitement
  • 20. Topic, Content, andLearning GoalsLevel of StudentUnderstandingCharacteristics of the TeacherPost-event ReflectionManipulateMonitor,Manage,Manipulate Monitor ManipulateStudent-Teacher Rapportand Classroom AtmosphereIn-the-MomentReflection Pre-eventReflectionForm ofAssessmentTeachingStrategiesCharacteristicsof the LearnerLearningStrategies
  • 21. Background, preparation, andindividual characteristicsUnderstanding the ways that humans learnClassroom design, technology,and institutional prioritiesContent difficulty, relevance,organization, and accuracyTeaching technique,teacher behaviors, andstudent learning activitiesDesired results of teaching,short- and long-term goals,and assessment practicesGroccia’s (2012) 7-Component ModelFrom St. Clair, K. L., & Groccia, J. E. (2012). Change to social justiceeducation: A higher education strategy. InSkubikowski, K., Wright, C., & Graf, R. (Eds.). Social justiceeducation: Inviting faculty to transform their institutions. Sterling, VA:Stylus.
  • 22.  Making SoTl accessible National SoTL infrastructure▪ (Poole,Taylor, &Thompson, 2007) Building on MERLOT SoTL Electronic Repository The psychology of teaching: An empirically basedguide to picking, choosing, & using pedagogy The MetaSearch Project Tackle cross-cutting questions.▪ What are the processes most linked to learning?
  • 23.  Making Learning MoreVisible: Lesson Study Blooming BiologyTool (Crowe, Dirks, & Wenderoth, 2008) Question-Eliciting-Questions (Dickman, 2009)
  • 24. HOW do you use SoTL?Change course design?Modify assessments?Tell students about SoTL results? Main uses (McKinney & Jarvis, 2009; Meyers, 2009)
  • 25. • How will you do it?• Did it work?• Present• Publish• What’sYOUR question?• What willYOU do?• What are students’learning?• How can you do better?• What’s been done?Reflect&ReviewFocus &ChangeAssess &EvaluateShare &Respond
  • 26. • How are students’learning?• What can you do better?• What’s been done?Reflect&Review
  • 27. CourseGoalSLO AssessmentSLO AssessmentGoalSLO AssessmentSLO Assessment
  • 28. PedagogyContentTextReadingsMethodology LectureDiscussionAssignmentsPapersService LearningOral PresentationDebateLab
  • 29.  Backward Design (Fink,2003;Wiggins & McTighe,1998)1. Articulate learning goalsfirst/determine learningobjectives (outcomes)2. Determine assessmentmethods/techniques3. Select pedagogical /teaching strategies Action Research
  • 30. Landrum, 2012
  • 31. Landrum, 2012
  • 32. Landrum, 2012
  • 33. (Gurung & Landrum, 2012)
  • 34. • What’sYOUR question?• What willYOU do?Focus &Change
  • 35. Someone teaches Something to Someone elseSomewhere (Schwab, 1973)Teacher Scrutinize your assignmentsMaterial Textbook evaluationsStudents How do students study?Context Online, hybrid, face to face
  • 36.  One problem you encounter in your coursessuch as:A student behavior you would like to changeA learning objective you want to better achieve
  • 37.  Flip your classroom Engage problem-based learning in a class Add a case study approach Introduce service-learning components Teach without a textbook Have students construct learning portfolios Increase the amount of writing, music, visuals, orreflection used in class
  • 38.  One problem you encounter in your courses What solution might you use to address theproblem?
  • 39.  LearningToThink▪ Donald (2002) Signature Pedagogies▪ Teach students your discipline’s habit of mind▪ Chick, Haynie, & Gurung (2012)▪ Gurung,Chick, & Haynie (2009)
  • 40.  One problem you encounter in your courses What solution, might you try to address one of theseproblems? How will you assess the success of your solution?What evidence will you collect?
  • 41.  QuantitativeVs. Qualitative Methodology vs. Analysis
  • 42. ResearchDesignDescribeQualitative QuantitativeCorrelate ComparePre-Post GroupsAcrosssemestersWithinsemestersWithinclasses
  • 43.  Watch Classroom Observation Content Analysis▪ Develop a coding scheme (categories, rubrics)▪ Units of analysis (words, turns-at-talk)▪ RaterTraining & Reliability Ask Survey Focus Group Protocol Analysis
  • 44.  Retention over the term Journal evidence Student discussion increases Student preparation improves Student evaluations improve Portfolio showcasing student work Classroom assessment techniques
  • 45. Quantitative survey scores course exam, project,paper scores frequencies of multiplechoice test itemresponses standardized scales andtests counts (participation,web requests, officevisits) measures of time use institutional researchdataQualitative performances interviews focus groups student projects term papers essay items exams reflective statements journals reports of others
  • 46. • How will you do it?• Did it work?Assess &Evaluate
  • 47.  Statistical Significance (SPSS;EXCEL) Correlational r (ranges from -1 to 1) T-Tests; Analysis ofVariance (ANOVA)
  • 48. 1.92.73.523.12.22.900.511.522.533.54Out.Summ.BoldItalicsKeyReview(Gurung, 2003; 2004)
  • 49. -0.15-0.1-0.0500.050.10.15OutlinesBoldfaceReviewKeyT(Gurung, 2004)
  • 50. 0.180.150.19OnlineCorrelations to Exam ScoreConnect MyPsychLab Psychportal
  • 51. Comparing Group Differences?65707580Guess/Open Read FirstExam Scores
  • 52. CompareClassesEthical counterbalanceCompareClassesInterventionRandomAssignmentClassSection1NoveltyTestNothingTestSection2NothingTestNoveltyTest
  • 53. CompareGroupsEthical counterbalanceCompareGroupsInterventionRandomAssignmentClassGroup1NoveltyTestNothingTestGroup 2NothingTestNoveltyTest
  • 54.  Choosing question formats Writing well-worded questions Encouraging accurate responses
  • 55. Open-ended questionsForced-choice formatLikert scaleSemantic differential format
  • 56. Leading questionsDouble-barreled questionsDouble negatives (vs. negatively worded items)Question order
  • 57. Response sets (three types) Yea-saying/ nay-saying Fence sittingFaking good or badSaying more than we can knowMeasuring subjectivity vs. objectivity
  • 58.  Your Name: What do you look like? Yesterday we talked about validity. Did I do a good job of explaining it?a. Absolutely! b. Completely c.Very much yes d. yes e. Mostly yes Is this your favorite workshop and do you have two legs? Yes/No Do you favor reducing the overwhelming amount of homework you are forcedagainst your will to do? Yes/No If you weren’t to advocate not doing your homework, would you also notadvocate not increasing the amount of reading you don’t feel is too muchalready? A. No B. Not no Do you enjoy being alive? Yes/No DO you think humans should keep having children?Yes/No Will you give me a chilly pepper on ratemyprofessors.com?Yes/NoanonymityScalechoiceDoublebarrelLoadedDouble NegativeYea sayingresponseset
  • 59. Fence sittingpossibleNo labels on allitemsToo Manychoices
  • 60. Design confoundSelection effectOrder effectMaturationHistoryRegression to the meanAttritionTestingInstrumentationObserver biasDemand characteristicsPlacebo effects
  • 61. Individual results may vary.
  • 62.  To Publish or Present--- Get Institutional ReviewBoard (IRB) clearance▪ Gurung (2012)
  • 63. • Present• PublishShare &Respond
  • 64. John Hattie (2009) 800+ meta-analysis 50,000 studies 240+ million students Elementary, secondary, & tertiary
  • 65. Percentage of Learning VarianceStudentsLecturersHomePeersTEOs Others
  • 66. Decreased EnhancedZero0 .22TypicalEffect Size1.0.40
  • 67. Rank Influence Studies Effects ES1 Self-reported grades/SC of ability 209 305 1.442 Piagetian programs 51 65 1.283 Formative evaluation of own teaching 30 78 .904 Micro teaching 402 439 .885 Acceleration 37 24 .886 Classroom behavioral 160 942 .807Comprehensive interventions forlearning disabled students 343 2654 .778 Teacher clarity na na .759 Reciprocal teaching 38 53 .7410 Feedback 1287 2050 .73
  • 68. Strategy #13 Example ESOrganizing &transforming Making an outline before writing a paper .85Self-consequences Putting off pleasurable events until work is completed .70Self-instruction Self-verbalizing the steps to complete a given task .62Self-evaluation Checking work before handing in to teacher .62Help-seeking Using a study partner .60Keeping records Recording of information related to study tasks .59Rehearsing andmemorizing Writing a mathematics formula down until it is remembered .57Goal-setting/planning Making lists to accomplish during studying .49Reviewing records Reviewing class textbook before going to lecture .49Self-monitoringObserving and tracking one’s own performance andoutcomes .45Task strategies Creating mnemonics to remember facts .45Imagery Creating or recalling vivid mental images to assist learning .44Time management Scheduling daily studying and homework time .44EnvironmentalrestructuringEfforts to select or arrange the physical setting to makelearning easier .22
  • 69. Individual results may vary.0 0.2 0.4 0.6ACT/SATHighschSESSkillsSelf-efficacyCommitmentGoalsMotivationRobbins, Lauver, Le, Davis, Langley, & Carlstrom (2004)
  • 70.  Cognitive:ACT, SAT, Highschool GPA Non-Cognitive: Conscientiousness (Noftle & Robins, 2007) Intrinsic Motivation (Komarraju, Karau, &Schmeck, 2009) Self-Efficacy Academic discipline
  • 71. LearningInstructionEffortX
  • 72. MentallearningmodelsLearningorientationsContextRichardson, 2011
  • 73. 86Psychological Science in thePublic Interest14(1) 4–58© TheAuthor(s) 2013Reprintsand permission:sagepub.com/journalsPermissions.navDOI:10.1177/1529100612453266http://pspi.sagepub.comImproving Students’ Learning W ithEffective Learning Techniques: PromisingDirectionsFrom Cognitive andEducational PsychologyJohn Dunlosky1, Katherine A. Rawson1, Elizabeth J. Marsh2,Mitchell J. Nathan3, and Daniel T. W illingham41Department of Psychology, Kent State University; 2Department of Psychology and Neuroscience, Duke University;3Department of Educational Psychology, Department of Curriculum & Instruction, and Department of Psychology,University of Wisconsin–Madison; and 4Department of Psychology, University of VirginiaSummaryMany students are being left behind by an educational system that some people believe is in crisis. Improving educationaloutcomes will require efforts on many fronts, but a central premise of this monograph is that one part of a solution involveshelping students to better regulate their learning through the use of effective learning techniques. Fortunately, cognitive andeducational psychologists have been developingand evaluatingeasy-to-use learningtechniques that could help students achievetheir learning goals. In this monograph, we discuss 10 learning techniques in detail and offer recommendations about their
  • 74. DunloskyetTable 1. LearningTechniquesTechnique Description1.Elaborative interrogation Generatingan explanation for why an explicitly stated fact or concept is true2.Self-explanation Explaininghow new information is related to known information, or explainingsteps takenduringproblem solving3.Summarization Writingsummaries (of various lengths) of to-be-learned texts4.Highlighting/underlining Markingpotentially important portions of to-be-learned materials while r eading5.Keyword mnemonic Usingkeywords and mental imager y to associate verbal materials6.Imagery for text Attemptingto form mental images of text materials while r eadingor listening7.Rereading Restudyingtext material again after an initial r eading8.Practice testing Self-testingor takingpractice tests o ver to-be-learned material9.Distributed practice Implementingaschedule of practice that spr eads out study activities over time10.Interleaved practice Implementingaschedule of practice that mix es different kinds of problems,or aschedule ofstudy that mixes different kinds of material, within asingle study sessionNote.See text for adetailed description of each learningtechnique and r elevant examples of their use.Table 2. Examples of the Four Categories ofVariables for GeneralizabilityMaterials Learningconditions Student characteristicsaCriterion tasks87
  • 75. 88Improving Student Achievement 45in Table 4 with an I rating highlights the need for further sys-tematic research.Finally, some cells include more than one rating. In thesecases, enough evidence exists to evaluate a technique on onedimension of a category or issue, yet insufficient evidence isreview to make informed decisions about which techniqueswill best meet their instructional and learning goals.Implicationsfor research on learningTable 4. UtilityAssessment and Ratings of Generalizability for Each of the LearningTechniquesTechnique Utility Learners MaterialsCriteriontasksIssues forimplementationEducationalcontextsElaborative interrogation Moderate P-I P I P ISelf-explanation Moderate P-I P P-I Q ISummarization Low Q P-I Q Q IHighlighting Low Q Q N P NThe keyword mnemonic Low Q Q Q-I Q Q-IImagery use for text learning Low Q Q Q-I P IRereading Low I P Q-I P IPractice testing High P-I P P P PDistributed practice High P-I P P-I P P-IInterleaved practice Moderate I Q P-I P P-INote:A positive (P) ratingindicates that available evidence demonstrates efficacy of alearningtechnique with r espect to agiven variable or issue.Anegative (N) ratingindicates that atechnique is largel y ineffective for agiven variable.A qualified (Q) ratingindicates that the technique yielded positiv eeffects under some conditions (or in some gr oups) but not others.An insufficient (I) ratingindicates that ther e is insufficient evidence to suppor t adefinitive assessment for one or more factors for agiven variable or issue.Dunlowsky et al. (2013)
  • 76. “The growth of any craft depends onshared practice and honest dialogueamong the people who do it. We growby private trial and error, to be sure –but our willingness to try, and fail, asindividuals is severely limited when weare not supported by a community thatencourages such risks.”—Parker J. Palmer
  • 77. Helen Regueiro Elam explains in “The Difficulty ofReading” (1991: 73), American culture does not take well to the idea ofdifficulty.Our penchant is for one-step, one-stopsolutions to problems, and we expect and demandin all areas of life, including reading, an ease ofachievement that is antithetical to thought itself. . .. Difficulty is there to be overcome, disposed of,certainly not to become the invisible partner of ourdaily lives.
  • 78. Chick, Hassel, & Haynie (2009)