Anatomy of course redesign tamu presentation (2)
Upcoming SlideShare
Loading in...5
×
 

Anatomy of course redesign tamu presentation (2)

on

  • 941 views

Presentation by Dr. Mike Simmons and Dr. Ronald Carriveau at the Texas A&M Assessment Conference. February, 20 2012.

Presentation by Dr. Mike Simmons and Dr. Ronald Carriveau at the Texas A&M Assessment Conference. February, 20 2012.

Statistics

Views

Total Views
941
Views on SlideShare
941
Embed Views
0

Actions

Likes
0
Downloads
4
Comments
0

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Introduce Ron and MikeMike Simmons, Senior Associate Director in CLEAR - the Center for Learning Enhancement, Assessment and Redesign at the University of North TexasRonald Carriveau, Outcomes, Assessment, and Measurement Specialist. QEP Assistant Director. Also a member of the CLEAR team at the University of North TexasOften when you submit a conference presentation proposal it seems brilliant in your mind. Then you show up at the conference and you realize both how much you don’t know, and sometimes just how much you do know. So far the sessions at this conference have proven more of the former than the latter (not speaking for Ron, just for me). Hopefully though, the value we add will be in leading a discussion/conversation by planting ideas in your head and giving you sufficient time for feedback and commentary.
  • We are committed to quickly moving through the material so that there is plenty of time for discussion and questions at the end of our session. As a result, we ask you to save your questions for the end of the presentation. Slides are numbered so if you want to come back to any specific slide, just make a note of it.What is your primary responsibility? Instructor, Administrator, Staff, Student, OtherMany roles in the room and in the end probably the best way to look at this is to view us as fellow beggars looking for food.Our conversation is about our efforts to provide information and tools which allow faculty to make valid instructional and course redesign decisions. Our specificfocus is on providing information that is of value to the faculty without overwhelming them. We expect that most of you have similar aspirations or may be ahead of us, but there’s not doubt we are all moving in this same direction.The graphic you see is one of UNT’sNextGeneration course redesign posters that market these courses to students. We’ll use our experience with NextGen as the example for our conversation today.
  • We need to take just a minute to give you some context and background so that our current directions and ideas will make some sense. First, let me share why we did what we did. At UNT, we found ourselves in the same spot as most other institutions who endeavor to redesign courses. There are many reasons to systematically redesign courses. In our case there were three main concerns that we wanted to try and address:Linking student learning to Student Learning OutcomesInstruction and assessment providing evidence of student outcome attainment Demonstrating how assessment results are used to make instructional changesJane Wellman, George Kuh morning sessions – “intentionality matters”
  • Here’s what we did:We made course redesign the focus of our QEP. We did not redesign for redesign’s sake, but we chose the redesign as a vehicle by which we could get to the concerns I just listed. Many of you have chosen other vehicles to reach similar destinations.We are at year 5 – we’re just about to submit our SACS 5th year report. We saw early on, and continue to confirm, that we needed something other than grades to provide the evidence of student outcome attainmentSo, we builtNextGenLarge undergraduate core courses.Making Big Classes BetterEngaged learning to create higher level learning experiences.
  • Outcome attainment measures are used to evaluate how well the class as a whole did on achieve specific learning outcomes and how well particular instructional formats worked in terms of outcome attainment.The instructional methods include combinations of in-class lecture, online instruction, and small group experiential activities.The foundation of this approach is the relationship of the item to the intended student learning outcomes and to the instructional methods used.
  • I’d like to show you the model, but as you will see, this is something that all of you are doing in one way or another – whether formally or informally.The real focus for the rest of our conversation today is to discuss how we use assessment results to inform and improve instruction.
  • In order to talk about the information needed to improve instruction, we must first spend a little time setting the scene for how we created our ruler or yardstick. We use a three level model and Ron will be discussing this in more detail at his session on Tuesday. For our purposes today, we just want to show you how it works in terms of the courseYou might be saying to yourself “wait, there are four levels on that diagram”. And you’d be right, but as Ron will soon discuss, the item column simply allows the specific assessments to be fully mapped to the goal levels. More in this in a moment. But for now, please notice that the arrows point in different directions because as you all know, we work in different directions at various times. Sometimes we have external or higher level goals that are prescribed, suggested, mandated for us, so that’s where we begin. Other times, the instructor has specific outcomes that are needed for the course. Most times, this process is a combination of the two scenarios – some goals mandated, and some sLO’s in place. It’s the point of the three level model to provide a framework for this to happen in an organized manner.Dr. Carriveau will now spend a few minutes talking about how the three level model allows us to calculate the attainment values that serve as our yardstick in determining what can actually be improved in instruction.Dr. Kuh challenged us this morning to “show the data”…so that is where we started…with a yardstick.
  • We now have the ruler…attainment values.We can begin to understand how changes in instruction have an impact on learningWe are a campus with lots of high impact practicesNextGen contains a number of the practices…and integrative learningWe are here to explore
  • Let’s start with what we all have in common. The big ideas are great, but most of us need to pick a simple starting point:Formative Monitoring learningMeasuring achievement on the construct scaleDetermining the degree to which student is learningDetermining the degree to which student meets outcome expectationsPre-Post informationMeasuring gainsMeasuring growthOnly those with pre and post scoresHow far did individual moveHow far did group moveSummative informationMaking judgments Assign proficiency levelAssign gradDetermine if retake is allowed
  • What are some other types of information that institutions will typically have? Demographic information.We are also finding that other course demographics, like size of course, time of day, room type, and student demographics have an impact on the dataKuh…said today that when we control for variables these types of information have less impact than we think…or do they? Perhaps it’s impolite to disagree with the plenary speaker, so I won’t. As Dr. Kuh said, it’s more important what we do in the instruction.
  • Dashboarding and visual presentation is more user friendly for quicker decisions
  • Source is institutional data, in our case Instituitional Research and EffectivenessAt this point, institutions are not likely to produce dashboard
  • Nextgen v. non
  • Additional, specific information that we collected and use for Next Gen – again these are not terribly unusual.
  • Pre and post
  • What to Learn.Student who prefers this approach to learning focuses on facts and primarily wants only to be given “right” answers to specific questions.  The teacher is viewed as an Authority and the only  source of information.  Learning is viewed as an exchange of information and content.  A clear-cut, objective testing/grading method is preferred. A straight-lecture format is preferred with the teacher being in complete control of the classroom environment.  Uncertainty and fuzziness in the learning process is not acceptable.2. How to Learn. Student who prefers this approach to learning focuses on the methods and processes of learning, like problem solving.   This person needs to have a variety of activities/methods used in class.  Learning is seen as the quantity of facts, subjects, and/or methods that can be learned.  Class discussions in which the opinions of others are expressed are desirable.  The teacher needs to be more than just a source of facts.  Challenges are enjoyable.  Hard work is viewed as being the primary factor in success.  3. How to Think. Student who prefers this approach to learning focuses on independent thinking as the primary purpose of learning.  The tendency is to integrate the process of learning with content.  Making connections between classes and across disciplines is important.  The need for presenting reasoned arguments is understood.  There is a tendency to reject rote learning and memorization.  Essay tests are preferred and viewed as opportunities to demonstrate thinking.  Learning about self is viewed as an important part of education. 4. How to Judge. Student who prefers this approach to learning focuses on the synthesis of different ideas and viewpoints in specific disciplines or areas.  Peers in class are viewed as being genuine sources of learning (in addition to the teacher).  The teacher is valued as the expert, but the preference is that the teacher assume the role of facilitator and co-participant in the learning.  Judging ideas and arguments begins to be based on the quality of the evidence presented.  There is a tendency to be a self-directed learner, seeking new challenges on one’s own.  Evaluation is viewed as constructive criticism and as an opportunity for learning.
  • CategoriesDifferences over time in course
  • As discussed in the plenary session earlier today, we are all chasing the holy grail of of “engagement”…but at the classroom level seems to be where we need to know more in order to make valid instructional decisions.Working in an active partnership with Dr. Bob Smallwood and the CLASSE team to move this conversation forward. Lots of questions…but this is as good a place to start as any.
  • Student academic behaviors and other performance data that is increasingly available to institutions
  • So what? Well I don’t know yet. But I want faculty to consider things like this regularly within the context of their own discipline. And I want the information to be available easily – without faculty overhead.
  • You just got your end of course results on student attainment of outcomes.  For a particular online activity that was designed to help students achieve a high degree of attainment on a particular sLOs, five test items were used. Four of the test items showed above 80% percent correct and one was at 45% correct, which of course lowered considerably the average the attainment value for the sLO.   Your attitude toward the course topic survey showed a slight positive result, and 70% of the students who took the format preference survey said they prefer NextGen to traditional FTF.   There was also high positive agreement on the engagement survey between what you thought was important and what students saw happening in the class.  Which of the following statement best represent your conclusion about the particular online activity.  A.  I created a great learning environment but I need to fix that dang 45% correct item that didn’t work last year either.  B.  I need to consider redesigning the online activity so that students will do better on attaining the sLO. C.  I will make sure that next semester I spend more review time on what the 45% test item was covering.
  • Mike: Implementation of this metrics driven approach includes constant exploration of what form data should take to be most useful - especially in terms of the many workload and administrative challenges that a faculty member faces in an education system that is ever-increasing in demands on instructors.Our overall intent is to find the best ways to support faculty as they endeavor to make good instructional decisions.

Anatomy of course redesign tamu presentation (2) Anatomy of course redesign tamu presentation (2) Presentation Transcript

  • Anatomy of Course Redesign: How to know what works by removing the guesswork. Dr. Mike Simmons | Dr. Ron Carriveau 1
  • INTRODUCTIONThis session focuses on the procedures and datacollection used to provide meaningfulinformation for use by faculty in making validinstructional and course redesign decisions. 2
  • Why We Did What We Did:• Course redesign goals and challenges with which institutions have difficulty – Linking student learning to Student Learning Outcomes – Instruction and assessment providing evidence of student outcome attainment – Demonstrating how assessment results are used to make instructional changes 3 View slide
  • Background• Background – Five years of QEP implementation – Providing evidence of student outcome attainment (NOT GRADES) – UNT process and methodology used to address these goals and challenges 4 View slide
  • Next Generation CourseRedesigntm • 350 + sections • 16,000 + students • 30 + courses 5
  • NextGen Outcome Based Model• Outcome attainment measures• Instructional methods• Foundation – Assessment – Intended student learning outcomes – Instructional methods used 6
  • The NextGen Outcome based assessment model Develop outcome statements that tell what students are to achieve. Align instruction to Use outcome statements to outcome statements develop test items Use assessment results to inform and improve instruction, asse ssment, and outcomesDevelop instructional Develop assessments tostrategies that tell how measure the degree to whichopportunities will be Align instruction to students are achieving andprovided to help students have achieved. assessmentachieve.Source: Carriveau, R.S. (2011). Connecting The Dots: Writing Student Learning Outcomes and Outcome Based Assessments. Fancy FoxPublications, Denton, TX.
  • Item 1. Specific The Three Level Model 2. Outcome 3. sLO 1.1.1 4. 5. Specific General Outcome Outcome 6. GLO 1.2 sLO 1.1.2 7. 8. Specific 9. Outcome Goal10. sLO 1.1.3 111.12.13. Specific14. Outcome15. sLO 1.2.1 General Outcome16. GLO 1.217. Specific18. Outcome19. sLO 1.2.2 Source: Carriveau, R.S. (2011). Connecting The Dots: Writing20. Student Learning Outcomes and Outcome Based Assessments. Fancy Fox Publications, Denton, TX.
  • Anatomy of Course Redesign: How to know what works by removingthe guesswork.RATIONALE FOR ATTAINMENT 9
  • Why Use Attainment Values?• Valid measure of student attainment of learning outcomes as the basis for instructional changes and course redesign• Provides superior measures to the use of grade distributions, percentages, and other traditional achievement measures which lack the ability to address student learning outcomes at the course level and provide little to no evidence for course improvement 10
  • %ageItemCorrect Calculating Outcome 1. 87 Specific Outcome 2. 90 Attainment Values (p.83) 1.1.1 3. 65 Avg = 81 4. 58 5. 63 Specific Outcome General Outcome 1.1.2 1.1 6. 52 Avg = 60 Avg = 81 7. 66 8. 77 Specific Outcome Goal 9. 84 1.1.3 1 10. 93 Avg = 88 Avg = 83 11. 96 12. 88 13. 82 14. 88 Specific Outcome 1.2.1 15. 90 Avg = 85 16. 80 General Outcome 17. 92 1.2 Specific Outcome Avg = 85 18. 81 1.2.2 19. 81 Avg = 85 20. 82 Source: Carriveau, R.S. (2011). Connecting The Dots: Writing Student Learning Outcomes and Outcome Based Assessments. Fancy Fox Publications, Denton, TX. Pg 83
  • Outcome Attainment Fixing Items 11 Outcome Attainment Item # 09 Fall 10 Fall SpringGoal GLO sLO AV Item Difficulty Summary N=50 Item % Item % Item % 1 81 L M H 49 59 57 52 1.1 81 15 25 10 31 55 48 56 1.1.1. 66 30% 50% 20% 9 33 84 84 1.1.2. 87 64 61 89 88 1.1.3. 80 Suggested Difficulty Scale 10 70 75 82 1.1.4. 75 H M L 54 43 38 62 1.2 93 Below 11 23 10 14 1.2.1. 93 60% 60-79% 80-100% 36 65 76 60 1.2.2. 73 60 37 27 63 1.2.3. 85 27 51 63 64 2 83 29 97 74 81 2.2 81 39 66 92 83 2.2.1. 73 15 32 67 74 2.2.2. 85 21 38 87 87 2.2.3. 81 24 54 71 63 2.2.4. 88 37 32 66 60 3 81 52 17 51 35 3.1 81 2 35 33 21 3.1.1. 71 33 50 86 82 3.1.2. 81 14 71 30 21 3.1.3. 84 26 65 36 24 3.1.4. 80 30 33 54 53 48 46 68 77 4 43 28 35 6 52 40 67
  • Applying Attainment Measure to a Marketing Class Learning ActivityExperiential Activity: OBSERVING MARKETING AND GENDER STEREOTYPINGThe purpose of this assignment is to examine the marketing of toys and sports equipment as wellas advertising images of boys and girls in play and sports contexts. The focus will be on memorycapabilities of adults compared to the memory capabilities of children. Class will be divided intothree groups. Each group will be assigned a specific task to research and will post their findingsonline. Marketing Experiential Activity Students will be able to detect Item Item sLO GLO1.2.3 differences between sub‐cultural market segments’ attitudes toward n=10 % sLO AV GLO AV brands. 23 83 1.2.3 1.2 18 67 1.2.3 1.21.3.1 Students will be able to recognize 22 88 1.2.3 1.2 consumer sub‐cultural market 4 38 1.2.3 69 1.2 69 segments’ VALs. 2 94 1.3.1 92 1.3 13 98 1.3.2 1.3 Students will be able to detect 14 86 1.3.2 1.31.3.2 differences between consumer sub‐cultural market segments’ 16 85 1.3.2 91 1.3 91 attitudes toward brands. 42 56 2.1.2 2.1 32 50 2.1.2 53 2.1 532.1.2 Students will be able to relate how self‐identity may impact consumers on consumption choices.
  • Examples of Statements That Can Be Made Using the Three Level Model• The class as a whole met the criterion on four out of five specific learning outcomes (sLOs).• The class met the criterion on both general outcomes (GLO level) and the associated Goal.• An improvement goal is to raise the general outcome (GLO level) criteria to 82%.• Our improvement goal is to increase Goal 1 attainment by 5 points within a year.
  • Anatomy of Course Redesign: How to Know what works by removingthe guesswork.INFORMATION FORINSTRUCTIONAL DECISIONS 15
  • Information for instructional decisions• Need An Analytics (Many Sources of Data) Approach To Make Good Instructional and Course Redesign Decisions – “Analytics is quickly becoming a term that gets slapped onto any existing product.” G. Siemans• This is academic (course level analytics) - key difference from learning (SIS or system analytics) 16
  • Learning AnalyticsPenetrating the Fog: Analytics in Learning andEducation – Educause“learning analytics is the measurement, collection,analysis and reporting of data about learners and theircontexts, for purposes of understanding and optimisinglearning and the environments in which it occurs.” 1st International Conference on Learning 17 Analytics and Knowledge
  • Source: George Siemens, http://www.learninganalytics.net/ 18
  • Obtaining Information About StudentsFormative information• Monitoring learning• Measuring achievement on the construct scale• Determining the degree to which student is learning• Determining the degree to which student meets outcome expectationsPre-Post Information• Measuring gains• Measuring growth• How far did individual move• How far did group moveSummative Information• Making judgments• Assign proficiency level
  • PRECOURSE DEMOGRAPHIC PROFILE1. Gender 4. Hours Worked per Week Decision QuestionsMale Currently do not work Female Less than 10 hrs per week 10 to 20 hrs per week Does this year differ significantly2.Race 21 to 30 hrs per week African American greater than 30 hrs per week from last year? Asian Caucasian 5. Residence Would seeing this demographic Hispanic Resident Hall information prior to starting theIndian City of Denton class cause you to want to make Non-Resident Alien Denton County changes to the instruction and Outside of Denton County course. design?3. ClassificationFreshman 6. AgeSophomore Below 19 How might you make changes inJunior 19 to 20 instructional design toSenior 21 to 22 accommodate the profile of thePost Bac 23 to 25 class you are getting?Masters 26 to 30Doctoral 31 to 40 41 to 50 Greater than 50
  • CHARTS SHOWING PRE-COURSE PROFILES Classification Counts with Percentage180 161160 145140120 113100 Count80 Percent60 48 31 3440 2420 10 0 Freshman Sophomore Junior Senior
  • RESIDENCE 140 110 115 120 100 80 70 60 40 27 20 0 Resident Hall City of Denton Denton County Outside Denton County AGE of STUDENTS140 118120100 82 80 62 60 42 40 40 28 20 10 2 0 Below 19 -20 21-22 23-25 26-30 31-40 41-50 50 + 19
  • Types of Normed Data Used by UNT• Entrance Exams - SAT - ACT - TOEFL - Acuplacer Math Placement• Core Curriculum Evalaution - CAAP - CLA - CBase
  • Traditional Measure Student Success for a Class MGMT Student Success Rates Success S UFormat INSTRUCTOR Count Percent Count Percent TotalFTF 1 Semester 08-Fall 213 0.87 32 0.13 245 09-Spg 238 0.90 25 0.10 263 09-Fall 267 0.91 28 0.09 295 10-Spg 241 0.90 26 0.10 267 2 Semester 10-Spg 0 0.00 1 1.00 1 3 Semester 05-Fall 147 0.75 48 0.25 195 06-Fall 135 0.70 59 0.30 194 06-Spg 155 0.71 62 0.29 217 07-Fall 191 0.77 56 0.23 247 07-Spg 159 0.73 59 0.27 218 08-Spg 175 0.80 44 0.20 219 09-Fall 215 0.81 51 0.19 266 4 Semester 05-Fall 160 0.76 50 0.24 210 06-Fall 194 0.89 25 0.11 219 06-Spg 155 0.77 47 0.23 202 07-Spg 170 0.78 48 0.22 218 2815 0.75NextGen 5 Semester 07-Fall 173 0.82 39 0.18 212 08-Fall 188 0.87 29 0.13 217 08-Spg 208 0.83 43 0.17 251 09-Spg 234 0.91 25 0.10 256 10-Spg 253 0.94 15 0.06 268
  • NextGen Information• Attitude Toward Course Survey• Learning Environment Preference Survey• Course Format Preference Survey• Student Evaluation of Teaching Effectiveness• Tests• Engagement 25
  • Student Preferences for N-Gen Course Format Versus Traditional FTF Format for a COMM courseTable 1 shows the results of the Student Format Preference surveys administered at theend of each semester. Provided are the counts and percentages of student preferencesfor course format plus the preferences by the categories of Successful (grade of A,B,C)and Unsuccessful (D,F,W,I). The values in parentheses are percentages. Total Un- Un- Total Number Success Success success success Preferred Preferred Number Un- preferred preferred preferred preferred COMM 1010 N-gen FTF Success successful N-Gen FTF N-Gen FTF 2009 Spring (n=595) 334(.56) 261(.44) 564(.95) 31(.05) 318(.56) 246(.44) 16(.52) 15 (.48) 2009 Fall (n=507) 232(.46) 275(.54) 472(.93) 35(.07) 216(.46) 256(.54) 16(.46) 19(.54) 2010 Spring (n=386) 181(.47) 205(.53) 380(.98) 6(.02) 177(.47) 203(.53) 4(.67) 2(.33) Student Comments Student comments as to why they preferred an NextGenformat versus an FTF format were also collected, and the current semester comments are sent to you in a separate email.
  • Student Preferences for NextGen Course Format Versus Traditional FTF Format Student CommentsStudent comments as to why they preferred an NextGen format versus an FTF format were alsocollected. After conducting a study of student responses over a two year period, it was found thatthe following categories emerged.Categories with descriptions for reasons why students chose NextGen or Traditional FTF.Format Reason Category Description Students liked that they could control the rate at which they absorbed information. PaceNextGen Flexibility Students liked that they could do assignments whenever and wherever they wanted. Learning Students found it easier to learn content when it is internet based. Practice Students liked that there were more opportunities to practice and learn Manage Students needed a structure so that they wouldn’t procrastinate.FTF Learning Students found it easier to learn content when format is FTF. People Students felt that they needed the face to face (lecture) interaction. Technical Students had difficulties with computers , network, and technology used
  • SURVEY RESULTS STUDENT ATTITUDE TOWARD COURSE SUBJECT (SATCS) N = 61 Class enrollment = 99 Diff (* is sig at +/- Pre (SD) Post (SD) .05) (SD) 1 This subject is worth knowing. + 4.29 (0.56) 4.33 (0.66) +0.05 (0.67) 2 I like this subject. + 4.00 (0.55) 4.14 (0.73) +0.14 (0.57)* 3 Knowing this subject makes me more employable. + 3.90 (0.70) 3.81 (0.98) -0.10 (1.09) 4 This subject is easy to learn. + 3.71 (0.64) 3.95 (0.67) +0.24 (0.70) 5 This subject should be required for all students. + 3.29 (1.00) 3.48 (0.93) +0.19 (0.87)* 6 This is a difficult subject for me. - 2.35 (0.67) 1.95 (0.76) -0.40 (0.82) 7 Learning this subject requires a lot of hard work. - 2.86 (0.79) 2.33 (0.86) -0.52 (1.08) 8 Knowing this subject is valuable to me. + 4.14 (0.66) 3.81 (0.81) -0.33 (0.91) 9 This subject makes me feel anxious or uncomfortable. - 1.81 (1.81) 1.52 (0.51) -0.29 (0.64)10 This subject does not fit into my overall educational needs. - 1.90 (0.77) 2.10 (0.89) +0.19 (0.51)*11 This subject is interesting. + 4.25 (0.55) 4.05 (0.61) -0.20 (0.70)12 This subject is difficult to understand. - 2.24 (0.70) 2.00 (0.89) -0.24 (0.89)13 This is a complicated subject. - 2.45 (0.89) 2.15 (0.88) -0.30 (1.08)14 I know a lot about this subject. + 3.00 (0.80) 3.35 (0.81) +0.35 (0.81)*15 This subject is relevant to my personal goals. + 3.80 (0.83) 3.75 (0.97) -0.05 (0.89)*16 I can learn this subject. + 4.25 (0.44) 4.25 (0.55) 0.00 (0.65)17 This subject is useful to my everyday life. + 4.00 (0.55) 3.90 (0.77) -0.10 (0.70)*18 I will have no application of this subject in my profession. - 2.05 (0.87) 1.95 (0.87) -0.10 (1.00)19 I am scared by this subject. - 1.48 (0.51) 1.52 (0.60) +0.05 (0.59)*20 I want to learn more about this subject. + 3.95 (0.74) 3.90 (0.70) -0.05 (0.74)*21 This is a fun subject. + 3.90 (0.77) 4.05 (0.67) +0.14 (0.66)*
  • Learning Environment Preferences Survey (LEP) Report Course: Semester:1. What to Learn2. How to Learn3. How to Think4. How to Judge
  • LEP Summary TableSemester Adm Category 1 Category 2 Category 3 Category 4 CCISpg 09 (n=35) Pre 48% 27% 13% 12% 288(36.6) Post 44% 29% 14% 13% 292(44.7) Difference -4% +2% +1% +1% +6Spg 10 (n=57 ) Pre 41.2 (20.2) 25.6 (11.1) 18.1 (15.3) 15.1 (10.9) 307.0 (47.0) Post 44.2 (20.5) 24.4 (11.6) 16.0 (12.3) 15.4 (10.9) 303.0 (45.0) Difference +3.0 -1.2 -2.1 +0.3 -4.0 43.21 (17.42) 28.18 (12.63) 12.91 (10.99) 15.64 (10.99) 301.05(40.38)Fall 10 (n = 35) Pre Post 38.03 (18.68) 37.77 (10.91) 16.85 (12.77) 13.48 (8.97) 305.71(42.65) Difference -5.18 (1.26) 9.59 (-0.08) 3.94 (1.78) -2.16 (-2.02) 4.66 (2.27)Spg 11 (n= 46) Pre 43.85(22.24) 24.96(12.25) 18.39(17.68) 12.78(10.82) 300.13(52.16) Post 42.22(25.50) 24.20(14.02) 17.96(15.97) 15.72(11.52) 307.13(55.52) Difference -1.63 -0.76 -0.43 2.93 7.00Fall11 (n=24) Pre 44.71(24.68) 26.54(16.41) 16.08(17.30) 12.75(9.72) 297.00(51.80) Post 41.54(25.25) 28.63(13.92) 12.79(14.15) 17.08(12.05) 305.25(55.08) Difference -3.17 2.08 -3.29 4.33 8.25
  • Engagement – Course Level• CLASSE is a pair of survey instruments that enable one to compare what engagement practices faculty particularly value and perceive important in a designated class with how frequently students report these practices occurring in that class.• CLASSEStudent is the survey instrument completed by each student enrolled in the designated class, while CLASSEFaculty is the survey instrument completed by the faculty instructor of the designated class. http://assessment.ua.edu/CLASSE/Overview.htm 31
  • Exploring Other Analytics• Big Data – McKinsey Global Institute defines big data as “datasets whose size is beyond the ability of typical database software tools to capture, store, manage and analyze.” (James Manyika, “Big Data: The Next Frontier for Innovation, Competition, and Productivity,” Executive Summary, McKinsey Global Institute, May 2011)• Learning management system analytics 32
  • 33
  • http://research.uow.edu.au/learningnetworks/seeing/snapp/index.ht ml 34
  • Using information to improve instructionINSTRUCTIONAL ANALYTICS 35
  • Scenario Which of the following statements best represents yourYou just got your end of courseresults on student attainment of conclusion about the particularoutcomes. online activity?•For a particular online activity thatwas designed to help students A. I created a great learningachieve a high degree of attainmenton a particular sLOs, five test items environment but I need to fixwere used. that dang 45% correct item that•Four of the test items showedabove 80% percent correct and one didn’t work last year either.was at 45% correct, which of courselowered considerably the average B. I need to considerthe attainment value for the sLO.•Your attitude toward the course redesigning the online activitytopic survey showed a slight positive so that students will do betterresult•70% of the students who took the on attaining the sLO.format preference survey said theyprefer NextGen to traditional FTF. C. I will make sure that next•There was also high positive semester I spend more reviewagreement on the engagementsurvey between what you thought time on what the 45% test itemwas important and what studentssaw happening in the class. was covering. 36
  • Information and Contacts• Mike Simmons, Ph.D. – mike.simmons@unt.edu• Ron Carriveau, Ph.D. – ronald.carriveau@unt.edu http://nextgen.unt.edu http://clear.unt.edu/ Twitter: @nextgeneducate http://www.slideshare.net/simmonsweb 37