Using Analytics to Intervene with Underperforming College Students


Published on

Abstract: Data mining is typically associated with business and marketing. For example, Amazon uses people's past purchases to suggest books they might be interested in buying. Similarly, academic analytics can be used to identify and predict students who might be at risk, by analyzing demographic and performance data of former students. However, there is no clear consensus on how to intervene with current students in a way they will accept and not associate with academic "profiling." Why should students think they are exceptions to our rules? This panel presentation will share how three institutions are approaching this problem and provide an overview of related issues.

Published in: Education
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • SIS data= admissions data such as HS GPA, HS rank, highest science taken in HS, SAT CMS data= Time spent in content files, time in discussion boards, time spent doing practice quizzes etc. and other technologies such as level of engagement from Hotseat, online tutor session, blogs, wikis etc. Other data= help seeking behavior (office hours, help centers), work study status, on campus/off campus ------ By combining all the data and building various predicative models, we can customize the algorithms. Institution level (PU students are different from UMBC, UMBC students are different that GRCC etc.) College level (ENGR students differ from LA students so a. data points are different b. different data elements are more predictive for certain college/programs) Course level (each course has different characteristic—some for majors, some survey level, inst differences with TA etc. different level of “success” as defined by instructor
  • Customizable—each instructor creates their own intervention based on the multiple data points Real-time this can be updated instantly Specific—faculty AND students know exactly where they should be focusing on (ie you are not utilizing the practice quizzes as much as your peers and you are not going to online tutoring as recommended, please try—we want you to be successful) actionalble—tell the students EXACTLT what to do
  • Academic analytics requires patience! We have been doing research for 7 years. When you are talking about wrangling all these data sources—some dynamic, some static—it takes time! And we have to be mindful of privacy issues as well.
  • 1) The Big Idea = Evaluating and measuring the use of Blackboard on your campus is becoming more and more important. With budget cuts and challenges… along with the need to continue to transform education by leveraging technology and the tools therein. 2) Focus of Presentation = Will be to review Project ASTRO and it’s capabilities. 3) Themes = Tracking/Reporting/Discovery/Sharing/Acting How many faculty and students are using Bb? What tools are used in our system? How can we improve and promote new tools? Who do we need to communicate with for the next upgrade? Who are our innovative faculty? How does faculty use compare to student use? Which departments are using Bb and to what degree? How can we use these data to encourage deeper use and create awareness of Bb? What is the impact of Bb in teaching and learning? Goal - To build a community based advanced reporting tool and Blackboard Building Block that will empower institutions to become more accountable and able to use data-driven decision making to enhance, optimize, and advance Blackboard Academic Suite usage in teaching and learning. Awarded in 2007 Code Name = Project ASTRO
  • Identify = Detect It costs more to recruit students that it does to retain them. LEVERAGE CMS data
  • Push
  • Pull
  • Auto
  • Using Analytics to Intervene with Underperforming College Students

    1. 1. USING ANALYTICS TO INTERVENE WITH UNDERPERFORMING COLLEGE STUDENTS Kimberly Arnold (Purdue University) John Fritz (University of Maryland, Baltimore County) Eric Kunnen (Grand Rapids Community College) January 20, 2010
    2. 2. OVERVIEW <ul><li>Analytics 101 </li></ul><ul><li>Five Minutes of Fame </li></ul><ul><ul><li>Purdue University’s “Signals” </li></ul></ul><ul><ul><li>UMBC’s “Check My Activity” (CMA) </li></ul></ul><ul><ul><li>GRCC’s & Seneca College’s Project ASTRO </li></ul></ul><ul><li>More Demos (time permitting) </li></ul><ul><li>Q&A </li></ul>
    3. 3. ANALYTICS 101
    4. 4. WHAT IF . . . <ul><li>Can performance and/or backgrounds of past students predict success of future students? </li></ul><ul><li>How would we know? </li></ul><ul><li>If so, how would we communicate (intervene?) with students? With teachers? </li></ul><ul><li>How would this change teaching & learning? </li></ul>
    5. 5. CLASSROOM WALLS THAT TALK <ul><li>Course or Learning Management Systems are NOT just content delivery or interactive learning environments. </li></ul><ul><li>The record or “residue” of online learning is a potentially rich data source that needs to be studied further. </li></ul><ul><li>How are schools thinking about this? </li></ul>
    6. 6. Five Stages of Analytics on Campus <ul><li>1. Extraction and reporting of transaction-level data </li></ul><ul><li>2. Analysis and monitoring of operational performance </li></ul><ul><li>3. What-if decision support (such as scenario building) </li></ul><ul><li>4. Predictive modeling and simulation </li></ul><ul><li>5. Automatic triggers of business processes (such as alerts) </li></ul><ul><ul><ul><li>Source: ECAR Study on Analytics (2005). </li></ul></ul></ul>
    8. 8. PURDUE’S “SIGNALS”
    9. 10. CMS and other technologies (real time) SIS data (historic) Other data Specific and customizable interventions Prediction
    10. 11. INTERVENTIONS <ul><li>Customizable </li></ul><ul><li>Real-time </li></ul><ul><li>Specific </li></ul><ul><li>Actionable </li></ul>
    11. 12. 5 MINUTE OF FAME RESULTS <ul><li>More Bs and Cs </li></ul><ul><li>Less Ds and Fs </li></ul><ul><li>Students are getting more help, earlier and more frequently </li></ul><ul><li>Faculty like that they can give feedback to large courses (150-1,200) </li></ul><ul><li>Students </li></ul><ul><ul><li>Direct contact with faculty </li></ul></ul><ul><ul><li>Motivation </li></ul></ul><ul><ul><li>60% say they got a better grade </li></ul></ul>
    14. 19. FUTURE VERSION
    15. 20. UMBC BLACKBOARD ACTIVITY BY GRADE DISTRIBUTION (2007-2009) SEMESTER COURSES D/F Avg >=C Avg % Diff FA2009 29 189.50 302.33 37.32 SU2009 9 212.00 275.67 23.10 SP2009 11 92.50 175.00 47.14 FA2008 13 101.50 166.33 38.98 SU2008 7 65.00 167.33 61.16 SP2008 26 136.00 199.67 31.89 FA2007 15 135.00 211.33 36.12 110 112.29 213.95 39.39
    17. 22. FA2008 SCI100 FINDINGS <ul><li>How would you describe the CMA’s view of your Bb activity compared to your peers? </li></ul><ul><ul><li>28% “I was surprised by what it showed me” </li></ul></ul><ul><ul><li>12% “It confirmed what I already knew” </li></ul></ul><ul><ul><li>42% “I’d have to use it more to determine its usefulness” </li></ul></ul><ul><ul><li>16% “I haven’t used it.” </li></ul></ul><ul><ul><li>2% did not respond to this question </li></ul></ul>
    18. 23. FA2008 SCI100 FINDINGS <ul><li>If your instructor published a GDR for past assignments, would you be more or less inclined to use the CMA before future assignments are due? </li></ul><ul><ul><li>54% “More inclined” </li></ul></ul><ul><ul><li>10% “Less inclined </li></ul></ul><ul><ul><li>36% “Not sure” </li></ul></ul>
    19. 24. FA2009 STUDENT USE
    20. 25. GRADE DISTRIBUTION <ul><li>Part of our public Blackboard Reports, run after the last day of classes every semester. </li></ul><ul><li>Final GDR run after final grades are submitted. </li></ul><ul><li>Faculty “opt in” by including a final letter grade in their Bb grade book with the column heading “GRADE.” </li></ul><ul><li> . </li></ul>
    21. 26. <ul><ul><li>Eric Kunnen </li></ul></ul><ul><ul><li>Coordinator of Instructional Technologies </li></ul></ul><ul><ul><li>Grand Rapids Community College </li></ul></ul><ul><ul><li>[email_address] </li></ul></ul><ul><ul><li>Santo Nucifora </li></ul></ul><ul><ul><li>Manager of Systems Development and Innovation </li></ul></ul><ul><ul><li>Seneca College </li></ul></ul><ul><ul><li>[email_address] </li></ul></ul>&quot;PROJECT ASTRO&quot; BLACKBOARD GREENHOUSE GRANT
    22. 27. OVERVIEW <ul><li>Evaluating Blackboard Use on Your Campus A review of the &quot;Project ASTRO&quot; Greenhouse Grant - Session Description: </li></ul><ul><ul><li>Collecting and reporting on system activity information from Blackboard is often a challenge. Come and learn how to easily access reports on how Blackboard is being used by faculty, staff, and students which will help you: inform stakeholders, improve the engagement of end users, increase adoption, and to encourage deeper use of Blackboard by faculty, staff, and students. </li></ul></ul><ul><li>Key Functions of Project ASTRO: </li></ul><ul><ul><li>Tracking Automatic tracking (via Building Block) of courses, organizations, users, and tools. </li></ul></ul><ul><ul><li>Reporting Easy point and click access to advanced reports. </li></ul></ul><ul><ul><li>Discovery Ability to measure trends and analyze usage. </li></ul></ul><ul><ul><li>Sharing Inform key stakeholders of usage levels. </li></ul></ul><ul><ul><li>Acting Identify, target, and engage users using reports. </li></ul></ul>
    23. 28. THE POTENTIAL OF REPORTING <ul><li>Usage Statistics – Ability to monitor and better determine which parts of the system are the most frequently used and less frequently used, thereby providing targeted promotion and training of specific tools or features that would benefit both faculty and students. (Student/Faculty Engagement, Adoption, Retention , and Satisfaction) </li></ul><ul><li>Accountability – Ability to provide metrics and trend data that can be obtained from the system to determine accurate statistics on usage for stakeholders requiring data-driven decisions and measures. (Department Action Plans, Continuous Quality Improvement) </li></ul><ul><li>Planning - Data obtained will enable institutions to better prepare and optimize system configuration, change management, manage upgrades, monitor performance, rolling out new tools, communication of issues, and investigate infrastructure hardware purchases for future growth. (Upgrade Management and System Growth) </li></ul><ul><li>Return on Investment - Data gathered can help determine and allow for the maximization of Blackboard use by faculty, staff, and students. (ROI and Accountability) </li></ul>
    24. 29. DASHBOARD <ul><li>At a Glance Views </li></ul><ul><li>Active vs Inactive Courses including Departments </li></ul><ul><li>Top Courses and Organizations for Week </li></ul><ul><li>Top Tools Used by Instructors and Students </li></ul><ul><li>Top Portal Modules Used </li></ul>
    25. 30. ACTIVE COURSES <ul><li>Department vs Sub Department </li></ul><ul><li>Courses, Percentage, Instructors, Students </li></ul><ul><li>Semester Trends </li></ul>
    26. 31. ACTIVITY IN ACTIVE COURSES <ul><li>Page Views (Hits) by Instructor and Student </li></ul><ul><li>Courses with Page Views vs Total Page Views </li></ul><ul><li>Break Down by Day, Week, Month, All </li></ul>
    27. 32. TOP WEEK TOOL PAGE VIEWS <ul><li>Student vs Instructor </li></ul>
    28. 33. COURSE TOOL ITEMS <ul><li>Building Block Tool Tracking </li></ul><ul><li>Drill Down by Courses and Instructor Using Tools </li></ul><ul><li>Trends </li></ul>
    30. 35. ACTIVITY BY USER
    31. 36. GRCC – STARFISH EARLY ALERT <ul><li>Identify & Detect </li></ul><ul><ul><li>Manual Flags </li></ul></ul><ul><ul><li>Automatic Flags </li></ul></ul><ul><ul><li>Attendance </li></ul></ul><ul><li>Intervene & Track </li></ul><ul><ul><li>Instructor </li></ul></ul><ul><ul><li>Advisor </li></ul></ul><ul><ul><li>Groups of Courses and Students </li></ul></ul><ul><li>Improve & Retain </li></ul><ul><ul><li>Student Communication and 360 Close Loop </li></ul></ul>More info:
    32. 37. GRCC - STARFISH EXAMPLE INSTRUCTOR MANUALLY RAISES FLAG The instructor can select one or more students from the student list and manually raise a flag on the student. When raising a flag, the instructor writes a description of the flagged issue. This information is forwarded to someone who can help the student, as determined by the flag rule setup by the administrator.
    33. 38. GRCC - STARFISH EXAMPLE INSTRUCTOR RESPONDS TO A “FLAG SURVEY” EMAIL Administrators can email survey requests to instructors. Clicking on the request takes the instructor to a flag survey where they are prompted to flag their students if they are experiencing any specified problems.
    34. 39. GRCC - STARFISH EXAMPLE AUTOMATIC FLAGS BASED ON BLACKBOARD GRADEBOOK/COURSE ACCESS Administrators can set up flags to be raised that are auto-generated. Flags can be raised by the system by grades and average scores and specific gradebook columns in Blackboard. Flags can also be raised based on students’ access to their courses in Blackboard. Additional customization is available through API’s.
    35. 40. MORE INFORMATION <ul><li>Purdue University Signals Project Site </li></ul><ul><li>UMBC’s Blackboard Reports & CMA </li></ul><ul><li>GRCC & Seneca College - Project ASTRO </li></ul><ul><li>GRCC Starfish Early Alert Project Site </li></ul>
    36. 41. THANK YOU Questions?