Your SlideShare is downloading. ×
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Using Analytics to Improve Student Success
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Using Analytics to Improve Student Success

154

Published on

Preconference presentation for the Conference on Gateway Course Excellence, March 23, 2014

Preconference presentation for the Conference on Gateway Course Excellence, March 23, 2014

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
154
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. USING ANALYTICS TO IMPROVE STUDENT SUCCESS: A PRIMER ON LEVERAGING DATA TO ENHANCE STUDENT PERFORMANCE March 23, 2014 Matthew D. Pistilli, PhD
  • 2. Plan for the day  Introductions and Purpose  Conceptual Overview  Other Institutions’ Analytics  Five Components of Analytics  Individual/Group Work & Planning  Managing Expectations in Next Steps
  • 3. Who are we? Where are we from? Why are we here? Introductions and Purpose
  • 4. Definitions Student Involvement Theory: Astin’s Inputs-Environment-Output Model Conceptual Overview
  • 5. Definitions
  • 6. Definitions of Learning Analytics  The measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs (SoLAR)  Evaluating large data sets to provide decision makers with information that can help determine the best course of action for an organization, with a specific goal of improving learning outcomes (EDUCAUSE, 2011)
  • 7. Definitions Continued  Using analytic techniques to help target instructional, curricular, and support resources to support the achievement of specific learning goals (van Bareneveld, Arnold, & Campbell, 2012)  the process of developing actionable insights through problem definition and the application of statistical models and analysis against existing and/or simulated future data (Cooper, 2012)
  • 8. Definitions Continued  Using data to inform decision-making; leveraging data to identify students in need of academic support; and allowing direct user interaction with a tool to engage in some form of sensemaking that supports a subsequent action (Krumm, Washington, Lonn, & Teasley)  The use of data, statistical analysis, and explanatory and predictive models to gain insights and act on complex issues (Bichsel, 2012)
  • 9. Common Themes
  • 10. Challenge: How do you find the student at risk? http://www.youthareawesome.com/wp-content/uploads/2010/10/wheres-waldo1.jpg
  • 11. http://www.youthareawesome.com/wp-content/uploads/2010/10/wheres-waldo1.jpg Challenge: How do you find the student at risk?
  • 12. Analytics is about…  Actionable intelligence  Moving research to practice  Basis for design, pedagogy, self- awareness  Changing institutional culture  Understanding the limitations and risks
  • 13. Inputs-Environment-Output Student Involvement Theory
  • 14. Student Involvement Theory  Alexander Astin - UCLA  Involvement: The amount of physical and psychological energy that the student devotes to the academic experience. (1985, p. 134)  Exists on a continuum, with students investing varying levels of energy  Is both quantitative and qualitative  Direct relationship between student learning and student involvement  Effectiveness of policy or practice directly related to their capacity to increase student learning (Astin, 1999)
  • 15. Inputs-Environment-Output Model Inputs Output Environment
  • 16. Inputs  The personal, background, and educational characteristics that students bring with them to postsecondary education that can influence educational outcomes (Astin, 1984).
  • 17. Inputs  Astin (1993) identified 146 characteristics, including  Demographics  Citizenship  Ethnicity  Residency  Sex  Socioeconomic status  High school academic achievement  Standardized test scores  GPA  Grades in specific courses  Previous experiences & self-perceptions  Reasons for attending college  Expectations  Perceived ability
  • 18. Outcomes  Basic level  Academic Achievement  Retention  Graduation  More abstractly  Skills  Behaviors  Knowledge The things we are attempting to develop in students
  • 19. Environment  Where we have the most control  Factors related to students’ experience while in college  Astin (1993) identified 192 variables across 8 overarching classifications Institutional characteristics Financial Aid Peer group characteristics Major Field Choice Faculty characteristics Place of residence Curriculum Student involvement
  • 20. … requires a shift in thought. All this data…
  • 21. Moving from… Data DescribesDecides to…
  • 22. Other Institutions’ Analytics
  • 23. Austin Peay University Degree Compass
  • 24. Rio Salado College Student Support Model
  • 25. Open Learning Initiative SNAPP
  • 26. UMBC Purdue University Check My Activity
  • 27. Campbell & Pistilli, 2012 Analytics 5 Component Model
  • 28. Five Components of Analytic Model Gather Predict ActMonitor Refine Components are cyclical starting with gather but can be drawn upon at any point in the cycle.
  • 29. Analytic Component 1: Gather
  • 30. Gather  Data  In multiple formats  From multiple sources  With insights into students & their success  That can be analyzed & manipulated into formulae Data is the foundation for this work, and without good data, the effort may be for naught.
  • 31. Gather  Before gathering, determine what will be gathered.  What question are you trying to answer?  To do so, consider…  Where will your focus be?  What data do you already have (or have access to)?  What else do you need to collect?  Who owns that data?  What will it take to get access to it?  What are the challenges associated with assembling all the data?  What are the funding implications for data collection and assembly?
  • 32. Gather Ultimately, answer the following questions: 1. How will you describe this analytics area to interested parties? 2. Who are the key stakeholders that need to be included in discussions? 3. Who should serve as the lead for this area at your institution? 4. What other considerations are there?
  • 33. Analytic Component 2: Predict
  • 34. Predict  Begins with the question asked in Gather:  What do you want to predict?  How do you identify this as a focus area?  Prediction models built will be driven by  Types of data gathered  Question being answered  What’s currently being predicted?  How?  By whom?  In what realms? Student success?  How can you involve those persons in this effort?
  • 35. Predict  What makes a good model?  Correlation vs. Causation  Expertise required  Data analysis  Statistical  Content  Reliability & Validity  Frequency of updating  Challenges & obstacles
  • 36. Predict Ultimately, answer the following questions: 1. How will you describe this analytics area to interested parties? 2. Who are the key stakeholders that need to be included in discussions? 3. Who should serve as the lead for this area at your institution? 4. What other considerations are there?
  • 37. Analytic Component 3: Act
  • 38. Act  Harken back to journalism class…  Who?  What?  Where?  When?  Why?  How?  Add:  Available resources?  Timing
  • 39. Act  Frequency – more is always better  Funding the action  Assessing the impact  What are you assessing?  Were behaviors changed?  How do you know?  Do different actions need to be:  Taken (on your end)?  Suggested (on the students’ end)?
  • 40. Act Ultimately, answer the following questions: 1. How will you describe this analytics area to interested parties? 2. Who are the key stakeholders that need to be included in discussions? 3. Who should serve as the lead for this area at your institution? 4. What other considerations are there?
  • 41. Analytic Component 4: Monitor
  • 42. Monitor  Formative & summative in nature  Can present challenges and obstacles  It’s a process  Current process must be understood  New/parallel processes developed as necessary  Involving others… to some extent, the more the merrier  Availability of resource (time, money, people)  Timing of monitoring  Ability to react
  • 43. Monitor  Review  Data collected and used… was it  Necessary?  Correct?  Sufficient?  Predictions made… were they  Accurate?  Meaningful?  Actions taken… were they  Useful?  Sustainable?  Feedback received to date
  • 44. Monitor Ultimately, answer the following questions: 1. How will you describe this analytics area to interested parties? 2. Who are the key stakeholders that need to be included in discussions? 3. Who should serve as the lead for this area at your institution? 4. What other considerations are there?
  • 45. Analytic Component 5: Refine
  • 46. Refine  Self-improvement process for  Analytics at the institution  The institution  Enrolled students  Continual monitoring  Small tweaks here and there  Major changes after periods of time  Updating of algorithms and statistical models  Outcome data important as  Assessment  Additional components for inclusion in the model
  • 47. Refine  What was learned from this effort?  Where are the positives?  Where are the deficiencies?  Was the goal realized?  How does the goal/involvement in the project help meet institutional goals?  Who else needs to be involved to improve/enhance the process, actions, and outcomes?  How can lessons learned be applied for future use?
  • 48. Refine Ultimately, answer the following questions: 1. How will you describe this analytics area to interested parties? 2. Who are the key stakeholders that need to be included in discussions? 3. Who should serve as the lead for this area at your institution? 4. What other considerations are there?
  • 49. Elevator Speech for Project Determine/solidify Institutional Goal Work on Component Templates Individual/Group Work
  • 50. What is your goal for this project? What have you learned? What are your next steps? What questions do you still have? Institution Reporting & Town Hall
  • 51. Managing Expectations in Next Steps
  • 52. http://i.imgur.com/nZArTnc.jpg
  • 53. Expectations Reality  Plug and Play  Immediate results  Solve every problem – ever!  Universal adoption  Everyone would love it!  Fits, starts, reboots  Mostly long term outcomes  Solve some problems, create some new problems  Lackluster use  Not everyone loved it
  • 54. Institutional Challenges  Data in many places, “owned” by many people/organizations  Different processes, procedures, and regulations depending on data owner  Everyone can see potential, but all want something slightly different  Sustainability – “can’t you just…”  Faculty participation is essential  Staffing is a challenge
  • 55. New Possibilities  Using data that exists on campus  Taking advantages of existing programs  Bringing a “complete picture” beyond academics  Focusing on the “Action” in “Actionable Intelligence”
  • 56. Contact Information  Email: mdpistilli@purdue.edu  Phone: 765-494-6746  Twitter: @mdpistilli – twitter.com/mdpistilli
  • 57. References Astin, A. W. (1984). Student involvement: A developmental theory for higher education. Journal of College Student Development, 24, 297-308. Astin, A. W. (1993). What matters in college? Liberal Education, 79(4). Astin, A. W. (1994). What matters in college: Four critical years revisited. San Francisco: Jossey-Bass. Bichsel, J. (2012, August). Analytics in higher education: Benefits, barriers, progress, and recommendations (Research Report). Louisville, CO: EDUCAUSE Center for Applied Research. Available: http://net.educause.edu/ir/library/pdf/ERS1207/ers1207.pdf Cooper, A. (2012). What is Analytics? Definition and Essential Characteristics. CETIS Analytics Series, 1(5). Available: http://publications.cetis.ac.uk/2012/521 EDUCAUSE Learning Initiative. (2011). 7 things you should know about first-generation learning analytics. Louisville, CO: EDUCAUSE. Available: http://www.educause.edu/library/resources/7-things-youshould- know-about-first-generation-learning-analytics Krumm, A. E., Waddington, R. J., Lonn, S., & Teasley, S. D. (n.d.). Increasing academic success in undergraduate engineering education using learning analytics: A design based research project. Available: https://ctools.umich.edu/access/content/group/research/papers/aera2012_krumm_learning_analytics. pdf Oblinger, D. G. and Campbell, J. P. (2007). Academic Analytics, EDUCAUSE White Paper. Society of Learning Analytics Research. (n.d.) About. [Webpage] Available: http://www.solaresearch.org/mission/about/

×