Ellen Wagner: Putting Data to Work

536 views
355 views

Published on

Ellen Wagner, Executive Director, WCET.
Putting Data to Work
This session explores changing data sensibilities at US post-secondary institutions with particular attention paid to how predictive analytics are changing expectations for institutional accountability and student success. Results from the Predictive Analytics Reporting Framework show that predictive modeling can identify students at risk and that linking behavioral predictions of risk with interventions to mitigate those risks at the point of need is a powerful strategy for increasing rates of student retention, academic progress and completion.
presentation at the 15th annual SLN SOLsummit February 27, 2014
http://slnsolsummit2014.edublogs.org/

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
536
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
3
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Ellen Wagner: Putting Data to Work

  1. 1. From Reporting to Insight to Action: How Data are Changing Post-secondary Education Ellen D. Wagner Chief Strategy Officer, PAR Framework
  2. 2. Session Overview • This session explores changing data sensibilities at US postsecondary institutions. Particular attention is paid to how predictive analytics are changing expectations for institutional accountability and student success. • Results from current work in postsecondary education show that predictive modeling can effectively identify students at risk. • Is predicting risk enough to move the needle on risk mitigation to improve student success? • What does this mean for online learning?
  3. 3. Setting the Context: Data Are Changing Everything
  4. 4. “Meh… education researchers have always worked with data.” • • • • We do qualitative research with data We do quantitative research with data We do evaluations with data We develop surveys and instruments and experiments to collect more data • We pull data from LMSs, SISs, ERPs, CRMs … • We write reports, summaries, make presentations, develop articles and books and webcasts….
  5. 5. What is the one thing we don’t do??? Data mining
  6. 6. Data Optimize Online Experience The “digital breadcrumbs” that online technology users leave about viewing, engagement and behaviors, interests preferences provide massive amounts of information that can be mined to better optimize online experience. It’s about convenience, personalization, recommendations, just-in-time, just-the-right-device.
  7. 7. What do we want? The RIGHT Answers!! When do we want them? NOW!!
  8. 8. Getting to the right answer takes work • Analysis and model building is an iterative process • Around 70-80% efforts are spent on data exploration and understanding. SAS Analysis/Modeling Process
  9. 9. Three Opportunities on our Horizons • Linking predictions to action • Creating new insights and opportunities with data • Delivering on the promise of what online learning can be
  10. 10. Link Predictions to Action • Predictive analytics refer to a wide varieties of methodologies. There is no single “best” way of doing predictives. You need to know what you are looking for. • Simply knowing who is at risk is simply not enough. Predictions have value when they are tied to what you can do about it. • Linking behavioral predictions of risk with interventions at the best points of fit offers a powerful strategy for increasing rates of student retention, academic progress and completion.
  11. 11. Create new insights and opportunities for data in our practices • • • • • • • Enrollment management Student services Program and learning experience design Content creation Retention, completion Gainful employment Institutional Culture
  12. 12. Delivering on the promises of what Online Learning can be • Online learning is more than MOOCs, but that has now become the public perception of what we do. • We are watching our our practice disaggregated into 3rd party platforms and apps. • We find ourselves at the center of strategic conversations, but the ways in which we are evaluated continue to miss the mark. • We need to get out of the way of what we’ve been and embrace where we need to go.
  13. 13. Online Learning as a Catalyst for Changing Data Sensibilities Sloan-C Babson, 2013
  14. 14. And yet…… Sloan-C Babson, 2013
  15. 15. Faculty still don’t trust it Sloan-C Babson, 2013
  16. 16. 16
  17. 17. How Are We Doing So Far? • Data analytics are still emerging. Many organization still rely on traditional technology (e.g. spreadsheets) and methods (e.g. inferential statistics). • Analytics tend to be used narrowly within departments and business units, not integrated across the institutions. • Intuition based on experience is still the driving factor in datadriven decision-making. Analytics are used as a part of the process. 17 Bloomberg BusinessWeek Research Services Analytics Insights (2014):
  18. 18. How Are We Doing So Far? • Data is the number 1 challenge in the adoption and use of analytics. Organization continue to struggle with data accuracy, consistency, access. • Analytics to solve big issues, with the primary focus on reducing costs, improving the bottom line, managing risk. • Many organizations lack the proper analytical talent. Organizations that struggle with making good use of analytics often don’t know how to apply the results. • Culture plays a critical role in the effective use of data analytics. 18
  19. 19. National Non-profit Multi-institutional Collaborative Institutional Effectiveness + Student Success
  20. 20. About PAR Framework • Established, growing non-profit collaborative focused on using existing institutional data to improve institutional effectiveness and student outcomes • Funded by Bill & Melinda Gates Foundation 2011, 2012, 2013 • Managed at the Western Interstate Commission for Higher Education • Engagement with more than 39 forward thinking US institutions • Small, high functioning team with partner, subject and domain expertise • In-kind donations to date ▫ IBM Tableau ▫ Blackboard iData ▫ Starfish
  21. 21. DATA STATISTICS • Total Counts Time Frame • August 2009 – May 2013 – 13,090,351 course records – 1,842,917 student records
  22. 22. PAR Objectives Creating scalable solutions for institutional effectiveness and student success through -common data definitions -common measures -institutional collaboration
  23. 23. Structured, Readily Available Data • Common data definitions = reusable predictive models and meaningful comparisons. • Openly published via a cc license @ https://public.datacook book.com/public/institu tions/par
  24. 24. Speak the same language
  25. 25. PAR Core Competencies, helping members IDENTIFY Benchmarks Provide insight into how institutions compare to their peers through common measures by scaling multiinstitutional datasets for benchmarking and research purposes. TARGET Models Identify which students need assistance, by using in-depth, institutional specific predictive models Models are unique to the needs and priorities of our member institutions based on their specific data. TREAT Interventions Determine the best way to address areas of weakness identified in benchmarks and models by scaling and leveraging a member and literature validated framework for examining interventions within and across institutions (SSMx)
  26. 26. Different Levels of Insight PAR Benchmarks Descriptive Analytics PAR Models Predictive Analytics Cross Institutional Student/degree/major level insight into: 1. What did the retention look like for students entering in the same cohort 2. How does your institution compare to peer institutions / institutions in other sectors 3. What was the relationship of student attributes 4. What were the attributes and performance outcomes Institutional Specific insight into: 1. What students are being retained over time? 2. Which students are currently at risk for completing and why? 3. Which factors are directly correlated to student success?
  27. 27. DATA DELIVERY AND QA TOOLS • • Automated response and self service Q1 2014 300 automated tests
  28. 28. BENCHMARKS • • • • • • • Available now Unlimited institutional users Released November / May Member driven report development Expanded report sets Releasing on tablets/devices Dynamic institution
  29. 29. INSTITUTIONAL PREDICTIVE MODELS • • • • • Delivered a a limited beta Member driven institutional model targets selection Migrating SAS Visual Analytics delivery with next data set Unlimited number of institutional logins Delivered up to 3x a year year, within 21 days of data acceptance Rapid turn model publications for watch lists
  30. 30. INTERVENTION INVENTORY TOOLS • • • • • • • Online application built on SSMx Lays groundwork for intervention benchmarks Enables institutional snapshot on expanded intervention dataset Ver. 1 launches Q1 ‘14 Ver. 1.1 with reporting Q3 ‘14 Designed for integration with benchmarks and models Full integration of reporting Q3 ‘14
  31. 31. Thank you for interest Ellen D. Wagner, Ph.D. edwsonoma@gmail.com ellen.wagner@parframework.org http://.parframework.org http://twitter.com/edwsonoma +1.415.613.2690 mobile 31

×