Reten%on and success • Reten%on and success are dis%nct, but linked. Qualita%ve vs binary. • Applica%ons: quick/early drop-‐out, adapa%ve learning. • Ethical issues. • Media%ng feedback, using analy%cs to present the model with the ra%onale, used as the basis for a personalised conversa%on. Photo (CC) Trey Ratcliﬀ hJp://www.ﬂickr.com/photos/stuckincustoms/4622806283/
Mul%ple Purposes • Aggrega%on Ethics • Interven%on • Emo%ons • Mo%va%on • Informed decision making • Anxiety • ‘De-‐ • Surveillance modularisa%on’ (holis%c • Privacy informa%on) • Ipsa%ve vs norm • Transparency informa%on Opera%onalisa%on Mul%ple audiences • Selec%ng data sets • Diﬀerent purposes • Timeliness and eﬃcacy • Same data sets • evalua%on • Granularity • Interpreta%on and • Interac%vity clarity • Proprietary tool providers • Training and sense making Dashboards preemp%ng our needs/ wants • Pedagogically drivers
Dashboard Examples Student • How am I doing compared to cohort? Tutor • Is what I’m doing with my students working? Ins%tu%on • Which students are most likely to drop out? PSRB • Are any students gradua%ng from this ins%tu%on without all of the required learning outcomes? Researchers • Across the sector which ins%tu%ons produce the best graduates in each discipline?
Analy5cs for Student Success & Reten5on: Issues Pre-‐fail Dangers of a Pre-‐Crime Unit Ethics of interven5on: Just for those who are failing? What about the rest? Beware self-‐fulﬁlling failure prophecies! “Dear <ﬁeld1>…” Beware back-‐ﬁring personalisa%on expecta%ons: “So I really am just a number” Informed interven%ons hopefully changing learners’ futures for the beJer… But what does that do for datasets and historical comparison? Important to collect data about interven%ons to assess their impact amongst other variables Beware: can’t count, doesn’t count: we’re in a complex people business!
Issues • How do we measure learning (rather than ‘success’ in assessments) • Approximate proxies for learning… • Shouldn’t assessment be our ‘best measure’ of learning – well, perhaps it should be a suite of analy%cs • What ‘knowledge’ do we want from our graduates • ‘Recipe’ issue of LA? – so we have to make sure we’re looking for the ‘right’ processes • Assessment/analy%cs: Snapshots, con%nuity, and change metrics; how can they be used? • Analy%cs driven by what we want to achieve rather than what data is available
Examples • Dialogue analysis, perhaps analysis of use of social networks • LA as pedagogy v LA for pedagogy – LA which feeds back in to ‘improving’/adap%ng. LA can help us challenge our assump%ons about how the learning is taking place. Can LA allow us to hypothesis test our (as teachers) assump%ons about learning? • Pass rate and online ac%vity has a correla%on – eﬀec%ve ‘proxy’?
Issues • Availability • Awareness of data • Quality collec%on • Enrich (combining data) • Sharing (ethics, • Private commercially sensi%ve) • Infrastructure • Paying to access your own data • Planning in rapidly evolving • Need? area (itera%ons) • Granularity (nano) • Data ownership • Not everything is online – • Purpose no footprint (overall • Culture change visibility of interac%ons) • Volume
Examples • TINCAN API • IBM – (data don’t ask, don’t get) • midata