The Connected Intelligence Centre:
Human-Centered Analytics for UTS Data Challenges
Simon Buckingham Shum & CIC Team
Director, Connected Intelligence Centre
Professor of Learning Informatics
@sbuckshum #LearningAnalytics
cic.uts.edu.au
h"ps://twi"er.com/Wiswijzer2/status/414055472451575808	
  
“Note: check the
huge difference
between knowing
and measuring…”
#LearningAnalytics
SoLAResearch.org
#AlgorithmicAccountability
GoverningAlgorithms.org
Blackfriars Campus, Grnd Floor Building 22
Georgia	
  Markakis	
  
Senior	
  Manager	
  
Theresa	
  Anderson	
  
Senior	
  Lecturer	
  
Fac.	
  Arts	
  &	
  Social	
  Sciences	
  
Roberto	
  MarLnez-­‐Maldonado	
  
Research	
  Fellow	
  
Xiaolong	
  (Shawn)	
  Wang	
  
Web	
  Developer	
  
=	
  at	
  today’s	
  Data	
  Science	
  Symposium	
  
Ruth	
  Crick	
  
Professor	
  of	
  Learning	
  AnalyLcs	
  
&	
  EducaLonal	
  Leadership	
  
School	
  of	
  EducaLon	
  
Simon	
  Knight	
  
Research	
  Fellow	
  
cic.uts.edu.au	
  
CIC works as a creative incubator to catalyse
thinking about the use of data and analytics
among students, educators, researchers
and leaders. We do this through research,
prototyping, evaluation, embedding and
education.
So we work with faculties
Business
Communication
Education
Engineering
Health
Information Technology
Law
Science
and support units
Careers
HELPS
Library
IML
ITD
Jumbunna IHL
U:PASS
Student Services
8
http://uts.edu.au/future-students/analytics-and-data-science
UTS Data Challenges
9
Real time analytics for
student writing
Real time prediction of ‘at
risk’ course dropouts
Predictive analytics for
student employment
Analytics for UTS learning.futures
Real time analytics for
student collaboration
Values, Data Infrastructure,
Algorithmic Accountability
Analytics for student
resilience and agency
What would it mean to
convene internal
“Data Challenges?”
Competitive Teams, Taskforces, or NICs?
10
11
12https://www.kaggle.com/c/predict-west-nile-virus
13
Or a multi-party taskforce, coordinated and collaborating?
CIC is currently coordinating a pilot Data Challenge on a
UTS Graduate Employability dataset
CIC is currently coordinating a pilot Data Challenge on a
UTS Graduate Employability dataset
AGS Dataset
Can new analytics
approaches add insight?
Planning & Quality Unit
Student Services/Careers
The ‘clients’ owning the problem
DVC (Education)
CIC
‘Knowledge
Broker’
School of Math. Sciences
FEIT School of Software
Advanced Analytics Institute
Analytics Experts
Clarify dataset, potential contributions, rules of the game,
schedule, and operationalise the challenge in
quantifiable terms
CIC is currently coordinating a pilot Data Challenge on a
UTS Graduate Employability dataset
Textual data (AGS survey comments)
Quantitative data (AGS survey item responses)
•  How accurately can we predict the employed status of a student, given the
AGS data?
•  Do text analytics on student comments add predictive value or other
insights?
•  What are the pros/cons of different statistical and machine learning
approaches?
•  What effort was required, and what obstacles were encountered?
17
Or a Networked Improvement Community sharing data and
building evidence on a wicked problem?
(not acute emergency response • using Improvement Science methodologies for practice-based evidence)
18
CIC is working on Networked Improvement Community
platforms in education globally, and within UTS
http://carnegiefoundationsummit.org
http://cdn.carnegiefoundation.org/wp-content/uploads/2014/09/bryk-gomez_building-nics-education.pdf
Evidence-based Educational Improvement Cycle
19
Picturing the
learning
system…
Human + Automated
Analysis
Data
Data
Data
Hypotheses, Evidence,
Arguments, Insights
Prototype
Learning
Designs
Assumptions: what
works when and why?
An Evidence Hub shows who in the community
is tackling which parts of the problem
People / Organizations / Projects / Claims / Evidence
Evidence Hub for Research by Children & Young People: http://rcyp.evidence-hub.net
20
Impact Map: how much evidence is there to
support an improvement hypothesis?
http://oermap.org/hypothesis/578/hypothesis-a-performance/ 21
How do we design and test
data science and analytics for
22
?
23
http://www.uts.edu.au/research-and-teaching/teaching-and-learning/learningfutures/overview
24
http://www.uts.edu.au/research-and-teaching/teaching-and-learning/learningfutures
group collaboration analytics
25
Roberto Martinez-Maldonado
Design drivers:
orchestration
and
awareness
MtClassroom
learning analytics in a multi-tabletop classroom
MtClassroom Architecture:
enabling collaborative classroom learning analytics
cic.uts.edu.au
In the future, might UTS give
collaboration feedback like this?
R. Martinez, K. Yacef, J. Kay, and B. Schwendimann.
An interactive teacher’s dashboard for monitoring
multiple groups in a multi-tabletop learning
environment. Proceedings of Intelligent Tutoring
Systems, pages 482-492. Springer, 2012.
Automatic student tracking
from multiple digital
tabletops in classroom
Analyse the students’ activity
traces for significant
patterns
Visualise on a teacher’s
dashboard
cic.uts.edu.au
R. Martinez, K. Yacef, J. Kay, and B. Schwendimann.
An interactive teacher’s dashboard for monitoring
multiple groups in a multi-tabletop learning
environment. Proceedings of Intelligent Tutoring
Systems, pages 482-492. Springer, 2012.
writing analytics
30
Ágnes Sándor Xiaolong Wang Simon Knight
Writing Analytics product space is growing…
31
http://turnitin.com/en_us/features/turnitin-scoring-engine
http://www.pearsonassessments.com/products/100000681/writing-coach.html
AWA: Academic Writing Analytics tool
32
The Tag Clouds invite you to reflect on the
concepts, people and places in the text
Visualizations of writing cohesion
33
Whitelock, D., D. Field, J. T. E. Richardson, N. V. Labeke and S. Pulman (2014). Designing and Testing Visual Representations of Draft Essays for Higher
Education Students. 2nd International Workshop on Discourse-Centric Learning Analytics, Fourth International Conference on Learning Analytics and
Knowledge, Indianapolis, Indiana, USA. https://dcla14.files.wordpress.com/2014/03/dcla14_whitelock_etal.pdf
Rhetorical functions of metadiscourse
identified by the Xerox Incremental Parser (XIP)
BACKGROUND KNOWLEDGE
Recent studies indicate …
… the previously proposed …
… is universally accepted ...
NOVELTY
... new insights provide direct evidence ...
... we suggest a new ... approach ...
... results define a novel role ...
OPEN QUESTION
… little is known …
… role … has been elusive
Current data is insufficient …
GENERALIZING
... emerging as a promising approach
Our understanding ... has grown
exponentially ...
... growing recognition of the importance ...
CONTRASTING IDEAS
… unorthodox view resolves …
paradoxes …
In contrast with previous
hypotheses ...
... inconsistent with past
findings ...
SIGNIFICANCE
studies ... have provided important
advances
Knowledge ... is crucial for ... understanding
valuable information ... from studies
SURPRISE
We have recently observed ... surprisingly
We have identified ... unusual
The recent discovery ... suggests intriguing roles
SUMMARIZING
The goal of this study ...
Here, we show ...
Altogether, our results ... indicate
34
AWA: Academic Writing Analytics tool
35
Highlighted sentences are colour-
coded according to their broad type
Sentences with Function Keys have
more precise functions (e.g. Novelty)
http://bit.ly/utscicawa
AWA: Academic Writing Analytics tool
36
Roll over sentences with Fkeys for a
popup reminding you of their meaning
http://bit.ly/utscicawa
Could this work for reflective writing?
37
A “reflection detection NLP architecture”
38Ullmann, T. D. (2011). An Architecture for the Automated Detection of Textual Indicators of Reflection. In W. Reinhardt, T. D. Ullmann, P. Scott, V. Pammer, O. Conlan, & A.
Berlanga (Eds.), Proc. 1st European Workshop on Awareness & Reflection in Learning Networks (pp. 138–151). Palermo, IT. http://ceur-ws.org/Vol-790/paper14.pdf
Demo: http://reflectr.eduinf.eu/reflectr
Based on UIMA http://uima.apache.org
Quantifying learners’
dispositions?
39
Ruth Crick
“It’s more than knowledge and skills. For the
innovation economy, dispositions
come into play:
readiness to collaborate;
attention to multiple perspectives;
initiative;
persistence;
curiosity.”
Larry Rosenstock
High Tech High
San Diego
hightechhigh.org
LearningREimagined project: http://learning-reimagined.com
Larry Rosenstock: http://audioboo.fm/boos/1669375-50-seconds-of-larry-rosenstock-ceo-of-hightechhigh-on-how-he-would-re-imagine-learning
Knowledge, Skills & Dispositions
40
Deakin Crick, R., S. Huang, A. Ahmed Shafi and C. Goldspink (2015). Developing Resilient Agency in Learning: The Internal Structure of
Learning Power. British Journal of Educational Studies: Published online: 24 Mar 2015. http://dx.doi.org/10.1080/00071005.2015.1006574 41
Evidencing learning dispositions: CLARA survey
(Ruth Deakin Crick, UTS)
Mindful
Agency
Sense making
Creativity
CuriosityBelonging
Collaboration
Hope and
optimism
Immediate visual analytic generated by CLARA
(Individual and cohort profiles + detailed reports + spreadsheets enabling further analysis)
Rapid Visual Feedback to
Stimulate Self-Directed Change
A framework for a coaching conversation
x 50,000 profiles => analytics potential
43
Structural Equation
Model underpinning
CLARA
Deakin Crick, R., S. Huang, A. Ahmed Shafi and C. Goldspink (2015).
Developing Resilient Agency in Learning: The Internal Structure of Learning
Power. British Journal of Educational Studies: Published online: 24 Mar 2015.
http://dx.doi.org/10.1080/00071005.2015.1006574
From self-report to
activity analytics for
dispositions?
Proxies for
“Conscientiousness”?
Shute, V. J. and M. Ventura (2013). Stealth Assessment: Measuring
and supporting learning in video games. Cambridge, MA, MIT Press.
Figure 5 from report to The John D. and Catherine T. MacArthur
Foundation Reports on Digital Media and Learning
http://myweb.fsu.edu/vshute/pdf/Stealth_Assessment.pdf
But personal informatics require personal sensemaking
45
Students need to be placed in control of this kind of analytics, not placed in a big data panopticon
Quantified
Self Qualified
Self
I am more than my digital profile.
I have ultimate authority over its meaning
— and who sees it.
Cf. STUDENT PRIVACY SELF-MANAGEMENT:
IMPLICATIONS FOR LEARNING ANALYTICS
Paul Prinsloo and Sharon Slade, LAK15
big data, values and
algorithmic accountability
46
Theresa Anderson
cic.uts.edu.au
“We must ask difficult
questions of Big Data’s
models of intelligibility
before they crystallize
into new orthodoxies.”
boyd & Crawford, 2012
cic.uts.edu.au
“Raw data is both an oxymoron and a bad idea;
to the contrary, data should be cooked with care.”
Geof Bowker, 2005
cic.uts.edu.au
engaging with the bright & dark
Where is the ‘human’ in analytics –
how can we ensure a humanist
element remains present?
What human/machine partnerships
can/should we enable in
computationally intensive work?
@uts_mdsi #bright_dark
Invitation to think out of the box and play!...
Multimodal expressions of Big Data
@uts_mdsi #bright_dark

The Connected Intelligence Centre: Human-Centered Analytics for UTS Data Challenges

  • 1.
    The Connected IntelligenceCentre: Human-Centered Analytics for UTS Data Challenges Simon Buckingham Shum & CIC Team Director, Connected Intelligence Centre Professor of Learning Informatics @sbuckshum #LearningAnalytics cic.uts.edu.au
  • 2.
    h"ps://twi"er.com/Wiswijzer2/status/414055472451575808   “Note: checkthe huge difference between knowing and measuring…”
  • 3.
  • 4.
  • 5.
    Blackfriars Campus, GrndFloor Building 22
  • 6.
    Georgia  Markakis   Senior  Manager   Theresa  Anderson   Senior  Lecturer   Fac.  Arts  &  Social  Sciences   Roberto  MarLnez-­‐Maldonado   Research  Fellow   Xiaolong  (Shawn)  Wang   Web  Developer   =  at  today’s  Data  Science  Symposium   Ruth  Crick   Professor  of  Learning  AnalyLcs   &  EducaLonal  Leadership   School  of  EducaLon   Simon  Knight   Research  Fellow  
  • 7.
    cic.uts.edu.au   CIC worksas a creative incubator to catalyse thinking about the use of data and analytics among students, educators, researchers and leaders. We do this through research, prototyping, evaluation, embedding and education. So we work with faculties Business Communication Education Engineering Health Information Technology Law Science and support units Careers HELPS Library IML ITD Jumbunna IHL U:PASS Student Services
  • 8.
  • 9.
    UTS Data Challenges 9 Realtime analytics for student writing Real time prediction of ‘at risk’ course dropouts Predictive analytics for student employment Analytics for UTS learning.futures Real time analytics for student collaboration Values, Data Infrastructure, Algorithmic Accountability Analytics for student resilience and agency
  • 10.
    What would itmean to convene internal “Data Challenges?” Competitive Teams, Taskforces, or NICs? 10
  • 11.
  • 12.
  • 13.
    13 Or a multi-partytaskforce, coordinated and collaborating?
  • 14.
    CIC is currentlycoordinating a pilot Data Challenge on a UTS Graduate Employability dataset
  • 15.
    CIC is currentlycoordinating a pilot Data Challenge on a UTS Graduate Employability dataset AGS Dataset Can new analytics approaches add insight? Planning & Quality Unit Student Services/Careers The ‘clients’ owning the problem DVC (Education) CIC ‘Knowledge Broker’ School of Math. Sciences FEIT School of Software Advanced Analytics Institute Analytics Experts Clarify dataset, potential contributions, rules of the game, schedule, and operationalise the challenge in quantifiable terms
  • 16.
    CIC is currentlycoordinating a pilot Data Challenge on a UTS Graduate Employability dataset Textual data (AGS survey comments) Quantitative data (AGS survey item responses) •  How accurately can we predict the employed status of a student, given the AGS data? •  Do text analytics on student comments add predictive value or other insights? •  What are the pros/cons of different statistical and machine learning approaches? •  What effort was required, and what obstacles were encountered?
  • 17.
    17 Or a NetworkedImprovement Community sharing data and building evidence on a wicked problem? (not acute emergency response • using Improvement Science methodologies for practice-based evidence)
  • 18.
    18 CIC is workingon Networked Improvement Community platforms in education globally, and within UTS http://carnegiefoundationsummit.org http://cdn.carnegiefoundation.org/wp-content/uploads/2014/09/bryk-gomez_building-nics-education.pdf
  • 19.
    Evidence-based Educational ImprovementCycle 19 Picturing the learning system… Human + Automated Analysis Data Data Data Hypotheses, Evidence, Arguments, Insights Prototype Learning Designs Assumptions: what works when and why?
  • 20.
    An Evidence Hubshows who in the community is tackling which parts of the problem People / Organizations / Projects / Claims / Evidence Evidence Hub for Research by Children & Young People: http://rcyp.evidence-hub.net 20
  • 21.
    Impact Map: howmuch evidence is there to support an improvement hypothesis? http://oermap.org/hypothesis/578/hypothesis-a-performance/ 21
  • 22.
    How do wedesign and test data science and analytics for 22 ?
  • 23.
  • 24.
  • 25.
  • 26.
  • 27.
  • 28.
    cic.uts.edu.au In the future,might UTS give collaboration feedback like this? R. Martinez, K. Yacef, J. Kay, and B. Schwendimann. An interactive teacher’s dashboard for monitoring multiple groups in a multi-tabletop learning environment. Proceedings of Intelligent Tutoring Systems, pages 482-492. Springer, 2012. Automatic student tracking from multiple digital tabletops in classroom Analyse the students’ activity traces for significant patterns Visualise on a teacher’s dashboard
  • 29.
    cic.uts.edu.au R. Martinez, K.Yacef, J. Kay, and B. Schwendimann. An interactive teacher’s dashboard for monitoring multiple groups in a multi-tabletop learning environment. Proceedings of Intelligent Tutoring Systems, pages 482-492. Springer, 2012.
  • 30.
    writing analytics 30 Ágnes SándorXiaolong Wang Simon Knight
  • 31.
    Writing Analytics productspace is growing… 31 http://turnitin.com/en_us/features/turnitin-scoring-engine http://www.pearsonassessments.com/products/100000681/writing-coach.html
  • 32.
    AWA: Academic WritingAnalytics tool 32 The Tag Clouds invite you to reflect on the concepts, people and places in the text
  • 33.
    Visualizations of writingcohesion 33 Whitelock, D., D. Field, J. T. E. Richardson, N. V. Labeke and S. Pulman (2014). Designing and Testing Visual Representations of Draft Essays for Higher Education Students. 2nd International Workshop on Discourse-Centric Learning Analytics, Fourth International Conference on Learning Analytics and Knowledge, Indianapolis, Indiana, USA. https://dcla14.files.wordpress.com/2014/03/dcla14_whitelock_etal.pdf
  • 34.
    Rhetorical functions ofmetadiscourse identified by the Xerox Incremental Parser (XIP) BACKGROUND KNOWLEDGE Recent studies indicate … … the previously proposed … … is universally accepted ... NOVELTY ... new insights provide direct evidence ... ... we suggest a new ... approach ... ... results define a novel role ... OPEN QUESTION … little is known … … role … has been elusive Current data is insufficient … GENERALIZING ... emerging as a promising approach Our understanding ... has grown exponentially ... ... growing recognition of the importance ... CONTRASTING IDEAS … unorthodox view resolves … paradoxes … In contrast with previous hypotheses ... ... inconsistent with past findings ... SIGNIFICANCE studies ... have provided important advances Knowledge ... is crucial for ... understanding valuable information ... from studies SURPRISE We have recently observed ... surprisingly We have identified ... unusual The recent discovery ... suggests intriguing roles SUMMARIZING The goal of this study ... Here, we show ... Altogether, our results ... indicate 34
  • 35.
    AWA: Academic WritingAnalytics tool 35 Highlighted sentences are colour- coded according to their broad type Sentences with Function Keys have more precise functions (e.g. Novelty) http://bit.ly/utscicawa
  • 36.
    AWA: Academic WritingAnalytics tool 36 Roll over sentences with Fkeys for a popup reminding you of their meaning http://bit.ly/utscicawa
  • 37.
    Could this workfor reflective writing? 37
  • 38.
    A “reflection detectionNLP architecture” 38Ullmann, T. D. (2011). An Architecture for the Automated Detection of Textual Indicators of Reflection. In W. Reinhardt, T. D. Ullmann, P. Scott, V. Pammer, O. Conlan, & A. Berlanga (Eds.), Proc. 1st European Workshop on Awareness & Reflection in Learning Networks (pp. 138–151). Palermo, IT. http://ceur-ws.org/Vol-790/paper14.pdf Demo: http://reflectr.eduinf.eu/reflectr Based on UIMA http://uima.apache.org
  • 39.
  • 40.
    “It’s more thanknowledge and skills. For the innovation economy, dispositions come into play: readiness to collaborate; attention to multiple perspectives; initiative; persistence; curiosity.” Larry Rosenstock High Tech High San Diego hightechhigh.org LearningREimagined project: http://learning-reimagined.com Larry Rosenstock: http://audioboo.fm/boos/1669375-50-seconds-of-larry-rosenstock-ceo-of-hightechhigh-on-how-he-would-re-imagine-learning Knowledge, Skills & Dispositions 40
  • 41.
    Deakin Crick, R.,S. Huang, A. Ahmed Shafi and C. Goldspink (2015). Developing Resilient Agency in Learning: The Internal Structure of Learning Power. British Journal of Educational Studies: Published online: 24 Mar 2015. http://dx.doi.org/10.1080/00071005.2015.1006574 41 Evidencing learning dispositions: CLARA survey (Ruth Deakin Crick, UTS)
  • 42.
    Mindful Agency Sense making Creativity CuriosityBelonging Collaboration Hope and optimism Immediatevisual analytic generated by CLARA (Individual and cohort profiles + detailed reports + spreadsheets enabling further analysis) Rapid Visual Feedback to Stimulate Self-Directed Change A framework for a coaching conversation x 50,000 profiles => analytics potential
  • 43.
    43 Structural Equation Model underpinning CLARA DeakinCrick, R., S. Huang, A. Ahmed Shafi and C. Goldspink (2015). Developing Resilient Agency in Learning: The Internal Structure of Learning Power. British Journal of Educational Studies: Published online: 24 Mar 2015. http://dx.doi.org/10.1080/00071005.2015.1006574
  • 44.
    From self-report to activityanalytics for dispositions? Proxies for “Conscientiousness”? Shute, V. J. and M. Ventura (2013). Stealth Assessment: Measuring and supporting learning in video games. Cambridge, MA, MIT Press. Figure 5 from report to The John D. and Catherine T. MacArthur Foundation Reports on Digital Media and Learning http://myweb.fsu.edu/vshute/pdf/Stealth_Assessment.pdf
  • 45.
    But personal informaticsrequire personal sensemaking 45 Students need to be placed in control of this kind of analytics, not placed in a big data panopticon Quantified Self Qualified Self I am more than my digital profile. I have ultimate authority over its meaning — and who sees it. Cf. STUDENT PRIVACY SELF-MANAGEMENT: IMPLICATIONS FOR LEARNING ANALYTICS Paul Prinsloo and Sharon Slade, LAK15
  • 46.
    big data, valuesand algorithmic accountability 46 Theresa Anderson
  • 47.
    cic.uts.edu.au “We must askdifficult questions of Big Data’s models of intelligibility before they crystallize into new orthodoxies.” boyd & Crawford, 2012
  • 48.
    cic.uts.edu.au “Raw data isboth an oxymoron and a bad idea; to the contrary, data should be cooked with care.” Geof Bowker, 2005
  • 49.
    cic.uts.edu.au engaging with thebright & dark Where is the ‘human’ in analytics – how can we ensure a humanist element remains present? What human/machine partnerships can/should we enable in computationally intensive work? @uts_mdsi #bright_dark
  • 50.
    Invitation to thinkout of the box and play!... Multimodal expressions of Big Data @uts_mdsi #bright_dark