This document summarizes a talk about what we are learning from implementing learning analytics (LA) in higher education. It discusses the drivers for interest in LA, perspectives from industry and research, benchmarks of current LA adoption, and emerging models. While industry rhetoric portrays LA as providing easy answers, the reality is more complex. Most universities are still in early stages of basic reporting rather than advanced applications. For LA to meet its potential and have long term impact, a process-focused model is needed that builds organizational capacity, is adaptive, and takes a broad view of LA beyond just retention.
Using learning analytics to improve student transition into and support throu...Tinne De Laet
Presentation supporting the ABLE and STELA workshop titled "Using learning analytics to improve student transition into and support throughout the 1st year" delivered at the EFYE 2016 conference in Gent, Belgium
Learning Analytics: Seeking new insights from educational dataAndrew Deacon
CPUT Fundani TWT - 22 May 2014
Analytics is a buzzword that encompasses the analysis and visualisation of big data. Current interest results from the growing access to data and the many software tools now available to analyse this data in Higher Education, through platforms such as Learning Management Systems. This seminar provides an overview of current applications and uses of learning analytics and how it can help institutions of learning better support their learners. The illustrative examples look at institutional and social media data that together provide rich insights into institutional, teaching and learning issues. A few simple ways to perform such analytics in a context of Higher Education will be introduced.
Using learning analytics to improve student transition into and support throu...Tinne De Laet
Presentation supporting the ABLE and STELA workshop titled "Using learning analytics to improve student transition into and support throughout the 1st year" delivered at the EFYE 2016 conference in Gent, Belgium
Learning Analytics: Seeking new insights from educational dataAndrew Deacon
CPUT Fundani TWT - 22 May 2014
Analytics is a buzzword that encompasses the analysis and visualisation of big data. Current interest results from the growing access to data and the many software tools now available to analyse this data in Higher Education, through platforms such as Learning Management Systems. This seminar provides an overview of current applications and uses of learning analytics and how it can help institutions of learning better support their learners. The illustrative examples look at institutional and social media data that together provide rich insights into institutional, teaching and learning issues. A few simple ways to perform such analytics in a context of Higher Education will be introduced.
Assessment Analytics - EUNIS 2015 E-Learning Task Force WorkshopLACE Project
This presentation is to introduce a discussion session at the 2015 EUNIS Congress workshop session of the E-Learning Task Force. The LACE Project is very briefly introduced, followed by an explanation of the presenter's view of learning analytics and a critique of some common themes. Assessment Analytics is presented as an antithesis to these themes and an assessment lifecycle model (used in the Jisc Electronic Management of Assessment Programme) is used to outline some ways in which assessment analytics can be realised, as stimulus for discussion.
Online Educa Berlin conference: Big Data in Education - theory and practiceMike Moore
Online Educa Berlin Conference Presentation
Big Data in Education - Theory and Practice
Presented December 6, 2013 by
Mike Moore, Sr. Advisory Consultant - Analytics
Desire2Learn, Inc.
Ellen Wagner, Executive Director, WCET.
Putting Data to Work
This session explores changing data sensibilities at US post-secondary institutions with particular attention paid to how predictive analytics are changing expectations for institutional accountability and student success. Results from the Predictive Analytics Reporting Framework show that predictive modeling can identify students at risk and that linking behavioral predictions of risk with interventions to mitigate those risks at the point of need is a powerful strategy for increasing rates of student retention, academic progress and completion.
presentation at the 15th annual SLN SOLsummit February 27, 2014
http://slnsolsummit2014.edublogs.org/
Open Learning Analytics panel at Open Education Conference 2014Stian Håklev
The past five years have seen a dramatic growth in interest in the emerging field of Learning Analytics (LA), and particularly in the potential the field holds to address major challenges facing education. However, much of the work in the learning analytics landscape today is closed in nature, small in scale, tool- or software-centric, and relatively disconnected from other LA initiatives. This lack of collaboration, openness, and system integration often leads to fragmentation where learning data cannot be aggregated across different sources, institutions only have the option to implement "closed" systems, and cross disciplinary research opportunities are limited. Beyond the immediate concerns this fragmentation creates for educators and learners, a closed approach dramatically limits our ability to build upon successes, learn from failures and move beyond the "pockets of excellence (and failures)? approach that typifies much of the educational technology landscape.
The potential benefits of openness as a core value within the learning analytics community are numerous. Learning initiatives could be informed by large scale research projects. Open-source software, such as dashboards and analytics engines, could be available free of licensing costs and easily enhanced by others, and OERs could become more personalized to match learners' needs. Open data sets and reproducible papers could rapidly spread understanding of analytical approaches, enabling secondary analysis and comparison across research projects. To realize this future, leaders within the learning analytics, open technologies (software, standards, etc.), open research (open data, open predictive models, etc.) and open learning (OER, MOOCs, etc.) fields have established a "network of practice" aimed at connecting subject matter experts, projects, organizations and companies working in these domains. As an initial organizing event, these leaders organized an Open Learning Analytics (OLA) Summit directly following the 2014 Learning Analytics and Knowledge (LAK) conference this past March as means to further the goal of establishing "openness' as a core value of the larger learning analytics movement. Additional details on the Summit and those involved can be found at: http://www.prweb.com/releases/2014/04/prweb11754343.htm.
This panel session will bring together several thought leaders from the Open Learning Analytics community who participated in the Summit to facilitate an interactive dialog with attendees on the intersection of learning analytics and open learning, open technologies, open data, and open research. The presenters represent a broad range of experience with institutional analytics projects, an open source development consortium, the sharing of open learner data, and academic research on open learning environments.
Presentation by Russ Little. Provides an overview of Integrated Planning and Advising Systems (IPAS). Demonstrates how the Student Success Plan software and My Academic Plan (MAP) function, and evidence of their effectiveness.
Learning analytics and Moodle: So much we could measure, but what do we want to measure? A presentation to the USQ Math and Sciences Community of Practice May 2013
Speakers:
David Lewis, senior analytics consultant, Jisc
An opportunity to find out about how an institution has been implementing learning analytics to support the student journey with and opportunity to discuss issues and possibilities that the use of learning analytics may create.
Speakers:
David Lewis, senior analytics consultant, Jisc
Martin Lynch, learning systems manager, University of South Wales
An opportunity to find out about how an institution has been implementing learning analytics to support the student journey with and opportunity to discuss issues and possibilities that the use of learning analytics may create.
Handout of my presentation on the student perspective of Learning Analytics. Most slides contain a few sentences in the speaker notes (in English) to describe the point I was making there.
Assessment Analytics - EUNIS 2015 E-Learning Task Force WorkshopLACE Project
This presentation is to introduce a discussion session at the 2015 EUNIS Congress workshop session of the E-Learning Task Force. The LACE Project is very briefly introduced, followed by an explanation of the presenter's view of learning analytics and a critique of some common themes. Assessment Analytics is presented as an antithesis to these themes and an assessment lifecycle model (used in the Jisc Electronic Management of Assessment Programme) is used to outline some ways in which assessment analytics can be realised, as stimulus for discussion.
Online Educa Berlin conference: Big Data in Education - theory and practiceMike Moore
Online Educa Berlin Conference Presentation
Big Data in Education - Theory and Practice
Presented December 6, 2013 by
Mike Moore, Sr. Advisory Consultant - Analytics
Desire2Learn, Inc.
Ellen Wagner, Executive Director, WCET.
Putting Data to Work
This session explores changing data sensibilities at US post-secondary institutions with particular attention paid to how predictive analytics are changing expectations for institutional accountability and student success. Results from the Predictive Analytics Reporting Framework show that predictive modeling can identify students at risk and that linking behavioral predictions of risk with interventions to mitigate those risks at the point of need is a powerful strategy for increasing rates of student retention, academic progress and completion.
presentation at the 15th annual SLN SOLsummit February 27, 2014
http://slnsolsummit2014.edublogs.org/
Open Learning Analytics panel at Open Education Conference 2014Stian Håklev
The past five years have seen a dramatic growth in interest in the emerging field of Learning Analytics (LA), and particularly in the potential the field holds to address major challenges facing education. However, much of the work in the learning analytics landscape today is closed in nature, small in scale, tool- or software-centric, and relatively disconnected from other LA initiatives. This lack of collaboration, openness, and system integration often leads to fragmentation where learning data cannot be aggregated across different sources, institutions only have the option to implement "closed" systems, and cross disciplinary research opportunities are limited. Beyond the immediate concerns this fragmentation creates for educators and learners, a closed approach dramatically limits our ability to build upon successes, learn from failures and move beyond the "pockets of excellence (and failures)? approach that typifies much of the educational technology landscape.
The potential benefits of openness as a core value within the learning analytics community are numerous. Learning initiatives could be informed by large scale research projects. Open-source software, such as dashboards and analytics engines, could be available free of licensing costs and easily enhanced by others, and OERs could become more personalized to match learners' needs. Open data sets and reproducible papers could rapidly spread understanding of analytical approaches, enabling secondary analysis and comparison across research projects. To realize this future, leaders within the learning analytics, open technologies (software, standards, etc.), open research (open data, open predictive models, etc.) and open learning (OER, MOOCs, etc.) fields have established a "network of practice" aimed at connecting subject matter experts, projects, organizations and companies working in these domains. As an initial organizing event, these leaders organized an Open Learning Analytics (OLA) Summit directly following the 2014 Learning Analytics and Knowledge (LAK) conference this past March as means to further the goal of establishing "openness' as a core value of the larger learning analytics movement. Additional details on the Summit and those involved can be found at: http://www.prweb.com/releases/2014/04/prweb11754343.htm.
This panel session will bring together several thought leaders from the Open Learning Analytics community who participated in the Summit to facilitate an interactive dialog with attendees on the intersection of learning analytics and open learning, open technologies, open data, and open research. The presenters represent a broad range of experience with institutional analytics projects, an open source development consortium, the sharing of open learner data, and academic research on open learning environments.
Presentation by Russ Little. Provides an overview of Integrated Planning and Advising Systems (IPAS). Demonstrates how the Student Success Plan software and My Academic Plan (MAP) function, and evidence of their effectiveness.
Learning analytics and Moodle: So much we could measure, but what do we want to measure? A presentation to the USQ Math and Sciences Community of Practice May 2013
Speakers:
David Lewis, senior analytics consultant, Jisc
An opportunity to find out about how an institution has been implementing learning analytics to support the student journey with and opportunity to discuss issues and possibilities that the use of learning analytics may create.
Speakers:
David Lewis, senior analytics consultant, Jisc
Martin Lynch, learning systems manager, University of South Wales
An opportunity to find out about how an institution has been implementing learning analytics to support the student journey with and opportunity to discuss issues and possibilities that the use of learning analytics may create.
Handout of my presentation on the student perspective of Learning Analytics. Most slides contain a few sentences in the speaker notes (in English) to describe the point I was making there.
Open Learning Analytics Strategy for Student Success: The North Carolina Stat...Joshua
The open learning analytics process is gaining traction in higher education as institutions consider how to leverage the power of predictive learning analytics to impact student success. Institutions embrace open source options as viable alternatives to the cost of proprietary solutions. This presentation was from a September 2015 webiner in which participants learned from North Carolina State University on how NC State is pioneering the implementation of an open strategy for student success. This webinar will also feature Marist College, and will be hosted by Unicon, Inc. The webinar was recorded and is available at: https://youtu.be/ODPTjNcqNuo
Educational Data Mining/Learning Analytics issue brief overviewMarie Bienkowski
An overview of the Draft Issue Brief prepared by SRI International for the US Department of Education on Educational Data Mining and Learning Analytics
Advances in Learning Analytics and Educational Data Mining MehrnooshV
This presentation is about the state-of-the-art of Learning Analytics and Edicational Data Mining. It is presented by Mehrnoosh Vahdat as the introductory tutorial of Special Session 'Advances in Learning Analytics and Educational Data Mining' at ESANN 2015 conference.
Sdal air education workforce analytics workshop jan. 7 , 2014.pptxkimlyman
The American Institutes for Research (AIR) and Virginia Tech are collaborating to explore and develop new approaches to combining, manipulating and understanding big data. The two are also looking at how big data analytics can help answer questions critical to solving issues in education, workforce, health, and human and social development. They held two workshops on January 7 and 27, 2014- the first on Education and Workforce Analytics and the second on Health and Social Development Analytics.
June presentations org_adoption_learning_analyticsShane Dawson
Learning analytics (LA) has been touted as a game changer for education. The rapidly growing literature associated with the field serves to promote this fervour in citing the vast impact LA can and will play in the education space. From the detection of at-risk students to address retention and performance, building self-regulated learning, development and identification of 21st Century literacies to the realisation of personalised learning, there appears little that LA cannot contribute to within learning and teaching practice. However, if LA is such an impactful, desirable and worthy endeavour that can effectively improve learning, and our understanding of the learning process, why are there so few examples of institutional LA adoption?
Presentation exploring the relationship between policy and practice in the development of e-assessment in higher education and the importance of establishing a policy framework - developed in collaboration with all key stakeholders - to support wider uptake among academic staff.
tableau together with analytics
introduction to the simple examples of using data visualisation.. and also how to bridge the gap for using data for Education
Strategies
Data
Analytics
UCISA Learning Anaytics Pre-Conference WorkshopMike Moore
UCISA Learning Analytics Pre-Conference Workshop
Mike Moore - Sr. Advisory Consultant - Analytics
Desire2Learn, Inc.
UCISA Conference 2014, Brighton, UK
Presented Mar 26, 2014
Australian university teacher’s engagement with learning analytics: Still ea...Blackboard APAC
This session reports the results of a recent OLT-funded national exploratory study addressing the relevant factors and their impact when implementing learning analytics for student retention purposes. The project utilised a mixed-method research design and yielded a series of outputs, including the development of a non-technical overview of learning analytics, focusing on linking the fields of student retention and learning analytics resulting in an institution level survey focusing on sector readiness and decision making relating to utilising learning analytics for retention purposes. An academic level survey was administered to academic staff exploring their progress, aspirations and support needs relating to learning analytics. Follow-up interviews expanded on their experiences with learning analytics to date. An evidence-based framework was developed, mapping important factors affecting learning analytics decision making and implementation. This was illustrated by a suite of five case studies developed by each of the research partner institutions detailing their experiences with learning analytics and demonstrating why elements in the framework are important. These findings were shared and tested at a National Forum in April 2015.
Delivered at Innovate and Educate: Teaching and Learning Conference by Blackboard. 24 -27 August 2015 in Adelaide, Australia.
Developing an Effective IT Governance Structure from the Ground UpFrank Cervone
Effective use of information technology requires more than just a solid technological infrastructure. Broad campus-wide participation and engagement is critical to the success of IT. In this session, we will explore how an effective infrastructure can be developed by looking at how Purdue University Calumet developed a model based on a combination of industry best practices and EDUCAUSE resources.
Rafael Hidalgo from The Open University, UK gave a presentation about Learning Analytics for Student Support as part of the online events by expert pool Student Support within EMPOWER.
What are we learning from learning analytics: Rhetoric to reality escalate 2014
1. What are we learning from
learning analytics?
Shane Dawson
Shane.dawson@unisa.edu.au
Twitter: @shaned07
2. Introduction
• Student from Shanghai-based East China Normal
University
• "Last month, you spent less on meals. Are you in
financial difficulty? If so, please contact me via
phone, text message or e-mail.“
http://www.bjreview.com.cn/nation/txt/2014-06/23/content_625466.htm
3. Introduction
• Automatically track students' meal card spending.
• If spending falls under a threshold level, a
designated faculty member sends the student a
short message to check whether they are in
financial difficulty.
http://www.bjreview.com.cn/nation/txt/2014-06/23/content_625466.htm
4. • Highlights the rapidly growing list of applications of
student data
• Academic
• Social
• Pastoral
Introduction
5. Introduction
This talk:
• What are we learning from the implementation
of LA into HE?
• What are the conversations, expectations and
reactions to this nascent field?
• What are the emerging models for institutional
implementation?
8. Drivers
• 1926 - Pressey built an instructional machine to
provide multiple choice questions
• “…with the addition of a simple attachment the
apparatus will present the subject with a piece of
candy or other reward upon his making on any
given score for which the experimenter may have
set the device…”
Shute, V. J., & Psotka, J. (1994). Intelligent Tutoring Systems: Past, Present, and Future (No. AL/HR-TP-1994-
0005). ARMSTRONG LAB BROOKS AFB TX HUMAN RESOURCES DIRECTORATE.
9. • Scale, access and application
• Ease of access to learner data – LMS, SIS, mobile
• Growth in adoption of technical devices
• Huge investment in analytics – industry &
Government
Data
10. Learning Analytics
• Learning Analytics
• “game changer” for education
…is the collection, collation, analysis and reporting
of data about learners and their contexts, for the
purposes of understanding and optimizing
learning
12. Industry rhetoric
“Get answers to your most important questions like:
• How can I easily find students who are at-risk?
13. Industry rhetoric
“Get answers to your most important questions like:
• How can I easily find students who are at-risk?
• Yes possible – much research in this area
• However, ignores the complexity
• Context is critical
• Not all courses are alike – student diversity
and approach
Overstated
14. Industry rhetoric
“Get answers to your most important questions like:
• Who are the most innovative instructors?”
15. Industry rhetoric
“Get answers to your most important questions like:
• Who are the most innovative instructors?”
• How and why? What defines innovative in this
space given the myriad of tools and learning
approaches available
Why?
16. Industry rhetoric
“…In five years the classroom will learn you! And
personalize course work accordingly”
http://www.research.ibm.com/cognitive-computing/machine-learning-applications/decision-support-education.
shtml#fbid=MRUeQg4jzVG
17. Industry rhetoric
“…In five years the classroom will learn you! And
personalize course work accordingly”
• Currently available if:
• Cognitive tutor, Knewton, Knowillage
• Ryan Baker – on/off task behaviour; gaming and
choice of major
Plausible
18. Industry rhetoric
“Enhance student outcomes with the ability to monitor,
evaluate, and predict learner performance to drive
retention and improve outcomes.”
• Much work in this area to predict performance
however, intervention strategies less well
understood.
• Greater recognition SRL
Available but not utilised
http://www.brightspace.com/solutions/higher-education/advanced-analytics/
19. Industry rhetoric
“…predictive analytics capabilities help educators target
learning strategies and pre-emptively mentor at-risk
learners.”
http://www.brightspace.com/solutions/higher-education/advanced-analytics/
https://www.flickr.com/photos/tadeeej/3228729514/
20. Industry rhetoric
Do we need predictive analytics here?
https://www.flickr.com/photos/tadeeej/3228729514/
21. Industry rhetoric
• Unlikely – practice is difficult change. However first
step is to aid identification.
• Tannes et al (2011) - Course Signals feedback
• Instructors – feedback was motivational
• Student success related to instructional
feedback
Tannes, et al (2011) . Using Signals for appropriate feedback. Perceptions and practice. Computers and Education,
57, (4), 2414 - 2422
23. Research rhetoric
What is missing: a focus on learning process
• SRL proficiency (Gasevic; Winne)
• Discourse analysis and text mining (Rose)
• Learning design and Instructional conditions
(Lockyer; Gasevic)
• Learning dispositions (Deakin Crick, Buckingham
Shum)
• Literacies or fluencies (Siemens)
• Creativity (Pei Ling Tan)
24. Research rhetoric
Great research BUT:
• Tends to ignore the complexity of university wide
practice
• Predominantly, small scale and technology and
institutional specific
• Lacks guidance to aid further adoption
• Frequently requires high level skills and capacities
25. Hence:
• Very few university wide examples of LA adoption
• But obviously an area of increasing need and
importance
Research rhetoric
Leads to questions related to how to
implement, get started and what data?
26. Learning Analytics
National project to benchmark LA status, policy
and practices for Australian Universities
27. Benchmarking
Interviews with 39 Universities and 30 “experts”:
• Identification of current practice, methods and
approaches
• Identification of key drivers for institutions, stage
of development, process for implementation,
project leads
28. Benchmarking
Research perspective:
• Focus on understanding learning processes
• Broad range of data sets –larger size and range
of data (relational data)
• Limited interest in the scalability of findings
across institution (at least not a stated intention)
29. Benchmarking
Research perspective:
“My hope [for LA] is that we can develop a better
theory about how people learn and forge
recommendations that might nudge learners
toward more productive, more efficient, more
satisfying ways of learning”
30. Benchmarking
University leaders perspective:
• Primarily focused on retention
• “It’s [LA] a tool for improving retention”
• Limited mention of LA as a means to improve
learning
• Main driver is budget (cost savings)
• Perception that it is only related to – LMS and
SIS
• Limited number of data sets considered
31. Benchmarking
University leaders perspective:
• Success is seen as staff access to information
• Limited understanding of the application of
interventions that are data informed
• Data visualisations – dashboard development is
the endpoint and goal
• Few institutions with stated LA policy and strategy
32. Benchmarking
• Widening gap between University Admin and
researchers
• Admin – Industry very similar
33. Reality is sobering:
Reality
• Need to develop greater understanding of the role
of technology and role of data in an institution
• Access to data does not mean change in practice
• Interventions and early alerts must be constantly
evaluated, revised and contextualised
34. 2005 – Goldstein & Katz:
• Stage 1: Extraction and reporting of transaction-level
data
• Stage 2: Analysis and monitoring of operational
performance
• Stage 3: “What-if” decision support (such as
scenario building)
• Stage 4: Predictive modeling & simulation
• Stage 5: Automatic triggers and alerts
(interventions)
Reality
35. 2005 – Goldstein & Katz:
• Stage 1: Extraction and reporting of transaction-level
data
• Stage 2: Analysis and monitoring of operational
performance
• Stage 3: “What-if” decision support (such as
scenario building)
• Stage 4: Predictive modeling & simulation
• Stage 5: Automatic triggers and alerts
(interventions)
Reality
36. • Yanosky (2009) – 305 institutions, 58% at
stage 1, 20% at stage 2
• Bichsel (2012)
• Interest in analytics is high, but many
institutions had yet to make progress
beyond basic reporting.
37. Reality
2014 LA organisational adoption is low:
• Australia is predominantly at a stage of basic
reporting
• Very few institutions have an enterprise
approach
• While the research has well progressed -
implementation remains a challenge.
38. Reality
• Essentially, 2 models emerging
1. Solutions focused
• IT driven or
• L&T driven or
• Industry
2. Process focused
• Individual “faculty” or
• Networked and integrated
40. Reality
Adaptability of
system to
meet org
needs
High
Low High
Low
Ease of adoption
Solutions
focused
Process
focused
41. Reality
Adaptability of
system to
meet org
needs
High
Low High
Low
Ease of adoption
Solutions
focused
Process
focused
42. Reality
Adaptability of
system to
meet org
needs
High
Low High
Low
Long term impact
Solutions
focused
Process
focused
43. Reality
Solutions focused – Short term gains
Advantages Disadvantages
• Cost • Locked in
• Speed of delivery • Short time for
acceptance
• Ease of
dissemination
• Lacks capacity building
• Scalable, risk
mitigation
• Access to data is often
limited
44. Reality
Process focused – Longer term gains
Advantages Disadvantages
• Capacity building • Time required
• Adaptive to
changing reqs
• Sustained leadership
and principles of access
• Acceptance of
process
• Complexity
• Shared ownership • Raises org threat
• Evidenced based
45. Reality
Common model – Solutions focused:
• IT lead and implemented
• Closed system focused on scalability,
performance, and list of features
• Dashboards/ reports are important
• Dissemination and access gains
[Success is seen as staff access to information]
• Where is the why?
46. Conclusion
LA sophistication model
Siemens, G., Dawson, S., & Lynch, G. (2013). Improving
the Productivity of the Higher Education Sector: Policy
and Strategy for Systems-Level Deployment of Learning
Analytics. Society for Learning Analytics Research for the
Australian Government Office for Learning and Teaching.
49. Is there an alternative:
Reality
• What are the organisational needs and how to
gain both impact and adoption
• How do we merge both models to gain both
short and long term impact?
50. An alternative
Developing models:
• Cross organisation
• IT, L&T, Faculty, Research, Administrators
• Development of exemplars and research
informed.
• Process is future looking and agile
• Increased time required for acceptance and
discussion
• Problem focused – understand the problem
51. An alternative
Developing models:
• Building organisational capacity
• Time for organisational acceptance
• Identify sites of interest and growth
• Research ideas promoted and faculty invited
into new spaces
• Need to act on data and findings
52. Complex adaptive system:
• Education is complex
• Learning is complex
• Organisations are complex
• CAS are systems large numbers of agents that
interact and adapt or learn
• Non-linear and resilient
53. Complex Leadership Theory:
• CAS – requires new forms of leadership
(Complex leadership theory - Uhl-Bien et al)
• Interactive, engaged, multi-level and
contextual
• Takes advantage of the dynamic capabilities
the system
• Leadership vs leaders
Uhl-Bien, M., Marion, R. & McKelvey, B. (2007). Complexity Leadership Theory: Shifting leadership from the
industrial age to the knowledge era, The Leadership Quarterly, Volume 18(4),298-318
55. Complexity Leadership:
Administrative Leadership
Adaptive Leadership
Administrative stifles
adaptive. (Bureaucratic
and top down)
However – it is driven
and solution focused
56. Complexity Leadership:
Administrative Leadership
Adaptive Leadership
Adaptive (lack of
integration)
However capacity
building and
innovation focused
58. Enabling:
• Leadership- focused on process and enabling staff
• Developing awareness and building capacity
• Diverse teams represented
• IT/ L&T – systems
• Data analysts
• Data wranglers
• Teaching staff
• Researchers
E.g.
• Open UK
• University of Michigan
• University of Texas
60. Conclusion
• Change in education is complex and multi-faceted
• Requires new models for implementation and
leadership
• Enabling leadership
• models that are agile and research informed
• Requires an inter-disciplinary approach
• Embrace Friction - generates discussion and
innovation
61. Conclusion
For the reality of LA to meet the rhetoric (to reach
potential):
• LA is not a technology
• LA is not a dashboard
• LA is not one individual
• LA is team based
• LA is dynamic and requires longer term
investment and process