This document summarizes an open academic analytics initiative that aimed to create an open-source early alert system. Key points:
- The initiative developed predictive models using historical student data to identify students at risk of not completing courses.
- The early alert system was deployed across four institutions, and research found it had a statistically significant positive impact on final course grades, particularly for low-income students.
- Instructors reported the system changed their pedagogy by making them more proactive in reaching out to struggling students.
- The document advocates for open learning analytics and describes Apereo's open source learning analytics platform that incorporates open standards, a learning record store, analytics processor and dashboards.
A Pulse of Predictive Analytics In Higher Education │ Civitas LearningCivitas Learning
Civitas Learning presents the findings of our survey conducted during the September 2014 Civitas Learning Summit, where more than 100 leaders representing 40 Pioneer Partner institutions gathered to share more on their work. The survey, distributed to all participants, resulted in 74 responses highlighting how this cross-section of higher education institutions are using advanced analytics to power student success initiatives.
The presentation describes the design decisions on how to develop open learning analytics solutions at scale. The presentation talks about the Marist's Open Academic Analytics Initiative and how the predictive models developed during this research is packaged and made available as an open source Risk Assessment API
The following is a presentation given at the Open Apereo conference 2015. It provides updates on the Apereo Learning Analytics initiative and the work that has been implemented over the past year since its inception in June 2014
Modern Learning Ecosystem Design with xAPIMargaret Roth
While the L&D community is increasingly familiar with the Experience API (xAPI) and its value for data collection and interoperability, few examples exist to clarify the value of xAPI as applied within different existing learning infrastructures. This session focused on sharing the ways xAPI can connect and provide value in any eLearning environment.
These slides present a series of different learning ecosystem configurations and the ways xAPI and a learning record store (LRS) can provide value in each case. The three main learning ecosystem configurations examined range from the simplest (LMS and LRS) to three systems connected (LMS, LRS, and CMS) to the fully modular (LRS, LMS, simulations, microlearning, performance assessment, and other tools). For each of these configurations, the presentation shares specific values and practical applications gained by connecting an xAPI LRS to the existing system.
This presentation was originally shared as part of the eLearning Guild's 2018 Learning Solutions conference on March 28, 2018.
Data Visualization and Learning Analytics with xAPIMargaret Roth
With the Experience API we are able to collect more granular, high-resolution data from our learning tools and platforms. But once we have that data, how do we present it in ways that easily communicate the right insights to our stakeholders?
In this presentation from the xAPI Cohort's Spring 2018 session, you'll find a brief historical survey of data visualizations, three keys to designing good data visualizations, and case studies of xAPI specific data visualizations and the insights they provided to organizations.
A Pulse of Predictive Analytics In Higher Education │ Civitas LearningCivitas Learning
Civitas Learning presents the findings of our survey conducted during the September 2014 Civitas Learning Summit, where more than 100 leaders representing 40 Pioneer Partner institutions gathered to share more on their work. The survey, distributed to all participants, resulted in 74 responses highlighting how this cross-section of higher education institutions are using advanced analytics to power student success initiatives.
The presentation describes the design decisions on how to develop open learning analytics solutions at scale. The presentation talks about the Marist's Open Academic Analytics Initiative and how the predictive models developed during this research is packaged and made available as an open source Risk Assessment API
The following is a presentation given at the Open Apereo conference 2015. It provides updates on the Apereo Learning Analytics initiative and the work that has been implemented over the past year since its inception in June 2014
Modern Learning Ecosystem Design with xAPIMargaret Roth
While the L&D community is increasingly familiar with the Experience API (xAPI) and its value for data collection and interoperability, few examples exist to clarify the value of xAPI as applied within different existing learning infrastructures. This session focused on sharing the ways xAPI can connect and provide value in any eLearning environment.
These slides present a series of different learning ecosystem configurations and the ways xAPI and a learning record store (LRS) can provide value in each case. The three main learning ecosystem configurations examined range from the simplest (LMS and LRS) to three systems connected (LMS, LRS, and CMS) to the fully modular (LRS, LMS, simulations, microlearning, performance assessment, and other tools). For each of these configurations, the presentation shares specific values and practical applications gained by connecting an xAPI LRS to the existing system.
This presentation was originally shared as part of the eLearning Guild's 2018 Learning Solutions conference on March 28, 2018.
Data Visualization and Learning Analytics with xAPIMargaret Roth
With the Experience API we are able to collect more granular, high-resolution data from our learning tools and platforms. But once we have that data, how do we present it in ways that easily communicate the right insights to our stakeholders?
In this presentation from the xAPI Cohort's Spring 2018 session, you'll find a brief historical survey of data visualizations, three keys to designing good data visualizations, and case studies of xAPI specific data visualizations and the insights they provided to organizations.
Why xAPI? A Business Leader's Getting Started GuideMargaret Roth
For many L&D teams, getting started with xAPI is all about finding the balance between meeting strategic organization-wide data transformation goals and defining an achievable, feasible, and valuable team-level proof of concept. Getting to this happy medium requires not only an understanding of technical functionality, but more importantly the ability to help your stakeholders understand how they will benefit from an xAPI-powered learning ecosystem and data strategy.
In this webinar, you'll gain an understanding of the foundational conceptual and technical components of xAPI, see examples of xAPI-powered learning ecosystem pilot projects, and learn the necessary first steps to get your team xAPI ready for 2019!
This content was originally shared as part of the Why xAPI? webinar on December 13, 2018.
Learning Analytics: Seeking new insights from educational dataAndrew Deacon
CPUT Fundani TWT - 22 May 2014
Analytics is a buzzword that encompasses the analysis and visualisation of big data. Current interest results from the growing access to data and the many software tools now available to analyse this data in Higher Education, through platforms such as Learning Management Systems. This seminar provides an overview of current applications and uses of learning analytics and how it can help institutions of learning better support their learners. The illustrative examples look at institutional and social media data that together provide rich insights into institutional, teaching and learning issues. A few simple ways to perform such analytics in a context of Higher Education will be introduced.
Keynote Address, Expanding Horizons 2012, Macquarie University
http://staff.mq.edu.au/teaching/workshops_programs/expanding_horizons
"Learning Analytics": unprecedented data sets and live data streams about learners, with computational power to help make sense of it all, and new breeds of staff who can talk predictive models, pedagogy and ethics. This means rather different things to different people: unprecedented opportunity to study, benchmark and improve educational practice, at scales from countries and institutions, to departments, individual teachers and learners. "Benchmarking" may trigger dystopic visions of dumbed down proxies for 'real teaching and learning', but an emu response is no good. For educational institutions, our calling is to raise the quality of debate, shape external and internal policy, and engage with the companies and open communities developing the future infrastructure. How we deploy these new tools rests critically on assessment regimes, what can be logged and measured with integrity, and what we think it means to deliver education that equips citizens for a complex, uncertain world.
Jisc learning analytics MASHEIN Jan 2017Paul Bailey
Jisc Learning Analytics presentation at Leading Digital Learning: Key Issues for Small and Specialist Institutions event organised by MASHEIN (Management of Small Higher
Education Institutions Network)
Educational Data Mining/Learning Analytics issue brief overviewMarie Bienkowski
An overview of the Draft Issue Brief prepared by SRI International for the US Department of Education on Educational Data Mining and Learning Analytics
XAPI and Machine Learning for Patient / LearnerJessie Chuang
xAPI and Machine Learning can help us build "intelligent assistance" for patients and learners, but human-in-the-loop machine learning is important. We need good learning design from the beginning and as we return data to instructors and learners immediately, humans can give great inputs to this human-machine collaboration.
Using learning analytics to improve student transition into and support throu...Tinne De Laet
Presentation supporting the ABLE and STELA workshop titled "Using learning analytics to improve student transition into and support throughout the 1st year" delivered at the EFYE 2016 conference in Gent, Belgium
Experience API recipes/visualization services are free for education institutions, partners are welcome
Contact: mlearning@classroomaid.org
See detailed features list:
http://classroom-aid.com/xapi-and-analytics-services/
How to Plan for an xAPI Pilot at xAPI Camp DevLearn 2018 - Yet AnalyticsAllie Tscheulin
From an organization-wide executive directive to become more data-driven, a retail corporate L&D team took an internal look at their own data practices. Realizing that they had an overwhelming lack of transparency into their learning initiatives and a great amount of data that had gone unused, the team developed a transformation vision to create a single system of record for learning to enable observability, granularity, and accountability for all team members. The team was committed to the vision of xAPI; however, the data and information they needed in order to make actionable change for their learners was locked away in non-interoperable formats, and they recognized the need to develop a data strategy and implementation plan.
*Originally presented on 10/ 23/2018 at xAPI Camp during DevLearn 2018 by Allie Tscheulin
Co-developing bespoke, enterprise-scale analytics systems with teaching staffDanny Liu
Presentation at the NSW Learning Analytics Working Group meeting, 3 February 2016, at the University of Technology, Sydney. Covering projects from Macquarie University and the University of Sydney.
Why xAPI? A Business Leader's Getting Started GuideMargaret Roth
For many L&D teams, getting started with xAPI is all about finding the balance between meeting strategic organization-wide data transformation goals and defining an achievable, feasible, and valuable team-level proof of concept. Getting to this happy medium requires not only an understanding of technical functionality, but more importantly the ability to help your stakeholders understand how they will benefit from an xAPI-powered learning ecosystem and data strategy.
In this webinar, you'll gain an understanding of the foundational conceptual and technical components of xAPI, see examples of xAPI-powered learning ecosystem pilot projects, and learn the necessary first steps to get your team xAPI ready for 2019!
This content was originally shared as part of the Why xAPI? webinar on December 13, 2018.
Learning Analytics: Seeking new insights from educational dataAndrew Deacon
CPUT Fundani TWT - 22 May 2014
Analytics is a buzzword that encompasses the analysis and visualisation of big data. Current interest results from the growing access to data and the many software tools now available to analyse this data in Higher Education, through platforms such as Learning Management Systems. This seminar provides an overview of current applications and uses of learning analytics and how it can help institutions of learning better support their learners. The illustrative examples look at institutional and social media data that together provide rich insights into institutional, teaching and learning issues. A few simple ways to perform such analytics in a context of Higher Education will be introduced.
Keynote Address, Expanding Horizons 2012, Macquarie University
http://staff.mq.edu.au/teaching/workshops_programs/expanding_horizons
"Learning Analytics": unprecedented data sets and live data streams about learners, with computational power to help make sense of it all, and new breeds of staff who can talk predictive models, pedagogy and ethics. This means rather different things to different people: unprecedented opportunity to study, benchmark and improve educational practice, at scales from countries and institutions, to departments, individual teachers and learners. "Benchmarking" may trigger dystopic visions of dumbed down proxies for 'real teaching and learning', but an emu response is no good. For educational institutions, our calling is to raise the quality of debate, shape external and internal policy, and engage with the companies and open communities developing the future infrastructure. How we deploy these new tools rests critically on assessment regimes, what can be logged and measured with integrity, and what we think it means to deliver education that equips citizens for a complex, uncertain world.
Jisc learning analytics MASHEIN Jan 2017Paul Bailey
Jisc Learning Analytics presentation at Leading Digital Learning: Key Issues for Small and Specialist Institutions event organised by MASHEIN (Management of Small Higher
Education Institutions Network)
Educational Data Mining/Learning Analytics issue brief overviewMarie Bienkowski
An overview of the Draft Issue Brief prepared by SRI International for the US Department of Education on Educational Data Mining and Learning Analytics
XAPI and Machine Learning for Patient / LearnerJessie Chuang
xAPI and Machine Learning can help us build "intelligent assistance" for patients and learners, but human-in-the-loop machine learning is important. We need good learning design from the beginning and as we return data to instructors and learners immediately, humans can give great inputs to this human-machine collaboration.
Using learning analytics to improve student transition into and support throu...Tinne De Laet
Presentation supporting the ABLE and STELA workshop titled "Using learning analytics to improve student transition into and support throughout the 1st year" delivered at the EFYE 2016 conference in Gent, Belgium
Experience API recipes/visualization services are free for education institutions, partners are welcome
Contact: mlearning@classroomaid.org
See detailed features list:
http://classroom-aid.com/xapi-and-analytics-services/
How to Plan for an xAPI Pilot at xAPI Camp DevLearn 2018 - Yet AnalyticsAllie Tscheulin
From an organization-wide executive directive to become more data-driven, a retail corporate L&D team took an internal look at their own data practices. Realizing that they had an overwhelming lack of transparency into their learning initiatives and a great amount of data that had gone unused, the team developed a transformation vision to create a single system of record for learning to enable observability, granularity, and accountability for all team members. The team was committed to the vision of xAPI; however, the data and information they needed in order to make actionable change for their learners was locked away in non-interoperable formats, and they recognized the need to develop a data strategy and implementation plan.
*Originally presented on 10/ 23/2018 at xAPI Camp during DevLearn 2018 by Allie Tscheulin
Co-developing bespoke, enterprise-scale analytics systems with teaching staffDanny Liu
Presentation at the NSW Learning Analytics Working Group meeting, 3 February 2016, at the University of Technology, Sydney. Covering projects from Macquarie University and the University of Sydney.
Technical Challenges for Realizing Learning AnalyticsRalf Klamma
Technical Challenges for Realizing Learning Analytics
Learntec 2015, January 28, 2015, Karlsruhe, Germany,
Ralf Klamma
Advanced Community Informations Systems (ACIS) Group
RWTH Aachen University
This presentation proposes that Social Learning Analytics (SLA) can be usefully thought of as a subset of learning analytics approaches. SLA focuses on how learners build knowledge together in their cultural and social settings. In the context of online social learning, it takes into account both formal and informal educational environments, including networks and communities. The paper introduces the broad rationale for SLA by reviewing some of the key drivers that make social learning so important today. Five forms of SLA are identified, including those which are inherently social, and others which have social dimensions. The paper goes on to describe early work towards implementing these analytics on SocialLearn, an online learning space in use at the UK’s Open University, and the challenges that this is raising. This work takes an iterative approach to analytics, encouraging learners to respond to and help to shape not only the analytics but also their associated recommendations
Top Application Security Trends of 2012DaveEdwards12
Learn about the major risks to Cloud and Web-based Applications. What are their weaknesses? How can you deploy them in a more confident fashion and avoid the risks? What can you do to protect these applications without creating a major burden on your end-users and customers. Application Security has become one of the top most priorities of CIOs, CSOs and IT Staff in 2012. Cloud has created a paradigm shift in how we leverage technology. Learn about the power of the Cloud to Secure your applications.
Guía Oficial de Google Posicionamiento en Buscadores SEO :: Pau KleinPau Klein
Guía oficial para principiantes sobre optimización para motores de búsqueda SEO
Conceptos básicos SEO
Crea títulos de página únicos y precisos
Utiliza la metaetiqueta description
Mejorando la estructura del sitio web
Mejora la estructura de las URL
Facilita la navegación en tu sitio
Optimizando el contenido
Ofrece contenido y servicios de calidad
Escribe texto ancla de mejor calidad
Optimiza el uso de las imágenes
Usa las etiquetas de cabecera de forma apropiada
Tratando con los bots
Haz un uso efectivo de robots.txt
Ten en cuenta rel=“nofollow” para los enlaces
SEO para teléfonos móviles
Informa a Google sobre tus sitios para móviles
Guía con precisión a los usuarios de móviles
Promoción y análisis
Promociona tu sitio de manera correcta
Utiliza las herramientas gratuitas para webmasters
http://www.pauklein.com
marketing online valencia
The influence of variants at the 9p21 locus on melanoma risk has been reported through investigation of CDKN2A variants through candidate gene approach as well as by genome wide association studies (GWAS).
Open Learning Analytics panel at Open Education Conference 2014Stian Håklev
The past five years have seen a dramatic growth in interest in the emerging field of Learning Analytics (LA), and particularly in the potential the field holds to address major challenges facing education. However, much of the work in the learning analytics landscape today is closed in nature, small in scale, tool- or software-centric, and relatively disconnected from other LA initiatives. This lack of collaboration, openness, and system integration often leads to fragmentation where learning data cannot be aggregated across different sources, institutions only have the option to implement "closed" systems, and cross disciplinary research opportunities are limited. Beyond the immediate concerns this fragmentation creates for educators and learners, a closed approach dramatically limits our ability to build upon successes, learn from failures and move beyond the "pockets of excellence (and failures)? approach that typifies much of the educational technology landscape.
The potential benefits of openness as a core value within the learning analytics community are numerous. Learning initiatives could be informed by large scale research projects. Open-source software, such as dashboards and analytics engines, could be available free of licensing costs and easily enhanced by others, and OERs could become more personalized to match learners' needs. Open data sets and reproducible papers could rapidly spread understanding of analytical approaches, enabling secondary analysis and comparison across research projects. To realize this future, leaders within the learning analytics, open technologies (software, standards, etc.), open research (open data, open predictive models, etc.) and open learning (OER, MOOCs, etc.) fields have established a "network of practice" aimed at connecting subject matter experts, projects, organizations and companies working in these domains. As an initial organizing event, these leaders organized an Open Learning Analytics (OLA) Summit directly following the 2014 Learning Analytics and Knowledge (LAK) conference this past March as means to further the goal of establishing "openness' as a core value of the larger learning analytics movement. Additional details on the Summit and those involved can be found at: http://www.prweb.com/releases/2014/04/prweb11754343.htm.
This panel session will bring together several thought leaders from the Open Learning Analytics community who participated in the Summit to facilitate an interactive dialog with attendees on the intersection of learning analytics and open learning, open technologies, open data, and open research. The presenters represent a broad range of experience with institutional analytics projects, an open source development consortium, the sharing of open learner data, and academic research on open learning environments.
Open Learning Analytics Strategy for Student Success: The North Carolina Stat...Joshua
The open learning analytics process is gaining traction in higher education as institutions consider how to leverage the power of predictive learning analytics to impact student success. Institutions embrace open source options as viable alternatives to the cost of proprietary solutions. This presentation was from a September 2015 webiner in which participants learned from North Carolina State University on how NC State is pioneering the implementation of an open strategy for student success. This webinar will also feature Marist College, and will be hosted by Unicon, Inc. The webinar was recorded and is available at: https://youtu.be/ODPTjNcqNuo
This slide was presented in International the 2015 Conference on Education Research.
I aggregated several my other partial slides and reports to describe adaptive learning model pertaining to concept of learning analytics as well as LOD for curriculum standards and digital resources. There is short introduction to the project of ISO/IEC 20748 Learning analytics interoperability - Part 1: Reference model.
Education must capitalize on the trend within technology toward big data. New types of data are becoming available. From evidence approaches to xAPI and the whole Training and Learning Architecture(TLA) big data is the foundation of all.
Invited talk, INSIGHT Centre for Data Analytics, Univ. Galway, 2 Oct 2013, http://www.insight-centre.org
Abstract:
Data and analytics are transforming how organisations work in all sectors. While there are clearly ethical issues around big data and privacy, there may also be an argument that educational institutions have a moral obligation to use all the information they have to maximize the learner's progress. So, assuming education can't (arguably shouldn't) resist this revolution, the question is how to harness this new capability intelligently. Learning Analytics is an exploding research field and startup market: do leaders know what to ask when the vendors roll up with dazzling dashboards? In this talk I'll provide an overview of developments, and consider some of the key questions we should be asking. Like any modelling technology and accounting system, analytics are not neutral, and do not passively describe sociotechnical reality: they begin to shape it. Moreover, they start with the things that are easiest to count, which doesn't necessarily equate to the things we value in learning. Given the crisis in education at many levels, what realities do we want analytics to perpetuate, or bring into being?
Bio:
Simon Buckingham Shum is Professor of Learning Informatics at the UK Open University's Knowledge Media Institute. He researches, teaches and consults on Learning Analytics, Collective Intelligence and Argument Visualization. His background is B.Sc. Psychology, M.Sc. Ergonomics and Ph.D. Human-Computer Interaction. He co-edited Visualizing Argumentation (Springer 2003), the standard reference in the field, followed by Knowledge Cartography (2008). In the field of Learning Analytics, he served as Program Co-Chair of the 2nd International Learning Analytics LAK12 conference, chaired the LAK13 Discourse-Centric Learning Analytics workshop, and the LASI13 Dispositional Learning Analytics workshop. He is a co-founder of the Society for Learning Analytics Research, Compendium Institute, LearningEmergence.net, and was Co-Founder and General Editor of the Journal of Interactive Media in Education. He serves on the Advisory Groups for a variety of learning analytics initiatives in education and enterprise, and is a Visiting Fellow at University of Bristol Graduate School of Education. Contact him via http://simon.buckinghamshum.net
Maandag 9 november
Sessieronde 1
Titel: Dashboards voor learning analytics
Spreker(s): Renée Filius (Elevate), Alan Berg (Universiteit van Amsterdam)
Zaal: Rotterdam Hall
OAAI: Deploying an Open Ecosystem for Learner AnalyticsJoshua
The Open Academic Analytics Initiative (OAAI), an NGLC grant recipient, has developed a predictive model for learner analytics using open-source tools, which we are releasing under an open-source license. We will share project outcomes along with research into effective OER-based intervention strategies and other critical learner analytics scaling factors.
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...John Andrews
SlideShare Description for "Chatty Kathy - UNC Bootcamp Final Project Presentation"
Title: Chatty Kathy: Enhancing Physical Activity Among Older Adults
Description:
Discover how Chatty Kathy, an innovative project developed at the UNC Bootcamp, aims to tackle the challenge of low physical activity among older adults. Our AI-driven solution uses peer interaction to boost and sustain exercise levels, significantly improving health outcomes. This presentation covers our problem statement, the rationale behind Chatty Kathy, synthetic data and persona creation, model performance metrics, a visual demonstration of the project, and potential future developments. Join us for an insightful Q&A session to explore the potential of this groundbreaking project.
Project Team: Jay Requarth, Jana Avery, John Andrews, Dr. Dick Davis II, Nee Buntoum, Nam Yeongjin & Mat Nicholas
Techniques to optimize the pagerank algorithm usually fall in two categories. One is to try reducing the work per iteration, and the other is to try reducing the number of iterations. These goals are often at odds with one another. Skipping computation on vertices which have already converged has the potential to save iteration time. Skipping in-identical vertices, with the same in-links, helps reduce duplicate computations and thus could help reduce iteration time. Road networks often have chains which can be short-circuited before pagerank computation to improve performance. Final ranks of chain nodes can be easily calculated. This could reduce both the iteration time, and the number of iterations. If a graph has no dangling nodes, pagerank of each strongly connected component can be computed in topological order. This could help reduce the iteration time, no. of iterations, and also enable multi-iteration concurrency in pagerank computation. The combination of all of the above methods is the STICD algorithm. [sticd] For dynamic graphs, unchanged components whose ranks are unaffected can be skipped altogether.
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
5. LESSONS LEARNED – VALUE OF AN OPEN PLATFORM
Open Academic
Analytics Initiative
6. OAAI: Overview and Impact
EDUCAUSE Next Generation Learning
Challenges (NGLC)
Funded by Bill and Melinda Gates Foundations
$250,000 over a 15 month period
Goal: Leverage Big Data concepts to create an
open-source academic early alert system and
research “scaling factors”
7. Student Aptitude Data
(SATs, current GPA, etc.)
Student Demographic Data
(Age, gender, etc.)
Sakai Event Log Data
Sakai Gradebook Data
Predictive
Model
Scoring
Identifies
students “at
risk” to not
complete
course
SISDataLMSData
OAAI Early Alert System Overview
Intervention Deployed
“Awareness” or Online
Academic Support
Environment (OASE)
“Creating an Open Academic
Early Alert System”
Model Developed
Using Historical Data
Step #1: Developed model
using historical data
Academic Alert
Report (AAR)
8. Research Design
Deployed OAAI system to 2200 students across four institutions
• Two Community Colleges
• Two Historically Black Colleges and Universities
Design > One instructor teaching 3 sections
• One section was control, other 2 were treatment groups
Each instructor received an AAR three times during the
semester:
• Intervals were 25%, 50% and 75% into the semester
11. Fall ’12 Portability Findings
Conclusion
1. Predictive models are more
“portable” then anticipated.
2. It is possible to create generic
models that are then “tuned” for
use at specific types of
institutions.
3. It is possible to create a library of
open predictive models that
could be shared globally.
12. Intervention Research Findings
Final Course Grades
Analysis showed a statistically significant
positive impact on final course grades
• No difference between treatment groups
Saw larger impact in spring then fall
Similar trend amount low income students
50
60
70
80
90
100
Awareness OASE Control
FinalGrade(%)
Mean Final Grade for "at Risk" Students
13. Instructor Feedback
"Not only did this project directly assist my students by guiding students to
resources to help them succeed, but as an instructor, it changed my pedagogy;
I became more vigilant about reaching out to individual students and
providing them with outlets to master necessary skills.
P.S. I have to say that this semester, I received the highest volume of
unsolicited positive feedback from students, who reported that they felt I
provided them exceptional individual attention!
14. JAYAPRAKASH, S. M., MOODY, E. W., LAURÍA, E. J., REGAN, J. R.,
& BARON, J. D. (2014). EARLY ALERT OF ACADEMICALLY AT-RISK
STUDENTS: AN OPEN SOURCE ANALYTICS INITIATIVE. JOURNAL
OF LEARNING ANALYTICS, 1(1), 6-47.
More Research Findings…
16. Intersections between
openness and Learning Analytics
Open Source Learning Analytics Software
• Weka, Kettle, Pentaho, R, Python etc.
Open Standards and APIs for Learning Analytics
• Experience API, IMS Caliper/Sensor API
Open Models - Predictive models, knowledge maps, PMML etc.
Open Content/Access – Journals, whitepapers, policies documents
Openness or Transparency with regards to Ethics/Privacy
NOT anti-commercial – Commercial ecosystems help sustain OSS
18. Software Silos vs. Platforms
Many learning analytics solutions today are
“tool” or “software-centric”
• Analytics tools are built into existing software such as the
Learning Management System (LMS)
Can make it harder to capture data and
integrate across systems (limits Big Data)
A platform solution would allow institutions
to collect data from across many systems
• A “modularized platform” approach allows institutions to use all or just some components
• Integration points allow data to “flow” in for processing and results to flow out
19. Apereo Learning Analytics Initiative (LAI)
Goal: Operationalize outcomes from Learning Analytics research as means to
develop, maintain and sustain an open platform for Learning Analytics
Current Proof-of-Concept Projects
◦ University of Amsterdam – Larrisa (open-source Learning Record Store)
◦ Marist College – Learning Analytics Processor (LAP)
◦ Uniformed Services University – OpenDashboard
◦ Sinclair Community College – Student Success Plan
◦ Unicon – OpenLRS and commercial support services
Contact: Alan Berg, Community Officer
Email: analytics-coordinator@apereo.org,
Wiki Page: https://confluence.sakaiproject.org/x/rIB_BQ
GitHub: https://github.com/Apereo-Learning-Analytics-Initiative
20. Strategic Vision: Open Learning
Analytics Platform
Collection – Standards-
based data capture from any
potential source using Experience
API and/or IMS Caliper/Senor API
Storage– Single repository
for all learning-related data using
Learning Record Store (LRS)
standard.
Analysis– Flexible Learning
Analytics Processor (LAP) that can
handle data mining, data
processing (ETL), predictive model
scoring and reporting.
Communication–
Dashboard technology for
displaying LAP output.
Action– LAP output can be fed
into other systems to trigger alerts,
etc.
Library of
Open Models
21. Learning Record Store & Data Collection
• OpenLRS is a secure, standards-based,
standalone Learning Record Store built to fill
the need for a high i/o storage mechanism for
an open learning analytics environment
• Technical Stack
• Spring-Boot
• Pluggable Datastores (redis, elasticsearch, mongodb)
• xAPI integrations to get activity streams
• Roadmap
• Integration & Support for IMS Caliper
26. Open Dashboard
• Web application that provides a framework
for displaying analytics visualizations and
data views called “cards”.
• Cards represent a single discrete
visualization or data view but share an API
and data model
• LTI compliant
• Widget(Card) library for Learning Analytics
27. LAK15 Hackathon - Open Dashboards
Early Alert Insights Chart
Course Engagement Pathways – Resource &
Content Access
29. Demo Overview
• Three core components of a collection of
open source applications and services that
represent the “Analytics Diamond”
• Can be used individually or collectively
• Work with a shared infrastructure and data
model
Technologies:
• AngularJS
• Spring-Boot
• Pluggable Datastores
(redis, elasticsearch, mongodb)
Sakai
OpenLRS
Learning
Analytics
Processor
Open
Dashboard
xAPI
LTI
API
API
AWS
Local
31. Engaging with Apereo Learning Analytics
Initiative (LAI)
We believe in Do-ocracy.If you see an opportunity or area of
enrichment then you should take leadership and the community will support
you. Bear this in mind as you ask me questions
Examples
Alan - community officer, organizes hackathons & workshops
Sandeep - warding incubation process, analytics
Patrick - communications officer, student requirements
Kate - marketing/communications, Evangelist
Josh - many roles (not even going to start)
Gary - builds the living daylights out of LAI. Sanity check, etc.
32. Engaging with Apereo Learning Analytics
Initiative (LAI)
Where to start? LEVEL 1 NINJA (in no particular order):
• Review the homepage
https://confluence.Sakaiproject.Org/display/LAI/learning+analytics+initiative
• Read the notes from the regular meetings
• Join the mailing list: analytics@apereo.org
(subscribe by sending a message to analytics+subscribe@apereo.Org)
• Join the calls (every other Wednesday) :
https://confluence.Sakaiproject.Org/display/LAI/community+hangouts
• Review github: https://github.Com/apereo-learning-analytics-initiative
• Meet us at a BOF or online.
• Take on a role on a subject you care about
33. Engaging with Apereo Learning Analytics
Initiative (LAI)
Where to start? LEVEL 2 NINJA (in no particular order):
• Buy us/ME beer
• Host a hackathon or workshop
• Present at a conference
• New project / consortium building / grant proposal
• Enrich a current product
• Add parts to Apereo LAI
• Consider co-developing
• Act as a communication channel between organizations
• Surf, JISC , Apereo
• SoLAR, LACE
• Unicon,uva,marist,hull,oxford, <<your name here>>
34. Discussion and Q&A
JOSH: JOSH.BARON@MARIST.EDU
ALAN: A.M.BERG@UVA.NL
SANDEEP: SANDEEP.JAYAPRAKASH1@MARIST.EDU
Editor's Notes
OK, so what is the OAAI and how are we working to address this problem…with the goal of leveraging Big Data to create an open-source academic early alert system that allows us to predict which students are at risk to not complete the course (and do so early on in the semester) and then deploy an intervention to help that student succeed.
I’ll talk about our intervention strategies in a little more detail a bit later on in the presentation…