Blackboard’s data science team conducts large-scale analysis of the relationship between the use of our academic technologies and student impact, in order to inform product design, disseminate effective practices, and advance the base of empirical research in educational technologies.
In this presentation, John Whitmer, Director of Analytics & Research, will discuss findings from 2016. Some findings challenge our conventional knowledge, while others confirm what we believed to be true.
Archived presentation made to JISC Learning Analytics workgroup on Feb 22, 2017
What data from 3 million learners can tell us about effective course designJohn Whitmer, Ed.D.
Presentation of research findings and implications from a large-scale analysis of LMS activity and grade data from across 927 institutions, 70,000 courses, and 3.3 million students. This webinar will speak to the promise (and potential pitfalls) of large-scale learning analytics research to promote student success.
Using Learning Analytics to Create our 'Preferred Future'John Whitmer, Ed.D.
One certainty about the future of higher education is that online technologies will play an increasingly central role in the creation and delivery of learning experiences, whether through mobile apps, MOOCs, open content, ePortfolios, and other resources. As adoption increases, the ‘digital exhaust’ recording technology use has increasing potential to understand student learning. The emergent field of Learning Analytics analyzes this data to provide actionable insights for students, for faculty, and for administrators. What have we learned in Learning Analytics to date? What challenges remain? How should we apply Learning Analytics to create our ‘preferred’ future’ that supports deep and meaningful learning
Using Learning Analytics to Assess Innovation & Improve Student Achievement John Whitmer, Ed.D.
Presentation about Learning Analytics for JISC network event; discussion of research findings and implications for individual and institutions considering a Learning Analytics project. Also discuss implications for my work with Blackboard on "Platform Analytics."
The Virtuous Loop of Learning Analytics & Academic Technology Innovation John Whitmer, Ed.D.
Faculty and academic departments creating innovative educational practices are often starved for useful data and analysis to determine whether their innovations made a difference. Research has found that this data is a systematically significant predictor of success much more powerful than traditional demographic or academic preparedness variables. This leads to a “virtuous loop” in which digital technology adoption enables assessment which then improves educational practices using those technologies.
This presentation was delivered at the Online Learning Consortium Collaborate Event, November 19, 2015.
Educational Data Mining in Program Evaluation: Lessons LearnedKerry Rice
AET 2016 Researchers present findings from a series of data mining studies, primarily examining data mining as part of an innovative triangulated approach in program evaluation. Findings suggest that is it possible to apply EDM techniques in online and blended learning classrooms to identify key variables important to the success of learners. Lessons learned will be shared as well as areas for improving data collection in learning management systems for meaningful analysis and visualization.
The Achievement Gap in Online Courses through a Learning Analytics LensJohn Whitmer, Ed.D.
Presentation at San Diego State University on April 12, 2013.
Educational researchers have found that students from under-represented minority families and other disadvantaged demographic backgrounds have lower achievement in online (or hybrid) courses compared to face-to-face course sections (Slate, Manuel, & Brinson Jr, 2002; Xu & Jaggars, 2013). However, these studies assume that "online course" is a homogeneous entity, and that student participation is uniform. The content and activity of the course is an opaque "black box", which leads to conclusions that are speculative at best and quite possibly further marginalize the very populations they intend to advocate for.
The emerging field of Learning Analytics promises to break open this black box understand how students use online course materials and the relationship between this use and student achievement. In this presentation, we will explore the countours of Learning Analytics, look at current applications of analytics, and discuss research applying a Learning Analytics research method to students from at-risk backgrounds. The findings of this research challenge stereotypes of these students as technologically unsophisticated and identify concrete learning activities that can support their success.
What data from 3 million learners can tell us about effective course designJohn Whitmer, Ed.D.
Presentation of research findings and implications from a large-scale analysis of LMS activity and grade data from across 927 institutions, 70,000 courses, and 3.3 million students. This webinar will speak to the promise (and potential pitfalls) of large-scale learning analytics research to promote student success.
Using Learning Analytics to Create our 'Preferred Future'John Whitmer, Ed.D.
One certainty about the future of higher education is that online technologies will play an increasingly central role in the creation and delivery of learning experiences, whether through mobile apps, MOOCs, open content, ePortfolios, and other resources. As adoption increases, the ‘digital exhaust’ recording technology use has increasing potential to understand student learning. The emergent field of Learning Analytics analyzes this data to provide actionable insights for students, for faculty, and for administrators. What have we learned in Learning Analytics to date? What challenges remain? How should we apply Learning Analytics to create our ‘preferred’ future’ that supports deep and meaningful learning
Using Learning Analytics to Assess Innovation & Improve Student Achievement John Whitmer, Ed.D.
Presentation about Learning Analytics for JISC network event; discussion of research findings and implications for individual and institutions considering a Learning Analytics project. Also discuss implications for my work with Blackboard on "Platform Analytics."
The Virtuous Loop of Learning Analytics & Academic Technology Innovation John Whitmer, Ed.D.
Faculty and academic departments creating innovative educational practices are often starved for useful data and analysis to determine whether their innovations made a difference. Research has found that this data is a systematically significant predictor of success much more powerful than traditional demographic or academic preparedness variables. This leads to a “virtuous loop” in which digital technology adoption enables assessment which then improves educational practices using those technologies.
This presentation was delivered at the Online Learning Consortium Collaborate Event, November 19, 2015.
Educational Data Mining in Program Evaluation: Lessons LearnedKerry Rice
AET 2016 Researchers present findings from a series of data mining studies, primarily examining data mining as part of an innovative triangulated approach in program evaluation. Findings suggest that is it possible to apply EDM techniques in online and blended learning classrooms to identify key variables important to the success of learners. Lessons learned will be shared as well as areas for improving data collection in learning management systems for meaningful analysis and visualization.
The Achievement Gap in Online Courses through a Learning Analytics LensJohn Whitmer, Ed.D.
Presentation at San Diego State University on April 12, 2013.
Educational researchers have found that students from under-represented minority families and other disadvantaged demographic backgrounds have lower achievement in online (or hybrid) courses compared to face-to-face course sections (Slate, Manuel, & Brinson Jr, 2002; Xu & Jaggars, 2013). However, these studies assume that "online course" is a homogeneous entity, and that student participation is uniform. The content and activity of the course is an opaque "black box", which leads to conclusions that are speculative at best and quite possibly further marginalize the very populations they intend to advocate for.
The emerging field of Learning Analytics promises to break open this black box understand how students use online course materials and the relationship between this use and student achievement. In this presentation, we will explore the countours of Learning Analytics, look at current applications of analytics, and discuss research applying a Learning Analytics research method to students from at-risk backgrounds. The findings of this research challenge stereotypes of these students as technologically unsophisticated and identify concrete learning activities that can support their success.
Learning Analytics: What is it? Why do it? And how?Timothy Harfield
Presentation delivered to graduate students at Emory University as part of a TATTO (Teaching Assistant Training and Teaching Opportunity) brown bag session.
ABSTRACT
Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs. Data driven approaches to teaching and learning are rapidly being adopted within educational environments, but there is still much confusion about what learning analytics is, what it can do, and how it is best employed.
This talk will provide a general overview of the field of learning analytics, its terminology and methods, as well as contemporary ethical debates. It will also introduce several open source and Emory-supported analytics tools available to students and instructors to facilitate the achievement of various learning outcomes.
Learning Analytics: Seeking new insights from educational dataAndrew Deacon
CPUT Fundani TWT - 22 May 2014
Analytics is a buzzword that encompasses the analysis and visualisation of big data. Current interest results from the growing access to data and the many software tools now available to analyse this data in Higher Education, through platforms such as Learning Management Systems. This seminar provides an overview of current applications and uses of learning analytics and how it can help institutions of learning better support their learners. The illustrative examples look at institutional and social media data that together provide rich insights into institutional, teaching and learning issues. A few simple ways to perform such analytics in a context of Higher Education will be introduced.
Educational Data Mining in relation to Educational Statistics of NepalRoshan Bhandari
This is the final presentation done at the Institute of Engineering Pulchowk Engineering Campus. We have applied data mining techniques like Regression Analysis, Clustering, to find the problems in education of Nepal. We had collaborated with Depart of Education of Nepal for Data. We came up with a suggestive term called "Educational Development Index" to find the relative development status of a district.
To read the complete report of our research please check here:-
http://flipkarma.com/project/educational-data-mining-in-relation-to-educational/
This presentation to the MoodleMoot UK/I 2017 provides an overview of Learning Analytics for VLE/LMS data and lessons learned in practice from using this data to model student risk and other characteristics. The findings come from fundamental research and application of Blackboard's X-Ray Learning Analytics application.
A case of Mbeya University of Science and Technology(MUST)
By;
Dr. Joel S. Mtebe
Director of;
Center for Virtual Learning
University of Dar es Salaam
Tanzania
http://works.bepress.com/mtebe
Using learning analytics to improve student transition into and support throu...Tinne De Laet
Presentation supporting the ABLE and STELA workshop titled "Using learning analytics to improve student transition into and support throughout the 1st year" delivered at the EFYE 2016 conference in Gent, Belgium
Introduction to Learning Analytics in BlackboardTimothy Harfield
Instructions for how to use and interpret the "Activity Compared to Others" feature in Blackboard. (Requires installation of Blackboard Analytics for Learn)
The Open University (OU) is a global leader in quality online, open and distance education with more than 180,000 students and 8,000 faculty and staff. Like many organizations, the OU is embracing data and learning analytics as an increasingly important approach for understanding learner behaviors. During this Fischer Speaker Series event, Dr. Tynan explores the vagaries of leading an institutional strategy at scale, specifically focusing on faculty, student and institutional engagement with analytics to support student success- detailing wins, pitfalls and unexpected twists resulting in unintended but delightful outcomes.
Professor Belinda Tynan is the Pro- Vice-Chancellor (Learning Innovation) and Professor of Higher Education at the Open University, UK. Reporting to the Vice-Chancellor, the Pro-Vice-Chancellor for Learning Innovation contributes to the strategic vision and mission of the University and has a focus on supporting student success by providing executive leadership in the areas of innovation, strategy and policy development, production, informal learning and research and scholarship in technology enhanced learning.
The video of this presentation can be viewed at https://goo.gl/W8qpi6
Toward an automated student feedback system for text based assignments - Pete...Blackboard APAC
As the use of blended learning environments and digital technologies become integrated into the higher education sector, rich technologies such as analytics have the ability to assist teaching staff identify students at risk, learning material that is not proving effective and learning site designs that aid and facilitate improved learning. More recently consideration has been given to automated essay scoring. Such systems can be used in a formative way, such as providing feedback on initial assignment drafts or summatively through the analysis of final assignment submissions. Further, providing students with quick feedback on written assignments opens the opportunity through formative feedback to improved learning outcomes.
This presentation details a current project developing a system to analyse text-based assignments. The project is being developed for broad application, but the findings focus on an undergraduate pilot subject: ‘Ideas that Shook the World’ (a compulsory first year Bachelor of Arts subject taught on 5 campuses to more than 1000 students by 15 staff). Preliminary results of a fist scan of assignments are presented and the issues raised in developing the system presented together with an outline of additional work planned for the project. It is believed the work will have wide application where text-based assignments are utilised for assessment.
Open Learning Analytics panel at Open Education Conference 2014Stian Håklev
The past five years have seen a dramatic growth in interest in the emerging field of Learning Analytics (LA), and particularly in the potential the field holds to address major challenges facing education. However, much of the work in the learning analytics landscape today is closed in nature, small in scale, tool- or software-centric, and relatively disconnected from other LA initiatives. This lack of collaboration, openness, and system integration often leads to fragmentation where learning data cannot be aggregated across different sources, institutions only have the option to implement "closed" systems, and cross disciplinary research opportunities are limited. Beyond the immediate concerns this fragmentation creates for educators and learners, a closed approach dramatically limits our ability to build upon successes, learn from failures and move beyond the "pockets of excellence (and failures)? approach that typifies much of the educational technology landscape.
The potential benefits of openness as a core value within the learning analytics community are numerous. Learning initiatives could be informed by large scale research projects. Open-source software, such as dashboards and analytics engines, could be available free of licensing costs and easily enhanced by others, and OERs could become more personalized to match learners' needs. Open data sets and reproducible papers could rapidly spread understanding of analytical approaches, enabling secondary analysis and comparison across research projects. To realize this future, leaders within the learning analytics, open technologies (software, standards, etc.), open research (open data, open predictive models, etc.) and open learning (OER, MOOCs, etc.) fields have established a "network of practice" aimed at connecting subject matter experts, projects, organizations and companies working in these domains. As an initial organizing event, these leaders organized an Open Learning Analytics (OLA) Summit directly following the 2014 Learning Analytics and Knowledge (LAK) conference this past March as means to further the goal of establishing "openness' as a core value of the larger learning analytics movement. Additional details on the Summit and those involved can be found at: http://www.prweb.com/releases/2014/04/prweb11754343.htm.
This panel session will bring together several thought leaders from the Open Learning Analytics community who participated in the Summit to facilitate an interactive dialog with attendees on the intersection of learning analytics and open learning, open technologies, open data, and open research. The presenters represent a broad range of experience with institutional analytics projects, an open source development consortium, the sharing of open learner data, and academic research on open learning environments.
Advances in Learning Analytics and Educational Data Mining MehrnooshV
This presentation is about the state-of-the-art of Learning Analytics and Edicational Data Mining. It is presented by Mehrnoosh Vahdat as the introductory tutorial of Special Session 'Advances in Learning Analytics and Educational Data Mining' at ESANN 2015 conference.
Learning Analytics: What is it? Why do it? And how?Timothy Harfield
Presentation delivered to graduate students at Emory University as part of a TATTO (Teaching Assistant Training and Teaching Opportunity) brown bag session.
ABSTRACT
Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs. Data driven approaches to teaching and learning are rapidly being adopted within educational environments, but there is still much confusion about what learning analytics is, what it can do, and how it is best employed.
This talk will provide a general overview of the field of learning analytics, its terminology and methods, as well as contemporary ethical debates. It will also introduce several open source and Emory-supported analytics tools available to students and instructors to facilitate the achievement of various learning outcomes.
Learning Analytics: Seeking new insights from educational dataAndrew Deacon
CPUT Fundani TWT - 22 May 2014
Analytics is a buzzword that encompasses the analysis and visualisation of big data. Current interest results from the growing access to data and the many software tools now available to analyse this data in Higher Education, through platforms such as Learning Management Systems. This seminar provides an overview of current applications and uses of learning analytics and how it can help institutions of learning better support their learners. The illustrative examples look at institutional and social media data that together provide rich insights into institutional, teaching and learning issues. A few simple ways to perform such analytics in a context of Higher Education will be introduced.
Educational Data Mining in relation to Educational Statistics of NepalRoshan Bhandari
This is the final presentation done at the Institute of Engineering Pulchowk Engineering Campus. We have applied data mining techniques like Regression Analysis, Clustering, to find the problems in education of Nepal. We had collaborated with Depart of Education of Nepal for Data. We came up with a suggestive term called "Educational Development Index" to find the relative development status of a district.
To read the complete report of our research please check here:-
http://flipkarma.com/project/educational-data-mining-in-relation-to-educational/
This presentation to the MoodleMoot UK/I 2017 provides an overview of Learning Analytics for VLE/LMS data and lessons learned in practice from using this data to model student risk and other characteristics. The findings come from fundamental research and application of Blackboard's X-Ray Learning Analytics application.
A case of Mbeya University of Science and Technology(MUST)
By;
Dr. Joel S. Mtebe
Director of;
Center for Virtual Learning
University of Dar es Salaam
Tanzania
http://works.bepress.com/mtebe
Using learning analytics to improve student transition into and support throu...Tinne De Laet
Presentation supporting the ABLE and STELA workshop titled "Using learning analytics to improve student transition into and support throughout the 1st year" delivered at the EFYE 2016 conference in Gent, Belgium
Introduction to Learning Analytics in BlackboardTimothy Harfield
Instructions for how to use and interpret the "Activity Compared to Others" feature in Blackboard. (Requires installation of Blackboard Analytics for Learn)
The Open University (OU) is a global leader in quality online, open and distance education with more than 180,000 students and 8,000 faculty and staff. Like many organizations, the OU is embracing data and learning analytics as an increasingly important approach for understanding learner behaviors. During this Fischer Speaker Series event, Dr. Tynan explores the vagaries of leading an institutional strategy at scale, specifically focusing on faculty, student and institutional engagement with analytics to support student success- detailing wins, pitfalls and unexpected twists resulting in unintended but delightful outcomes.
Professor Belinda Tynan is the Pro- Vice-Chancellor (Learning Innovation) and Professor of Higher Education at the Open University, UK. Reporting to the Vice-Chancellor, the Pro-Vice-Chancellor for Learning Innovation contributes to the strategic vision and mission of the University and has a focus on supporting student success by providing executive leadership in the areas of innovation, strategy and policy development, production, informal learning and research and scholarship in technology enhanced learning.
The video of this presentation can be viewed at https://goo.gl/W8qpi6
Toward an automated student feedback system for text based assignments - Pete...Blackboard APAC
As the use of blended learning environments and digital technologies become integrated into the higher education sector, rich technologies such as analytics have the ability to assist teaching staff identify students at risk, learning material that is not proving effective and learning site designs that aid and facilitate improved learning. More recently consideration has been given to automated essay scoring. Such systems can be used in a formative way, such as providing feedback on initial assignment drafts or summatively through the analysis of final assignment submissions. Further, providing students with quick feedback on written assignments opens the opportunity through formative feedback to improved learning outcomes.
This presentation details a current project developing a system to analyse text-based assignments. The project is being developed for broad application, but the findings focus on an undergraduate pilot subject: ‘Ideas that Shook the World’ (a compulsory first year Bachelor of Arts subject taught on 5 campuses to more than 1000 students by 15 staff). Preliminary results of a fist scan of assignments are presented and the issues raised in developing the system presented together with an outline of additional work planned for the project. It is believed the work will have wide application where text-based assignments are utilised for assessment.
Open Learning Analytics panel at Open Education Conference 2014Stian Håklev
The past five years have seen a dramatic growth in interest in the emerging field of Learning Analytics (LA), and particularly in the potential the field holds to address major challenges facing education. However, much of the work in the learning analytics landscape today is closed in nature, small in scale, tool- or software-centric, and relatively disconnected from other LA initiatives. This lack of collaboration, openness, and system integration often leads to fragmentation where learning data cannot be aggregated across different sources, institutions only have the option to implement "closed" systems, and cross disciplinary research opportunities are limited. Beyond the immediate concerns this fragmentation creates for educators and learners, a closed approach dramatically limits our ability to build upon successes, learn from failures and move beyond the "pockets of excellence (and failures)? approach that typifies much of the educational technology landscape.
The potential benefits of openness as a core value within the learning analytics community are numerous. Learning initiatives could be informed by large scale research projects. Open-source software, such as dashboards and analytics engines, could be available free of licensing costs and easily enhanced by others, and OERs could become more personalized to match learners' needs. Open data sets and reproducible papers could rapidly spread understanding of analytical approaches, enabling secondary analysis and comparison across research projects. To realize this future, leaders within the learning analytics, open technologies (software, standards, etc.), open research (open data, open predictive models, etc.) and open learning (OER, MOOCs, etc.) fields have established a "network of practice" aimed at connecting subject matter experts, projects, organizations and companies working in these domains. As an initial organizing event, these leaders organized an Open Learning Analytics (OLA) Summit directly following the 2014 Learning Analytics and Knowledge (LAK) conference this past March as means to further the goal of establishing "openness' as a core value of the larger learning analytics movement. Additional details on the Summit and those involved can be found at: http://www.prweb.com/releases/2014/04/prweb11754343.htm.
This panel session will bring together several thought leaders from the Open Learning Analytics community who participated in the Summit to facilitate an interactive dialog with attendees on the intersection of learning analytics and open learning, open technologies, open data, and open research. The presenters represent a broad range of experience with institutional analytics projects, an open source development consortium, the sharing of open learner data, and academic research on open learning environments.
Advances in Learning Analytics and Educational Data Mining MehrnooshV
This presentation is about the state-of-the-art of Learning Analytics and Edicational Data Mining. It is presented by Mehrnoosh Vahdat as the introductory tutorial of Special Session 'Advances in Learning Analytics and Educational Data Mining' at ESANN 2015 conference.
The lack of visible female role models is pervasive in the tech industry, particularly on Wikipedia, where just under 17% of Wikipedia biographies were on women. That's why HubSpot wrote fourteen Wikipedia entries for remarkable women in tech to help inspire young women to reach positions at the highest levels of STEM.
Designing, developing, and evaluating a real time student dashboardBob Bodily
We discuss the technical infrastructure needed to capture student data in an open learning environment (beyond the LMS), our iterative design process along with dashboard prototypes, and our dashboard evaluation results from focus groups and a survey.
Visit BobBodily.com for more information about my research.
I attended the Pittsburgh Summer LearnLab at Carnegie Mellon over the summer (2016). The work that I did over the week of the LearnLab went into this presentation. I conducted two linear regression models, two support vector classification models, a hierarchical clustering analytics, and a Latent Class Analysis.
Visit BobBodily.com for more information about my research.
Using real-time dashboards to improve student engagement in virtual learning ...Bob Bodily
In this presentation, I discuss the technical requirements for collecting learning analytics data in an open environment, the analytics system we have created to facilitate real-time data collection, screenshots of our student and instructor dashboards, and some statistical analyses conducted to improve our dashboards.
Visit BobBodily.com for more information about my research.
The RISE Framework: Using learning analytics for the continuous improvement o...Bob Bodily
We present the Resource Inspection, Selection, and Enhancement (RISE) framework, a learning analytics framework designed to enable teachers to engage in the continuous improvement process. This framework helps identify resources that should be evaluated by a teacher or an instructional designer.
Visit BobBodily.com for more information about my research.
Presentation by Rebecca Ferguson (IET, The Open University, UK) at e-Learning Korea 2016, held in Seoul, South Korea, in September 2016. This presentation, on Visions of the Future of learning analytics, is based on work carried out by the European consortium working on the Learning Analytics Community Exchange (LACE) project, and by the group working on the Learning Analytics for European Educational Policy (LAEP) project)
Examining the effect of a real time student dashboard on student behavior and...Bob Bodily
In this presentation we present a randomized control trial research study conducted to determine the effect of a real-time student dashboard on student behavior and student achievement. We also present on some of our design changes to increase student use of our dashboards.
Visit BobBodily.com for more information about my research.
Learning design meets learning analytics: Dr Bart Rienties, Open UniversityBart Rienties
8th UK Learning Analytics Network Meeting, The Open University, 2nd November 2016
1) The power of 151 Learning Designs on 113K+ students at the OU?
2) How can we use learning design to empower teachers?
3) How can Early Alert Systems improve Student Engagement and Academic Success? (Amara Atif, Macquarie University)
4) What evidence is there that learning design makes a difference over time and how students engage?
LAK '17 Trends and issues in student-facing learning analytics reporting sys...Bob Bodily
This presentation was given at the 7th Learning Analytics and Knowledge conference (2017) in Vancouver, BC. It presents the trends and issues in student-facing learning analytics reporting research as identified by a literature review including over 90 articles.
Slides from Keynote presentation at the University of Southern California's 2015 Teaching with Technology annual conference.
"9:15 am – ANN Auditorium
Key Note: What Do We Mean by Learning Analytics?
Leah Macfadyen, Director for Evaluation and Learning Analytics, University of British Columbia
Executive Board, SoLAR (Society for Learning Analytics Research)
Leah Macfadyen will define and explore the emerging and interdisciplinary field of learning analytics in the context of quantified and personalized learning. Leah will use actual examples and case studies to illustrate the range of stakeholders learning analytics may serve, the diverse array of questions they may be used to address, and the potential impact of learning analytics in higher education."
The case for learning analytics - Jisc Digifest 2016Jisc
Jisc is developing a learning analytics service in consultation with universities and colleges, suppliers and key stakeholders. The rationale is to provide universities and colleges with a basic solution that can form the basis of a complete solution to all you learning analytics requirements.
We believe Jisc are uniquely placed to provide a national infrastructure that can support the future development of learning analytics within the UK.
This session will explore the case for learning analytics, does it work and do you need it?
E-SUPPORTING PERFORMANCE STYLES BASED ON LEARNING ANALYTICS FOR DEVELOPMENT O...IJITE
This study aims to identify the effectiveness of delivering electronic supporting performance styles that are
based on learning analytics for the development of teaching practices in teaching science, moreover, the
Electronic and face to face supporting performance styles will deliver according to the data analytics that
extracted from observations, (participating rate- page views) data from platform, therefore, to determine
the effectiveness, the researchers design observation rubric based on teaching practices standard that
extract from (ASTE/NSTA, AITSL) to observe teaching practices of student science teachers. Regarding the
participants they were science students who enrolled in educational diplomas, researchers use the mixed
method in collected data and quantitative data, furthermore, they will study a supportive program of
considering data analyses to develop their teaching practices in teaching science, the results exposed that
providing a supporting program that considers learning analytics, helps increase teaching practices in
teaching science for student's science teachers.
E-supporting Performance Styles based on Learning Analytics for Development o...IJITE
This study aims to identify the effectiveness of delivering electronic supporting performance styles that are
based on learning analytics for the development of teaching practices in teaching science, moreover, the
Electronic and face to face supporting performance styles will deliver according to the data analytics that
extracted from observations, (participating rate- page views) data from platform, therefore, to determine
the effectiveness, the researchers design observation rubric based on teaching practices standard that
extract from (ASTE/NSTA, AITSL) to observe teaching practices of student science teachers. Regarding the
participants they were science students who enrolled in educational diplomas, researchers use the mixed
method in collected data and quantitative data, furthermore, they will study a supportive program of
considering data analyses to develop their teaching practices in teaching science, the results exposed that
providing a supporting program that considers learning analytics, helps increase teaching practices in
teaching science for student's science teachers.
E-SUPPORTING PERFORMANCE STYLES BASED ON LEARNING ANALYTICS FOR DEVELOPMENT O...IJITE
This study aims to identify the effectiveness of delivering electronic supporting performance styles that are
based on learning analytics for the development of teaching practices in teaching science, moreover, the
Electronic and face to face supporting performance styles will deliver according to the data analytics that
extracted from observations, (participating rate- page views) data from platform, therefore, to determine
the effectiveness, the researchers design observation rubric based on teaching practices standard that
extract from (ASTE/NSTA, AITSL) to observe teaching practices of student science teachers. Regarding the
participants they were science students who enrolled in educational diplomas, researchers use the mixed
method in collected data and quantitative data, furthermore, they will study a supportive program of
considering data analyses to develop their teaching practices in teaching science, the results exposed that
providing a supporting program that considers learning analytics, helps increase teaching practices in
teaching science for student's science teachers.
This master class covers the latest developments and possibilities of learning analytics and addresses the issue of visualising data for teachers using current examples.
This class is organised in the context of the LACE (Learning Analytics Community Exchange) project which brings together existing key European players in the field of learning analytics & Educational Data Mining in order to support development of communities of practice and share emerging best practices.
Presentation LMU Munich: The power of learning analytics to unpack learning a...Bart Rienties
The power of learning analytics to unpack learning and teaching: a critical perspective
Ludwig-Maximilians-Universität München
Fakultät für Psychologie und Pädagogik
Materials for introduction to adaptive learning and learning analytics as well as efforts of interoperability standardization. This slides treats brief concept of adaptive learning, reference model of learning analytics, data APIs for learning analytics, and topic list of standardization community (ISO/IEC JTC1 SC36).
In this presentation to NYU-Learn, I discuss my experience applying data science and machine learning in educational technology and assessment industries. I share tips for thinking about the importance of context and potential of scalability.
Collaborative Research: Stealth Assessment of SE Skills w/Learning AnalyticsJohn Whitmer, Ed.D.
Early description of a work in progress between ACT, UMBC, Blackboard and Vitalsource to investigate the relationship between social and emotional skills and learning analytics using machine and deep learning techniques. A few preliminary results.
Improving Student Achievement with New Approaches to DataJohn Whitmer, Ed.D.
Presentation delivered at WASC ARC conference on April 11, 2013 on the CSU Data Dashboard and Chico State Learning Analytics case study.
Chico State Case Study: Academic technologies collect highly detailed student usage data. How can this data be used to understand and predict student performance, especially of at-risk students? This presentation will discuss research on a high-enrollment undergraduate course exploring the relationship between LMS activity, student background characteristics, current enrollment information, and student achievement.
CSU Data Dashboard: By monitoring on-track indicators institutional leaders can better understand not only which milestones students are failing to reach, but why they are not reaching them. It can also help campuses to design interventions or policy changes to increase student success and to gauge the impact of interventions.
Learner Analytics Panel Session: Deja-Vu all over again? John Whitmer, Ed.D.
Panel presentation at the DET/CHE 2012 conference on November 28, 2012 by Kathy Fernandes (Chico State), James Frazee (San Diego State), Andrew Roderick (SFSU), and Deone Zell (CSU Northridge).
Improving student persistence, especially among under-represented minority students, is a driving goal at many colleges and universities. Academic technologies, such as the Learning Management System (LMS), are frequently used to deliver innovative pedagogical strategies to increase engagement and improve persistence. This study presents research on a redesigned hybrid high-enrollment undergraduate course exploring the relationship between LMS activity, student background characteristics, current enrollment information, and student achievement.
Many Hands Makes Light Work: Collaborating on Moodle Services and DevelopmentJohn Whitmer, Ed.D.
Presentation by Kathy Fernandes, Andrew Roderick, and John Whitmer at the US West Coast MoodleMoot 2012 on August 2, 2012.
Learning Management Systems have evolved from faculty sandboxes to complex enterprise learning environments. Meanwhile, budgets have plummeted and the LMS market has been undergoing rapid change. Many campuses have moved to Moodle to help stabilize their business and application environments. An important criteria behind this transition for many campuses has been the ability to ‘control their own destiny’ and collaborate with colleagues.
In this presentation, we will discuss the experience of campuses in the California State University system collaborating on Moodle technical development, user services, and support. Among the 10 campuse currently using or in transition to Moodle, we have developed a shared governance model with separate groups to administer policy-related issues and technical / UI issues. We will discuss the creation of a Moodle Shared Code base that is being used by several campuses, and the current migration of SCB features into Moodle v2.0. Moodle techincal expertise is shared between campuses, and training resources have been leveraged across the CSU system. We will discuss the process and features that have led to successful (and not so successful) colllaborative activities, as well as the services that have been created.
Presentation by John Whitmer, Michael Haskell (Cal Poly SLO), and Hillary Kaplowitz (CSU Northtridge) at US West Coast Moodle Moot 2012.
“Learner Analytics” has captured the attention of the media and is the topic of much debate in professional and academic circles. What lies behind the hype? In this presentation, we will discuss the state and limits to current in research in LMS Learner Analytics. We will then look at examples of Learner Analytics in Moodle, including tools for faculty and reports for reporting across the entire instance.
Learner Analytics and the “Big Data” Promise for Course & Program AssessmentJohn Whitmer, Ed.D.
Presentation delivered at the San Diego State University "One Day in May" conference on May 22, 201 by John Whitmer, Hillary Kaplowitz, and Thomas J. Norman
Universities archive massive amounts of data about students and their activities. Students also generate significant amounts of “digital exhaust” as they use academic technologies. How can faculty and administrators use automated analysis of this data to save time and conduct targeted interventions to improve student learning?
The emerging discipline of Learner Analytics conducts analysis of this data to learn about student behaviors, predict students at-risk of failure, and identify potential interventions to help those students. In this presentation, we will discuss the contours of this discipline and review the state of research conducted to date. We will then look at several examples of Learner Analytics services and hear from California State University educators who are using these tools to help their students. Finally, we will suggest some immediate ways that Analytics can be conducted at San Diego State.
Presenters:
John Whitmer, California State University, Chico
Hillary Kaplowitz, California State University, Northridge
Thomas J. Norman, CSU Dominguez Hills
Learning Analytics: Realizing the Big Data Promise in the CSUJohn Whitmer, Ed.D.
The word “analytics” has become a buzzword in current educational technology conversations, applied to everything from analysis of student work to LMS usage reporting to institutional analysis of ERP data. Broadly speaking, Learner Analytics refers to the analysis of student data using statistical techniques to improve decision-making. In the context of educational technology, Learner Analytics promises to improve our understanding of effective (and ineffective) student learning and technology usage. What progress have we seen in realizing this promise? This session offers a discussion of the promise of Learner Analytics, current research findings and tools, and explores examples from CSU Chico and the CSU Office of the Chancellor.
Current CSU LMS Activities: Campus and Systemwide StrategiesJohn Whitmer, Ed.D.
In this webinar from April 2010, Dr. David Levin from CSU Northridge and Dr. Linda Scott from CSU San Marcos spoke about their campus migrations from Blackboard to Moodle. They discussed the decision-making process on their campus, their timeline, course migrations, implementations, training and support resources, and lessons learned.
Kathy Fernandes and John Whitmer spoke about the Chancellor’s Office Initiative to provide systemwide LMS Services. These services began with the LMS RFP and CSU Sandboxes, and were expanded to provide an LMS “safety net” and a “superset” of LMS services that include systems, integrations, migrations, support services, and educational practices.
Participants will learn about these current efforts and plans for the implementation of the LMS recommendations approved by the CSU Academic Technology Steering Committee in December 2009.
Presentation to the CSU Community of Academic Technology Staff annual conference in April 2010. Topics discussed include the history of the LMS in the CSU, the current services coordinated through the Chancellor's Office, and upcoming services.
Presenters:
Kathy Fernandes, CSU Office of the Chancellor
John Whitmer, CSU Office of the Chancellor
Faculty Development across the California State University SystemJohn Whitmer, Ed.D.
In this presentation from the US West Coast Moodle Moot 2011, leaders from California State University campuses discuss their efforts to support the increased use of Moodle on their campus. The speakers represent campuses new to Moodle and mature deployments, and discuss the needs of new users and those further along in the adoption process. Issues to be dicussed include: training resources, effective training modalities, critical training issues in Moodle, and more.
Prsenters:
Cherie Blut, CSU San Marcos
Brett Christie, Sonoma State University
Maggie Beers, San Francisco State University
Deone Zell, CSU Northridge
Moderator: John Whitmer, CSU Office of the Chancellor
Partnership & Collaboration in Moodle Development: Making it WorkJohn Whitmer, Ed.D.
Presentation by Kathy Fernandes (CSU Office of the Chancellor), Andrew Roderick (San Francisco State University), and John Whitmer (CSU Office of the Chancellor)
US West Coast MoodleMoot 2011 (July 2011, Rohnert Park, CA)
As an open source application, Moodle has strong potential for collaborative partnerships, support services, and code development. This presentation will describe one year in the life of California State University Moodle Collaborations. Over the past year, the CSU has developed a governance process and established a new organizational culture while working on code development, training materials, migration tool, and expertise collaboration. We will discuss the balance of central coordination and campus leadership, technical issues and opportunities, and plans for the future.
Migrating to Moodle: Lessons Learned from Recent CSU MigrationsJohn Whitmer, Ed.D.
In this presentation from the US West Coast Moodle Moot 2011, leaders from California State University that have recently migrated to Moodle discuss their campus decision-making process, the processes and technologies used to migrate content, and their process of implementation. The speakers represent campuses migrating from both Blackboard and WebCT, and a mix of small and large FTE campuses. Activities that benefited from multi-campus coordination and resource sharing are also be discussed.
Presenters:
David Levin, CSU Northridge
Barbara Taylor, CSU San Marcos
Moderator: John Whitmer, CSU Office of the Chancellor
Explore our comprehensive data analysis project presentation on predicting product ad campaign performance. Learn how data-driven insights can optimize your marketing strategies and enhance campaign effectiveness. Perfect for professionals and students looking to understand the power of data analysis in advertising. for more details visit: https://bostoninstituteofanalytics.org/data-science-and-artificial-intelligence/
Data Centers - Striving Within A Narrow Range - Research Report - MCG - May 2...pchutichetpong
M Capital Group (“MCG”) expects to see demand and the changing evolution of supply, facilitated through institutional investment rotation out of offices and into work from home (“WFH”), while the ever-expanding need for data storage as global internet usage expands, with experts predicting 5.3 billion users by 2023. These market factors will be underpinned by technological changes, such as progressing cloud services and edge sites, allowing the industry to see strong expected annual growth of 13% over the next 4 years.
Whilst competitive headwinds remain, represented through the recent second bankruptcy filing of Sungard, which blames “COVID-19 and other macroeconomic trends including delayed customer spending decisions, insourcing and reductions in IT spending, energy inflation and reduction in demand for certain services”, the industry has seen key adjustments, where MCG believes that engineering cost management and technological innovation will be paramount to success.
MCG reports that the more favorable market conditions expected over the next few years, helped by the winding down of pandemic restrictions and a hybrid working environment will be driving market momentum forward. The continuous injection of capital by alternative investment firms, as well as the growing infrastructural investment from cloud service providers and social media companies, whose revenues are expected to grow over 3.6x larger by value in 2026, will likely help propel center provision and innovation. These factors paint a promising picture for the industry players that offset rising input costs and adapt to new technologies.
According to M Capital Group: “Specifically, the long-term cost-saving opportunities available from the rise of remote managing will likely aid value growth for the industry. Through margin optimization and further availability of capital for reinvestment, strong players will maintain their competitive foothold, while weaker players exit the market to balance supply and demand.”
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...John Andrews
SlideShare Description for "Chatty Kathy - UNC Bootcamp Final Project Presentation"
Title: Chatty Kathy: Enhancing Physical Activity Among Older Adults
Description:
Discover how Chatty Kathy, an innovative project developed at the UNC Bootcamp, aims to tackle the challenge of low physical activity among older adults. Our AI-driven solution uses peer interaction to boost and sustain exercise levels, significantly improving health outcomes. This presentation covers our problem statement, the rationale behind Chatty Kathy, synthetic data and persona creation, model performance metrics, a visual demonstration of the project, and potential future developments. Join us for an insightful Q&A session to explore the potential of this groundbreaking project.
Project Team: Jay Requarth, Jana Avery, John Andrews, Dr. Dick Davis II, Nee Buntoum, Nam Yeongjin & Mat Nicholas
Opendatabay - Open Data Marketplace.pptxOpendatabay
Opendatabay.com unlocks the power of data for everyone. Open Data Marketplace fosters a collaborative hub for data enthusiasts to explore, share, and contribute to a vast collection of datasets.
First ever open hub for data enthusiasts to collaborate and innovate. A platform to explore, share, and contribute to a vast collection of datasets. Through robust quality control and innovative technologies like blockchain verification, opendatabay ensures the authenticity and reliability of datasets, empowering users to make data-driven decisions with confidence. Leverage cutting-edge AI technologies to enhance the data exploration, analysis, and discovery experience.
From intelligent search and recommendations to automated data productisation and quotation, Opendatabay AI-driven features streamline the data workflow. Finding the data you need shouldn't be a complex. Opendatabay simplifies the data acquisition process with an intuitive interface and robust search tools. Effortlessly explore, discover, and access the data you need, allowing you to focus on extracting valuable insights. Opendatabay breaks new ground with a dedicated, AI-generated, synthetic datasets.
Leverage these privacy-preserving datasets for training and testing AI models without compromising sensitive information. Opendatabay prioritizes transparency by providing detailed metadata, provenance information, and usage guidelines for each dataset, ensuring users have a comprehensive understanding of the data they're working with. By leveraging a powerful combination of distributed ledger technology and rigorous third-party audits Opendatabay ensures the authenticity and reliability of every dataset. Security is at the core of Opendatabay. Marketplace implements stringent security measures, including encryption, access controls, and regular vulnerability assessments, to safeguard your data and protect your privacy.
Techniques to optimize the pagerank algorithm usually fall in two categories. One is to try reducing the work per iteration, and the other is to try reducing the number of iterations. These goals are often at odds with one another. Skipping computation on vertices which have already converged has the potential to save iteration time. Skipping in-identical vertices, with the same in-links, helps reduce duplicate computations and thus could help reduce iteration time. Road networks often have chains which can be short-circuited before pagerank computation to improve performance. Final ranks of chain nodes can be easily calculated. This could reduce both the iteration time, and the number of iterations. If a graph has no dangling nodes, pagerank of each strongly connected component can be computed in topological order. This could help reduce the iteration time, no. of iterations, and also enable multi-iteration concurrency in pagerank computation. The combination of all of the above methods is the STICD algorithm. [sticd] For dynamic graphs, unchanged components whose ranks are unaffected can be skipped altogether.
As Europe's leading economic powerhouse and the fourth-largest hashtag#economy globally, Germany stands at the forefront of innovation and industrial might. Renowned for its precision engineering and high-tech sectors, Germany's economic structure is heavily supported by a robust service industry, accounting for approximately 68% of its GDP. This economic clout and strategic geopolitical stance position Germany as a focal point in the global cyber threat landscape.
In the face of escalating global tensions, particularly those emanating from geopolitical disputes with nations like hashtag#Russia and hashtag#China, hashtag#Germany has witnessed a significant uptick in targeted cyber operations. Our analysis indicates a marked increase in hashtag#cyberattack sophistication aimed at critical infrastructure and key industrial sectors. These attacks range from ransomware campaigns to hashtag#AdvancedPersistentThreats (hashtag#APTs), threatening national security and business integrity.
🔑 Key findings include:
🔍 Increased frequency and complexity of cyber threats.
🔍 Escalation of state-sponsored and criminally motivated cyber operations.
🔍 Active dark web exchanges of malicious tools and tactics.
Our comprehensive report delves into these challenges, using a blend of open-source and proprietary data collection techniques. By monitoring activity on critical networks and analyzing attack patterns, our team provides a detailed overview of the threats facing German entities.
This report aims to equip stakeholders across public and private sectors with the knowledge to enhance their defensive strategies, reduce exposure to cyber risks, and reinforce Germany's resilience against cyber threats.
4. Educational Technology Assessment Hierarchy
Does it impact
student learning?
(Learning Analy6cs)
How many people use it?
(Adop6on)
Does it work? (SLAs)
5. What is Learning Analytics?
Learning and Knowledge
Analytics Conference, 2011
“ ...measurement, collec6on, analysis and
repor6ng of data about learners and their
contexts, for purposes of understanding
and op2mizing learning
and the environments
in which it occurs.”
6. Meta- questions driving our Learning Analytics research @ Blackboard
1. How is student/faculty use of Bb plaorms (e.g. Learn, Collab,
etc.) related to student achievement? [or sa6sfac6on, or risk, or …]
3. What data elements, feature sets, and func6onality can we
create to integrate these findings into Bb products to help faculty
improve student achievement?
2. Do these findings apply equally to students ‘at promise’ due to
their academic achievement or background characteris6cs? (e.g.
race, class, family educa6on, geography)
8. Commitment to Privacy & Openness
• Analyze data records
that are not only
removed of PII, but
de-personalized
(individual &
ins6tu6onal levels)
• Share results and open
discussion procedures
for analysis to inform
broader educa6onal
community
• Respect territorial
jurisdic6ons and safe
harbor provisions
11. Bb Study: Relationship Time in Learn & Grade
• Distribution in Time Spent is
highly skewed toward low
access
• Transforming data (log
transform) can produce normal
curves for analysis
• Of course, huge variation of
quality within that time spent
(of course materials, of student
activity)
12. Findings: Relationship Time in Learn & Grade
• Question: what is the
relationship between student
use of Learn and their course
grade?
• Investigate at student-course
level (one student, one course)
• 1.2M students, 34,519 courses,
788 institutions
• Significant, but effect size < 1%
13. Finding: Tool Use & Grade
Tool use and Final Grade do not have a linear rela6onship;
there is a diminishing marginal effect of tool use on Final Grade
Interpreta6ons
• Students absent from course ac6vity are at
greatest risk of low achievement.
• The first 6me you read/see a PowerPoint
presenta6on, you learn a lot, but the
second 6me you read/see it, you learn
less.
• GeYng from a 90% to a 95% requires
more effort than geYng from a 60% to a
65%.
Log transforma2on shows
stronger trend
17. Investigation Grade by Specific Tools Used
Ques6on: what is the rela6onship
between use of Learn and student grade,
based on the tool used?
Analysis Steps
1. Filter data for courses with poten6al meaningful
use (>60 min average, enrollment >10 <500,
gradebook used)
2. Iden6fy most frequently used tools
3. Separate tool use into no use & quar6les
4. Divide students into 3 groups by course grade
• High (80+)
• Passing (60-79)
• Low/Failing (0-59)
23. Research Questions Ques2ons
1. Are there systema6c ways that instructors use LMS tools in
their courses that span instructors and ins6tu6ons?
2. What recommenda6ons can be drawn for faculty, instruc6onal
designers, and other academic technology leaders seeking to
increase the impact of LMS use at their ins6tu6on?
Methods
1. Use same filtered data sample of student-course data
2. Calculate rela6ve student 6me per tool (as % of total course
6me), for comparison between courses
3. Cluster by pakerns in the balance of 6me spent in each tool
(unsupervised machine-learning; k means cluster analysis)
4. Add data as relevant to pakerns about enrollment, total 6me,
etc.
5. Make up cool names for each cluster and interpret meaning
31. Finding: Discussions with low/high avg use
Compare courses with low forum use to courses with forum use >1 hour / student average
32. Summary & Future Directions for DS Research
Summary
• Tremendous varia6on in use of Learn; most use skewed toward low/very low use.
• Importance of 6me spent in Learn for learning is also tremendously varied (“necessary” and “effec6ve” use
of Learn)
• Cri6cal to account for this varia6on to understand poten6al importance of Learn ac6vity
Future Direc2ons
• Analyze quality of ac2vity in greater depth (e.g. content of assignments, words in forum posts) to get
insights into quality of interac6ons
• Conduct 6me-series analysis (quan6ta6ve methods, design also needed); when someone accesses is more
important than if they do.
• Create proxies/derived values for behavior (above average, at average, etc.) by tool
34. Blackboard Analytics – Product Naming
Blackboard Analytics
Data warehouse products
Blackboard Analytics
Suite of analytics products
35. Blackboard Analy.cs
Product portfolio
Blackboard Intelligence
• Analy6cs for Learn – LMS data
• Student Management – SIS data
• Finance, HR, Advancement – ERP data
Blackboard Predict
• Predic6ve analy6cs and early alerts for reten6on
• Provides data for faculty and advisors about at-risk students
• Formerly Blue Canary
X-Ray Learning Analytics
• Classroom engagement data for faculty
• Ac6vity aggregated into 30+ visualiza6ons
• Currently available for Moodlerooms & Self-Hosted Moodlers only
Past view Current view Future view
Past view Current view Future view
Past view Current view Future view
37. Blackboard Analytics – Our Approach & Philosophy
Products that provide insight into
the teaching and learning process
Our Philosophy:
Data complements
human decision-making
Core competency:
Learning so{ware
and academic data
A team of experts
in the analy6cs field