Presentation about Learning Analytics for JISC network event; discussion of research findings and implications for individual and institutions considering a Learning Analytics project. Also discuss implications for my work with Blackboard on "Platform Analytics."
The Achievement Gap in Online Courses through a Learning Analytics LensJohn Whitmer, Ed.D.
Presentation at San Diego State University on April 12, 2013.
Educational researchers have found that students from under-represented minority families and other disadvantaged demographic backgrounds have lower achievement in online (or hybrid) courses compared to face-to-face course sections (Slate, Manuel, & Brinson Jr, 2002; Xu & Jaggars, 2013). However, these studies assume that "online course" is a homogeneous entity, and that student participation is uniform. The content and activity of the course is an opaque "black box", which leads to conclusions that are speculative at best and quite possibly further marginalize the very populations they intend to advocate for.
The emerging field of Learning Analytics promises to break open this black box understand how students use online course materials and the relationship between this use and student achievement. In this presentation, we will explore the countours of Learning Analytics, look at current applications of analytics, and discuss research applying a Learning Analytics research method to students from at-risk backgrounds. The findings of this research challenge stereotypes of these students as technologically unsophisticated and identify concrete learning activities that can support their success.
Using Learning Analytics to Create our 'Preferred Future'John Whitmer, Ed.D.
One certainty about the future of higher education is that online technologies will play an increasingly central role in the creation and delivery of learning experiences, whether through mobile apps, MOOCs, open content, ePortfolios, and other resources. As adoption increases, the ‘digital exhaust’ recording technology use has increasing potential to understand student learning. The emergent field of Learning Analytics analyzes this data to provide actionable insights for students, for faculty, and for administrators. What have we learned in Learning Analytics to date? What challenges remain? How should we apply Learning Analytics to create our ‘preferred’ future’ that supports deep and meaningful learning
The Virtuous Loop of Learning Analytics & Academic Technology Innovation John Whitmer, Ed.D.
Faculty and academic departments creating innovative educational practices are often starved for useful data and analysis to determine whether their innovations made a difference. Research has found that this data is a systematically significant predictor of success much more powerful than traditional demographic or academic preparedness variables. This leads to a “virtuous loop” in which digital technology adoption enables assessment which then improves educational practices using those technologies.
This presentation was delivered at the Online Learning Consortium Collaborate Event, November 19, 2015.
What data from 3 million learners can tell us about effective course designJohn Whitmer, Ed.D.
Presentation of research findings and implications from a large-scale analysis of LMS activity and grade data from across 927 institutions, 70,000 courses, and 3.3 million students. This webinar will speak to the promise (and potential pitfalls) of large-scale learning analytics research to promote student success.
Improving Student Achievement with New Approaches to DataJohn Whitmer, Ed.D.
Presentation delivered at WASC ARC conference on April 11, 2013 on the CSU Data Dashboard and Chico State Learning Analytics case study.
Chico State Case Study: Academic technologies collect highly detailed student usage data. How can this data be used to understand and predict student performance, especially of at-risk students? This presentation will discuss research on a high-enrollment undergraduate course exploring the relationship between LMS activity, student background characteristics, current enrollment information, and student achievement.
CSU Data Dashboard: By monitoring on-track indicators institutional leaders can better understand not only which milestones students are failing to reach, but why they are not reaching them. It can also help campuses to design interventions or policy changes to increase student success and to gauge the impact of interventions.
Blackboard’s data science team conducts large-scale analysis of the relationship between the use of our academic technologies and student impact, in order to inform product design, disseminate effective practices, and advance the base of empirical research in educational technologies.
In this presentation, John Whitmer, Director of Analytics & Research, will discuss findings from 2016. Some findings challenge our conventional knowledge, while others confirm what we believed to be true.
Archived presentation made to JISC Learning Analytics workgroup on Feb 22, 2017
22 January 2018 HEFCE open event “Using data to increase learning gains and t...Bart Rienties
With the Teaching Excellence Framework being implemented across England, a lot of higher education institutions have started to ask questions about what it means to be “excellent” in teaching. In particular, with the rich and complex data that all educational institutions gather that could potentially capture learning gains, what do we actually know about our students’ learning journeys? What kinds of data could be used to infer whether our students are actually making affective (e.g., motivation), behavioural (e.g., engagement), and/or cognitive learning gains? Please join us on 22 January 2018 in lovely Milton Keynes at a free OU- and HEFCE-supported event on Using data to increase learning gains and teaching excellence.
14.00-15.00 Measuring learning gains with (psychometric) questionnaires
Dr Sonia Ilie, Prof Jan Vermunt, Prof Anna Vignoles (University of Cambridge, UK): Learning gain: from concept to measurement
Dr Fabio Arico (University of East Anglia): Learning Gain and Confidence Gain Through Peer-instruction: the role of pedagogical design
Dr Paul Mcdermott & Dr Robert Jenkins (University of East Anglia): A Methodology that Makes Self-Assessment an Implicit Part of the Answering Process
15.00-15.45 Measuring employability learning gains
Dr Heike Behle (University of Warwick): Measuring employability gain in Higher Education. A case study using R2 Strengths
Fiona Cobb, Dr Bob Gilworth, David Winter (University of London): Careers Registration Learning Gain project
The Achievement Gap in Online Courses through a Learning Analytics LensJohn Whitmer, Ed.D.
Presentation at San Diego State University on April 12, 2013.
Educational researchers have found that students from under-represented minority families and other disadvantaged demographic backgrounds have lower achievement in online (or hybrid) courses compared to face-to-face course sections (Slate, Manuel, & Brinson Jr, 2002; Xu & Jaggars, 2013). However, these studies assume that "online course" is a homogeneous entity, and that student participation is uniform. The content and activity of the course is an opaque "black box", which leads to conclusions that are speculative at best and quite possibly further marginalize the very populations they intend to advocate for.
The emerging field of Learning Analytics promises to break open this black box understand how students use online course materials and the relationship between this use and student achievement. In this presentation, we will explore the countours of Learning Analytics, look at current applications of analytics, and discuss research applying a Learning Analytics research method to students from at-risk backgrounds. The findings of this research challenge stereotypes of these students as technologically unsophisticated and identify concrete learning activities that can support their success.
Using Learning Analytics to Create our 'Preferred Future'John Whitmer, Ed.D.
One certainty about the future of higher education is that online technologies will play an increasingly central role in the creation and delivery of learning experiences, whether through mobile apps, MOOCs, open content, ePortfolios, and other resources. As adoption increases, the ‘digital exhaust’ recording technology use has increasing potential to understand student learning. The emergent field of Learning Analytics analyzes this data to provide actionable insights for students, for faculty, and for administrators. What have we learned in Learning Analytics to date? What challenges remain? How should we apply Learning Analytics to create our ‘preferred’ future’ that supports deep and meaningful learning
The Virtuous Loop of Learning Analytics & Academic Technology Innovation John Whitmer, Ed.D.
Faculty and academic departments creating innovative educational practices are often starved for useful data and analysis to determine whether their innovations made a difference. Research has found that this data is a systematically significant predictor of success much more powerful than traditional demographic or academic preparedness variables. This leads to a “virtuous loop” in which digital technology adoption enables assessment which then improves educational practices using those technologies.
This presentation was delivered at the Online Learning Consortium Collaborate Event, November 19, 2015.
What data from 3 million learners can tell us about effective course designJohn Whitmer, Ed.D.
Presentation of research findings and implications from a large-scale analysis of LMS activity and grade data from across 927 institutions, 70,000 courses, and 3.3 million students. This webinar will speak to the promise (and potential pitfalls) of large-scale learning analytics research to promote student success.
Improving Student Achievement with New Approaches to DataJohn Whitmer, Ed.D.
Presentation delivered at WASC ARC conference on April 11, 2013 on the CSU Data Dashboard and Chico State Learning Analytics case study.
Chico State Case Study: Academic technologies collect highly detailed student usage data. How can this data be used to understand and predict student performance, especially of at-risk students? This presentation will discuss research on a high-enrollment undergraduate course exploring the relationship between LMS activity, student background characteristics, current enrollment information, and student achievement.
CSU Data Dashboard: By monitoring on-track indicators institutional leaders can better understand not only which milestones students are failing to reach, but why they are not reaching them. It can also help campuses to design interventions or policy changes to increase student success and to gauge the impact of interventions.
Blackboard’s data science team conducts large-scale analysis of the relationship between the use of our academic technologies and student impact, in order to inform product design, disseminate effective practices, and advance the base of empirical research in educational technologies.
In this presentation, John Whitmer, Director of Analytics & Research, will discuss findings from 2016. Some findings challenge our conventional knowledge, while others confirm what we believed to be true.
Archived presentation made to JISC Learning Analytics workgroup on Feb 22, 2017
22 January 2018 HEFCE open event “Using data to increase learning gains and t...Bart Rienties
With the Teaching Excellence Framework being implemented across England, a lot of higher education institutions have started to ask questions about what it means to be “excellent” in teaching. In particular, with the rich and complex data that all educational institutions gather that could potentially capture learning gains, what do we actually know about our students’ learning journeys? What kinds of data could be used to infer whether our students are actually making affective (e.g., motivation), behavioural (e.g., engagement), and/or cognitive learning gains? Please join us on 22 January 2018 in lovely Milton Keynes at a free OU- and HEFCE-supported event on Using data to increase learning gains and teaching excellence.
14.00-15.00 Measuring learning gains with (psychometric) questionnaires
Dr Sonia Ilie, Prof Jan Vermunt, Prof Anna Vignoles (University of Cambridge, UK): Learning gain: from concept to measurement
Dr Fabio Arico (University of East Anglia): Learning Gain and Confidence Gain Through Peer-instruction: the role of pedagogical design
Dr Paul Mcdermott & Dr Robert Jenkins (University of East Anglia): A Methodology that Makes Self-Assessment an Implicit Part of the Answering Process
15.00-15.45 Measuring employability learning gains
Dr Heike Behle (University of Warwick): Measuring employability gain in Higher Education. A case study using R2 Strengths
Fiona Cobb, Dr Bob Gilworth, David Winter (University of London): Careers Registration Learning Gain project
Students First 2020 - Usage and impact of academic supportStudiosity.com
Comparing Studiosity with other forms of Academic Support – An ‘ecosystem’ of student support services.
Jennifer Lawrence, Program Director, University of New England
Learning Analytics: What is it? Why do it? And how?Timothy Harfield
Presentation delivered to graduate students at Emory University as part of a TATTO (Teaching Assistant Training and Teaching Opportunity) brown bag session.
ABSTRACT
Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs. Data driven approaches to teaching and learning are rapidly being adopted within educational environments, but there is still much confusion about what learning analytics is, what it can do, and how it is best employed.
This talk will provide a general overview of the field of learning analytics, its terminology and methods, as well as contemporary ethical debates. It will also introduce several open source and Emory-supported analytics tools available to students and instructors to facilitate the achievement of various learning outcomes.
Students First 2020 - Embracing and effectively leveraging online student sup...Studiosity.com
Students First 2020 - Prof Philippa Levy, PVC Student Learning at The University of Adelaide, discusses the path to successfully adopting Studiosity, and what has happened since for academic success, confidence, and student satisfaction. Prof Levy also looks at results and engagement for non-traditional students and international students.
Personalized Online Practice Systems for Learning ProgrammingPeter Brusilovsky
Computer programming is quickly transitioning from being just a key competency in computer and information science majors to being a desired skill for students in a wide range of fields. Yet, it is also one of the most challenging subjects to learn. While learning by doing is a critical component in mastering programming skills, neither the traditional educational process nor standard learning support tools provide sufficient opportunities for programming practice. In this talk, I will present our research on personalized programming practice systems for Java, Python, and SQL, which attempt to bridge this known gap in learning programming. A programming practice system engages students in practicing programming skills beyond a relatively small number of graded assignments and exams. To support learning by doing, an online practice system offers a range of interactive “smart content” such as program animations, worked examples, and various kinds of programming problems with an automatic assessment. The main challenges for online practice systems are to motivate students to practice and to guide them to the most appropriate smart content given their course goals and knowledge levels. In this talk, I will review a range of AI technologies, such as student modeling, navigation support, social comparison, and content recommendation, which support efficient programming practice. I will also discuss how personalized practice system could support COVID-19-influenced switch to online learning while maintaining an extensive level of feedback expected from an efficient learning process.
Learning design meets learning analytics: Dr Bart Rienties, Open UniversityBart Rienties
8th UK Learning Analytics Network Meeting, The Open University, 2nd November 2016
1) The power of 151 Learning Designs on 113K+ students at the OU?
2) How can we use learning design to empower teachers?
3) How can Early Alert Systems improve Student Engagement and Academic Success? (Amara Atif, Macquarie University)
4) What evidence is there that learning design makes a difference over time and how students engage?
Using learning analytics to improve student transition into and support throu...Tinne De Laet
Presentation supporting the ABLE and STELA workshop titled "Using learning analytics to improve student transition into and support throughout the 1st year" delivered at the EFYE 2016 conference in Gent, Belgium
ABLE - the NTU Student Dashboard - University of DerbyEd Foster
implementing a university wide learning analytics system.
Presentation Overview:
- Introduction
- Developing the NTU Student Dashboard
- Transitioning from pilot phase to whole institution roll-out
- Embedding the resource into working practices
- Future development
Online writing feedback: A national study exploring the service and learning ...Studiosity.com
Professor Chris Tisdell, Scientia Education Academy Fellow at the University of New South Wales (...and YouTube star, mathematician, former DJ...) kicked off the day by talking student word choice, feedback, and psychology, and wellbeing.
Chris presented findings from a national study which used the feedback from students from more than 20 universities. Why? After every Studiosity session, students give feedback. That feedback from students needs to be analysed and used in practical ways (especially recalling Associate Professor Phill Dawson on Day One, who discussed the importance of feedback literacy and translating it into action.) Online, 24/7 support is needed as much to fulfil student expectations for their overall university service experience, as it is needed for delivering learning outcomes.
This year's Studiosity 'Students First' Symposium was hosted at La Trobe University City Campus, 25 and 26 July 2019.
Keynote H818 The Power of (In)formal learning: a learning analytics approachBart Rienties
A special thanks to Avinash Boroowa, Simon Cross, Lee Farrington-Flint, Christothea Herodotou, Lynda Prescott, Kevin Mayles, Tom Olney, Lisette Toetenel, John Woodthorpe and others…A special thanks to Prof Belinda Tynan for her continuous support on analytics at the OU UK
State and Directions of Learning Analytics Adoption (Second edition)Dragan Gasevic
The analysis of data collected from user interactions with educational and information technology has attracted much attention as a promising approach for advancing our understanding of the learning process. This promise motivated the emergence of the new field learning analytics and mobilized the education sector to embrace the use of data for decision-making. This talk will first introduce the field of learning analytics and touch on lessons learned from some well-known case studies. The talk will then identify critical challenges that require immediate attention in order for learning analytics to make a sustainable impact on learning, teaching, and decision making. The talk will conclude by discussing a set of milestones selected as critical for the maturation of the field of learning analytics. The most important take away from the talk will be that
- systemic approaches to the development and adoption of learning analytics are critical,
- multidisciplinary teams are necessary to unlock a full potential of learning analytics, and
- capacity development at institutional levels through the inclusion of diverse stakeholders is essential for full learning analytics adoption.
This is the second edition of the talk that previously gave under the same title on several occasions. The second edition reflects many developments happened in the field of learning analytics, especially those in the following two projects - http://he-analytics.com and http://sheilaproject.eu.
Toward an automated student feedback system for text based assignments - Pete...Blackboard APAC
As the use of blended learning environments and digital technologies become integrated into the higher education sector, rich technologies such as analytics have the ability to assist teaching staff identify students at risk, learning material that is not proving effective and learning site designs that aid and facilitate improved learning. More recently consideration has been given to automated essay scoring. Such systems can be used in a formative way, such as providing feedback on initial assignment drafts or summatively through the analysis of final assignment submissions. Further, providing students with quick feedback on written assignments opens the opportunity through formative feedback to improved learning outcomes.
This presentation details a current project developing a system to analyse text-based assignments. The project is being developed for broad application, but the findings focus on an undergraduate pilot subject: ‘Ideas that Shook the World’ (a compulsory first year Bachelor of Arts subject taught on 5 campuses to more than 1000 students by 15 staff). Preliminary results of a fist scan of assignments are presented and the issues raised in developing the system presented together with an outline of additional work planned for the project. It is believed the work will have wide application where text-based assignments are utilised for assessment.
Course-Adaptive Content Recommender for Course AuthoringPeter Brusilovsky
Developing online courses is a complex and time-consuming
process that involves organizing a course into a sequence of topics and
allocating the appropriate learning content within each topic. This task
is especially difficult in complex domains like programming, due to the
incremental nature of programming knowledge, where new topics extensively
build upon domain concepts that were introduced in earlier lessons.
In this paper, we propose a course-adaptive content-based recommender
system that assists course authors and instructors in selecting the most
relevant learning material for each course topic. The recommender system
adapts to the deep prerequisite structure of the course as envisioned
by a specific instructor, while unobtrusively deducing that structure from
problem-solving examples that the instructor uses to present course concepts.
We assessed the quality of recommendations and examined several
aspects of the recommendation process by using three datasets collected
from two different courses.While the presented recommender system was
built for the domain of introductory programming, our course-adaptive
recommendation approach could be used in a variety of other domains.
Conducting Research on Blended and Online Education, WorkshopTanya Joosten
Conducting Research on Blended and Online Education
October 14, 2015 - 8:30am
Lead Presenter: Tanya Joosten (University of Wisconsin - Milwaukee, USA)
Nori Barajas-Murphy (University of La Verne, USA)
Track: Learning Effectiveness
Pre-Conference Workshop
Location: Oceanic 7
Session Duration: 3 Hours
Pre-Conference Workshop Session 3
This workshop consists of practice-based research planning activities to help you prepare for conducting research at the course or program level. Specifically, we will utilize the distance education research model developed by the National Research Center for Distance Education and Technological Advancements (DETA) to guide the development of research plans for blended and online. Attendees will walk away with a research agenda and the necessary tools to help them conduct research on their campus as part of the National DETA Research Center initiative.
The University of Wisconsin-Milwaukee (UWM) established a National Distance Education and Technological Advancement (DETA) Research Center in 2014 to conduct cross-institutional data collection with 2-year and 4-year Institutions of Higher Education (IHEs) funded by the U.S. Department of Education Fund for Improvement of Postsecondary Education (FIPSE). UWM has partnered with the University of Wisconsin System, UW-Extension, Milwaukee Area Technical College (MATC), EDUCAUSE Learning Initiative (ELI), and leaders across the nation to develop a research model. This model is to promote student access and success through evidence-based online learning practices and learning technologies.
The DETA Center looks to identify and evaluate effective course and institutional practices in online learning (including competency-based education) for underrepresented individuals (i.e., economically disadvantaged, adult learners, disabled) through rigorous research. Furthermore, although the research currently is focused on postsecondary U.S. institutions, the DETA Center looks to advance their work in K-12 and internationally -- all are welcome!
This workshop will prepare attendees to take a plan back to their own institution to successfully gather research on blended and online teaching and learning.
For more on DETA, visit http://www.uwm.edu/deta.
Land of The Learning Giants: The Rise of MOOCsEamon Costello
Massive Open Online Courses (MOOCs) have been heralded and decried in something of equal measure over the last four years. Their ultimate purpose and the effect they are having are still uncertain but given the level of maturity that has now reached we ought now to be able to attempt to answer some questions of this phenomenon. Following an overview of key issues for educational research on the topic of MOOCs this paper presents findings from studies we have conducted into
* Representations of MOOCs in the Irish Print Media: What are the narratives, who is telling it and why?
* Quality of education in MOOCs in particular regarding online testing
* The strategic drivers for higher education institutions in Ireland to develop MOOCs
Educational Data Mining in Program Evaluation: Lessons LearnedKerry Rice
AET 2016 Researchers present findings from a series of data mining studies, primarily examining data mining as part of an innovative triangulated approach in program evaluation. Findings suggest that is it possible to apply EDM techniques in online and blended learning classrooms to identify key variables important to the success of learners. Lessons learned will be shared as well as areas for improving data collection in learning management systems for meaningful analysis and visualization.
Presentations morning session 22 January 2018 HEFCE open event “Using data to...Bart Rienties
With the Teaching Excellence Framework being implemented across England, a lot of higher education institutions have started to ask questions about what it means to be “excellent” in teaching. In particular, with the rich and complex data that all educational institutions gather that could potentially capture learning gains, what do we actually know about our students’ learning journeys? What kinds of data could be used to infer whether our students are actually making affective (e.g., motivation), behavioural (e.g., engagement), and/or cognitive learning gains? Please join us on 22 January 2018 in lovely Milton Keynes at a free OU- and HEFCE-supported event on Using data to increase learning gains and teaching excellence.
10.30-11.00 Welcome and Coffee
11.00-11.30 Lightning presentations by participants, outlining insights about learning gains
1130-1300 Insights from the ABC-Learning Gains project
Dr Jekaterina Rogaten (OU): Reviewing affective, behavioural and cognitive learning gains in higher education of 54 learning gains studies
Prof Bart Rienties & Dr Jekaterina Rogaten (OU): Are assessment scores good proxies of estimating learning gains: a large-scale study amongst humanities and science students
Prof Rhona Sharpe (University of Surrey) & Dr Simon Cross (OU): Insights from 45 qualitative interviews with different learning gain paths of high and low achievers
Dr Ian Scott (Oxford Brookes) & Dr Simon Lygo-Baker (OU): Making sense of learning trajectories: a qualitative perspective
Students First 2020 - Usage and impact of academic supportStudiosity.com
Comparing Studiosity with other forms of Academic Support – An ‘ecosystem’ of student support services.
Jennifer Lawrence, Program Director, University of New England
Learning Analytics: What is it? Why do it? And how?Timothy Harfield
Presentation delivered to graduate students at Emory University as part of a TATTO (Teaching Assistant Training and Teaching Opportunity) brown bag session.
ABSTRACT
Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs. Data driven approaches to teaching and learning are rapidly being adopted within educational environments, but there is still much confusion about what learning analytics is, what it can do, and how it is best employed.
This talk will provide a general overview of the field of learning analytics, its terminology and methods, as well as contemporary ethical debates. It will also introduce several open source and Emory-supported analytics tools available to students and instructors to facilitate the achievement of various learning outcomes.
Students First 2020 - Embracing and effectively leveraging online student sup...Studiosity.com
Students First 2020 - Prof Philippa Levy, PVC Student Learning at The University of Adelaide, discusses the path to successfully adopting Studiosity, and what has happened since for academic success, confidence, and student satisfaction. Prof Levy also looks at results and engagement for non-traditional students and international students.
Personalized Online Practice Systems for Learning ProgrammingPeter Brusilovsky
Computer programming is quickly transitioning from being just a key competency in computer and information science majors to being a desired skill for students in a wide range of fields. Yet, it is also one of the most challenging subjects to learn. While learning by doing is a critical component in mastering programming skills, neither the traditional educational process nor standard learning support tools provide sufficient opportunities for programming practice. In this talk, I will present our research on personalized programming practice systems for Java, Python, and SQL, which attempt to bridge this known gap in learning programming. A programming practice system engages students in practicing programming skills beyond a relatively small number of graded assignments and exams. To support learning by doing, an online practice system offers a range of interactive “smart content” such as program animations, worked examples, and various kinds of programming problems with an automatic assessment. The main challenges for online practice systems are to motivate students to practice and to guide them to the most appropriate smart content given their course goals and knowledge levels. In this talk, I will review a range of AI technologies, such as student modeling, navigation support, social comparison, and content recommendation, which support efficient programming practice. I will also discuss how personalized practice system could support COVID-19-influenced switch to online learning while maintaining an extensive level of feedback expected from an efficient learning process.
Learning design meets learning analytics: Dr Bart Rienties, Open UniversityBart Rienties
8th UK Learning Analytics Network Meeting, The Open University, 2nd November 2016
1) The power of 151 Learning Designs on 113K+ students at the OU?
2) How can we use learning design to empower teachers?
3) How can Early Alert Systems improve Student Engagement and Academic Success? (Amara Atif, Macquarie University)
4) What evidence is there that learning design makes a difference over time and how students engage?
Using learning analytics to improve student transition into and support throu...Tinne De Laet
Presentation supporting the ABLE and STELA workshop titled "Using learning analytics to improve student transition into and support throughout the 1st year" delivered at the EFYE 2016 conference in Gent, Belgium
ABLE - the NTU Student Dashboard - University of DerbyEd Foster
implementing a university wide learning analytics system.
Presentation Overview:
- Introduction
- Developing the NTU Student Dashboard
- Transitioning from pilot phase to whole institution roll-out
- Embedding the resource into working practices
- Future development
Online writing feedback: A national study exploring the service and learning ...Studiosity.com
Professor Chris Tisdell, Scientia Education Academy Fellow at the University of New South Wales (...and YouTube star, mathematician, former DJ...) kicked off the day by talking student word choice, feedback, and psychology, and wellbeing.
Chris presented findings from a national study which used the feedback from students from more than 20 universities. Why? After every Studiosity session, students give feedback. That feedback from students needs to be analysed and used in practical ways (especially recalling Associate Professor Phill Dawson on Day One, who discussed the importance of feedback literacy and translating it into action.) Online, 24/7 support is needed as much to fulfil student expectations for their overall university service experience, as it is needed for delivering learning outcomes.
This year's Studiosity 'Students First' Symposium was hosted at La Trobe University City Campus, 25 and 26 July 2019.
Keynote H818 The Power of (In)formal learning: a learning analytics approachBart Rienties
A special thanks to Avinash Boroowa, Simon Cross, Lee Farrington-Flint, Christothea Herodotou, Lynda Prescott, Kevin Mayles, Tom Olney, Lisette Toetenel, John Woodthorpe and others…A special thanks to Prof Belinda Tynan for her continuous support on analytics at the OU UK
State and Directions of Learning Analytics Adoption (Second edition)Dragan Gasevic
The analysis of data collected from user interactions with educational and information technology has attracted much attention as a promising approach for advancing our understanding of the learning process. This promise motivated the emergence of the new field learning analytics and mobilized the education sector to embrace the use of data for decision-making. This talk will first introduce the field of learning analytics and touch on lessons learned from some well-known case studies. The talk will then identify critical challenges that require immediate attention in order for learning analytics to make a sustainable impact on learning, teaching, and decision making. The talk will conclude by discussing a set of milestones selected as critical for the maturation of the field of learning analytics. The most important take away from the talk will be that
- systemic approaches to the development and adoption of learning analytics are critical,
- multidisciplinary teams are necessary to unlock a full potential of learning analytics, and
- capacity development at institutional levels through the inclusion of diverse stakeholders is essential for full learning analytics adoption.
This is the second edition of the talk that previously gave under the same title on several occasions. The second edition reflects many developments happened in the field of learning analytics, especially those in the following two projects - http://he-analytics.com and http://sheilaproject.eu.
Toward an automated student feedback system for text based assignments - Pete...Blackboard APAC
As the use of blended learning environments and digital technologies become integrated into the higher education sector, rich technologies such as analytics have the ability to assist teaching staff identify students at risk, learning material that is not proving effective and learning site designs that aid and facilitate improved learning. More recently consideration has been given to automated essay scoring. Such systems can be used in a formative way, such as providing feedback on initial assignment drafts or summatively through the analysis of final assignment submissions. Further, providing students with quick feedback on written assignments opens the opportunity through formative feedback to improved learning outcomes.
This presentation details a current project developing a system to analyse text-based assignments. The project is being developed for broad application, but the findings focus on an undergraduate pilot subject: ‘Ideas that Shook the World’ (a compulsory first year Bachelor of Arts subject taught on 5 campuses to more than 1000 students by 15 staff). Preliminary results of a fist scan of assignments are presented and the issues raised in developing the system presented together with an outline of additional work planned for the project. It is believed the work will have wide application where text-based assignments are utilised for assessment.
Course-Adaptive Content Recommender for Course AuthoringPeter Brusilovsky
Developing online courses is a complex and time-consuming
process that involves organizing a course into a sequence of topics and
allocating the appropriate learning content within each topic. This task
is especially difficult in complex domains like programming, due to the
incremental nature of programming knowledge, where new topics extensively
build upon domain concepts that were introduced in earlier lessons.
In this paper, we propose a course-adaptive content-based recommender
system that assists course authors and instructors in selecting the most
relevant learning material for each course topic. The recommender system
adapts to the deep prerequisite structure of the course as envisioned
by a specific instructor, while unobtrusively deducing that structure from
problem-solving examples that the instructor uses to present course concepts.
We assessed the quality of recommendations and examined several
aspects of the recommendation process by using three datasets collected
from two different courses.While the presented recommender system was
built for the domain of introductory programming, our course-adaptive
recommendation approach could be used in a variety of other domains.
Conducting Research on Blended and Online Education, WorkshopTanya Joosten
Conducting Research on Blended and Online Education
October 14, 2015 - 8:30am
Lead Presenter: Tanya Joosten (University of Wisconsin - Milwaukee, USA)
Nori Barajas-Murphy (University of La Verne, USA)
Track: Learning Effectiveness
Pre-Conference Workshop
Location: Oceanic 7
Session Duration: 3 Hours
Pre-Conference Workshop Session 3
This workshop consists of practice-based research planning activities to help you prepare for conducting research at the course or program level. Specifically, we will utilize the distance education research model developed by the National Research Center for Distance Education and Technological Advancements (DETA) to guide the development of research plans for blended and online. Attendees will walk away with a research agenda and the necessary tools to help them conduct research on their campus as part of the National DETA Research Center initiative.
The University of Wisconsin-Milwaukee (UWM) established a National Distance Education and Technological Advancement (DETA) Research Center in 2014 to conduct cross-institutional data collection with 2-year and 4-year Institutions of Higher Education (IHEs) funded by the U.S. Department of Education Fund for Improvement of Postsecondary Education (FIPSE). UWM has partnered with the University of Wisconsin System, UW-Extension, Milwaukee Area Technical College (MATC), EDUCAUSE Learning Initiative (ELI), and leaders across the nation to develop a research model. This model is to promote student access and success through evidence-based online learning practices and learning technologies.
The DETA Center looks to identify and evaluate effective course and institutional practices in online learning (including competency-based education) for underrepresented individuals (i.e., economically disadvantaged, adult learners, disabled) through rigorous research. Furthermore, although the research currently is focused on postsecondary U.S. institutions, the DETA Center looks to advance their work in K-12 and internationally -- all are welcome!
This workshop will prepare attendees to take a plan back to their own institution to successfully gather research on blended and online teaching and learning.
For more on DETA, visit http://www.uwm.edu/deta.
Land of The Learning Giants: The Rise of MOOCsEamon Costello
Massive Open Online Courses (MOOCs) have been heralded and decried in something of equal measure over the last four years. Their ultimate purpose and the effect they are having are still uncertain but given the level of maturity that has now reached we ought now to be able to attempt to answer some questions of this phenomenon. Following an overview of key issues for educational research on the topic of MOOCs this paper presents findings from studies we have conducted into
* Representations of MOOCs in the Irish Print Media: What are the narratives, who is telling it and why?
* Quality of education in MOOCs in particular regarding online testing
* The strategic drivers for higher education institutions in Ireland to develop MOOCs
Educational Data Mining in Program Evaluation: Lessons LearnedKerry Rice
AET 2016 Researchers present findings from a series of data mining studies, primarily examining data mining as part of an innovative triangulated approach in program evaluation. Findings suggest that is it possible to apply EDM techniques in online and blended learning classrooms to identify key variables important to the success of learners. Lessons learned will be shared as well as areas for improving data collection in learning management systems for meaningful analysis and visualization.
Presentations morning session 22 January 2018 HEFCE open event “Using data to...Bart Rienties
With the Teaching Excellence Framework being implemented across England, a lot of higher education institutions have started to ask questions about what it means to be “excellent” in teaching. In particular, with the rich and complex data that all educational institutions gather that could potentially capture learning gains, what do we actually know about our students’ learning journeys? What kinds of data could be used to infer whether our students are actually making affective (e.g., motivation), behavioural (e.g., engagement), and/or cognitive learning gains? Please join us on 22 January 2018 in lovely Milton Keynes at a free OU- and HEFCE-supported event on Using data to increase learning gains and teaching excellence.
10.30-11.00 Welcome and Coffee
11.00-11.30 Lightning presentations by participants, outlining insights about learning gains
1130-1300 Insights from the ABC-Learning Gains project
Dr Jekaterina Rogaten (OU): Reviewing affective, behavioural and cognitive learning gains in higher education of 54 learning gains studies
Prof Bart Rienties & Dr Jekaterina Rogaten (OU): Are assessment scores good proxies of estimating learning gains: a large-scale study amongst humanities and science students
Prof Rhona Sharpe (University of Surrey) & Dr Simon Cross (OU): Insights from 45 qualitative interviews with different learning gain paths of high and low achievers
Dr Ian Scott (Oxford Brookes) & Dr Simon Lygo-Baker (OU): Making sense of learning trajectories: a qualitative perspective
Conducting Research on Blended and Online Education: A Research ToolkitTanya Joosten
An ELI Short Course delivered on May 16th, 2016.
This session consists of practice-based research planning activities to help participants prepare for conducting research at the course or program level. Specifically, we will utilize the distance education research toolkit developed by the National Research Center for Distance Education and Technological Advancements (DETA) to guide the development of research plans for blended and online learning. Attendees will walk away with a research agenda and the necessary tools to help them conduct research on their campus as part of the National DETA Research Center initiative. The DETA Center seeks to identify and evaluate effective course and institutional practices in online learning (including competency-based education) for underrepresented learners.
Objectives:
After participating in this webinar, participants will be able to:
Develop research questions
Clarify variables and measures
Identify data gathering techniques
Consider other actionable milestones necessary to conduct rigorous research
http://www.educause.edu/events/eli-webinar-conducting-research-blended-and-online-education
The power of learning analytics to measure learning gains: an OU, Surrey and ...Bart Rienties
Learning gains has increasingly become apparent within the HE literature, gained traction in government policies in the UK, and are at the heart of Teaching Excellence Framework (TFL). As such, this raises a question to what extent teaching and learning environment can actually predict students’ learning gains using principles of learning analytics. In this presentation, which is joined work with University of Surrey and Oxford Brookes, I will focus on some preliminary findings based upon developing and testing an Affective-Behaviour-Cognition learning gains model using longitudinal approach. The main aim of the research is to examine whether learning gains occur on all three levels of Affective-Behaviour-Cognition model and whether any particular student or course characteristics can predict learning gains or lack of learning and dropout. For more info, see https://abclearninggains.com/
Promoting Effective Teaching and Learning Ecosystems via Research Proven Prac...Tanya Joosten
ELI Leadership Seminar, 2016, San Antonio TX
The ELI Leadership Seminar, "Promoting a Sustainable and Effective Teaching and Learning Ecosystem via Research Proven Practice," is an extended learning opportunity threaded throughout the annual meeting program. The goals for this seminar are to:
Enable quality teaching and learning through evidence-based faculty development to diffuse proven instructional interventions and practices
Discover ways to gather evidence using a research model for online learning, including key research questions driving inquiry
Explore different research designs (experimental and survey with data mining) for studying teaching and learning innovations
Develop a research plan for your program or institution that will assist in identifying effective instructional and institutional practices in blended and online learning
Identify potential methods of effectively engaging faculty in teaching and researching innovations in student learning
Learn about institutional mechanisms that can impact quality in teaching and learning, particularly in blended and online environments
Enable participants to network with peers interested in promoting effective teaching and learning through research on blended and online programming at universities
Participants, both new and experienced, will benefit from peer interaction and the opportunity to network and engage with leaders during small group discussions. Participants will meet with, share with, and learn from a cohort of peers from a wide range of positions supporting teaching and learning from different types of higher education institutions.
The Open University (OU) is a global leader in quality online, open and distance education with more than 180,000 students and 8,000 faculty and staff. Like many organizations, the OU is embracing data and learning analytics as an increasingly important approach for understanding learner behaviors. During this Fischer Speaker Series event, Dr. Tynan explores the vagaries of leading an institutional strategy at scale, specifically focusing on faculty, student and institutional engagement with analytics to support student success- detailing wins, pitfalls and unexpected twists resulting in unintended but delightful outcomes.
Professor Belinda Tynan is the Pro- Vice-Chancellor (Learning Innovation) and Professor of Higher Education at the Open University, UK. Reporting to the Vice-Chancellor, the Pro-Vice-Chancellor for Learning Innovation contributes to the strategic vision and mission of the University and has a focus on supporting student success by providing executive leadership in the areas of innovation, strategy and policy development, production, informal learning and research and scholarship in technology enhanced learning.
The video of this presentation can be viewed at https://goo.gl/W8qpi6
Dr. Nasrin Nazemzadeh, PhD Dissertation Defense, Dr. William Allan Kritsonis,...William Kritsonis
Dr. William Allan Kritsonis, PhD Dissertation Chair for Dr. Nasrin Nazemzadeh, PhD Program in Educational Leadership, PVAMU, Member of the Texas A&M University System.
Using intelligent tutoring systems, virtual laboratories, simulations, and frequent opportunities for assessment and feedback, The Open Learning Initiative (OLI) builds open learning environments that support continuous improvement in teaching and learning.
One of the most powerful features of web-based learning environments is that we can embed assessment into, virtually all, instructional activities. As students interact with OLI environments, we collect real-time data of student work. We use this data to create four positive feedback loops:
• feedback to students
• feedback to instructors
• feedback to course designers
• feedback to learning science researchers
In this JumpStart Session, we demonstrate how OLI uses the web to deliver online instruction that instantiates course designs based on research and how the learning environments, in turn, support ongoing research. We will discuss the Community College Open Learning Initiative (CC-OLI) and how faculty and colleges across the country can participate in CC-OLI and the connection between CC-OLI and Washington State’s Open Course Library project.
Lessons learned from 200K students and 2 GB of learning gains data.Bart Rienties
With the introduction of the Teaching Excellence Framework a lot of attention is focussed on measuring learning gains. A vast body of research has found that individual student characteristics influence academic progression over time. This case-study aims to explore how advanced statistical techniques in combination with Big Data can be used to provide potentially new insights into how students are progressing over time, and in particular how students’ socio-demographics (i.e., gender, ethnicity, socioeconomic status, prior educational qualifications) influence students’ learning trajectories
Professor Bart Rienties, Open University UK
https://warwick.ac.uk/services/aro/dar/quality/legacy/anagendaforchange/
Learning Analytics and the Scholarship of Teaching and Learning - an obvious ...Blackboard APAC
The scholarship of teaching and learning (SoTL) essentially advocates for a research approach to be applied to the improvement of learning and teaching. It encourages teachers to reflect in a scholarly way on their teaching practice and at the more advanced level to undertake research on teaching practice and curriculum. Learning analytics has the potential to provide data on elements of the teaching process which have to date been difficult to measure particularly for the broader cohort of teachers.
This presentation will draw attention to the connection between SoTL and learning analytics and prompt participants to think about how learning analytics can be used in a wider context to contribute to changes in teaching design and practice.
In this presentation to NYU-Learn, I discuss my experience applying data science and machine learning in educational technology and assessment industries. I share tips for thinking about the importance of context and potential of scalability.
Collaborative Research: Stealth Assessment of SE Skills w/Learning AnalyticsJohn Whitmer, Ed.D.
Early description of a work in progress between ACT, UMBC, Blackboard and Vitalsource to investigate the relationship between social and emotional skills and learning analytics using machine and deep learning techniques. A few preliminary results.
This presentation to the MoodleMoot UK/I 2017 provides an overview of Learning Analytics for VLE/LMS data and lessons learned in practice from using this data to model student risk and other characteristics. The findings come from fundamental research and application of Blackboard's X-Ray Learning Analytics application.
Learner Analytics Panel Session: Deja-Vu all over again? John Whitmer, Ed.D.
Panel presentation at the DET/CHE 2012 conference on November 28, 2012 by Kathy Fernandes (Chico State), James Frazee (San Diego State), Andrew Roderick (SFSU), and Deone Zell (CSU Northridge).
Improving student persistence, especially among under-represented minority students, is a driving goal at many colleges and universities. Academic technologies, such as the Learning Management System (LMS), are frequently used to deliver innovative pedagogical strategies to increase engagement and improve persistence. This study presents research on a redesigned hybrid high-enrollment undergraduate course exploring the relationship between LMS activity, student background characteristics, current enrollment information, and student achievement.
Many Hands Makes Light Work: Collaborating on Moodle Services and DevelopmentJohn Whitmer, Ed.D.
Presentation by Kathy Fernandes, Andrew Roderick, and John Whitmer at the US West Coast MoodleMoot 2012 on August 2, 2012.
Learning Management Systems have evolved from faculty sandboxes to complex enterprise learning environments. Meanwhile, budgets have plummeted and the LMS market has been undergoing rapid change. Many campuses have moved to Moodle to help stabilize their business and application environments. An important criteria behind this transition for many campuses has been the ability to ‘control their own destiny’ and collaborate with colleagues.
In this presentation, we will discuss the experience of campuses in the California State University system collaborating on Moodle technical development, user services, and support. Among the 10 campuse currently using or in transition to Moodle, we have developed a shared governance model with separate groups to administer policy-related issues and technical / UI issues. We will discuss the creation of a Moodle Shared Code base that is being used by several campuses, and the current migration of SCB features into Moodle v2.0. Moodle techincal expertise is shared between campuses, and training resources have been leveraged across the CSU system. We will discuss the process and features that have led to successful (and not so successful) colllaborative activities, as well as the services that have been created.
Presentation by John Whitmer, Michael Haskell (Cal Poly SLO), and Hillary Kaplowitz (CSU Northtridge) at US West Coast Moodle Moot 2012.
“Learner Analytics” has captured the attention of the media and is the topic of much debate in professional and academic circles. What lies behind the hype? In this presentation, we will discuss the state and limits to current in research in LMS Learner Analytics. We will then look at examples of Learner Analytics in Moodle, including tools for faculty and reports for reporting across the entire instance.
Learner Analytics and the “Big Data” Promise for Course & Program AssessmentJohn Whitmer, Ed.D.
Presentation delivered at the San Diego State University "One Day in May" conference on May 22, 201 by John Whitmer, Hillary Kaplowitz, and Thomas J. Norman
Universities archive massive amounts of data about students and their activities. Students also generate significant amounts of “digital exhaust” as they use academic technologies. How can faculty and administrators use automated analysis of this data to save time and conduct targeted interventions to improve student learning?
The emerging discipline of Learner Analytics conducts analysis of this data to learn about student behaviors, predict students at-risk of failure, and identify potential interventions to help those students. In this presentation, we will discuss the contours of this discipline and review the state of research conducted to date. We will then look at several examples of Learner Analytics services and hear from California State University educators who are using these tools to help their students. Finally, we will suggest some immediate ways that Analytics can be conducted at San Diego State.
Presenters:
John Whitmer, California State University, Chico
Hillary Kaplowitz, California State University, Northridge
Thomas J. Norman, CSU Dominguez Hills
Learning Analytics: Realizing the Big Data Promise in the CSUJohn Whitmer, Ed.D.
The word “analytics” has become a buzzword in current educational technology conversations, applied to everything from analysis of student work to LMS usage reporting to institutional analysis of ERP data. Broadly speaking, Learner Analytics refers to the analysis of student data using statistical techniques to improve decision-making. In the context of educational technology, Learner Analytics promises to improve our understanding of effective (and ineffective) student learning and technology usage. What progress have we seen in realizing this promise? This session offers a discussion of the promise of Learner Analytics, current research findings and tools, and explores examples from CSU Chico and the CSU Office of the Chancellor.
Current CSU LMS Activities: Campus and Systemwide StrategiesJohn Whitmer, Ed.D.
In this webinar from April 2010, Dr. David Levin from CSU Northridge and Dr. Linda Scott from CSU San Marcos spoke about their campus migrations from Blackboard to Moodle. They discussed the decision-making process on their campus, their timeline, course migrations, implementations, training and support resources, and lessons learned.
Kathy Fernandes and John Whitmer spoke about the Chancellor’s Office Initiative to provide systemwide LMS Services. These services began with the LMS RFP and CSU Sandboxes, and were expanded to provide an LMS “safety net” and a “superset” of LMS services that include systems, integrations, migrations, support services, and educational practices.
Participants will learn about these current efforts and plans for the implementation of the LMS recommendations approved by the CSU Academic Technology Steering Committee in December 2009.
Presentation to the CSU Community of Academic Technology Staff annual conference in April 2010. Topics discussed include the history of the LMS in the CSU, the current services coordinated through the Chancellor's Office, and upcoming services.
Presenters:
Kathy Fernandes, CSU Office of the Chancellor
John Whitmer, CSU Office of the Chancellor
Faculty Development across the California State University SystemJohn Whitmer, Ed.D.
In this presentation from the US West Coast Moodle Moot 2011, leaders from California State University campuses discuss their efforts to support the increased use of Moodle on their campus. The speakers represent campuses new to Moodle and mature deployments, and discuss the needs of new users and those further along in the adoption process. Issues to be dicussed include: training resources, effective training modalities, critical training issues in Moodle, and more.
Prsenters:
Cherie Blut, CSU San Marcos
Brett Christie, Sonoma State University
Maggie Beers, San Francisco State University
Deone Zell, CSU Northridge
Moderator: John Whitmer, CSU Office of the Chancellor
Partnership & Collaboration in Moodle Development: Making it WorkJohn Whitmer, Ed.D.
Presentation by Kathy Fernandes (CSU Office of the Chancellor), Andrew Roderick (San Francisco State University), and John Whitmer (CSU Office of the Chancellor)
US West Coast MoodleMoot 2011 (July 2011, Rohnert Park, CA)
As an open source application, Moodle has strong potential for collaborative partnerships, support services, and code development. This presentation will describe one year in the life of California State University Moodle Collaborations. Over the past year, the CSU has developed a governance process and established a new organizational culture while working on code development, training materials, migration tool, and expertise collaboration. We will discuss the balance of central coordination and campus leadership, technical issues and opportunities, and plans for the future.
Migrating to Moodle: Lessons Learned from Recent CSU MigrationsJohn Whitmer, Ed.D.
In this presentation from the US West Coast Moodle Moot 2011, leaders from California State University that have recently migrated to Moodle discuss their campus decision-making process, the processes and technologies used to migrate content, and their process of implementation. The speakers represent campuses migrating from both Blackboard and WebCT, and a mix of small and large FTE campuses. Activities that benefited from multi-campus coordination and resource sharing are also be discussed.
Presenters:
David Levin, CSU Northridge
Barbara Taylor, CSU San Marcos
Moderator: John Whitmer, CSU Office of the Chancellor
Operation “Blue Star” is the only event in the history of Independent India where the state went into war with its own people. Even after about 40 years it is not clear if it was culmination of states anger over people of the region, a political game of power or start of dictatorial chapter in the democratic setup.
The people of Punjab felt alienated from main stream due to denial of their just demands during a long democratic struggle since independence. As it happen all over the word, it led to militant struggle with great loss of lives of military, police and civilian personnel. Killing of Indira Gandhi and massacre of innocent Sikhs in Delhi and other India cities was also associated with this movement.
A Strategic Approach: GenAI in EducationPeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
Honest Reviews of Tim Han LMA Course Program.pptxtimhan337
Personal development courses are widely available today, with each one promising life-changing outcomes. Tim Han’s Life Mastery Achievers (LMA) Course has drawn a lot of interest. In addition to offering my frank assessment of Success Insider’s LMA Course, this piece examines the course’s effects via a variety of Tim Han LMA course reviews and Success Insider comments.
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
Francesca Gottschalk - How can education support child empowerment.pptxEduSkills OECD
Francesca Gottschalk from the OECD’s Centre for Educational Research and Innovation presents at the Ask an Expert Webinar: How can education support child empowerment?
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...Levi Shapiro
Letter from the Congress of the United States regarding Anti-Semitism sent June 3rd to MIT President Sally Kornbluth, MIT Corp Chair, Mark Gorenberg
Dear Dr. Kornbluth and Mr. Gorenberg,
The US House of Representatives is deeply concerned by ongoing and pervasive acts of antisemitic
harassment and intimidation at the Massachusetts Institute of Technology (MIT). Failing to act decisively to ensure a safe learning environment for all students would be a grave dereliction of your responsibilities as President of MIT and Chair of the MIT Corporation.
This Congress will not stand idly by and allow an environment hostile to Jewish students to persist. The House believes that your institution is in violation of Title VI of the Civil Rights Act, and the inability or
unwillingness to rectify this violation through action requires accountability.
Postsecondary education is a unique opportunity for students to learn and have their ideas and beliefs challenged. However, universities receiving hundreds of millions of federal funds annually have denied
students that opportunity and have been hijacked to become venues for the promotion of terrorism, antisemitic harassment and intimidation, unlawful encampments, and in some cases, assaults and riots.
The House of Representatives will not countenance the use of federal funds to indoctrinate students into hateful, antisemitic, anti-American supporters of terrorism. Investigations into campus antisemitism by the Committee on Education and the Workforce and the Committee on Ways and Means have been expanded into a Congress-wide probe across all relevant jurisdictions to address this national crisis. The undersigned Committees will conduct oversight into the use of federal funds at MIT and its learning environment under authorities granted to each Committee.
• The Committee on Education and the Workforce has been investigating your institution since December 7, 2023. The Committee has broad jurisdiction over postsecondary education, including its compliance with Title VI of the Civil Rights Act, campus safety concerns over disruptions to the learning environment, and the awarding of federal student aid under the Higher Education Act.
• The Committee on Oversight and Accountability is investigating the sources of funding and other support flowing to groups espousing pro-Hamas propaganda and engaged in antisemitic harassment and intimidation of students. The Committee on Oversight and Accountability is the principal oversight committee of the US House of Representatives and has broad authority to investigate “any matter” at “any time” under House Rule X.
• The Committee on Ways and Means has been investigating several universities since November 15, 2023, when the Committee held a hearing entitled From Ivory Towers to Dark Corners: Investigating the Nexus Between Antisemitism, Tax-Exempt Universities, and Terror Financing. The Committee followed the hearing with letters to those institutions on January 10, 202
Using Learning Analytics to Assess Innovation & Improve Student Achievement
1. Using Learning Analytics to Assess
Innovation & Improve Student
Achievement
John Whitmer, Ed.D.
john.whitmer@blackboard.com
@johncwhitmer
UK Learning Analytics Network Event (JISC)
March 5, 2015
http://bit.ly/jwhitmer-jisc
2. Quick bio
15 years managing academic technology
at public higher ed institutions
(R1, 4-year, CC’s)
• Always multi-campus projects, innovative uses
of academic technologies
• Most recently: California State University,
Chancellor’s Office, Academic Technology Services
Doctorate in Education from UC Davis (2013)
with Learning Analytics study on Hybrid,
Large Enrollment course
Active academic research practice
(San Diego State Learning Analytics, MOOC
Research Initiative, Udacity SJSU Study…)
3. Meta-questions driving my research
1. How can we provide students with
immediate, real-time feedback? (esp
identify students at-risk of failing a
course)
2. How can we design effective
interventions for these students?
3. How can we assess innovations
(or status quo deployments) of
academic technologies?
4. Do these
findings apply
equally to
students ‘at
promise’ due to
their background
(e.g. race, class,
family education,
geography)
4. Outline
1. Defining & Positioning Learning Analytics
2. A Few Empirical Research Findings
• Understanding Contradictory Outcomes in a Redesigned Hybrid Course
(Chico State)
• Creating Accurate Learning Analytics Triggers & Effective Interventions
(SDSU)
3. How we’re Applying this Research @ Blackboard
4. Discussion
4
5. Economist. (2010, 11/4/2010). Augmented business: Smart systems will disrupt lots of industries, and perhaps the entire economy. The Economist.
5
6. 200MBof data emissions annually
Economist. (2010, 11/4/2010). Augmented business: Smart systems will disrupt lots of industries, and perhaps the entire economy. The Economist.
6
7. Logged into course
within 24 hours
Interacts frequently
in discussion boards
Failed first exam
Hasn’t taken
college-level math
No declared major
7
8. What is learning analytics?
Learning and Knowledge
Analytics Conference, 2011
“ ...measurement, collection,
analysis and reporting of data about
learners and their contexts,
for purposes of understanding
and optimizing learning
and the environments
in which it occurs.”
9. Strong interest by faculty & students
From Eden Dahlstrom, D. Christopher Brooks, and Jacqueline Bichsel. The Current Ecosystem of Learning Management Systems in Higher Education: Student, Faculty,
and IT Perspectives. Research report. Louisville, CO: ECAR, September 2014. Available from http://www.educause.edu/ecar.
13. Course redesigned for hybrid
delivery in year-long program
Enrollment: 373 students
(54% increase largest section)
Highest LMS usage entire
campus Fall 2010 (>250k hits)
Bimodal outcomes:
• 10% increased SLO mastery
• 7% & 11% increase in DWF
Why? Can’t tell with aggregated
reporting data
Study Overview
54 F’s
14. Grades Significantly Related to Access
Course: “Introduction to Religious Studies”
CSU Chico, Fall 2013 (n=373)
Variable % Variance
Total Hits 23%
Assessment activity hits 22%
Content activity hits 17%
Engagement activity hits 16%
Administrative activity hits 12%
Mean value all significant
variables 18%
15. LMS Activity better Predictor than
Demographic/Educational Variables
Variable % Var.
HS GPA 9%
URM and Pell-Eligibility Interaction 7%
Under-Represented Minority 4%
Enrollment Status 3%
URM and Gender Interaction 2%
Pell Eligible 2%
First in Family to Attend College 1%
Mean value all significant variables 4%
Not Statistically Significant
Gender
Major-College
22. 1. Identify courses and recruit instructors
2. Prior to course start, review syllabus, schedule meaningful
“triggers” for each course (e.g. attendance, graded items,
Blackboard use, etc.)
3. Run reports in Blackboard, Online Homework/Quiz software to
identify students with low activity or performance (~ weekly)
4. Send “flagged” student in experimental group a
notification/intervention
5. Aggregate data, add demographic data. Analyze.
Study Protocol
22
23. Key Questions
1. Are triggers accurate predictors of course grade?
2. Do interventions (based on triggers) improve student grades?
3. Do these relationships vary based on student background
characteristics?
27. A Typical Intervention: “Concerned Friend” tone
27
… data that I've gathered over the years via clickers indicates
that students who attend every face-to-face class meeting reduce
their chances of getting a D or an F in the class from
almost 30% down to approximately 8%.
So, please take my friendly advice and attend class and
participate in our classroom activities via your clicker. You'll be
happy you did!
Let me know if you have any questions.
Good luck,
Dr. Laumakis
28. Poll question
A Not significant
<10%, significant .05 level
20%, significant .01 level
30%, significant .001 level
Did triggers predict achievement? What level significance?
How much variation in student grade was explained?
B
C
D
E 50%+, significant .0001 level
28
29. Poll question
A Not significant
<10%, significant .05 level
20%, significant .01 level
30%, significant .001 level
Did triggers predict achievement? What level significance?
How much variation in student grade was explained?
B
C
D
E 50%+, significant .0001 level (Spring 2014, Fall 2014)
29
30. Statistics
Learning analytics triggers vs. final course points
Spring 2014: 4 sections, 2 courses, 882 students
Psychology
p<0.0001; r2=0.4828 p<0.0001; r2=0.6558
31. Fall 2014 results: Almost identical
5 Sections, 3 Courses, N=1,220 students
p<0.00001; r2=0.4836
32. Spring 2015 Results (tentative): lower relationship
8 Sections, 5 Courses, N=1,390 students
p<0.00001; r2=0.28
32
33. Spring 2015 Results (tentative): lower relationship
8 Sections, 5 Courses, N=1,390 students
p<0.00001; r2=0.28
33
34. Explained by differences between courses
(Spring 2015 Results by Course)
R2 (all triggers) R2 (no grades)
Anthro1
(Online)
0.54 0.65
Anthro3
(In Person)
not significant not significant
Comp Eng not significant not significant
Econ4 not significant not significant
Psych1 0.58 0.4
Psych2 0.41 0.23
Stat3 0.2 0.11
Stat4 0.33 0.21
34
R2 (all triggers) R2 (no grades)
Anthro1
(Online)
0.54 0.65
Anthro3 (In
Person)
not significant not significant
Comp Eng not significant not significant
Econ4 not significant not significant
Psych1 0.58 0.4
Psych2 0.41 0.23
Stat3 0.2 0.11
Stat4 0.33 0.21
35. So did the interventions make a
difference in learning outcomes?
37. 77%
91%23%
9%
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
No Interventions
(n=87, PSY, Pell-Eligible)
Interventions
(n=81, PSY, Pell-eligible)
Experimental Participation vs.
Repeatable Grade (Pell-Eligible)
(n=168, Spring 2014, PSY 101)
Passing
Grade
Repeatible
Grade
24 additional Pell-eligible students would have
passed the class
if the intervention was applied to all
participating students.
37
38. Fall 2014 / Spring 2015 Intervention
Results:
No Significant Difference Between
Experimental/Control Groups.
38
39. One Explanation: Low Reach
39
Fall 2014 (n = 1,220)
Row Labels # Triggers Message Open Rate Clickthrough Rate
Econ1 8 76% 36%
Psych1 6 70% 29%
Psych2 7 69% 35%
Stat3 9 62% 25%
Stat4 8 65% 27%
Grand Total 38 68% 30%
Spring 2015 (n = 1,138)
Row Labels # Triggers Message Open Rate Clickthrough Rate
Anthro-In Person 17 57% 10%
Anthro-Online 7 71% 35%
Comp
Engineering
15 52% 14%
Econ 15 44% 13%
Psych1 17 60% 13%
Psych2 17 63% 13%
Stat3 21 64% 9%
Stat4 20 55% 5%
Grand Total 129 58% 12%
40. Proposed Next Steps
• Add interventions that move “beyond informing”
students to address underlying study skills and behaviors
– Supplemental Instruction <http://www.umkc.edu/asm/si/>
– Adaptive Release within online courses (content, activities)
40
41. 1. Data from academic technology use predicts student
achievement; diverse sources provide better predictions.
2. Tech use > demographic data to predict course success; adding
demographic data provides nuanced understandings and
identifies trends not otherwise visible.
3. Academic technology use is not a “cause” in itself,
but reveals underlying study habits and behaviors
(e.g. effort, time on task, massed vs. distributed activity).
4. Predictions are necessary, but not sufficient, to change academic
outcomes. Research into interventions is promising.
5. We’re at an early stage in Learning Analytics;
expect quantum leaps in the near future.
41
Conclusions and Implications
42. 4. How we’re Applying this
Research @ Blackboard
42
43. Improved instrumentation for learning activity
within applications
Blackboard’s “Platform Analytics” Project
A new effort to
enhance
our analytics
offerings across our
academic technology
applications that
include
Applied findings from analysis
(inc. inferential statistics and data mining)
Integrated analytics into user experiences
(inc. student and faculty)
Aggregated usage data across cloud
applications (anonymized, rolled-up,
privacy-compliant)
48. Factors affecting growth of learning analytics
Enabler
Constraint
WidespreadRare
New education
models
Resources
($$$, talent)
Data governance (privacy,
security, ownership)
Clear goals and
linked actions
Data valued in
academic decisions
Tools/systems for data
co-mingling and analysis
Academic
technology adoption
Low data quality (fidelity
with meaningful learning)
Difficulty of data
preparation
Not invented here
syndrome
49. Call to action [with amendments]
(from a May 2012 Keynote Presentation @ San Diego State U)
You’re not behind the curve, this is a rapidly emerging area
that we can (should) lead... [together with interested partners]
Metrics reporting is the foundation for analytics
[don’t under or over-estimate the importance]
Start with what you have! Don’t wait for student characteristics and
detailed database information; LMS data can provide significant insights
If there’s any ed tech software folks in the audience,
please help us with better reporting!
[we’re working on it and feel your pain!]
2013 SP – Study began with two courses (N=`2,000) in Spring 2014
Weekly reports; triggered students sent email and multimedia “interventions” (low/high intensity)
Our hypothesis was - as you will see as the number of trigger events increase, so would the likelihood of having to repeat the course
We focused on high needs courses.
16-38% “Repeatable grades”
James
Weekly reports; triggered students sent email “interventions” (low intensity). Did people show up i.e., Clicker (participation/attendance points)
Talking points:
Almost ¾ of students got at least one trigger in each course
More PSY students got interventions than Stat students (b/c not completing homework)
The pattern of the # of interventions in both courses is about the same – high up to 2-3, then trails off.
Interesting findings – when consider that the triggers were very different between courses (e.g. PSY only 2 graded items, PSY: Online Homework, Stat: Online Quizzes. Etc).
James: most recent semester: different trajectory by course, although see trail-off pattern over time, most of the triggers in small range (1-3)
But significant differences between courses in number of triggers and number of students “activating” trigger.
Show differences as we have expanded the study over time.
James
For the experimental students, we have expanded the interventions over time to use different modalities: but the focus of all interventions has been in increasing the awareness of students about their at-risk behaviors/status, so that they can do something about it (self regulate).
Message written by instructor, with strong attention to tone: a “concerned friend”, which we thought would be more effective with students.
Also include serious results that might resonate with them and they would take notice about.
John
John
These graphs illustrate that DECREASES in triggers are related to INCREASES in student grade.
(explanation: Each dot is a student; Y axis is the total points (lower to higher), and X axis is the total # of triggers (higher to lower))
Significantly significant results for both courses; possibility due to chance less than 1 in 10,000.
Size of effect different: PSY: triggers explain 48% variation in final grade
STAT: triggers explain 66% of variation in final grade (if remove graded items from Stat, triggers explains 49%)
John
John
John: Million dollar question: did this reduce student repeatable grade rates.
John
James
Definition: Supplemental Instruction (SI) is an academic assistance program that utilizes peer-assisted study sessions. SI sessions are regularly-scheduled, informal review sessions in which students compare notes, discuss readings, develop organizational tools, and predict test items. Students learn how to integrate course content and study skills while working together. The sessions are facilitated by “SI leaders”, students who have previously done well in the course and who attend all class lectures, take notes, and act as model students. Purpose: To increase retention within targeted historically difficult courses
To improve student grades in targeted historically difficult courses
To increase the graduation rates of students
Participants: SI is a “free service” offered to all students in a targeted course. SI is a non remedial approach to learning as the program targets high-risk courses rather than high-risk students. All students are encouraged to attend SI sessions, as it is a voluntary program. Students with varying levels of academic preparedness and diverse ethnicities participate. There is no remedial stigma attached to SI since the program targets high-risk courses rather than high-risk students.
John – first two points (provide examples of academic technology use when explaining first bullet)
James – last two points (because so much of a students participation and performance in a class is logged via academic technology tools it provides us with evidence of student behavior which reveal patterns that can predict struggling students or successful students)
More and better data: More data sources, less structured data, integrated data, potential for expansion of data sources in future. Unlock meaning from data that’s meaningful/available.
Intended for multiple audiences: Blackboard internal development, CIOs/administrative customers interested in metrics, faculty/student customers interested in learning/teaching, researchers interested in underlying relationships.