Learning analytics aims to collect, analyze, and report on student data to understand and optimize learning. It draws on fields like education theory, social network analysis, and machine learning. Current analytics focus on retention and early intervention but are progressing towards predictive, adaptive, and recommender systems. Tools like SNAPP provide lightweight social network analysis to help interpret student interaction patterns and evaluate teaching interventions. However, ensuring privacy and developing analytics with appropriate educational context remain ongoing challenges.
June presentations org_adoption_learning_analyticsShane Dawson
Learning analytics (LA) has been touted as a game changer for education. The rapidly growing literature associated with the field serves to promote this fervour in citing the vast impact LA can and will play in the education space. From the detection of at-risk students to address retention and performance, building self-regulated learning, development and identification of 21st Century literacies to the realisation of personalised learning, there appears little that LA cannot contribute to within learning and teaching practice. However, if LA is such an impactful, desirable and worthy endeavour that can effectively improve learning, and our understanding of the learning process, why are there so few examples of institutional LA adoption?
Toward an automated student feedback system for text based assignments - Pete...Blackboard APAC
As the use of blended learning environments and digital technologies become integrated into the higher education sector, rich technologies such as analytics have the ability to assist teaching staff identify students at risk, learning material that is not proving effective and learning site designs that aid and facilitate improved learning. More recently consideration has been given to automated essay scoring. Such systems can be used in a formative way, such as providing feedback on initial assignment drafts or summatively through the analysis of final assignment submissions. Further, providing students with quick feedback on written assignments opens the opportunity through formative feedback to improved learning outcomes.
This presentation details a current project developing a system to analyse text-based assignments. The project is being developed for broad application, but the findings focus on an undergraduate pilot subject: ‘Ideas that Shook the World’ (a compulsory first year Bachelor of Arts subject taught on 5 campuses to more than 1000 students by 15 staff). Preliminary results of a fist scan of assignments are presented and the issues raised in developing the system presented together with an outline of additional work planned for the project. It is believed the work will have wide application where text-based assignments are utilised for assessment.
June presentations org_adoption_learning_analyticsShane Dawson
Learning analytics (LA) has been touted as a game changer for education. The rapidly growing literature associated with the field serves to promote this fervour in citing the vast impact LA can and will play in the education space. From the detection of at-risk students to address retention and performance, building self-regulated learning, development and identification of 21st Century literacies to the realisation of personalised learning, there appears little that LA cannot contribute to within learning and teaching practice. However, if LA is such an impactful, desirable and worthy endeavour that can effectively improve learning, and our understanding of the learning process, why are there so few examples of institutional LA adoption?
Toward an automated student feedback system for text based assignments - Pete...Blackboard APAC
As the use of blended learning environments and digital technologies become integrated into the higher education sector, rich technologies such as analytics have the ability to assist teaching staff identify students at risk, learning material that is not proving effective and learning site designs that aid and facilitate improved learning. More recently consideration has been given to automated essay scoring. Such systems can be used in a formative way, such as providing feedback on initial assignment drafts or summatively through the analysis of final assignment submissions. Further, providing students with quick feedback on written assignments opens the opportunity through formative feedback to improved learning outcomes.
This presentation details a current project developing a system to analyse text-based assignments. The project is being developed for broad application, but the findings focus on an undergraduate pilot subject: ‘Ideas that Shook the World’ (a compulsory first year Bachelor of Arts subject taught on 5 campuses to more than 1000 students by 15 staff). Preliminary results of a fist scan of assignments are presented and the issues raised in developing the system presented together with an outline of additional work planned for the project. It is believed the work will have wide application where text-based assignments are utilised for assessment.
Education, data policy and practice - Kim Schildkamp EduSkills OECD
This presentation was given by Kim Schildkamp of the University of Twente, Netherlands at the GCES Conference on Education Governance: The Role of Data in Tallinn on 12 February during the session on Keynote: Education data, policy and practice.
Higher Education & Game Principles: Context, Theory & Application - Daniel La...Blackboard APAC
This presentation reports on the efficacy of a mobile learning intervention that combined ‘push notifications’ and game principles within a timed quiz app. An institutional interdisciplinary case study was conducted which compared rates of student retention and academic performance with their usage of a purpose-designed learning app. Leading up to lectures the app pushed daily quizzes to students’ personal mobile devices and then rewarded them with feedback, points, badges and a position on a leaderboard. During this session, the findings of this study will be discussed and conclusions made in regards to what findings mean for the future research into higher education learning enabled via mobile app technologies.
Using Learning Analytics to Assess Innovation & Improve Student Achievement John Whitmer, Ed.D.
Presentation about Learning Analytics for JISC network event; discussion of research findings and implications for individual and institutions considering a Learning Analytics project. Also discuss implications for my work with Blackboard on "Platform Analytics."
Speakers:
David Lewis, senior analytics consultant, Jisc
Martin Lynch, learning systems manager, University of South Wales
An opportunity to find out about how an institution has been implementing learning analytics to support the student journey with and opportunity to discuss issues and possibilities that the use of learning analytics may create.
Improving Student Achievement with New Approaches to DataJohn Whitmer, Ed.D.
Presentation delivered at WASC ARC conference on April 11, 2013 on the CSU Data Dashboard and Chico State Learning Analytics case study.
Chico State Case Study: Academic technologies collect highly detailed student usage data. How can this data be used to understand and predict student performance, especially of at-risk students? This presentation will discuss research on a high-enrollment undergraduate course exploring the relationship between LMS activity, student background characteristics, current enrollment information, and student achievement.
CSU Data Dashboard: By monitoring on-track indicators institutional leaders can better understand not only which milestones students are failing to reach, but why they are not reaching them. It can also help campuses to design interventions or policy changes to increase student success and to gauge the impact of interventions.
The Virtuous Loop of Learning Analytics & Academic Technology Innovation John Whitmer, Ed.D.
Faculty and academic departments creating innovative educational practices are often starved for useful data and analysis to determine whether their innovations made a difference. Research has found that this data is a systematically significant predictor of success much more powerful than traditional demographic or academic preparedness variables. This leads to a “virtuous loop” in which digital technology adoption enables assessment which then improves educational practices using those technologies.
This presentation was delivered at the Online Learning Consortium Collaborate Event, November 19, 2015.
Presentation given at SCONUL 2014, the summer conference of The Society of College, National and University Libraries, Glasgow, June 2014. The presentation focuses on frequently asked questions (FAQs) about learning analytics, with the emphasis on the role and perspective of libraries in this area.
The popular media tells us that we live in an age of disengagement. 21st century professors are told they need to design curriculum to support student success and create an engaging classroom whether it is face-to-face, online, or in a blended learning environment. Creating engaging learning environments with technology will be essential to embrace 21st century learners and their ever evolving learning styles. Information Technology is dedicated to this philosophy and embraces varying technologies and learning concepts with other institutions and with our own faculty to generate innovation with technology and learning engagement in tandem. Information Technology invites the Stevens community to explore how educators can use some of the tools such as apps, clickers, open education resources, mobile learning, collaborative learning platforms from Google Hangouts to Massive Open Online Courses, and embrace the engagement strategies of social media
What data from 3 million learners can tell us about effective course designJohn Whitmer, Ed.D.
Presentation of research findings and implications from a large-scale analysis of LMS activity and grade data from across 927 institutions, 70,000 courses, and 3.3 million students. This webinar will speak to the promise (and potential pitfalls) of large-scale learning analytics research to promote student success.
A tale of two universities - organic growth of learning analytics through bes...Danny Liu
ACODE70
Despite calls for actionable information, few learning analytics approaches nationally allow staff to easily ‘do’ anything with data. Coupled with the typically long development cycles of software tools, this has the potential to stall uptake of learning analytics by interested staff. This presentation will outline two approaches at the University of Sydney and Macquarie University where staff were closely involved in the coevolution and development of two bespoke learning analytics tools to personalise student-staff interactions at scale. This allowed the tools to meet pressing needs, and has led to substantial organic adoption and positive student outcomes. These highlight the importance of grassroots developments for building wider learning analytics capabilities.
Presentation of a paper at the ASCILITE Conference, discussing how we need to share the findings of failed research, so we can learn from other's mistakes. The full paper may be found at: https://www.researchgate.net/publication/311108135_Failing_forward_in_research_around_technology_enhanced_learning
Using Learning Analytics to Create our 'Preferred Future'John Whitmer, Ed.D.
One certainty about the future of higher education is that online technologies will play an increasingly central role in the creation and delivery of learning experiences, whether through mobile apps, MOOCs, open content, ePortfolios, and other resources. As adoption increases, the ‘digital exhaust’ recording technology use has increasing potential to understand student learning. The emergent field of Learning Analytics analyzes this data to provide actionable insights for students, for faculty, and for administrators. What have we learned in Learning Analytics to date? What challenges remain? How should we apply Learning Analytics to create our ‘preferred’ future’ that supports deep and meaningful learning
Conducting Research on Blended and Online Education, WorkshopTanya Joosten
Conducting Research on Blended and Online Education
October 14, 2015 - 8:30am
Lead Presenter: Tanya Joosten (University of Wisconsin - Milwaukee, USA)
Nori Barajas-Murphy (University of La Verne, USA)
Track: Learning Effectiveness
Pre-Conference Workshop
Location: Oceanic 7
Session Duration: 3 Hours
Pre-Conference Workshop Session 3
This workshop consists of practice-based research planning activities to help you prepare for conducting research at the course or program level. Specifically, we will utilize the distance education research model developed by the National Research Center for Distance Education and Technological Advancements (DETA) to guide the development of research plans for blended and online. Attendees will walk away with a research agenda and the necessary tools to help them conduct research on their campus as part of the National DETA Research Center initiative.
The University of Wisconsin-Milwaukee (UWM) established a National Distance Education and Technological Advancement (DETA) Research Center in 2014 to conduct cross-institutional data collection with 2-year and 4-year Institutions of Higher Education (IHEs) funded by the U.S. Department of Education Fund for Improvement of Postsecondary Education (FIPSE). UWM has partnered with the University of Wisconsin System, UW-Extension, Milwaukee Area Technical College (MATC), EDUCAUSE Learning Initiative (ELI), and leaders across the nation to develop a research model. This model is to promote student access and success through evidence-based online learning practices and learning technologies.
The DETA Center looks to identify and evaluate effective course and institutional practices in online learning (including competency-based education) for underrepresented individuals (i.e., economically disadvantaged, adult learners, disabled) through rigorous research. Furthermore, although the research currently is focused on postsecondary U.S. institutions, the DETA Center looks to advance their work in K-12 and internationally -- all are welcome!
This workshop will prepare attendees to take a plan back to their own institution to successfully gather research on blended and online teaching and learning.
For more on DETA, visit http://www.uwm.edu/deta.
Sdal air education workforce analytics workshop jan. 7 , 2014.pptxkimlyman
The American Institutes for Research (AIR) and Virginia Tech are collaborating to explore and develop new approaches to combining, manipulating and understanding big data. The two are also looking at how big data analytics can help answer questions critical to solving issues in education, workforce, health, and human and social development. They held two workshops on January 7 and 27, 2014- the first on Education and Workforce Analytics and the second on Health and Social Development Analytics.
Education, data policy and practice - Kim Schildkamp EduSkills OECD
This presentation was given by Kim Schildkamp of the University of Twente, Netherlands at the GCES Conference on Education Governance: The Role of Data in Tallinn on 12 February during the session on Keynote: Education data, policy and practice.
Higher Education & Game Principles: Context, Theory & Application - Daniel La...Blackboard APAC
This presentation reports on the efficacy of a mobile learning intervention that combined ‘push notifications’ and game principles within a timed quiz app. An institutional interdisciplinary case study was conducted which compared rates of student retention and academic performance with their usage of a purpose-designed learning app. Leading up to lectures the app pushed daily quizzes to students’ personal mobile devices and then rewarded them with feedback, points, badges and a position on a leaderboard. During this session, the findings of this study will be discussed and conclusions made in regards to what findings mean for the future research into higher education learning enabled via mobile app technologies.
Using Learning Analytics to Assess Innovation & Improve Student Achievement John Whitmer, Ed.D.
Presentation about Learning Analytics for JISC network event; discussion of research findings and implications for individual and institutions considering a Learning Analytics project. Also discuss implications for my work with Blackboard on "Platform Analytics."
Speakers:
David Lewis, senior analytics consultant, Jisc
Martin Lynch, learning systems manager, University of South Wales
An opportunity to find out about how an institution has been implementing learning analytics to support the student journey with and opportunity to discuss issues and possibilities that the use of learning analytics may create.
Improving Student Achievement with New Approaches to DataJohn Whitmer, Ed.D.
Presentation delivered at WASC ARC conference on April 11, 2013 on the CSU Data Dashboard and Chico State Learning Analytics case study.
Chico State Case Study: Academic technologies collect highly detailed student usage data. How can this data be used to understand and predict student performance, especially of at-risk students? This presentation will discuss research on a high-enrollment undergraduate course exploring the relationship between LMS activity, student background characteristics, current enrollment information, and student achievement.
CSU Data Dashboard: By monitoring on-track indicators institutional leaders can better understand not only which milestones students are failing to reach, but why they are not reaching them. It can also help campuses to design interventions or policy changes to increase student success and to gauge the impact of interventions.
The Virtuous Loop of Learning Analytics & Academic Technology Innovation John Whitmer, Ed.D.
Faculty and academic departments creating innovative educational practices are often starved for useful data and analysis to determine whether their innovations made a difference. Research has found that this data is a systematically significant predictor of success much more powerful than traditional demographic or academic preparedness variables. This leads to a “virtuous loop” in which digital technology adoption enables assessment which then improves educational practices using those technologies.
This presentation was delivered at the Online Learning Consortium Collaborate Event, November 19, 2015.
Presentation given at SCONUL 2014, the summer conference of The Society of College, National and University Libraries, Glasgow, June 2014. The presentation focuses on frequently asked questions (FAQs) about learning analytics, with the emphasis on the role and perspective of libraries in this area.
The popular media tells us that we live in an age of disengagement. 21st century professors are told they need to design curriculum to support student success and create an engaging classroom whether it is face-to-face, online, or in a blended learning environment. Creating engaging learning environments with technology will be essential to embrace 21st century learners and their ever evolving learning styles. Information Technology is dedicated to this philosophy and embraces varying technologies and learning concepts with other institutions and with our own faculty to generate innovation with technology and learning engagement in tandem. Information Technology invites the Stevens community to explore how educators can use some of the tools such as apps, clickers, open education resources, mobile learning, collaborative learning platforms from Google Hangouts to Massive Open Online Courses, and embrace the engagement strategies of social media
What data from 3 million learners can tell us about effective course designJohn Whitmer, Ed.D.
Presentation of research findings and implications from a large-scale analysis of LMS activity and grade data from across 927 institutions, 70,000 courses, and 3.3 million students. This webinar will speak to the promise (and potential pitfalls) of large-scale learning analytics research to promote student success.
A tale of two universities - organic growth of learning analytics through bes...Danny Liu
ACODE70
Despite calls for actionable information, few learning analytics approaches nationally allow staff to easily ‘do’ anything with data. Coupled with the typically long development cycles of software tools, this has the potential to stall uptake of learning analytics by interested staff. This presentation will outline two approaches at the University of Sydney and Macquarie University where staff were closely involved in the coevolution and development of two bespoke learning analytics tools to personalise student-staff interactions at scale. This allowed the tools to meet pressing needs, and has led to substantial organic adoption and positive student outcomes. These highlight the importance of grassroots developments for building wider learning analytics capabilities.
Presentation of a paper at the ASCILITE Conference, discussing how we need to share the findings of failed research, so we can learn from other's mistakes. The full paper may be found at: https://www.researchgate.net/publication/311108135_Failing_forward_in_research_around_technology_enhanced_learning
Using Learning Analytics to Create our 'Preferred Future'John Whitmer, Ed.D.
One certainty about the future of higher education is that online technologies will play an increasingly central role in the creation and delivery of learning experiences, whether through mobile apps, MOOCs, open content, ePortfolios, and other resources. As adoption increases, the ‘digital exhaust’ recording technology use has increasing potential to understand student learning. The emergent field of Learning Analytics analyzes this data to provide actionable insights for students, for faculty, and for administrators. What have we learned in Learning Analytics to date? What challenges remain? How should we apply Learning Analytics to create our ‘preferred’ future’ that supports deep and meaningful learning
Conducting Research on Blended and Online Education, WorkshopTanya Joosten
Conducting Research on Blended and Online Education
October 14, 2015 - 8:30am
Lead Presenter: Tanya Joosten (University of Wisconsin - Milwaukee, USA)
Nori Barajas-Murphy (University of La Verne, USA)
Track: Learning Effectiveness
Pre-Conference Workshop
Location: Oceanic 7
Session Duration: 3 Hours
Pre-Conference Workshop Session 3
This workshop consists of practice-based research planning activities to help you prepare for conducting research at the course or program level. Specifically, we will utilize the distance education research model developed by the National Research Center for Distance Education and Technological Advancements (DETA) to guide the development of research plans for blended and online. Attendees will walk away with a research agenda and the necessary tools to help them conduct research on their campus as part of the National DETA Research Center initiative.
The University of Wisconsin-Milwaukee (UWM) established a National Distance Education and Technological Advancement (DETA) Research Center in 2014 to conduct cross-institutional data collection with 2-year and 4-year Institutions of Higher Education (IHEs) funded by the U.S. Department of Education Fund for Improvement of Postsecondary Education (FIPSE). UWM has partnered with the University of Wisconsin System, UW-Extension, Milwaukee Area Technical College (MATC), EDUCAUSE Learning Initiative (ELI), and leaders across the nation to develop a research model. This model is to promote student access and success through evidence-based online learning practices and learning technologies.
The DETA Center looks to identify and evaluate effective course and institutional practices in online learning (including competency-based education) for underrepresented individuals (i.e., economically disadvantaged, adult learners, disabled) through rigorous research. Furthermore, although the research currently is focused on postsecondary U.S. institutions, the DETA Center looks to advance their work in K-12 and internationally -- all are welcome!
This workshop will prepare attendees to take a plan back to their own institution to successfully gather research on blended and online teaching and learning.
For more on DETA, visit http://www.uwm.edu/deta.
Sdal air education workforce analytics workshop jan. 7 , 2014.pptxkimlyman
The American Institutes for Research (AIR) and Virginia Tech are collaborating to explore and develop new approaches to combining, manipulating and understanding big data. The two are also looking at how big data analytics can help answer questions critical to solving issues in education, workforce, health, and human and social development. They held two workshops on January 7 and 27, 2014- the first on Education and Workforce Analytics and the second on Health and Social Development Analytics.
Conducting Research on Blended and Online Education: A Research ToolkitTanya Joosten
An ELI Short Course delivered on May 16th, 2016.
This session consists of practice-based research planning activities to help participants prepare for conducting research at the course or program level. Specifically, we will utilize the distance education research toolkit developed by the National Research Center for Distance Education and Technological Advancements (DETA) to guide the development of research plans for blended and online learning. Attendees will walk away with a research agenda and the necessary tools to help them conduct research on their campus as part of the National DETA Research Center initiative. The DETA Center seeks to identify and evaluate effective course and institutional practices in online learning (including competency-based education) for underrepresented learners.
Objectives:
After participating in this webinar, participants will be able to:
Develop research questions
Clarify variables and measures
Identify data gathering techniques
Consider other actionable milestones necessary to conduct rigorous research
http://www.educause.edu/events/eli-webinar-conducting-research-blended-and-online-education
Co-developing bespoke, enterprise-scale analytics systems with teaching staffDanny Liu
Presentation at the NSW Learning Analytics Working Group meeting, 3 February 2016, at the University of Technology, Sydney. Covering projects from Macquarie University and the University of Sydney.
Learning Analytics: Seeking new insights from educational dataAndrew Deacon
CPUT Fundani TWT - 22 May 2014
Analytics is a buzzword that encompasses the analysis and visualisation of big data. Current interest results from the growing access to data and the many software tools now available to analyse this data in Higher Education, through platforms such as Learning Management Systems. This seminar provides an overview of current applications and uses of learning analytics and how it can help institutions of learning better support their learners. The illustrative examples look at institutional and social media data that together provide rich insights into institutional, teaching and learning issues. A few simple ways to perform such analytics in a context of Higher Education will be introduced.
Slides from Keynote presentation at the University of Southern California's 2015 Teaching with Technology annual conference.
"9:15 am – ANN Auditorium
Key Note: What Do We Mean by Learning Analytics?
Leah Macfadyen, Director for Evaluation and Learning Analytics, University of British Columbia
Executive Board, SoLAR (Society for Learning Analytics Research)
Leah Macfadyen will define and explore the emerging and interdisciplinary field of learning analytics in the context of quantified and personalized learning. Leah will use actual examples and case studies to illustrate the range of stakeholders learning analytics may serve, the diverse array of questions they may be used to address, and the potential impact of learning analytics in higher education."
UCISA Learning Anaytics Pre-Conference WorkshopMike Moore
UCISA Learning Analytics Pre-Conference Workshop
Mike Moore - Sr. Advisory Consultant - Analytics
Desire2Learn, Inc.
UCISA Conference 2014, Brighton, UK
Presented Mar 26, 2014
Learning analytics and Moodle: So much we could measure, but what do we want to measure? A presentation to the USQ Math and Sciences Community of Practice May 2013
EMMA Summer School - Rebecca Ferguson - Learning design and learning analytic...EUmoocs
This hands-on workshop will work with learning design tools and with massive open online courses (MOOCs) on the FutureLearn platform to explore how learning design can be used to influence the choice and design of learning analytics. This workshop will be of interest to people who are involved in the design or presentation of online courses, and to those who want to find out more about learning design, learning analytics or MOOCs. Participants will find it helpful to have registered for FutureLearn and explored the platform for a short time in advance of the workshop.
This presentation was given during the EMMA Summer School, that took place in Ischia (Italy) on 4-11 July 2015.
More info on the website: http://project.europeanmoocs.eu/project/get-involved/summer-school/
Follow our MOOCs: http://platform.europeanmoocs.eu/MOOCs
Design and deliver your MOOC with EMMA: http://project.europeanmoocs.eu/project/get-involved/become-an-emma-mooc-provider/
In May 2018 I ran an e-Assessment workshop for members of the Griffith University Assessment Committee.
Topics included:
- What do we already understand about digital assessment
- What are our current pain-points
- We will identify where these sit on our assessment lifecycle
- Talk through some of the emerging tools and techniques, such as:
- Contract cheating and some ways to address this
- Digital exams and proctoring some tools now available
- Conditional assessments and Marking tools
- Looking at what’s possible in Office 365 + BB
- Use of voice in assessment
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
Francesca Gottschalk - How can education support child empowerment.pptxEduSkills OECD
Francesca Gottschalk from the OECD’s Centre for Educational Research and Innovation presents at the Ask an Expert Webinar: How can education support child empowerment?
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
Palestine last event orientationfvgnh .pptxRaedMohamed3
An EFL lesson about the current events in Palestine. It is intended to be for intermediate students who wish to increase their listening skills through a short lesson in power point.
Instructions for Submissions thorugh G- Classroom.pptxJheel Barad
This presentation provides a briefing on how to upload submissions and documents in Google Classroom. It was prepared as part of an orientation for new Sainik School in-service teacher trainees. As a training officer, my goal is to ensure that you are comfortable and proficient with this essential tool for managing assignments and fostering student engagement.
The Roman Empire A Historical Colossus.pdfkaushalkr1407
The Roman Empire, a vast and enduring power, stands as one of history's most remarkable civilizations, leaving an indelible imprint on the world. It emerged from the Roman Republic, transitioning into an imperial powerhouse under the leadership of Augustus Caesar in 27 BCE. This transformation marked the beginning of an era defined by unprecedented territorial expansion, architectural marvels, and profound cultural influence.
The empire's roots lie in the city of Rome, founded, according to legend, by Romulus in 753 BCE. Over centuries, Rome evolved from a small settlement to a formidable republic, characterized by a complex political system with elected officials and checks on power. However, internal strife, class conflicts, and military ambitions paved the way for the end of the Republic. Julius Caesar’s dictatorship and subsequent assassination in 44 BCE created a power vacuum, leading to a civil war. Octavian, later Augustus, emerged victorious, heralding the Roman Empire’s birth.
Under Augustus, the empire experienced the Pax Romana, a 200-year period of relative peace and stability. Augustus reformed the military, established efficient administrative systems, and initiated grand construction projects. The empire's borders expanded, encompassing territories from Britain to Egypt and from Spain to the Euphrates. Roman legions, renowned for their discipline and engineering prowess, secured and maintained these vast territories, building roads, fortifications, and cities that facilitated control and integration.
The Roman Empire’s society was hierarchical, with a rigid class system. At the top were the patricians, wealthy elites who held significant political power. Below them were the plebeians, free citizens with limited political influence, and the vast numbers of slaves who formed the backbone of the economy. The family unit was central, governed by the paterfamilias, the male head who held absolute authority.
Culturally, the Romans were eclectic, absorbing and adapting elements from the civilizations they encountered, particularly the Greeks. Roman art, literature, and philosophy reflected this synthesis, creating a rich cultural tapestry. Latin, the Roman language, became the lingua franca of the Western world, influencing numerous modern languages.
Roman architecture and engineering achievements were monumental. They perfected the arch, vault, and dome, constructing enduring structures like the Colosseum, Pantheon, and aqueducts. These engineering marvels not only showcased Roman ingenuity but also served practical purposes, from public entertainment to water supply.
Biological screening of herbal drugs: Introduction and Need for
Phyto-Pharmacological Screening, New Strategies for evaluating
Natural Products, In vitro evaluation techniques for Antioxidants, Antimicrobial and Anticancer drugs. In vivo evaluation techniques
for Anti-inflammatory, Antiulcer, Anticancer, Wound healing, Antidiabetic, Hepatoprotective, Cardio protective, Diuretics and
Antifertility, Toxicity studies as per OECD guidelines
2. What about today?
• Current state of play
• What analytics are in place?
• What questions and data?
• Patterns of data – importance of context
• Analysis tools - SNA
• Curriculum analytics
• Privacy/ ethics
• Questions, concerns or issues
3. Where are LA?
Peak of inflated expectations
Slope of enlightenment
Trough of disillusionment
Technology trigger
Plateau of productivity
5. …is the collection, collation, analysis and
reporting of data about learners and their
contexts, for the purposes of understanding
and optimizing learning
Learning Analytics
6. Ed theory, Ed practice, SNA, Data
mining, Machine learning, semantic,
data visualisations, sense-making,
psychology (social, cognitive,
organisational), learning sciences
Learning Analytics
7. Creatures of habit (Study, communication, search
patterns, networks, credit card security, Movies)
What do patterns indicate and what do changes in
habit indicate?
Learning Analytics
8. • 5 Billion mobile phones (2010)
• 30 Billion content shared on facebook
• $600 drive to store all of the worlds music
• 60% increase in operating margin for retailers
using big data
Its accessible, cheap and critical
Big data
Manyika, J., et al. (2011). Big Data: The Next Frontier for Innovation, Competition, and Productivity:
McKinsey Global Institute
9. Examples
Develop a predictive algorithm to identify who will be
admitted to a hospital using historical claims data
Kaggle: connect with
data scientists
12. “Data is the new oil”
Higher education:
• Lots of isolated work targeting attrition. Few
large enterprise egs.
• Commercial – IBM, D2L S3, BB analytics
Why?
Where is LA?
13. • “High potential but low mindset”
• Target rapid returns – students at risk.
• Predictive Analytics Research Framework:
• What data? Lets define terms
Potential is there
Manyika, J., et al. (2011). Big Data: The Next Frontier for Innovation, Competition, and Productivity:
McKinsey Global Institute
19. What questions?
What questions are learning analytics attempting
to address?
What analytics work is being undertaken at your
institution?
How far has this progressed?
21. Rapidly moving beyond simple reports, attrition
and student learning support measures
TO -
Predictive, Adaptive and Recommender states
• Akin to – iTunes Genius, Amazon, Gmail
Future
23. Example Knewton:
2012 - 500,000 students
2013 - 5 million students
2014 - 15 million students
1 million points of data per student.
Curriculum and activities modified based on the
individual student.
Future
25. In Australia:
• Largely focused on retention and early
intervention
Examples:
• UniSA – ESAP
• QUT – student success and retention
• CQU – indicators project
• UTS – data intensive university
Potential is there
41. What are the predictors of failure and
retention?
• Prior grades
• Low SE
• First in family
• Study load
• Engagement online (time of login/
discussion activity)
Risk assessment
42. What patterns do you expect?
LMS activity
Class interaction
Assessment
Qualitative
Survey
Networks
What patterns?
44. Analytic techniques SNA
“single most potent source of influence”
Astin, A. (1993). What matters in college: Four critical years revisited.
San Francisco: Jossey-Bass.
Student Networks
47. SNAPP
• Social Networks Adapting Pedagogical Practice
• Focus on student relationships (learning
networks)
• Simple visualizations to assist with
interpretation and evaluate impact of activities
• Lightweight analytics tool
• Bookmarklet
• Rapid and easy dissemination
Bakharia, A., & Dawson, S. (2011). SNAPP: a bird's-eye view of temporal participant interaction.
Paper presented at the 1st International Conference on Learning Analytics and Knowledge, Banff,
Alberta, Canada
50. SNAPP
• No need to access database
• No need for Admin rights (installation of a
bookmark)
• Broad accessibility and compatibility
• Fast delivery mechanisms – focus on simplicity
51. • Forum A • Forum B
14 messages posted by 4 participants
Measuring Interaction
56. Other tools
Jigsaw: visual analytics for
documents
http://www.cc.gatech.edu/gvu/ii/jigsaw/
Netlytic: text & SNA for
Twitter, Youtube, blogs, etc.
Gephi.org
64. Teaching Presence
• Staff intervention
• High – 70% of networks
• Low – 10% of networks
• Why?
• The pursuit of community
Dawson, S. (2006). Online forum discussion interactions as an indicator of student community.
Australasian Journal of Educational Technology, 22(4), 495-510.
Dawson, S. (2010). 'Seeing' the learning community: An exploration of the development of a
resource for monitoring online student networking. British Journal of Educational Technology,
41(5), 736–752.
65. Curriculum analytics
Need context in order to move from predictions to
recommender systems
Lecture, seminar, group
work, community,
online, hybrid, blended
What teaching model?
What outcomes?
71. Curriculum analytics
What outcomes, what experiences?
Assessment
Learning outcomes
Learning experiences
Graduate attributes
Automated
portfolio/
Learning
Relationship
72. Privacy and ethics
• Who “owns” data?
• Are analytics an intrusion of privacy?
• If we can identify students at risk is there an
obligation to intervene?