The document discusses designing learning analytics tools with a human-centered approach by involving intended users. It notes that past learning analytics have focused more on technical systems than human ones. Only a small percentage of tools reported user needs analysis or usability testing. This can result in tools being misaligned with user needs and perceptions, undermining trust. The presentation describes NYU's learning analytics work which aims to build partnerships putting people first. It discusses initial design processes, fieldwork examining instructor analytics use, and implications for tool redesign and implementation supports to better facilitate pedagogical decision-making.
Scalable, Actionable, and Ethical Learning Dashboards: a reality checkTinne De Laet
Keynote presentation at Edmedia 2018 conference: https://www.aace.org/conf/edmedia/speakers/.
Results of Erasmus+ projects ABLE (www.ableproject.eu) and STELA (www.stela-project.eu) on learning dashboards for supporting first-year students.
Listening to Teachers’ Needs: Human-centred Design for Mobile Technologies in...Renée Schulz
This is the presentation for my PhD defense given on the 21st March 2018. The full dissertation should be available in AURA soon (University of Agder/ Universitetet i Agder), Norway.
Scalable, Actionable, and Ethical Learning Dashboards: a reality checkTinne De Laet
Keynote presentation at Edmedia 2018 conference: https://www.aace.org/conf/edmedia/speakers/.
Results of Erasmus+ projects ABLE (www.ableproject.eu) and STELA (www.stela-project.eu) on learning dashboards for supporting first-year students.
Listening to Teachers’ Needs: Human-centred Design for Mobile Technologies in...Renée Schulz
This is the presentation for my PhD defense given on the 21st March 2018. The full dissertation should be available in AURA soon (University of Agder/ Universitetet i Agder), Norway.
Speakers:
David Lewis, senior analytics consultant, Jisc
Martin Lynch, learning systems manager, University of South Wales
An opportunity to find out about how an institution has been implementing learning analytics to support the student journey with and opportunity to discuss issues and possibilities that the use of learning analytics may create.
Course revision is a reality of daily life in higher education. Each semester, faculty review their courses to ensure that they are presenting current concepts and providing proper methods of assessment and interaction for their students. Unfortunately, most review and revision is done during periods of frantic activity just before or during the beginning of the semester. This methodology does not allow for deep consideration of issues and can negatively affect learning for students.
Focused revision is a methodology of review that tasks faculty to review a course over a longer period of time and focus on one pedagogical aspect, such as interaction, content presentation, rubric development, etc. Focusing on a specific aspect of a course, to the exclusion of others, increases the efficacy of that aspect of the course while maintaining the current level of quality on the other aspects. This methodology also changes course revision from a summative process to a formative process and allows for the effective inclusion of student feedback into course design. The process also allows faculty to create efficiencies in their process to maximize time and minimize work. Multiple focused revisions may build on each other to create a synergy between course components, thus creating a more effective learning environment in both the physical and the digital classrooms, leading to increased student engagement and learning.
ASCILITE Webinar: A review of five years of implementation and research in al...Bart Rienties
Date and time: Wednesday 20 September 2017 at 5pm AEST
Abstract: The Open University UK (OU) has been one of few institutions that have explicitly and systematically captured the designs for learning at a large scale. By applying advanced analytical techniques on large and fine-grained datasets, we have been unpacking the complexity of instructional practices, as well as providing empirical evidence of how learning designs influence student behaviour, satisfaction, and performance. This seminar will discuss the implementation of learning design at the OU in the last 5 years, and reviews empirical evidence from several studies that have linked learning design with learning analytics. Recommendations are put forward to support future adoptions of the learning design approach, and potential research trajectories.
https://ascilite.org/get-involved/sigs/learning-analytics-sig/
www.bartrienties.nl
A Development of Students’ Worksheet Based on Contextual Teaching and LearningIOSRJM
This research is aimed at developing the students’ worksheet to determine the quality of validity and practicality aspects based on expert’s assessment of materials, expert’s design, media specialists, an individual assessment of students’ testing, a small group assessment of students trial, and a field trials assessment of students.This study is adapted from the development of ADDIE model which consists of 5 stages: 1) Analysis, 2) design, 3) Development, 4) Implementation, and 5) evaluation. The results showed that the quality of students' worksheet of mathematics on materials of factorization in algebra-based on Contextual Teaching and Learning basically on the assessment of: 1) the experts’ of subject materials is obtained a total average of 3.81 is included in the category of "Good" or scored 76.2 % which is included in the category of "Very Decent", 2) the experts’ design is obtained a total average of 3.62 which is included in the category of "Good" or scored 72.4% which is included in the category "Decent", 3) the experts’ of media is obtained scored 4:43 which is included in the category of "Good" or scored 88.6% which is in the category of "Very Decent".Whereas, the assessment by the students is done in three stages: 1) an individual assessment of students’ testing is obtained average total of 4.75 which is included in the category of "Very Good" or 95% which is included in the category of "Very Decent", 2) a small group assessment of students trial is obtained total average of 4:58 which is included in the category of "Very Good" or scored 91.6% thus it is included in the category of "Very Decent", 3) a field trials assessment of students is obtained a total average of 4:43 which is included in the category of "Very Good" or scored 88.6% thus it is included in the category of "Very Decent". Thus mathematics on materials of factorization in algebra-based on Contextual Teaching and Learning (CTL) is declared valid and practical so it can be used as the learning equipment of mathematics at the factorization material algebra.
The power of learning analytics to measure learning gains: an OU, Surrey and ...Bart Rienties
Learning gains has increasingly become apparent within the HE literature, gained traction in government policies in the UK, and are at the heart of Teaching Excellence Framework (TFL). As such, this raises a question to what extent teaching and learning environment can actually predict students’ learning gains using principles of learning analytics. In this presentation, which is joined work with University of Surrey and Oxford Brookes, I will focus on some preliminary findings based upon developing and testing an Affective-Behaviour-Cognition learning gains model using longitudinal approach. The main aim of the research is to examine whether learning gains occur on all three levels of Affective-Behaviour-Cognition model and whether any particular student or course characteristics can predict learning gains or lack of learning and dropout. For more info, see https://abclearninggains.com/
22 January 2018 HEFCE open event “Using data to increase learning gains and t...Bart Rienties
With the Teaching Excellence Framework being implemented across England, a lot of higher education institutions have started to ask questions about what it means to be “excellent” in teaching. In particular, with the rich and complex data that all educational institutions gather that could potentially capture learning gains, what do we actually know about our students’ learning journeys? What kinds of data could be used to infer whether our students are actually making affective (e.g., motivation), behavioural (e.g., engagement), and/or cognitive learning gains? Please join us on 22 January 2018 in lovely Milton Keynes at a free OU- and HEFCE-supported event on Using data to increase learning gains and teaching excellence.
14.00-15.00 Measuring learning gains with (psychometric) questionnaires
Dr Sonia Ilie, Prof Jan Vermunt, Prof Anna Vignoles (University of Cambridge, UK): Learning gain: from concept to measurement
Dr Fabio Arico (University of East Anglia): Learning Gain and Confidence Gain Through Peer-instruction: the role of pedagogical design
Dr Paul Mcdermott & Dr Robert Jenkins (University of East Anglia): A Methodology that Makes Self-Assessment an Implicit Part of the Answering Process
15.00-15.45 Measuring employability learning gains
Dr Heike Behle (University of Warwick): Measuring employability gain in Higher Education. A case study using R2 Strengths
Fiona Cobb, Dr Bob Gilworth, David Winter (University of London): Careers Registration Learning Gain project
2019 Midwest Scholarship of Teaching & Learning (SOTL) conference presentation. The goal of this presentation is to share our data-informed approach to re-engineer the exam design, delivery, grading, and item analysis process in order to construct better exams that maximize all students potential to flourish. Can we make the use of exam analytics so easy and time efficient that faculty clearly see the benefit? For more info see our blog at https://kaneb.nd.edu/real/
The impact of gamification technology on students performance motivation Sherin El-Rashied
To study The Impact of Gamification Technology on Students’ performance and motivation in schools. This topic is considered a new trend, as many people are unaware of the word “gamification” and the relation it has to do with learning.
Data Driven College Counseling by SchooLinksKatie Fang
This workshop will expose school counselors and administrators to a framework for data-driven college planning and accountability. Attendees will learn about data collection, pattern analysis, and translating insight into intervention to best support students in their college planning process. No special statistical knowledge is required for this session, just enthusiasm to understand how using data unlock better student outcomes.
The Power of Learning Analytics: Is There Still a Need for Educational Research?Bart Rienties
Across the globe many institutions and organisations have high hopes that learning analytics can play a major role in helping their organisations remain fit-for-purpose, flexible, and innovative. A broad goal of learning analytics is to apply the outcomes of analysing data gathered by monitoring and measuring the learning process. Learning analytics applications in education are expected to provide institutions with opportunities to support learner progression, but more importantly provide personalised, rich learning on a large scale. Substantial progress in learning analytics research has been made in the last few years.
Researchers in learning analytics use a range of advanced computational techniques (e.g., Bayesian modelling, cluster analysis, natural language processing, machine learning) for predicting which learners are likely to fail or succeed, and how to provide appropriate support in a flexible and adaptive manner.
In this keynote, I will argue that unless educational researchers at EARLI embrace some of the key principles, methods, and approaches of learning analytics, educational researchers may be left behind. In particular, a main merit of learning analytics is linking large datasets of actual learning processes and outcomes with learning dispositions and learner characteristics. Using evidence-based approaches rapid insights and advancements are developed how learning designs and learning processes can be optimised to maximise the potential of each learner. For example, our recent research with 151 modules and 133K students at the Open University UK indicates that learning design has a strong impact on student behaviour, satisfaction, and performance. Learning analytics can also drive learning in more “traditional”, face-to-face contexts. For example, by measuring emotions, epistemological expressions, and cross-cultural dialogue, social interactions can be effectively supported by innovative dashboards and adaptive
approaches. I aim to unpack the advantages and limitations of learning analytics and how EARLI researchers can embrace such data-driven research approaches
More info at www.bartrienties.nl
Speakers:
David Lewis, senior analytics consultant, Jisc
Martin Lynch, learning systems manager, University of South Wales
An opportunity to find out about how an institution has been implementing learning analytics to support the student journey with and opportunity to discuss issues and possibilities that the use of learning analytics may create.
Course revision is a reality of daily life in higher education. Each semester, faculty review their courses to ensure that they are presenting current concepts and providing proper methods of assessment and interaction for their students. Unfortunately, most review and revision is done during periods of frantic activity just before or during the beginning of the semester. This methodology does not allow for deep consideration of issues and can negatively affect learning for students.
Focused revision is a methodology of review that tasks faculty to review a course over a longer period of time and focus on one pedagogical aspect, such as interaction, content presentation, rubric development, etc. Focusing on a specific aspect of a course, to the exclusion of others, increases the efficacy of that aspect of the course while maintaining the current level of quality on the other aspects. This methodology also changes course revision from a summative process to a formative process and allows for the effective inclusion of student feedback into course design. The process also allows faculty to create efficiencies in their process to maximize time and minimize work. Multiple focused revisions may build on each other to create a synergy between course components, thus creating a more effective learning environment in both the physical and the digital classrooms, leading to increased student engagement and learning.
ASCILITE Webinar: A review of five years of implementation and research in al...Bart Rienties
Date and time: Wednesday 20 September 2017 at 5pm AEST
Abstract: The Open University UK (OU) has been one of few institutions that have explicitly and systematically captured the designs for learning at a large scale. By applying advanced analytical techniques on large and fine-grained datasets, we have been unpacking the complexity of instructional practices, as well as providing empirical evidence of how learning designs influence student behaviour, satisfaction, and performance. This seminar will discuss the implementation of learning design at the OU in the last 5 years, and reviews empirical evidence from several studies that have linked learning design with learning analytics. Recommendations are put forward to support future adoptions of the learning design approach, and potential research trajectories.
https://ascilite.org/get-involved/sigs/learning-analytics-sig/
www.bartrienties.nl
A Development of Students’ Worksheet Based on Contextual Teaching and LearningIOSRJM
This research is aimed at developing the students’ worksheet to determine the quality of validity and practicality aspects based on expert’s assessment of materials, expert’s design, media specialists, an individual assessment of students’ testing, a small group assessment of students trial, and a field trials assessment of students.This study is adapted from the development of ADDIE model which consists of 5 stages: 1) Analysis, 2) design, 3) Development, 4) Implementation, and 5) evaluation. The results showed that the quality of students' worksheet of mathematics on materials of factorization in algebra-based on Contextual Teaching and Learning basically on the assessment of: 1) the experts’ of subject materials is obtained a total average of 3.81 is included in the category of "Good" or scored 76.2 % which is included in the category of "Very Decent", 2) the experts’ design is obtained a total average of 3.62 which is included in the category of "Good" or scored 72.4% which is included in the category "Decent", 3) the experts’ of media is obtained scored 4:43 which is included in the category of "Good" or scored 88.6% which is in the category of "Very Decent".Whereas, the assessment by the students is done in three stages: 1) an individual assessment of students’ testing is obtained average total of 4.75 which is included in the category of "Very Good" or 95% which is included in the category of "Very Decent", 2) a small group assessment of students trial is obtained total average of 4:58 which is included in the category of "Very Good" or scored 91.6% thus it is included in the category of "Very Decent", 3) a field trials assessment of students is obtained a total average of 4:43 which is included in the category of "Very Good" or scored 88.6% thus it is included in the category of "Very Decent". Thus mathematics on materials of factorization in algebra-based on Contextual Teaching and Learning (CTL) is declared valid and practical so it can be used as the learning equipment of mathematics at the factorization material algebra.
The power of learning analytics to measure learning gains: an OU, Surrey and ...Bart Rienties
Learning gains has increasingly become apparent within the HE literature, gained traction in government policies in the UK, and are at the heart of Teaching Excellence Framework (TFL). As such, this raises a question to what extent teaching and learning environment can actually predict students’ learning gains using principles of learning analytics. In this presentation, which is joined work with University of Surrey and Oxford Brookes, I will focus on some preliminary findings based upon developing and testing an Affective-Behaviour-Cognition learning gains model using longitudinal approach. The main aim of the research is to examine whether learning gains occur on all three levels of Affective-Behaviour-Cognition model and whether any particular student or course characteristics can predict learning gains or lack of learning and dropout. For more info, see https://abclearninggains.com/
22 January 2018 HEFCE open event “Using data to increase learning gains and t...Bart Rienties
With the Teaching Excellence Framework being implemented across England, a lot of higher education institutions have started to ask questions about what it means to be “excellent” in teaching. In particular, with the rich and complex data that all educational institutions gather that could potentially capture learning gains, what do we actually know about our students’ learning journeys? What kinds of data could be used to infer whether our students are actually making affective (e.g., motivation), behavioural (e.g., engagement), and/or cognitive learning gains? Please join us on 22 January 2018 in lovely Milton Keynes at a free OU- and HEFCE-supported event on Using data to increase learning gains and teaching excellence.
14.00-15.00 Measuring learning gains with (psychometric) questionnaires
Dr Sonia Ilie, Prof Jan Vermunt, Prof Anna Vignoles (University of Cambridge, UK): Learning gain: from concept to measurement
Dr Fabio Arico (University of East Anglia): Learning Gain and Confidence Gain Through Peer-instruction: the role of pedagogical design
Dr Paul Mcdermott & Dr Robert Jenkins (University of East Anglia): A Methodology that Makes Self-Assessment an Implicit Part of the Answering Process
15.00-15.45 Measuring employability learning gains
Dr Heike Behle (University of Warwick): Measuring employability gain in Higher Education. A case study using R2 Strengths
Fiona Cobb, Dr Bob Gilworth, David Winter (University of London): Careers Registration Learning Gain project
2019 Midwest Scholarship of Teaching & Learning (SOTL) conference presentation. The goal of this presentation is to share our data-informed approach to re-engineer the exam design, delivery, grading, and item analysis process in order to construct better exams that maximize all students potential to flourish. Can we make the use of exam analytics so easy and time efficient that faculty clearly see the benefit? For more info see our blog at https://kaneb.nd.edu/real/
The impact of gamification technology on students performance motivation Sherin El-Rashied
To study The Impact of Gamification Technology on Students’ performance and motivation in schools. This topic is considered a new trend, as many people are unaware of the word “gamification” and the relation it has to do with learning.
Data Driven College Counseling by SchooLinksKatie Fang
This workshop will expose school counselors and administrators to a framework for data-driven college planning and accountability. Attendees will learn about data collection, pattern analysis, and translating insight into intervention to best support students in their college planning process. No special statistical knowledge is required for this session, just enthusiasm to understand how using data unlock better student outcomes.
The Power of Learning Analytics: Is There Still a Need for Educational Research?Bart Rienties
Across the globe many institutions and organisations have high hopes that learning analytics can play a major role in helping their organisations remain fit-for-purpose, flexible, and innovative. A broad goal of learning analytics is to apply the outcomes of analysing data gathered by monitoring and measuring the learning process. Learning analytics applications in education are expected to provide institutions with opportunities to support learner progression, but more importantly provide personalised, rich learning on a large scale. Substantial progress in learning analytics research has been made in the last few years.
Researchers in learning analytics use a range of advanced computational techniques (e.g., Bayesian modelling, cluster analysis, natural language processing, machine learning) for predicting which learners are likely to fail or succeed, and how to provide appropriate support in a flexible and adaptive manner.
In this keynote, I will argue that unless educational researchers at EARLI embrace some of the key principles, methods, and approaches of learning analytics, educational researchers may be left behind. In particular, a main merit of learning analytics is linking large datasets of actual learning processes and outcomes with learning dispositions and learner characteristics. Using evidence-based approaches rapid insights and advancements are developed how learning designs and learning processes can be optimised to maximise the potential of each learner. For example, our recent research with 151 modules and 133K students at the Open University UK indicates that learning design has a strong impact on student behaviour, satisfaction, and performance. Learning analytics can also drive learning in more “traditional”, face-to-face contexts. For example, by measuring emotions, epistemological expressions, and cross-cultural dialogue, social interactions can be effectively supported by innovative dashboards and adaptive
approaches. I aim to unpack the advantages and limitations of learning analytics and how EARLI researchers can embrace such data-driven research approaches
More info at www.bartrienties.nl
Slides from Keynote presentation at the University of Southern California's 2015 Teaching with Technology annual conference.
"9:15 am – ANN Auditorium
Key Note: What Do We Mean by Learning Analytics?
Leah Macfadyen, Director for Evaluation and Learning Analytics, University of British Columbia
Executive Board, SoLAR (Society for Learning Analytics Research)
Leah Macfadyen will define and explore the emerging and interdisciplinary field of learning analytics in the context of quantified and personalized learning. Leah will use actual examples and case studies to illustrate the range of stakeholders learning analytics may serve, the diverse array of questions they may be used to address, and the potential impact of learning analytics in higher education."
A seminar drawn from two projects that explored a range of assessment practices, and examined how they are implemented by establishing and comparing attitudes to assessment amongst tutors and students within three ODL environments: University of London International Programmes, King’s College London (ODL programmes) and the Open University.
EMMA Summer School - Rebecca Ferguson - Learning design and learning analytic...EUmoocs
This hands-on workshop will work with learning design tools and with massive open online courses (MOOCs) on the FutureLearn platform to explore how learning design can be used to influence the choice and design of learning analytics. This workshop will be of interest to people who are involved in the design or presentation of online courses, and to those who want to find out more about learning design, learning analytics or MOOCs. Participants will find it helpful to have registered for FutureLearn and explored the platform for a short time in advance of the workshop.
This presentation was given during the EMMA Summer School, that took place in Ischia (Italy) on 4-11 July 2015.
More info on the website: http://project.europeanmoocs.eu/project/get-involved/summer-school/
Follow our MOOCs: http://platform.europeanmoocs.eu/MOOCs
Design and deliver your MOOC with EMMA: http://project.europeanmoocs.eu/project/get-involved/become-an-emma-mooc-provider/
Learning Analytics and the Scholarship of Teaching and Learning - an obvious ...Blackboard APAC
The scholarship of teaching and learning (SoTL) essentially advocates for a research approach to be applied to the improvement of learning and teaching. It encourages teachers to reflect in a scholarly way on their teaching practice and at the more advanced level to undertake research on teaching practice and curriculum. Learning analytics has the potential to provide data on elements of the teaching process which have to date been difficult to measure particularly for the broader cohort of teachers.
This presentation will draw attention to the connection between SoTL and learning analytics and prompt participants to think about how learning analytics can be used in a wider context to contribute to changes in teaching design and practice.
ABLE - the NTU Student Dashboard - University of DerbyEd Foster
implementing a university wide learning analytics system.
Presentation Overview:
- Introduction
- Developing the NTU Student Dashboard
- Transitioning from pilot phase to whole institution roll-out
- Embedding the resource into working practices
- Future development
Learning Analytics for online and on-campus education: experience and researchTinne De Laet
This presentation was used Tinne De Laet, KU Leuven, for a keynote presentation during the event: http://www.educationandlearning.nl/agenda/2017-10-13-cel-innovation-room-10-learning-and-academic-analytics organised by Leiden University, Erasmus University Rotterdam, and Delft University of Technology.
The presentations presents the results of two case studies from the Erasmus+ project ABLE and STELA, and provides 9 recommendations regarding learning analytics.
Collaborative, Program-wide Alignment of Assessments and ePortfolios to Build...ePortfolios Australia
During their course of study, medical science students are generally unaware that they are developing professional skills related to graduate capabilities. Interestingly, at a program level the institution finds it difficult to view the development of these capabilities. In this session we will discuss our own learning journey as discipline specific teachers who have worked collaboratively to implement ePortfolios and rubrics across courses and within the medical science degree program at UNSW Australia. Our approach to supporting student learning and development of reflective practice and professional skills in teamwork by cross-discipline alignment of assessment coupled with ePortfolio thinking and doing will be presented.
Evidence Based Decision Making in the Classroom Panelalywise
Slides from Daltai-supported seminar to explore the role of continuous professional development in supporting staff & student engagement with learning analytics.
Data Archeology - A theory- and context-informed approach to analyzing data t...alywise
Theoretical overview and two examples of Data Archeology - a need to deeply understand context and engage in ground-truthing when analyzing large sets of digital data.
Francesca Gottschalk - How can education support child empowerment.pptxEduSkills OECD
Francesca Gottschalk from the OECD’s Centre for Educational Research and Innovation presents at the Ask an Expert Webinar: How can education support child empowerment?
Honest Reviews of Tim Han LMA Course Program.pptxtimhan337
Personal development courses are widely available today, with each one promising life-changing outcomes. Tim Han’s Life Mastery Achievers (LMA) Course has drawn a lot of interest. In addition to offering my frank assessment of Success Insider’s LMA Course, this piece examines the course’s effects via a variety of Tim Han LMA course reviews and Success Insider comments.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
Synthetic Fiber Construction in lab .pptxPavel ( NSTU)
Synthetic fiber production is a fascinating and complex field that blends chemistry, engineering, and environmental science. By understanding these aspects, students can gain a comprehensive view of synthetic fiber production, its impact on society and the environment, and the potential for future innovations. Synthetic fibers play a crucial role in modern society, impacting various aspects of daily life, industry, and the environment. ynthetic fibers are integral to modern life, offering a range of benefits from cost-effectiveness and versatility to innovative applications and performance characteristics. While they pose environmental challenges, ongoing research and development aim to create more sustainable and eco-friendly alternatives. Understanding the importance of synthetic fibers helps in appreciating their role in the economy, industry, and daily life, while also emphasizing the need for sustainable practices and innovation.
Operation “Blue Star” is the only event in the history of Independent India where the state went into war with its own people. Even after about 40 years it is not clear if it was culmination of states anger over people of the region, a political game of power or start of dictatorial chapter in the democratic setup.
The people of Punjab felt alienated from main stream due to denial of their just demands during a long democratic struggle since independence. As it happen all over the word, it led to militant struggle with great loss of lives of military, police and civilian personnel. Killing of Indira Gandhi and massacre of innocent Sikhs in Delhi and other India cities was also associated with this movement.
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
Embracing GenAI - A Strategic ImperativePeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Overview on Edible Vaccine: Pros & Cons with Mechanism
Designing Learning Analytics for Humans with Humans
1. DESIGNING
LEARNING
ANALYTICS FOR
HUMANS WITH
HUMANS
A l y s s a F r i e n d W i s e
A s s o c i a t e P r o f e s s o r
N e w Y o r k U n i v e r s i t y
D i r e c t o r , N Y U - L E A R N
SoLAR Webinar, Oct 16th, 2019
@NYU_LEARN
@alywise nyu.edu/learn-analytics
2. Yeonji Jung Sameen Reza
Alyssa Wise
JP Saramiento
Eunyoung JeonSophia Lu Trang TranSophie Sommer
Fabio Campos Ofer Chen
Yoav Bergner Susana ToroXavier Ochoa
Yu Wang
Shiri MundJing Zhang
Qiujie Li
Meet
Our
Team
3. Ben Maddox
Chief Instructional Technology Officer
Jason Korenkiewicz
Director of Instructional Technology Tools & Services
Elizabeth McAlpin
Project Director of Research & Outcomes Assessment
With thanks to our amazing partners at NYU-IT
Andrew Brackett
Learning Analytics Specialist
Robert Egan
eLearning Specialist
4. And the many members of the larger LEARN community
across NYU who participated in the projects described today
Faculty of Arts
& Sciences
Selin Kalaycioglu
Lucy Appert
Tyrell Davis
Stern School of
Business
Kristen Sosulski
Ben Bowman
Sean Diaz
Marian Tes
Daniel de Valk
NYU
Libraries
Andrew Battista
Denis Rubin
School for Professional Studies
Victoria Axelrod
5. V A N H A R M E L E N & W O R K M A N ( 2 0 1 2 )
“LEARNING ANALYTICS EXIST
AS PART OF A SOCIO-
TECHNICAL SYSTEM WHERE
HUMAN DECISION-MAKING
AND CONSEQUENT ACTIONS
ARE AS MUCH A PART OF ANY
SUCCESSFUL ANALYTICS
SOLUTION AS THE TECHNICAL
COMPONENTS"
6. “LEARNING ANALYTICS EXIST
AS PART OF A SOCIO-
TECHNICAL SYSTEM WHERE
HUMAN DECISION-MAKING
AND CONSEQUENT ACTIONS
ARE AS MUCH A PART OF ANY
SUCCESSFUL ANALYTICS
SOLUTION AS THE TECHNICAL
COMPONENTS"
V A N H A R M E L E N & W O R K M A N ( 2 0 1 2 )
7. AND YET . .
.
ONLY 6% OF STUDENT-FACING
LEARNING ANALYTICS SYSTEMS
DESCRIBED IN THE LITERATURE 2004-
2016 REPORTED A CLEAR,
EXPLICIT NEEDS ANALYSISAND ONLY 11% REPORTED ANY FORM
OF USABILITY TESTING
B O D I L Y & V E R B E R T ( 2 0 1 7 )
8. AND YET . .
.
ONLY 30% OF
DASHBOARDS DESCRIBED IN
THE LITERATURE 2010-2015
INCLUDED A REPORT OF
AUTHENTIC USER
EVALUATION
S C H W E N D I M A N N E T A L . ( 2 0 1 6 )
10. WHY DOES THIS
MATTER?
TOOLS THAT ARE DESIGNED
WITHOUT CONSIDERATION OF
USER’S NEEDS AND THE
SITUATIONS IN WHICH THEY WILL
USE THEM ARE UNLIKELY TO
IMPACT REAL WORLD
PRACTICES IN ANYC U B A N ( 2 0 0 1 )
11. The first decade of Learning Analytics has
focused more on technical systems than
human ones
This represents a large gulf with what is
known about best practices for Human-
Computer Interaction Design
Consequently there is now great interest in
involving the intended users of learning
analytics in their design
IN SUMMARY
12. Special Section: Human-Centred Learning
Analytics
Working Together in Learning Analytics: Towards the
Co-Creation of Value
Co-Designing a Real-Time Classroom Orchestration Tool
to Support Teacher-AI Complementarity
Teaching with Analytics: Towards a Situated Model of
Instructional Decision-Making
Designing in Context: Reaching Beyond Usability in
Learning Analytics Dashboard Design
Engaging Faculty in Learning Analytics: Agents of
Institutional Culture Change
Journal of Learning Analytics 6(2) – Summer 2019
learning-analytics.info
14. Learning Analytics @ NYU
a collaborative effort, focused on community
change, that puts people, not data, first
We build partnerships
between researchers,
information technology staff,
faculty, administrators and
students to jointly advance
data-informed teaching and
learning
We create and
support effective
teaching and learning
tools that augment
human capacity to
improve educational
processes
17. INITIAL DESIGN PROCESS
Establish the scope and goals for the
project
• University-wide service to operate at
scale
• Support data-informed decision-making
• Instructor of record at the heart of
service
Some starting strategies
• Draw on existing knowledge and
relationships
NYU INSTRUCTIONAL DASHBOARD
18. INITIAL DESIGN PROCESS
Use cases drive design with instructor
questions as the starting point
NYU INSTRUCTIONAL DASHBOARD
19. Resource Activity View
# of Times Accessed
Each
Student
Duration of Access Time
Each
Resource
Learning Analytics Dashboard Design v1
To identify students (engaging / not engaging with the resources)
& resources (frequently / infrequently accessed)
Purpose
20. To identify aspects of the materials which were difficult for studentsPurpose
Learning Analytics Dashboard Design v1
Score for Each Item
(at the Class-level)
Detailed Results
for Each Item
Quiz
Item
Quiz Results View
22. FIRST FORAYS TO THE FIELD
(Not surprisingly) the process of actually
using analytics to inform pedagogical
decisions is complex
NYU INSTRUCTIONAL DASHBOARD
Instructors’ excitement &
high perceived value
around
analytics release/use
Use of information to
guide
sense-making &
pedagogical action
Tool-provided
data about
student activity
Struggles in connecting
the data with their
teaching and routines
<< >>
TENSION
GAP
We need to examine the process of analytics use in
situ
23. In what ways do instructors make
pedagogical decisions based on
analytic data?
What implications for LA design
and implementation
can be drawn based on this?
Key Questions
We need to examine the process of analytics use in
situ
Case studies with all 5
instructors who used
the LA dashboard in
their teaching during
that first semester
Approach
24. Template for Inquiry
Sense-Making Pedagogical Response
Check Impact
Get Oriented
Focused Attention
Use Context
Interpret Data
Whole-Class Scaffolding
Targeted Scaffolding
Revise Course Design
Take ActionAsk Questions
Q1. How Do Instructors Ask
Questions of the Analytics?
Q2. How Do Instructors
Interpret the Analytics?
Q3. How Do Instructors
Respond to the Analytics?
Q4. How Do Instructors
Check the Impact of
Actions?
starting from the literature
Q5. What are other
important aspects of
instructors’ analytics use?
25. Q1. How Do Instructors Ask
Questions of the Analytics?
Approaching the Analytics
Based on Existing Areas
of Curiosity
Developing Questions
through Interacting with
the Analytics
Q2. How Do Instructors
Interpret the Analytics?
Getting Oriented through
Focused Attention to
the Analytics
Examining Changes of
Student Engagement
over Time
The Need for
a Reference Point
Triangulating the Analytics
with Additional Information
abut Student
Using the Course Context
to Explain/Question
the Analytics
Inconsistent Attribution of
Analytic Results
Q3. How Do Instructors
Respond to the Analytics?
Taking Action via
Whole Class Scaffolding
Taking Actions via
Targeted Scaffolding
Taking Actions via
Revising Course Design
Wait-and-See
Reflecting on Pedagogical
Strategies and Knowledge
Q4. How Do Instructors
Check the Impact of
Actions?
Q5. What are Other
Important Aspects of
Instructor Analytics Use?
Data Interpretation Is
Affective as Well as
Cognitive
Wrestling with Questions
of Transparency around
Analytics
Experiencing a Learning
Curve in Analytics Use
Potential Value of
Collaborative
Interpretation
Disconnection between
Pedagogical Approaches
and Data Presented
Misalignment between
Instructor and System
Timing
Analytics Seen as Useful
but not Essential
Emergent Themes
26. Interpret Data
Sense-Making Pedagogical Response
Get Oriented/
Focused Attention
Find Absolute & Relative
Reference Points
Read Data
Triangulate
Contextualize
Make Attribution
Explain Pattern
AFFECTIVE PROCESSES
Area of
Curiosity
Question
Generation Wait-and-See
Reflect on
Pedagogy
Check
Impact
Take Action
Whole-Class Scaffolding
Targeted Scaffolding
Revise Course Design
A Model of Instructor Analytics Use
Sense-Making Pedagogical Response
Wise, A. F., & Jung, Y. (2019). Teaching with Analytics: Towards a Situated Model of
Instructional Decision-Making. Journal of Learning Analytics, 6(2), 53-69.
27. I. Design to Support
Processes Use
Features for Question
Generation & Maintenance
Support for Working with
Reference Points
Visual Aids for
Finding Entry Points
Flags for Later Decisions
to Take Action
Switch to De-identified
Views for Sharing
III. Support Sense-making
Conversations
II. Align Information with
Pedagogical Concerns
Organize Information
from Teaching Perspective
Align System Timing with
Teaching Practices
The model offers a clear starting place to (re)design LA to support
instructors’ pedagogical decision-making by guiding designers in
thinking ahead to instructor use during the design process
Implications for Dashboard Redesign
28. Our partnership with IT has led to new models of
dashboard (re)design and iterative improvement cycles
Assessment View
Learning Analytics Dashboard Design v2
29. I. Link Pedagogical Questions, Answers, & Actions Together
Interpretive dashboard shell plus weekly emails
II. Support Collaborative Interpretation & Feedback
Workshops & One-on-one coaching sessions
III. Cultivate Contextualized & On-Going Networks
Local instructor communities of practice around analytics use
Along with design efforts, it is also important to consider
implementation supports to facilitate translating information
into actionable insights.
Implications for Implementation Supports
34. Key Questions for Participatory Design
Who is in the room? (VON HIPPEL, 2005)
What do they see? What are they
told?
What are they invited to do? (VERBERT,
2014)
At what stages of the process do they
participate?
Note that students are frequently excluded
(MARJANOVIC, 2014)
35. Student-Facing Learning Analytics Project
Phases
NEEDS
ANALYSIS
CO-
DESIGN
PROTO
ITERATING
PERSONAS
Salient challenges
QUOTES
Represent personas
STEERING
COMMITTEE
INTERVIEWS
Advisors -> faculty ->
students
38. THREE KEY DESIGN INGREDIENTS
WIDE PROBLEM
SCOPE
The dashboard, data
and academics are
not the limit. Project
objectives flexible
from the beginning.
GENERATIVE
TENSIONS
SAFE DESIGN
SPACE
39. THREE KEY DESIGN INGREDIENTS
WIDE PROBLEM
SCOPE
GENERATIVE
TENSIONS
SAFE DESIGN
SPACE
Doc students led
workshops for
undergrads: a
learning experience
for everyone
involved.
40. THREE KEY DESIGN INGREDIENTS
GENERATIVE
TENSIONS
Leveraging tensions
as fuel for design;
going beyond "tell
me what you want".
SAFE DESIGN
SPACE
WIDE PROBLEM
SCOPE
42. THREE FUNDAMENTAL TOOLS
EMPATHY
MAPPING
The entire design
process was
centered what on
what the defined
personas would feel
or think about our
ideas.
LIVE
PROTOTYPING
DESIGN CARDS
43. Student Persona Descriptions
OS (Overwhelmed Student)
• College feels a lot more challenging
than high school
• Is reassessing who they are (not
the best in class any more)
• May have some habits that could
be improved around:
• Writing
• Time management
• Advocating for themselves
FG (First Gen Transitioner)
• NYU perceived as great
opportunity: stakes very high
• Generally very proficient and
recognized in their environments
but not recognized as much in new
space
• May feel different to classmates
• High pressure from family and fear
of failure
• May have trouble navigating
additional opportunities
44. Student Personas Shared via Quotes
Olivia (Overwhelmed Student)
Says
• I used to do well in school. Really
well. I was the Valedictorian, and I
always knew that I wanted to come
to NYC. Now I am not so sure.
• I am confused about my grades. I
am good at studying. I did well in
school.
Does
• Tends to open the readings one or
two days before class, sometimes
the morning before the class itself.
Frank (First Gen Transitioner)
Says
• I had a friend, one of these
mentors, that told me how to
navigate the statistics course….
there was all this stuff out there. I
wish he would have told me earlier.
• I have learned to pace myself when
studying, and do a little every day. I
don’t know where I got that from,
maybe another student.
Does
• Studies a bit every day. Without
much structure; just allots a number
of hours to study and reads or
writes whatever is most urgent.
48. THREE FUNDAMENTAL TOOLS
LIVE PROTOTYPING
A UX designer
materialized ideas into
sketches with
students in situ, during
the workshops.
EMPATHY
MAPPING
DESIGN CARDS
49. “HIVE” design – from Ideation to Prototype
First ideas
drawn by
students
53. Social
Sharing of
information through
the system seen as
one of the powerful
possibilities of data.
Holistic
Students
underscored needs
beyond academic
help.
[Data]
Using data was as
only part of the story.
Thus, students came
up with tools that are
more than
dashboards.
EMERGENT THEMES FROM STUDENTS
55. FINAL TAKEAWAYS FOR
DESIGNING LA FOR HUMANS
WITH HUMANS
Gathering (useful) input from humans
to inform analytics is about much
more than simply asking people what
they would like
Both efforts described here led to a
variety of things we never would have
imagined otherwise
There was a concern with burdening
already overstretched students and
56. Yeonji Jung JP Saramiento Fabio Campos
Special thanks to the LEARN PhD students who
spearheaded work on the projects described today
58. DESIGNING
LEARNING
ANALYTICS FOR
HUMANS WITH
HUMANS
A l y s s a F r i e n d W i s e
A s s o c i a t e P r o f e s s o r
N e w Y o r k U n i v e r s i t y
D i r e c t o r , N Y U - L E A R N
SoLAR Webinar, Oct 16th, 2019
@NYU_LEARN
@alywise nyu.edu/learn-analytics
Editor's Notes
(6 advisers interviewed – led to the students)
Schematics – phases for LAK. How sampled people.
Learning analytics (LA) is a technology for enabling better decision-making by teachers, students, and other educational stakeholders by providing them with timely and actionable information about learning-in-process on an ongoing basis. To be effective LA tools must thus not only be technically robust but also designed to support use by real people.
Learning analytics (LA) is a technology for enabling better decision-making by teachers, students, and other educational stakeholders by providing them with timely and actionable information about learning-in-process on an ongoing basis. To be effective LA tools must thus not only be technically robust but also designed to support use by real people. One powerful strategy for achieving this goal is to involve those who will (hopefully!) use the learning analytics in their design.
Schwendimann, B. A., Rodriguez-Triana, M. J., Vozniuk, A., Prieto, L. P., Boroujeni, M. S., Holzer, A., ... & Dillenbourg, P. (2016). Perceiving learning at a glance: A systematic literature review of learning dashboard research. IEEE Transactions on Learning Technologies, 10(1), 30-41.
Bodily, R., & Verbert, K. (2017). Review of research on student-facing learning analytics dashboards and educational recommender systems. IEEE Transactions on Learning Technologies, 10(4), 405-418.
Maybe they were involved a bit, but bringing them into a lab doesn’t cut it
Schwendimann, B. A., Rodriguez-Triana, M. J., Vozniuk, A., Prieto, L. P., Boroujeni, M. S., Holzer, A., ... & Dillenbourg, P. (2016). Perceiving learning at a glance: A systematic literature review of learning dashboard research. IEEE Transactions on Learning Technologies, 10(1), 30-41.
Bodily, R., & Verbert, K. (2017). Review of research on student-facing learning analytics dashboards and educational recommender systems. IEEE Transactions on Learning Technologies, 10(4), 405-418.
de Quincey, E., Briggs, C., Kyriacou, T., & Waller, R. (2019, March). Student Centred Design of a Learning Analytics System. In Proceedings of the 9th International Conference on Learning Analytics & Knowledge (pp. 353-362)
Cuban, L. (2001). Oversold and underused: Computers in the classroom. Cambridge, MA: Harvard University Press
de Quincey, E., Briggs, C., Kyriacou, T., & Waller, R. (2019, March). Student Centred Design of a Learning Analytics System. In Proceedings of the 9th International Conference on Learning Analytics & Knowledge (pp. 353-362)
Cuban, L. (2001). Oversold and underused: Computers in the classroom. Cambridge, MA: Harvard University Press
Schwendimann, B. A., Rodriguez-Triana, M. J., Vozniuk, A., Prieto, L. P., Boroujeni, M. S., Holzer, A., ... & Dillenbourg, P. (2016). Perceiving learning at a glance: A systematic literature review of learning dashboard research. IEEE Transactions on Learning Technologies, 10(1), 30-41.
Bodily, R., & Verbert, K. (2017). Review of research on student-facing learning analytics dashboards and educational recommender systems. IEEE Transactions on Learning Technologies, 10(4), 405-418.
Schwendimann, B. A., Rodriguez-Triana, M. J., Vozniuk, A., Prieto, L. P., Boroujeni, M. S., Holzer, A., ... & Dillenbourg, P. (2016). Perceiving learning at a glance: A systematic literature review of learning dashboard research. IEEE Transactions on Learning Technologies, 10(1), 30-41.
Bodily, R., & Verbert, K. (2017). Review of research on student-facing learning analytics dashboards and educational recommender systems. IEEE Transactions on Learning Technologies, 10(4), 405-418.
Gathering information from intended users
Observing teaching and learning practices
Directly engaging them in participatory designIn this webinar, I'll present a diverse set of examples of the ways that NYU's Learning Analytics Research Network (NYU-LEARN) is including educators and students in the process of building and implementing learning analytics.
involve students in the creation and revision of learning analytics solutions for their own use
I research on you, ask you things, create with you (bring you here, I go to there)
Key theme of how to deal with the lack of awareness and ideas faculty may have about how to do this and what is possible
Not just going to them and saying “what do you want form the data”
We didn’t go with a blank slate
We talked to them
The conversation was questions do you have (not what data do you want to see)
(5)
The dashboards used in this study were developed by NYU IT based on the consultations with each instructor about the kinds of student activity and performance information they would like to see in the dashboards.
(6) The dashboard for each instructor consists of three to four distinct views. For example, student resource activity view was developed to help instructors to identify who were not engaging with the resource or which course materials were not frequently accessed.
This view displayed which course resource was accessed by each student, and visualized its number of times as size and its duration as color.
(7) Another view displayed which quiz item was difficult by the class, and visualized its average score and the number of students who completed this item.
Very different than usability testing
Tension -> opportunity. Create opportunities for tension, creates opportunities for tension. Generative. Rather than a “pat” process.
Shout out – designers are aware of this, but institutions aren’t -> create space in your “production schedule” for this. Create space with the users for this.
People never use your tool the way you expect.
Creating the dashboard alone isn’t enough -> other artifacts (website, community)….
Semi-structured retrospective interviews where instructors walked us through their process of dashboard use
(9)
Then, we tried to align the empirical findings into the hypothetical model. 20 themes emerged.
However, there was no theme conceptualized in checking impact. Rather, seven themes emerged related to the other important aspects of instructors use of the analytics.
These themes were synthesized into the final model we conceptualized as the process of instructor analytics use.
(1) The model we conceptualized consists of two-part structure with multiple phases. First, sense-making. Second, pedagogical response.
Gathering information from intended users
Observing teaching and learning practices
Directly engaging them in participatory designIn this webinar, I'll present a diverse set of examples of the ways that NYU's Learning Analytics Research Network (NYU-LEARN) is including educators and students in the process of building and implementing learning analytics.
involve students in the creation and revision of learning analytics solutions for their own use
I research on you, ask you things, create with you (bring you here, I go to there)
The project emerged as a collaboration between NYU IT + LEARN.
IT had data which had been wrangled for previous project and saw opportunity.
VALUE OF STUDENT_FACING DATA
LEARN saw opportunity to approach process differently from faculty dashboard.
Because when designing for students issues of power are more “dangerous” (with tools designed “on” students and not “for” students, Lab and IT discussed using participatory design as a process.)
IT was keen, and *importantly* willing to slow down their delivery timelines to acommodate
There are different levels of participation in design.
As discussed before, the lowest one is asking stakeholders, the highest one having stakeholders involved in every phase of the process.
A way to conceptualise this is, to quote Hamilton, is to ask ourselves who is in the “room where it happens” and who isn’t. Who are we inviting to be part of the process?
How much information do they have when they are in the room? Do they know all information of the project or are elements hidden to them, either because they could be complicated institutionally, or because there are barriers of knowledge (differences between experts and amateurs).
At what times do they access the information and can participate? Is it constant or sporadic? Are the researchers and designers interpreting the stakeholder’s desires or are stakeholders designing themselves?
In general it has been the case that students are excluded from these processes. We wanted to flip that in this project, inviting them and other stakeholder to multiple phases of the project.
PHASES OF PROJECT:
On a grand level, we would have three phases;
A needs analysis, where we would assemble stakeholders and interview stakeholders to understand the problems and opportunities in this space better,
A process of co-design, where students, researchers and designers would participate in workshops to work together to come up with solutions to the challenges, and
Iterative phases of prototyping, testing, and repeating to improve the prototypes.
FROM -> WITH -> FROM / WITH
STEPS
We formed a steering commitee with different stakeholders. We had:
Representatives from IT
Academics
The director of educational technology from one of the schools at NYU
The head of advising staff at that school.
PHASES OF PROJECT:
With the help of the committee we outlined stakeholders who we would interview: faculty (3), advisors (6) and students (15).
Advisor interviews surfaced that there were groups of students who were particularly challenged, and we discussed making them the “extreme users” who we would design for.
Extreme users is a technique used often in design; you find someone who could use the tool, but who may have special challenges or needs, or use it very differently from other users. The assumption is that designing for this extreme needs allows for designs that meet the needs of average users better, and also meets the needs of those users who are not average.
in a university this is specially important if we want to have a focus on equity.
We identified extreme users as ‘first gen” students, who have special challenges in the school to college transition.
CALL FOR WORKSHOP
We made an open call to students, with advertisement in housing and the library. We also specially emailed a list of 100 first generation students, from which many of the participants in the workshop came.
The workshop format was chosen because of the difficulty of having a large number of students involved with the project for long periods of time. However, this created challenges of its own, and planning of the workshop was done to adress those challenges.
DESIGN SESSIONS:
Our objectives in these sessions were:
To gain insight into what types of tools students think they would use.
Collaborate with students in designing these tools.
Also, we gained insights into some of the challenges and fears of students when using tools that use their data. This was not unexpected but had not been incorporated into our design.
FOR THE SESSIONS, we had key ingredients we believe were central to the success;
Leave the problem space wide and open. Sometimes when designing we limit ourselves to a clear problem, and that is, partly, the suggestion in some methodologies of design thinking. However, because students are not designers, there is an iterative process between finding solutions and gaining more insights about the problems, which lead to more nuanced or interesting solutions. Moreover, as the team explored the space, new issues and problems arose. For example, the design which the team finally developed as a promising solution, the “Hive”, emerged from the question of dealing with competing deadlines in multiple courses, which wasn’t even in the forefront of the discussion of the challenges to design for in the first of the design sessions. The openness allows for increased depth and exploration.
Power dynamics can be a challenge and an opportunity in workshops like these. Maybe part of the reason why these ones generated insights was because facilitators were PhD students, who students could relate to. Similarly in the faculty project there was certain rapport in the interviews because the interviewer was a fellow academic. Signaling that the space is safe by encouraging dissent, showing that you (the facilitator) does not have all the answers, performing low hierarchy, using humor, among other techniques, is important to allow them to be creative and sincere.
More tangible than "I will create something for myself" → I will learning something today: DESIGN.
2. Look and feed tensions. In human centered design we sometimes address the user directly and ask them for their need. But sometimes the need is unclear, and they may not even know they could use a solution until they see it. Often, the “real” or surprising needs and solutions come when tensions arise. Don’t sweep them under the rug, and be attentive to them.
In our sessions, one of the students, for example, asked the facilitators whether the objective was to make students more academically successful, or happy. This led to a deep conversation about student wellbeing and the role of the university in student’s lives, which is represented in designs for an emotional tracker as part of the proposed solutions.
Phases of the workshop (4 min):
Ice breakers or fire starters: given the little time for interaction between participants and the need for them to become comfortable with each other (because of the relevance of psychological safety to lead to creative insights), ice breakers are an overlooked but important part of the process. The ones chosen in this workshop were aimed at inciting group work and also making participants physically connect and be vulnerable with each other.
Design Thinking Mini Lecture: In the workshop, participants learned about design thinking methodology before applying it.
Empathy Mapping: Traditional Human-centered design technique.
Personas were based on research discovery interviews, presented as an open-ended challenge to student participants in the form of quotes extracted from the interviews.
Participants asked to fill in gaps with knowledge about themselves and peers.
Phases of the workshop (4 min):
Data Mapping: To get participants familiarized with the kinds of data available, chat about what data could be useful for the challenge.
Ideation: First stab at coming up with ideas for a solution.
Short list of ideas, selecting the ones that students thought were most promising,
Prototyping, making paper graphic representations of the ideas,
Then discussing the prototypes and making new versions of them.
THREE TOOLS
On the workshops, there were three tools which, we believe, helped generate a lot of designs in a relatively short time.
Empathy Mapping: It is used often in human centered design. The more time is spent on this part of the process, the more participants are getting into the mental space of our user. If we had had the time, we would have had the students actually participate in the interviews, and maybe co-research, talking to other students to understand their challenges. Because of time constraints, we designed a shortcut, which both brought into the room the insights of the research team, and took advantage of the fact that the participant students were not very different from the interviewed students themselves; they could flesh out with the knowledge they had from themselves and their peers, and use this experience to theorize what kinds of solutions could be appropriate.
PHASES OF PROJECT:
On a grand level, we would have three phases;
A needs analysis, where we would assemble stakeholders and interview stakeholders to understand the problems and opportunities in this space better,
A process of co-design, where students, researchers and designers would participate in workshops to work together to come up with solutions to the challenges, and
Iterative phases of prototyping, testing, and repeating to improve the prototypes.
FROM -> WITH -> FROM / WITH
STEPS
We formed a steering commitee with different stakeholders. We had:
Representatives from IT
Academics
The director of educational technology from one of the schools at NYU
The head of advising staff at that school.
PHASES OF PROJECT:
With the help of the committee we outlined stakeholders who we would interview: faculty (3), advisors (6) and students (15).
Advisor interviews surfaced that there were groups of students who were particularly challenged, and we discussed making them the “extreme users” who we would design for.
Extreme users is a technique used often in design; you find someone who could use the tool, but who may have special challenges or needs, or use it very differently from other users. The assumption is that designing for this extreme needs allows for designs that meet the needs of average users better, and also meets the needs of those users who are not average.
in a university this is specially important if we want to have a focus on equity.
We identified extreme users as ‘first gen” students, who have special challenges in the school to college transition.
PHASES OF PROJECT:
On a grand level, we would have three phases;
A needs analysis, where we would assemble stakeholders and interview stakeholders to understand the problems and opportunities in this space better,
A process of co-design, where students, researchers and designers would participate in workshops to work together to come up with solutions to the challenges, and
Iterative phases of prototyping, testing, and repeating to improve the prototypes.
FROM -> WITH -> FROM / WITH
STEPS
We formed a steering commitee with different stakeholders. We had:
Representatives from IT
Academics
The director of educational technology from one of the schools at NYU
The head of advising staff at that school.
PHASES OF PROJECT:
With the help of the committee we outlined stakeholders who we would interview: faculty (3), advisors (6) and students (15).
Advisor interviews surfaced that there were groups of students who were particularly challenged, and we discussed making them the “extreme users” who we would design for.
Extreme users is a technique used often in design; you find someone who could use the tool, but who may have special challenges or needs, or use it very differently from other users. The assumption is that designing for this extreme needs allows for designs that meet the needs of average users better, and also meets the needs of those users who are not average.
in a university this is specially important if we want to have a focus on equity.
We identified extreme users as ‘first gen” students, who have special challenges in the school to college transition.
THREE TOOLS
Design cards: To make sure that the conversation and solutions stayed closed to the existing data, as well as to upskill students to think about data like learning scientists, a set of cards were created. (go to next slide)
Blue cards - actions / verbs; What data may do which aids learning and insight
Green - types of information and inferences you can make from them; what data we have.
Orange - Feelings; There to stimulate lateral thinking (looking at the combinations from a different perspective).
Students would randomly join a blue and a green card, and if blocked could add an orange card. Each “set” was a “solution”, a description of what a solution could do, which inspired students to think of a “form” (how would the solution look like, what does it do to achieve that result, and what are the implications).
Randomized cards can be a great way to both bring information into a discussion, and create random serendipitous combinations that can lead participants to discovery and insight.
THREE TOOLS
On the workshops, there were three tools which, we believe, helped generate a lot of designs in a relatively short time.
Live prototyping. Instead of students simply sharing ideas, both them and designers developed images of prototypes. The images of designers would then be shared back with participants to immediately member check and improve. This led to multiple iterations upon ideas in a single session.
This is a student’s first rendering of the idea of the “Hive”; a space where they could see, organized, multiple courses and resources.
Facilitators and designers then played with the idea alongside students, incorporating ideas such as “rate of completion” of different courses, or forms of comparing which courses had the most activity (which could signal the need for attention).
Students played with the idea further, and after the workshops, designers began playing with the concept. This is the first digital prototype; the bottom is a timeline with assignments, sized by “grade impact”
THIS IS A SECOND ITERATION, which adds other tools that students designed, and adds for example, an emotional tracker and notifications with information from different courses.
FROM THE WORKSHOP
Three insights emerged:
It seems that students lean more towards solutions with a strong social component. This is not surprising if we think for a second; they interact with social media daily. Moreover, both on the interviews and in the workshops, they described experiences where some peer (a mentor, another student) provided information which helped them reframe situations and face some of their academic and life challenges at the university better.
However, many of the student-facing solutions that the field has produced don’t have a strong social component, which hints to an opportunity for the field to use analytics and social media combined.
While many of the solutions that the field has produced are centered on academic needs, students saw their academic and life needs as intertwined. This is consistent on what we know about student experience, and the fact that it is not rare that what looks like an academic challenge is actually a life challenge in a student’s life (adjusting to college, depression, loneliness, etc.). As we promote holistic approaches for learning, so we should perhaps with analytics-powered solutions.
Consistent with the two previous points, in student’s solutions there is data and elements of dashboards, but they were often only part of the solution. Data was there to help, but it was not the only way to get help nor often the most important. They leaned toward systems which would do multiple things for them, as opposed to the dashboards that are often proposed as solutions.
Slowness – space and time and partner
Important – value of directions we never would have gone otherwise