This document summarizes an interactive masterclass on the TESTA (Transforming the Experience of Students Through Assessment) programme approach. The masterclass discusses the rationale for taking a programme approach to assessment, including addressing modular problems, curriculum problems, and student alienation. Methods discussed include conducting a TESTA programme audit and using an Assessment Experience Questionnaire and student focus groups to gather data. Key themes covered are high summative assessment loads, disconnected feedback between assignments, and student confusion about assessment goals and standards. Strategies presented to improve assessment include increasing formative assessment, providing more dialogic feedback, and helping students internalize assessment criteria.
22 January 2018 HEFCE open event “Using data to increase learning gains and t...Bart Rienties
With the Teaching Excellence Framework being implemented across England, a lot of higher education institutions have started to ask questions about what it means to be “excellent” in teaching. In particular, with the rich and complex data that all educational institutions gather that could potentially capture learning gains, what do we actually know about our students’ learning journeys? What kinds of data could be used to infer whether our students are actually making affective (e.g., motivation), behavioural (e.g., engagement), and/or cognitive learning gains? Please join us on 22 January 2018 in lovely Milton Keynes at a free OU- and HEFCE-supported event on Using data to increase learning gains and teaching excellence.
14.00-15.00 Measuring learning gains with (psychometric) questionnaires
Dr Sonia Ilie, Prof Jan Vermunt, Prof Anna Vignoles (University of Cambridge, UK): Learning gain: from concept to measurement
Dr Fabio Arico (University of East Anglia): Learning Gain and Confidence Gain Through Peer-instruction: the role of pedagogical design
Dr Paul Mcdermott & Dr Robert Jenkins (University of East Anglia): A Methodology that Makes Self-Assessment an Implicit Part of the Answering Process
15.00-15.45 Measuring employability learning gains
Dr Heike Behle (University of Warwick): Measuring employability gain in Higher Education. A case study using R2 Strengths
Fiona Cobb, Dr Bob Gilworth, David Winter (University of London): Careers Registration Learning Gain project
When Student Confidence Clicks - Using Student Response SystemsFabio R. Arico'
In this presentation I illustrate the methodology used to measure the relationship between student attainment, engagement, and self-efficacy beliefs through Student Response Systems.
https://sites.google.com/site/fabioarico
22 January 2018 HEFCE open event “Using data to increase learning gains and t...Bart Rienties
With the Teaching Excellence Framework being implemented across England, a lot of higher education institutions have started to ask questions about what it means to be “excellent” in teaching. In particular, with the rich and complex data that all educational institutions gather that could potentially capture learning gains, what do we actually know about our students’ learning journeys? What kinds of data could be used to infer whether our students are actually making affective (e.g., motivation), behavioural (e.g., engagement), and/or cognitive learning gains? Please join us on 22 January 2018 in lovely Milton Keynes at a free OU- and HEFCE-supported event on Using data to increase learning gains and teaching excellence.
14.00-15.00 Measuring learning gains with (psychometric) questionnaires
Dr Sonia Ilie, Prof Jan Vermunt, Prof Anna Vignoles (University of Cambridge, UK): Learning gain: from concept to measurement
Dr Fabio Arico (University of East Anglia): Learning Gain and Confidence Gain Through Peer-instruction: the role of pedagogical design
Dr Paul Mcdermott & Dr Robert Jenkins (University of East Anglia): A Methodology that Makes Self-Assessment an Implicit Part of the Answering Process
15.00-15.45 Measuring employability learning gains
Dr Heike Behle (University of Warwick): Measuring employability gain in Higher Education. A case study using R2 Strengths
Fiona Cobb, Dr Bob Gilworth, David Winter (University of London): Careers Registration Learning Gain project
When Student Confidence Clicks - Using Student Response SystemsFabio R. Arico'
In this presentation I illustrate the methodology used to measure the relationship between student attainment, engagement, and self-efficacy beliefs through Student Response Systems.
https://sites.google.com/site/fabioarico
Course-Adaptive Content Recommender for Course AuthoringPeter Brusilovsky
Developing online courses is a complex and time-consuming
process that involves organizing a course into a sequence of topics and
allocating the appropriate learning content within each topic. This task
is especially difficult in complex domains like programming, due to the
incremental nature of programming knowledge, where new topics extensively
build upon domain concepts that were introduced in earlier lessons.
In this paper, we propose a course-adaptive content-based recommender
system that assists course authors and instructors in selecting the most
relevant learning material for each course topic. The recommender system
adapts to the deep prerequisite structure of the course as envisioned
by a specific instructor, while unobtrusively deducing that structure from
problem-solving examples that the instructor uses to present course concepts.
We assessed the quality of recommendations and examined several
aspects of the recommendation process by using three datasets collected
from two different courses.While the presented recommender system was
built for the domain of introductory programming, our course-adaptive
recommendation approach could be used in a variety of other domains.
Presentation on large-scale e-Learning for Educators online professional development program and research with online training and courses by EdTech Leaders Online at EDC.
Toward an automated student feedback system for text based assignments - Pete...Blackboard APAC
As the use of blended learning environments and digital technologies become integrated into the higher education sector, rich technologies such as analytics have the ability to assist teaching staff identify students at risk, learning material that is not proving effective and learning site designs that aid and facilitate improved learning. More recently consideration has been given to automated essay scoring. Such systems can be used in a formative way, such as providing feedback on initial assignment drafts or summatively through the analysis of final assignment submissions. Further, providing students with quick feedback on written assignments opens the opportunity through formative feedback to improved learning outcomes.
This presentation details a current project developing a system to analyse text-based assignments. The project is being developed for broad application, but the findings focus on an undergraduate pilot subject: ‘Ideas that Shook the World’ (a compulsory first year Bachelor of Arts subject taught on 5 campuses to more than 1000 students by 15 staff). Preliminary results of a fist scan of assignments are presented and the issues raised in developing the system presented together with an outline of additional work planned for the project. It is believed the work will have wide application where text-based assignments are utilised for assessment.
Higher Education & Game Principles: Context, Theory & Application - Daniel La...Blackboard APAC
This presentation reports on the efficacy of a mobile learning intervention that combined ‘push notifications’ and game principles within a timed quiz app. An institutional interdisciplinary case study was conducted which compared rates of student retention and academic performance with their usage of a purpose-designed learning app. Leading up to lectures the app pushed daily quizzes to students’ personal mobile devices and then rewarded them with feedback, points, badges and a position on a leaderboard. During this session, the findings of this study will be discussed and conclusions made in regards to what findings mean for the future research into higher education learning enabled via mobile app technologies.
Personalized Online Practice Systems for Learning ProgrammingPeter Brusilovsky
Computer programming is quickly transitioning from being just a key competency in computer and information science majors to being a desired skill for students in a wide range of fields. Yet, it is also one of the most challenging subjects to learn. While learning by doing is a critical component in mastering programming skills, neither the traditional educational process nor standard learning support tools provide sufficient opportunities for programming practice. In this talk, I will present our research on personalized programming practice systems for Java, Python, and SQL, which attempt to bridge this known gap in learning programming. A programming practice system engages students in practicing programming skills beyond a relatively small number of graded assignments and exams. To support learning by doing, an online practice system offers a range of interactive “smart content” such as program animations, worked examples, and various kinds of programming problems with an automatic assessment. The main challenges for online practice systems are to motivate students to practice and to guide them to the most appropriate smart content given their course goals and knowledge levels. In this talk, I will review a range of AI technologies, such as student modeling, navigation support, social comparison, and content recommendation, which support efficient programming practice. I will also discuss how personalized practice system could support COVID-19-influenced switch to online learning while maintaining an extensive level of feedback expected from an efficient learning process.
We examined predictors of Calculus II final grades within a sample of 84 college students enrolled in a hybrid course through WEPS. Predictors included “typical” psychological correlates, including math confidence, math anxiety, spatial skills and numerosity ability, as well as clickstream data from the students’ activity in the online course. Results showed the clickstream data were the best predictors of course performance, in that students who spent more time grading other students’ assignments, and students who took fewer quiz attempts, did better in the course. Math confidence and then math anxiety were the next best predictors, in that students with higher confidence and lower math anxiety performed better in the course. We will discuss how results might be dependent on the particular content of this course, and how we might use easy to collect psychological variables along with clickstream data to better understand, and potentially predict, course performance in online courses.
Empirical studies of adaptive annotation in the educational context have demonstrated that it can help students to acquire knowledge faster, improve learning outcomes, reduce navigational overhead, and encourage non-sequential navigation. Over the last 8 years we have explored a lesser known effect of adaptive annotation – its ability to significantly increase student engagement in working with non-mandatory educational content. In the presence of adaptive link annotation, students tend to access significantly more learning content; they stay with it longer, return to it more often and explore a wider variety of learning resources. This talk will present an overview of our exploration of the addictive links effect in many course-long studies, which we ran in several domains (C, SQL and Java programming), for several types of learning content (quizzes, problems, interactive examples). The first part of the talk will review our exploration of a more traditional knowledge-based personalization approach and the second part will focus on more recent studies of social navigation and open social student modeling
This details a successful data-driven redesign of Math 215, an online statistics concepts course at Franklin University. The redesigned course incorporated new interactive educational multimedia. This new design resulted in improved student retention, better student performance, and better satisfaction with the course.
24-hour Papers: The Open-Book Alternative to Exams for Online AssessmentDavid Hopkins
Common unit specifications covering delivery of subject-identical units across different courses, often with different delivery methods, are increasingly being implemented. The inclusion of a ‘coursework’ element of assessment allows for flexibility. This is different when an ‘exam’ is required; with students on a fully-online course, unable to attend an exam centre, due to differences in time zones and/or locations, the concept of an open-book exam is used. The exam paper is released to students through our VLE (Blackboard) at a time that is agreed and broadcast to students in advance. Submission of their work is required within a 24-hour window via an upload of their files to the VLE (using either the standard submission tool or Turnitin).
This presentation will draw upon the Bournemouth University’s substantial experience of presenting ‘Time-Constrained Papers’ to students studying at a distance and will consider the issues surrounding this approach. Particular consideration will be given to the importance of question design to limit scope for academic dishonesty and the University’s plans to modify this approach in the forthcoming academic year.
Fostering students’ engagement and learning through UNEDTrivial: a gamified s...UNED
UNEDTrivial is an activity plugin of Moodle that allows teachers to create spaced quizzes based on two principles in the field of educational psychology
• Testing effect: according to which the best method to fix the knowledge is to answer questions after study sessions.
• Spacing education: the spaced repetition of the same items, at specific intervals, increases in long-term retention.
Participants enrolled in a UNEDTrivial receive daily through email reminders of the questions they must answer. Through the feedback provided in each response attempt, students build their knowledge, correcting failures and reinforcing the successes. Wrong answer questions are sending again in the delay established by the instructor to check the knowledge acquisition and increase the long-term retention.
UNEDTrivial offers their complete analytics page where teachers can track the progress of their students. Furthermore, UNEDTrivial uses gamification to increase student engagement such as a leaderboard where students can track their progress, in addition to competing with other classmates. Moreover, UNEDTrivial is compatible with Moodle badges, so it's possible to assign it when a student has closed all questions in a UNEDtrivial.
UNEDTrivial y available at Moodle official plug-in repository:
https://moodle.org/plugins/mod_unedtrivial
Course-Adaptive Content Recommender for Course AuthoringPeter Brusilovsky
Developing online courses is a complex and time-consuming
process that involves organizing a course into a sequence of topics and
allocating the appropriate learning content within each topic. This task
is especially difficult in complex domains like programming, due to the
incremental nature of programming knowledge, where new topics extensively
build upon domain concepts that were introduced in earlier lessons.
In this paper, we propose a course-adaptive content-based recommender
system that assists course authors and instructors in selecting the most
relevant learning material for each course topic. The recommender system
adapts to the deep prerequisite structure of the course as envisioned
by a specific instructor, while unobtrusively deducing that structure from
problem-solving examples that the instructor uses to present course concepts.
We assessed the quality of recommendations and examined several
aspects of the recommendation process by using three datasets collected
from two different courses.While the presented recommender system was
built for the domain of introductory programming, our course-adaptive
recommendation approach could be used in a variety of other domains.
Presentation on large-scale e-Learning for Educators online professional development program and research with online training and courses by EdTech Leaders Online at EDC.
Toward an automated student feedback system for text based assignments - Pete...Blackboard APAC
As the use of blended learning environments and digital technologies become integrated into the higher education sector, rich technologies such as analytics have the ability to assist teaching staff identify students at risk, learning material that is not proving effective and learning site designs that aid and facilitate improved learning. More recently consideration has been given to automated essay scoring. Such systems can be used in a formative way, such as providing feedback on initial assignment drafts or summatively through the analysis of final assignment submissions. Further, providing students with quick feedback on written assignments opens the opportunity through formative feedback to improved learning outcomes.
This presentation details a current project developing a system to analyse text-based assignments. The project is being developed for broad application, but the findings focus on an undergraduate pilot subject: ‘Ideas that Shook the World’ (a compulsory first year Bachelor of Arts subject taught on 5 campuses to more than 1000 students by 15 staff). Preliminary results of a fist scan of assignments are presented and the issues raised in developing the system presented together with an outline of additional work planned for the project. It is believed the work will have wide application where text-based assignments are utilised for assessment.
Higher Education & Game Principles: Context, Theory & Application - Daniel La...Blackboard APAC
This presentation reports on the efficacy of a mobile learning intervention that combined ‘push notifications’ and game principles within a timed quiz app. An institutional interdisciplinary case study was conducted which compared rates of student retention and academic performance with their usage of a purpose-designed learning app. Leading up to lectures the app pushed daily quizzes to students’ personal mobile devices and then rewarded them with feedback, points, badges and a position on a leaderboard. During this session, the findings of this study will be discussed and conclusions made in regards to what findings mean for the future research into higher education learning enabled via mobile app technologies.
Personalized Online Practice Systems for Learning ProgrammingPeter Brusilovsky
Computer programming is quickly transitioning from being just a key competency in computer and information science majors to being a desired skill for students in a wide range of fields. Yet, it is also one of the most challenging subjects to learn. While learning by doing is a critical component in mastering programming skills, neither the traditional educational process nor standard learning support tools provide sufficient opportunities for programming practice. In this talk, I will present our research on personalized programming practice systems for Java, Python, and SQL, which attempt to bridge this known gap in learning programming. A programming practice system engages students in practicing programming skills beyond a relatively small number of graded assignments and exams. To support learning by doing, an online practice system offers a range of interactive “smart content” such as program animations, worked examples, and various kinds of programming problems with an automatic assessment. The main challenges for online practice systems are to motivate students to practice and to guide them to the most appropriate smart content given their course goals and knowledge levels. In this talk, I will review a range of AI technologies, such as student modeling, navigation support, social comparison, and content recommendation, which support efficient programming practice. I will also discuss how personalized practice system could support COVID-19-influenced switch to online learning while maintaining an extensive level of feedback expected from an efficient learning process.
We examined predictors of Calculus II final grades within a sample of 84 college students enrolled in a hybrid course through WEPS. Predictors included “typical” psychological correlates, including math confidence, math anxiety, spatial skills and numerosity ability, as well as clickstream data from the students’ activity in the online course. Results showed the clickstream data were the best predictors of course performance, in that students who spent more time grading other students’ assignments, and students who took fewer quiz attempts, did better in the course. Math confidence and then math anxiety were the next best predictors, in that students with higher confidence and lower math anxiety performed better in the course. We will discuss how results might be dependent on the particular content of this course, and how we might use easy to collect psychological variables along with clickstream data to better understand, and potentially predict, course performance in online courses.
Empirical studies of adaptive annotation in the educational context have demonstrated that it can help students to acquire knowledge faster, improve learning outcomes, reduce navigational overhead, and encourage non-sequential navigation. Over the last 8 years we have explored a lesser known effect of adaptive annotation – its ability to significantly increase student engagement in working with non-mandatory educational content. In the presence of adaptive link annotation, students tend to access significantly more learning content; they stay with it longer, return to it more often and explore a wider variety of learning resources. This talk will present an overview of our exploration of the addictive links effect in many course-long studies, which we ran in several domains (C, SQL and Java programming), for several types of learning content (quizzes, problems, interactive examples). The first part of the talk will review our exploration of a more traditional knowledge-based personalization approach and the second part will focus on more recent studies of social navigation and open social student modeling
This details a successful data-driven redesign of Math 215, an online statistics concepts course at Franklin University. The redesigned course incorporated new interactive educational multimedia. This new design resulted in improved student retention, better student performance, and better satisfaction with the course.
24-hour Papers: The Open-Book Alternative to Exams for Online AssessmentDavid Hopkins
Common unit specifications covering delivery of subject-identical units across different courses, often with different delivery methods, are increasingly being implemented. The inclusion of a ‘coursework’ element of assessment allows for flexibility. This is different when an ‘exam’ is required; with students on a fully-online course, unable to attend an exam centre, due to differences in time zones and/or locations, the concept of an open-book exam is used. The exam paper is released to students through our VLE (Blackboard) at a time that is agreed and broadcast to students in advance. Submission of their work is required within a 24-hour window via an upload of their files to the VLE (using either the standard submission tool or Turnitin).
This presentation will draw upon the Bournemouth University’s substantial experience of presenting ‘Time-Constrained Papers’ to students studying at a distance and will consider the issues surrounding this approach. Particular consideration will be given to the importance of question design to limit scope for academic dishonesty and the University’s plans to modify this approach in the forthcoming academic year.
Fostering students’ engagement and learning through UNEDTrivial: a gamified s...UNED
UNEDTrivial is an activity plugin of Moodle that allows teachers to create spaced quizzes based on two principles in the field of educational psychology
• Testing effect: according to which the best method to fix the knowledge is to answer questions after study sessions.
• Spacing education: the spaced repetition of the same items, at specific intervals, increases in long-term retention.
Participants enrolled in a UNEDTrivial receive daily through email reminders of the questions they must answer. Through the feedback provided in each response attempt, students build their knowledge, correcting failures and reinforcing the successes. Wrong answer questions are sending again in the delay established by the instructor to check the knowledge acquisition and increase the long-term retention.
UNEDTrivial offers their complete analytics page where teachers can track the progress of their students. Furthermore, UNEDTrivial uses gamification to increase student engagement such as a leaderboard where students can track their progress, in addition to competing with other classmates. Moreover, UNEDTrivial is compatible with Moodle badges, so it's possible to assign it when a student has closed all questions in a UNEDtrivial.
UNEDTrivial y available at Moodle official plug-in repository:
https://moodle.org/plugins/mod_unedtrivial
An evidence-based model to enhance programme-wide assessment using technology: TESTA to FASTECH . Presented by Tansy Jessop and Yaz El-Hakim (University of Winchester) and Paul Hyland (Bath Spa University). Facilitated by Mark Russell (University of Hertfordshire).
Jisc conference 2011
Online Tests: Can we do them better? | Bopelo Boitshwarelo, Jyoti Vemuri, Han...Blackboard APAC
The use of e-assessment methods to facilitate and evaluate learning is a growing trend in the higher education space. In particular, the use of online tests has increased rapidly concomitant with the expansion of digital technologies for teaching purposes. Online tests, in the context of this presentation, refer to computer assisted-assessment where the deployment and marking is automated and typically involves objective types of questions such as multiple choice questions (MCQs), true/false questions, matching questions as well as predetermined short answer questions. The growing sophistication of Learning Management Systems(LMSs) such as Blackboard provide an increasing capacity for different types of online tests to be deployed, administered and marked efficiently. Additionally, most major textbook publishers and authors in certain disciplines provide online question banks that can easily integrate with LMSs meaning less time is spent on creating tests from scratch.
With these trends in mind, questions arise around the efficacy of online tests in higher education.
In this presentation we will share findings of a study investigating practices around online tests. First, we will explore what the literature reveals about the role of online tests in higher education and particularly how online tests are used to lead to student learning through formative assessment processes and feedback practices. Secondly, the presentation will review the practices around online tests at the Charles Darwin University Business School and discuss emerging issues. Thirdly, the presentation will distil some preliminary guiding principles around designing, developing, administering and reviewing online tests for effective learning and assessment. Finally, ongoing and further research by the team on the topic of online tests will be highlighted.
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
A Strategic Approach: GenAI in EducationPeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Synthetic Fiber Construction in lab .pptxPavel ( NSTU)
Synthetic fiber production is a fascinating and complex field that blends chemistry, engineering, and environmental science. By understanding these aspects, students can gain a comprehensive view of synthetic fiber production, its impact on society and the environment, and the potential for future innovations. Synthetic fibers play a crucial role in modern society, impacting various aspects of daily life, industry, and the environment. ynthetic fibers are integral to modern life, offering a range of benefits from cost-effectiveness and versatility to innovative applications and performance characteristics. While they pose environmental challenges, ongoing research and development aim to create more sustainable and eco-friendly alternatives. Understanding the importance of synthetic fibers helps in appreciating their role in the economy, industry, and daily life, while also emphasizing the need for sustainable practices and innovation.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
Palestine last event orientationfvgnh .pptxRaedMohamed3
An EFL lesson about the current events in Palestine. It is intended to be for intermediate students who wish to increase their listening skills through a short lesson in power point.
Instructions for Submissions thorugh G- Classroom.pptxJheel Barad
This presentation provides a briefing on how to upload submissions and documents in Google Classroom. It was prepared as part of an orientation for new Sainik School in-service teacher trainees. As a training officer, my goal is to ensure that you are comfortable and proficient with this essential tool for managing assignments and fostering student engagement.
Francesca Gottschalk - How can education support child empowerment.pptxEduSkills OECD
Francesca Gottschalk from the OECD’s Centre for Educational Research and Innovation presents at the Ask an Expert Webinar: How can education support child empowerment?
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
19. • Some context
• Number of summative
• Number of formative
• Varieties of assessment
• Proportion of exams
• Written feedback
• Speed of return of
feedback
Summary of audit data
20. Assessment features across a 3 year UG degree (n=73)
Characteristic Range
Summative 12 -227
Formative 0 - 116
Varieties of assessment 5 - 21
Proportion of examinations 0% - 87%
Time to return marks & feedback 10 - 42 days
Volume of oral feedback 37 -1800 minutes
Volume of written feedback 936 - 22,000 words
21. Typical A&F patterns
73 programmes in 14 unis (Jessop and Tomas 2017)
Characteristic Low Medium High
Volume of summative
assessment
Below 33 40-48 More than 48
Volume of formative only Below 1 5-19 More than 19
% of tasks by examinations Below 11% 22-31% More than
31%
Variety of assessment
methods
Below 8 11-15 More than 15
Written feedback in words Less than
3,800
6,000-7,600 More than
7,600
22. Make sense of audit data
1) What is striking about the data?
2) What surprises or puzzles you?
3) What would you want to know
more about?
Each table look at:
– 1 x university types
OR
– 1 x discipline
– Briefly discuss in
relation to questions
25. Theme 1:
High summative with low formative
• Low formative to summative ratio of 1:8 (UK,
NZ, Ireland)
• Summative as ‘pedagogy of control’
• Formative weakly practised and understood
27. A lot of people don’t do wider
reading. You just focus on your
essay question.
In Weeks 9 to 12 there is hardly
anyone in our lectures. I'd rather
use those two hours of lectures
to get the assignment done.
It’s been non-stop
assignments, and I’m now
free of assignments until
the exams – I’ve had to
rush every piece of work
I’ve done.
CONSEQUENCES
OF HIGH
SUMMATIVE
28. It was really useful. We
were assessed on it but we
weren’t officially given a
grade, but they did give us
feedback on how we did.
It didn’t actually count so
that helped quite a lot
because it was just a
practice and didn’t really
matter what we did and we
could learn from mistakes
so that was quite useful.
The benefits
of formative
29. If there weren’t loads
of other assessments,
I’d do it.
It’s good to know you’re
being graded because
you take it more
seriously.
BUT… If there are no actual
consequences of not doing
it, most students are going
to sit in the bar.
The lecturers do formative
assessment but we don’t get
any feedback on it.
30. Formative is the hardest nut to crack…
Go to www.menti.com and use the code 97 97 66
Type in three reasons why students may be
reluctant to invest time and energy in completing
formative assessment tasks
31. 1) Low-risk way of learning from feedback (Sadler, 1989)
2) Fine-tune understanding of goals (Boud 2000, Nicol 2006)
3) Feedback to lecturers to adapt teaching (Hattie, 2009)
4) Cycles of reflection and collaboration (Biggs 2003; Nicol &
McFarlane Dick 2006)
5) Encourages and distributes student effort (Gibbs 2004).
Yet formative is vital
32. How you encourage formative
Go to www.menti.com and use the code 23 86 17
Choose your top three strategies for engaging
students in formative assessment
…Or talk to each other about successful strategies
33. Case Study 1
• Systematic reduction of summative across
whole business school
• Systematic ramping up of formative
• All working to similar script
• Whole department shift, experimentation,
less risky together
34. Case Study 2
• Problem: silent seminar, students not reading
• Public platform blogging
• Current academic texts
• In-class
• Threads and live discussion
• Linked to summative
35. Case Study 3
• Problem: lack of discrimination about sources
• Students bring 1 x book, 1 x chapter, 1 x
journal article, 2 x pop culture articles to
seminar
• Justify choices to group
• Reach consensus about five best sources
• Add to reading list
38. Your task
• In groups, identify five principles for making
formative work. Write them down on flipchart
paper.
• How could you use or adapt this on your
course?
39. Principles to encourage formative
1. Rebalance summative and formative
2. Whole programme approach
3. Link formative and summative
4. Authentic, public domain tasks
5. Creative, collaborative, challenging tasks
6. Relational and conversational feedback
40. Break time! Before more nuts and
bolts…
How does the AEQ
work? What will it
tell me about the
programme?
What will I learn
about students’
views of assessment
from the focus
group?
42. Why a questionnaire about assessment?
• Weaker NSS scores
• Weak NSS?
• Quick and big data
• Quant/qualitative
43. AEQ 3.3 (2003)
• Designed to measure ‘conditions under which
assessment supports learning’
• Based on theory and evidence + selected CEQ scales
• Robust enough factor structure and scale coherence
– measures what it’s meant to be measuring?
44. AEQ 4.0 (2018)
• Fill in the AEQ 4.0 from the vantage point of
being a student in one of your classes
• Paper or online?
• https://educ.sphinxonline.net/v4/s/ha2fbs
45. Comparing Audit and AEQ data
from one programme
In pairs or groups, explore programme
audit and AEQ data from one programme.
Does anything stack up? Are there loose
ends, questions, contradictions?
47. The feedback is
generally focused
on the module
Because it’s at the end
of the module, it doesn’t
feed into our future
work.
If It’s difficult because your
assignments are so detached
from the next one you do for
that subject. They don’t
relate to each other.
I read it and think “Well,
that’s fine but I’ve already
handed it in now and got the
mark. It’s too late”.
STRUCTURAL
48. It was like ‘Who’s
Holly?’ It’s that
relationship where
you’re just a student.
Because they have to mark so
many that our essay becomes
lost in the sea that they have
to mark.
Here they say ‘Oh yes, I don’t
know who you are. Got too
many to remember, don’t
really care, I’ll mark you on
your assignment’.
RELATIONAL
50. Irretrievable breakdown…
Your essay lacked structure and
your referencing is problematic
Your classes are boring and I
don’t really like you
51. A way of thinking about assessment and
feedback?
52. Ways to be dialogic
• Conversation: who starts the dialogue?
• Cycles of reflection across modules
• Quick generic feedback
• Feedback synthesis tasks
• Peer feedback (especially on formative)
• Technology: audio, screencast and blogging
• From feedback as ‘telling’…
• … to feedback as asking questions
53. And human….
I use first & second person in feedback
A real person marked this!
You are known
I use plain, imaginative English
No techno-bot-speak allowed!
So last
century!
57. Have a go at triangulating data
• Read through audit, AEQ and focus group data from one
programme
• Quick abstract/bullet points of what seems to be going on
• Discuss with your group/flipchart
a) What are the stand out themes?
b) What jigsaw pieces fit together?
c) What unresolved issues remain?
58. Main pointers for focus group
• Questions are broad themes
• Easy to complicated
• Sit in a circle
• It’s the discussion that matters
• Go with the flow
• But steer when off topic, direct, pass the ball
• Troubleshooting
• Ethics
60. Getting students to attend…
• Get the support of lecturers, programme team
• Explore using student researchers
• Use vouchers
• Food
• Between 3 and 8 students for one hour
• Ethics and confidentiality
61. What the data looks like:
…and the intelligent transcript
texttoMP3
https://transcribe.wreally.com/
63. The tone of the case study
• Build a narrative thread
• Descriptive, non-evaluative tone
• Empathetic
• Surprises, puzzles, contradictions
• Balancing weak and strong features
• Admitting gaps, interpretation, errors
• Not prescriptive, but give a steer & create
options
64. Theme 3: Confusion about goals and
standards
• Consistently low scores on the AEQ for clear
goals and standards
• Alienation from the tools
• Perceptions of marker variation, unfair
standards and inconsistencies in practice
65. We’ve got two
tutors- one marks
completely differently
to the other and it’s
pot luck which one
you get.
They read the essay and then
they get a general impression,
then they pluck a mark from
the air.
It’s like Russian
roulette – you may
shoot yourself and
then get an A1.
They have different
criteria, they build up their
own criteria.
66. There are criteria, but I find them really
strange. There’s “writing coherently,
making sure the argument that you
present is backed up with evidence”.
68. Taking action: internalising goals and
standards
• Regular calibration exercises
• Team discussion and dialogue
Lecturers
• Rewrite/co-create criteria
• Discussing exemplars
Lecturers
and students
• Enter secret garden - peer review
• Engage in drafting processes
Students
71. References
Barlow, A. and Jessop, T. 2016. “You can’t write a load of rubbish”: Why blogging works as formative
assessment. Educational Developments. 17(3), 12-15. SEDA.
Boud, D. and Molloy, E. (2013) ‘Rethinking models of feedback for learning: The challenge of
design’, Assessment & Evaluation in Higher Education, 38(6), pp. 698–712.
Gibbs, G. & Simpson, C. (2004) Conditions under which assessment supports students' learning. Learning
and Teaching in Higher Education. 1(1): 3-31.
Harland, T., McLean, A., Wass, R., Miller, E. and Sim, K. N. (2014) ‘An assessment arms race and its fallout:
High-stakes grading and the case for slow scholarship’, Assessment & Evaluation in Higher Education.
Jessop, T. and Tomas, C. 2017. The implications of programme assessment on student learning. Assessment
and Evaluation in Higher Education.
Jessop, T. and Maleckar, B. 2016. The Influence of disciplinary assessment patterns on student learning: a
comparative study. Studies in Higher Education. Published Online 27 August 2014
Jessop, T. , El Hakim, Y. and Gibbs, G. (2014) The whole is greater than the sum of its parts: a large-scale
study of students’ learning in response to different assessment patterns. Assessment and Evaluation in
Higher Education. 39(1) 73-88.
Nicol, D. 2010. From monologue to dialogue: improving written feedback processes in mass higher
education, Assessment & Evaluation in Higher Education, 35: 5, 501 – 517.
O'Donovan, B , Price, M. and Rust, C. (2008) 'Developing student understanding of assessment standards: a
nested hierarchy of approaches', Teaching in Higher Education, 13: 2, 205 -217.
Sadler, D. R. 1989. ‘Formative assessment and the design of instructional systems’, Instructional Science,
18(2), pp. 119–144.
Tomas, C and Jessop, T. 2018. Struggling and juggling: A comparison of student assessment loads across
research and teaching-intensive universities. Assessment and Evaluation in Higher Education. 18 April.
Wu, Q. and Jessop, T. 2018. Formative assessment: missing in action in both research-intensive and teaching-
focused universities. Assessment and Evaluation in Higher Education. Published online 15 January.
Editor's Notes
Disconnected seeing the whole degree in silos – my module, lecturer perspective (Elephant, trunk, ears, tusks etc) compared to student perspective of the whole huge beast. I realise that what we were saying is two per module
Language of ‘covering material’ Should we be surprised?
The TESTA report back of programme findings was by far the most significant meeting I have attended in ten years of sitting through many meetings at this university. For the first time, I felt as though I was a player on the pitch, rather than someone watching from the side-lines. We were discussing real issues.
(Senior Lecturer, Education
Summative as a ‘pedagogy of control’
Teach Less, learn more. Assess less, learn more.
In the UK, assessment and feedback are primary areas of disquiet in the NSS. Provide very little diagnostic information to help course teams adopt more effective assessment strategies. Every year, routine charts red/green/orange – visual representation, accompanied by ritual humiliation of programmes, but not sure why or how to change! AEQ has been used in many countries
Looked at in the overview – student effort; intellectual challenge, focused on understanding rather than memorising or ‘sufficing’; clear about goals and stds; feedback is effective – students read it, understand it, use it to improve what they do next.
Cronbach’s Alpha – sounds like a disease to me but a test to measure the internal consistency of items – do all items measure the same construct?
Is anyone listening?
Use flip chart
Codes look at small units of meaning – a student says – it takes 4 weeks to get it back, so you’ve already handed in the next one, or the tasks are different one after the other, or I never bother at the end of the course because it’s over. All of these are partly to do with timing, and they contribute to the theme of students not using their feedback. Another sub reason is that students don’t use feedback is because they don’t trust it so they say things like If x marks it you’ll get a good grade, if y a bad, or if you get her on a good day, or it’s so subjective etc
What does it feel like to be a student? What does it feel like to be a lecturer at the end of this? An empathetic balanced reporting.
Students can increase their understanding of the language of assessment through their active engagement in: ‘observation, imitation, dialogue and practice’ (Rust, Price, and O’Donovan 2003, 152), Dialogue, clever strategies, social practice, relationship building, relinquishing power.