SlideShare a Scribd company logo
1 of 41
An evidenced-informed approach to
enhancing programme-wide
assessment
TESTA to FASTECH
Dr Tansy Jessop & Yaz El Hakim, University of Winchester
Professor Paul Hyland, Bath Spa University
JISC Online Annual: 22 November 2011
Pre-Conference Activities
Pre-reading:
1) Gibbs & Simpson (2004) Conditions under which
assessment supports student learning.
http://www2.glos.ac.uk/offload/tli/lets/lathe/issue1/articles/simpson.pdf
2) Gibbs, G. & Dunbar-Goddet, H. (2007) The effects of
programme assessment environments on student learning.
http://www.heacademy.ac.uk/assets/documents/teachingandresearch/gibbs_0506.pdf
3) Jessop, T., Smith, C. & El Hakim, Y. (2011) Programme-
wide assessment: doing ‘more with less’ from the TESTA
NTFS project. HEA Assessment & Feedback Briefing Paper.
http://www.heacademy.ac.uk/assets/documents/assessment/2011_Winchester_SS_Briefing_Report.pdf
1) What conditions do you see as most
important in student learning (Paper 1)?
2) What is your response to the idea of
institutional and programme ‘assessment
environments’ which influence assessment
and feedback patterns? (Paper 2)
3) What are the main challenges and benefits of
addressing assessment patterns on a whole
programme? (Paper 3)
Pre-conference questions
TESTA ‘Cathedrals Group’ Universities
Why TESTA has been compelling
1) The research methodology
2) It is conceptually grounded in assessment and
feedback literature
3) It’s about improving student learning
4) It is programmatic in focus
5) The change process is dialogic & developmental
Presentation Overview
1) The Research Methodology (Tansy)
2) Case study as a compelling narrative (Tansy)
3) Trends in assessment & feedback (Tansy)
Q&A
4) The student effort narrative (Yaz)
5) The bewildered student narrative (Yaz)
6) Systems-failure on feedback narrative (Yaz)
Q&A
7) A way forward: FASTECH (Paul)
Two Paradigms
Transmission
• Expert to novice
• Planned, packaged & ‘delivered’
• Feedback given by experts
• Feedback received by novices
• One way traffic
• Very little dialogue
• Emphasis on measurement
• Competition
Metaphor = mechanical system
Social constructivist model
• Participatory, democratic
• Messy and process-oriented
• Peer review
• Self-evaluation
• Social process
• Dialogue
• Emphasis on learning
• Collaboration
Metaphor = the journey
1) Research Methodology
• triangulates data from three sources
• presented in a case study
• complex, ambiguous, textured
• open to discussion - not the ‘final word’
• ‘before’ and ‘after’ data
Programme Audit
• How much summative assessment
• How much formative (reqd, formal, feedback)
• How many varieties of assessment
• Proportion exams to coursework
• Word count of written feedback
• How much ‘formal’ oral feedback
• Criteria, learning outcomes, course docs
Assessment Experience Questionnaire
version 3.3
• 28 questions
• 5 point Likert scale where 5 = strongly agree
• 9 scales and one overall satisfaction question
• Scales link to conditions of learning
• Examples:
– quantity and distribution of effort;
– use of feedback;
– quantity and quality of feedback;
– clear goals and standards
Focus groups
• What kinds of assessment
• How assessment influences your study
behaviour
• Whether you know what quality work looks like
• What feedback is like and how you use it
Research Methodology
ASSESSMENT
EXPERIENCE
QUESTIONNAIRE
(AEQ n= 1200+)
FOCUS GROUPS
(n=50 with
301 students)
PROGRAMME AUDIT
(n=22)
Programme
Team
Meeting
2) The cases are surprising, complex, puzzling
Here is one case from the TESTA data……
Case Study 1
• Lots of coursework (47 tasks)
• Very varied forms (15 types of assessment)
• Very few exams (1 in every 10)
• Masses of written feedback on assignments
(15,412 words)
• Learning outcomes and criteria clearly
specified
….looks like a ‘model’ assessment environment
But students:
• Don’t put in a lot of effort and distribute their
effort across few topics
• Don’t think there is a lot of feedback or that it
very useful, and don’t make use of it
• Don’t think it is at all clear what the goals and
standards are
……what is going on?
Your best guesses
A. Variety of assessment confuses students
B. Assessment in ‘bunched’ at certain times
C. The feedback is too late to be of any use
D. Teachers don’t share a common standard
E. Other
• Select your response from the buttons (A B C D E) at
the bottom-right of the list of participants
• Type any additional comments into the text-chat
• Teachers work hard, students less so.
• Feedback is too late to be useful
• Teachers have varied standards
• Students see feedback as ‘modular’
• Variety confuses students
• Formative tasks are assigned low priority
• Summative assessment drives effort
What is going on?
3) Trends in assessment and feedback
• High summative assessment, low formative
• High variety (average 11; range 7-17)
• Written feedback (ave7,153; r = 2,869-15,412 )
• Low oral feedback (average 6 hours)
• Watertight documents, tacit standards
• Huge institutional and programme variations:
o formative: summative ratios (134:1 cf 1:10)
o oral feedback (37 minutes to 30 hours)
Q&A
4) The effort narrative. TESTA data shows that:
• average of 12 summative per year
• 24 teaching weeks, one every two weeks
• summative tasks end-loaded & bunched
• leading to patchy effort
• and surface learning
• with an average three formative tasks a year….
The more you write the better you become at it…
and if we’ve only written 40 pieces over three years
that’s not a lot.
So you could have a great time doing nothing until
like a month before Christmas and you’d suddenly
panic. I prefer steady deadlines, there’s a gradual
move forward, rather than bam!
In the second year, I kept getting such good marks I
thought “If I’m getting this much without putting in
much effort that means I could do so much better if
I actually did do the hours” but it just goes up and
down really.
TESTA plus HEPI quiz
Which one is false?
A) 1 in 3 UK students study for 20 hours or less a week
B) Students on only 1 out of 7 TESTA programmes agreed
that they were working hard
C) Students work hardest when there is a high volume of
formative assessment and oral feedback
D) Students work hardest when there is a high volume of
summative assessment and written feedback
E) 1 in 3 UK students undertake > 6 hours of paid work a
week
Select your response from the buttons (A B C D E) at the
bottom-right of the list of participants
Chat box
What ideas might encourage students to put in
effort regularly on degree programmes?
• Type your responses in the text chat
Strategies to encourage student effort
Choose your top strategy to encourage effort:
A) Raise expectations in first year
B) Require more formative assessment
C) Link formative and summative tasks
D) Use more peer and self assessment
E) Design small, frequent assessed tasks
Select your response from the buttons (A B C D E) at the
bottom-right of the list of participants
Technologies that may help…
What technologies might work to
spur on regular and distributed
effort?
Type your responses in the text
chat
5) The baffled student narrative
o The language of written criteria is difficult
to understand
o feedback does not always refer to criteria
o students feel that marking standards vary
and are subjective and arbitrary
o students sometimes use criteria
instrumentally
I’m not a marker so I can’t really think like them... I don’t
have any idea of why it got that mark.
They have different criteria, build up their own criteria.
Some of them will mark more interested in how you word
things.
You know who are going to give crap marks and who are
going to give decent marks.
Chat Box
What strategies might help students to
internalise goals and standards?
• Type your responses in the text chat
Strategies to help students know what
‘good’ is
Which strategy do you think helps most?
A) Showing students models of good work
B) Peer marking workshops
C) Lots of formative tasks with feedback
D) Plenty of interactive dialogue about standards
E) Self assessment activities
Select your response from the buttons (A B C D E) at the
bottom-right of the list of participants
6) System-wide features make it difficult for students to
use feedback and act on it
o feedback often arrives after a module, or after
submission of the next task
o tasks are not sequenced or connected across
modules, leading to lack of feed forward
o students sometimes receive grades electronically
before their feedback becomes available on
parchment in a dusty office
o technology has led to some depersonalised cut and
pasting
It’s rare that you’ll get it in time to help you on that same module.
t’s rare that you’ll get it in time to help you on that same module.
You know that twenty other people have got the same sort of
comment.
I look on the Internet and say ‘Right, that’s my mark. I don’t need
to know too much about why I got it’.
I only apply feedback to that module because I have this fear
that if I transfer it to other modules it’s not going to transfer
smoothly.
You can’t carry forward most of the comments because you
might have an essay first and your next assignment might be a
poster.
Changes through TESTA
Structural
Thematic
Pedagogic
Module
Types of changes
1. Reduced summative
2. Increased formative assessment
3. Streamlined variety
4. Raised expectations of student
workload
5. Sequenced and linked tasks across
modules
6. Practice based changes
www.testa.ac.uk
Q&A
FASTECH
Feedback and Assessment for Students with Technology
What is FASTECH?
• R&D Project (3 yrs): ‘R’ primarily with TESTA tools; ‘D’ in disciplines and universities.
• approach: teaching teams with students interpret ‘R’ data to determine goals of ‘D’.
• activities: to address QA and QE issues, optimize sector engagement (fastech.ac.uk)
• outputs: R&D findings, experiences & guides by teachers, students, others…
Pragmatic Principles?
• Fast: using readily-available technologies; quick to learn, easy to use …
• Efficient: after start-up period; saves time & effort ( paper), productivity …
• Effective: brings significant learning benefit to students, pedagogic impact …
FASTECH: a Pedagogical Goal
Student
baggage …
• all can be
strategic!
and blocks:
• ideas about
roles of S & T
• …
… ability to manage own learning …
In each assessment culture, this entails using
technologies that help promote
transparency & S participation in all processes from
design and management to feedback and revision
(validity, reliability & fairness are not enough)
a reshaping of teacher & student responsibilities
processes that enhance and create new: peer-
learning activities & collaborations (in/out of class);
self & peer assessment; recording, sharing & review
of students’ progress and achievements …
teacher revision of pedagogies, based upon records
of student progress & achievement in learning
attuning of assessment to address individual &
distinctive needs & aspirations …..
Teacher
baggage …
and blocks:
• ideas about role
of assessment
• unsure about
value of feedback
• assessment &
marking conflated
• criteria &
standards
• …
Finally,
for an excellent overview
of technologies and pedagogies
JISC, Effective Assessment in a Digital Age. Bristol: HEFCE, 2010.
Available at: www.jisc.ac.uk/digiassess (esp., pp. 14-15, 54-55)
For resources associated with this publication:
www.jisc.ac.uk/assessresource
Please contact us for more info about TESTA and FASTECH:
Tansy.Jessop@winchester.ac.uk
Yassein.El-Hakim@winchester.ac.uk
p.hyland@bathspa.ac.uk
Websites: www.testa.ac.uk & www.fastech.ac.uk (from January 2012)
Thank You
DISCUSSION
to be continued in the conference discussion forum
How do you think using technology in A&F will
improve students’ learning?
References
Black, P. & D. William (1998) ‘Assessment and Classroom Learning’,
Assessment in Education: Principles, Policy and Practice. 5(1): 7-74.
Bloxham, S. & P. Boyd (2007) Planning a programme assessment strategy.
Chapter 11 (157-175) in Developing Effective Assessment in Higher Education.
Berkshire. Open University Press.
Boud, D. (2000) Sustainable Assessment: Rethinking assessment for the
learning society, Studies in Continuing Education, 22: 2, 151 — 167.
Gibbs, G. & Simpson, C. (2004) Conditions under which assessment supports
students' learning. Learning and Teaching in Higher Education. 1(1): 3-31.
Gibbs, G., & Dunbar-Goddet, H. (2007) The effects of programme assessment
environments on student learning. Higher Education Academy.
http://www.heacademy.ac.uk/assets/York/documents/ourwork/research/gib
bs_0506.pdf
Gibbs, G. & Dunbar-Goddet, H. (2009). Characterising programme-level
assessment environments that support learning. Assessment & Evaluation in
Higher Education. 34,4: 481-489.
Jessop, T., El Hakim, Y. & Gibbs, G. (2011) TESTA: Research inspiring change,
Educational Developments 12 (4). In press.
Jessop, T., McNab, N., and Gubby, L. (2012 forthcoming) Mind the gap: An
analysis of how quality assurance procedures influence programme
assessment patterns. Active Learning in Higher Education. 13(3).
Knight, P.T. and Yorke, M. (2003) Assessment, Learning and Employability.
Maidenhead. Open University Press.
Nicol, D. J. and McFarlane-Dick, D. (2006) Formative Assessment and Self-
Regulated Learning: A Model and Seven Principles of Good Feedback Practice.
Studies in Higher Education. 31(2): 199-218.
Nicol, D. (2010) From monologue to dialogue: improving written feedback
processes in mass higher education, Assessment & Evaluation in Higher
Education, 35: 5, 501 – 517
Sambell, K (2011) Rethinking Feedback in Higher Education. Higher Education
Academy Escalate Subject Centre Publication.

More Related Content

What's hot

Academic Skills and Student Success in PSE
Academic Skills and Student Success in PSEAcademic Skills and Student Success in PSE
Academic Skills and Student Success in PSE
pdietsche
 

What's hot (20)

TESTA, School of Politics & International Relations, University of Nottingham...
TESTA, School of Politics & International Relations, University of Nottingham...TESTA, School of Politics & International Relations, University of Nottingham...
TESTA, School of Politics & International Relations, University of Nottingham...
 
TESTA, Southampton Feedback Champions Conference (April 2015)
TESTA, Southampton Feedback Champions Conference (April 2015)TESTA, Southampton Feedback Champions Conference (April 2015)
TESTA, Southampton Feedback Champions Conference (April 2015)
 
TESTA, AHE Conference Masterclass (June 2015)
TESTA, AHE Conference Masterclass (June 2015)TESTA, AHE Conference Masterclass (June 2015)
TESTA, AHE Conference Masterclass (June 2015)
 
Testa interactive masterclass
Testa interactive masterclassTesta interactive masterclass
Testa interactive masterclass
 
Implementing peer feedback
Implementing peer feedbackImplementing peer feedback
Implementing peer feedback
 
The why and what of testa
The why and what of testaThe why and what of testa
The why and what of testa
 
Peer Feedback on Writing: A SoTL work in progress
Peer Feedback on Writing: A SoTL work in progressPeer Feedback on Writing: A SoTL work in progress
Peer Feedback on Writing: A SoTL work in progress
 
Dispelling myths; challenging traditions: TESTA evidence
Dispelling myths; challenging traditions: TESTA evidenceDispelling myths; challenging traditions: TESTA evidence
Dispelling myths; challenging traditions: TESTA evidence
 
Y1Feedback Presentation at EdTech2015
Y1Feedback Presentation at EdTech2015Y1Feedback Presentation at EdTech2015
Y1Feedback Presentation at EdTech2015
 
The why and what of testa
The why and what of testaThe why and what of testa
The why and what of testa
 
Experience of using formative assessment and students perception of formative...
Experience of using formative assessment and students perception of formative...Experience of using formative assessment and students perception of formative...
Experience of using formative assessment and students perception of formative...
 
TESTA, HEPN University of Sheffield (December 2014)
 TESTA, HEPN University of Sheffield (December 2014) TESTA, HEPN University of Sheffield (December 2014)
TESTA, HEPN University of Sheffield (December 2014)
 
Squeezing assessment and stretching learning
Squeezing assessment and stretching learningSqueezing assessment and stretching learning
Squeezing assessment and stretching learning
 
Peer Feedback On Writing: Is More Better? A Pilot Study in Progress (poster)
Peer Feedback On Writing: Is More Better? A Pilot Study in Progress (poster)Peer Feedback On Writing: Is More Better? A Pilot Study in Progress (poster)
Peer Feedback On Writing: Is More Better? A Pilot Study in Progress (poster)
 
Oct 23 Using exemplars to develop student assessment literacy
Oct 23 Using exemplars to develop student assessment literacyOct 23 Using exemplars to develop student assessment literacy
Oct 23 Using exemplars to develop student assessment literacy
 
Reflections on teachingand more (1)
Reflections on teachingand more (1)Reflections on teachingand more (1)
Reflections on teachingand more (1)
 
TESTA, Sports Away Day Sheffield Hallam (June 2014)
TESTA, Sports Away Day Sheffield Hallam (June 2014)TESTA, Sports Away Day Sheffield Hallam (June 2014)
TESTA, Sports Away Day Sheffield Hallam (June 2014)
 
Academic Skills and Student Success in PSE
Academic Skills and Student Success in PSEAcademic Skills and Student Success in PSE
Academic Skills and Student Success in PSE
 
Peer feedback dialogues
Peer feedback dialoguesPeer feedback dialogues
Peer feedback dialogues
 
Peer Evaluation Strategy for Improving Group Participation at Brightspace Ten...
Peer Evaluation Strategy for Improving Group Participation at Brightspace Ten...Peer Evaluation Strategy for Improving Group Participation at Brightspace Ten...
Peer Evaluation Strategy for Improving Group Participation at Brightspace Ten...
 

Similar to TESTA to FASTECH (November 2011)

Similar to TESTA to FASTECH (November 2011) (20)

Why a programme view? Why TESTA?
Why a programme view? Why TESTA?Why a programme view? Why TESTA?
Why a programme view? Why TESTA?
 
TESTA, HEDG Spring Meeting London (March 2013)
 TESTA, HEDG Spring Meeting London (March 2013) TESTA, HEDG Spring Meeting London (March 2013)
TESTA, HEDG Spring Meeting London (March 2013)
 
TESTA, Durham University (December 2013)
TESTA, Durham University (December 2013)TESTA, Durham University (December 2013)
TESTA, Durham University (December 2013)
 
1 why do testa
1 why do testa1 why do testa
1 why do testa
 
Birmingham Assessment and Feedback Symposium
Birmingham Assessment and Feedback Symposium Birmingham Assessment and Feedback Symposium
Birmingham Assessment and Feedback Symposium
 
Inspiring change in assessment and feedback
Inspiring change in assessment and feedbackInspiring change in assessment and feedback
Inspiring change in assessment and feedback
 
A broken assessment paradigm?
A broken assessment paradigm?A broken assessment paradigm?
A broken assessment paradigm?
 
Portsmouth BAM Knowledge and Learning SIG
Portsmouth BAM Knowledge and Learning SIGPortsmouth BAM Knowledge and Learning SIG
Portsmouth BAM Knowledge and Learning SIG
 
TESTA Masterclass
TESTA MasterclassTESTA Masterclass
TESTA Masterclass
 
From alienation to engagement through a programme assessment approach
From alienation to engagement through a programme assessment approachFrom alienation to engagement through a programme assessment approach
From alienation to engagement through a programme assessment approach
 
TESTA, University of Leeds: 'Talking @ Teaching' (September 2013)
TESTA, University of Leeds: 'Talking @ Teaching' (September 2013)TESTA, University of Leeds: 'Talking @ Teaching' (September 2013)
TESTA, University of Leeds: 'Talking @ Teaching' (September 2013)
 
Lights, action, clapperboards: changing how students think and perform throug...
Lights, action, clapperboards: changing how students think and perform throug...Lights, action, clapperboards: changing how students think and perform throug...
Lights, action, clapperboards: changing how students think and perform throug...
 
MMU TESTA Keynote
MMU TESTA KeynoteMMU TESTA Keynote
MMU TESTA Keynote
 
Flipping the technology-pedagogy equation: principles to improve assessment a...
Flipping the technology-pedagogy equation: principles to improve assessment a...Flipping the technology-pedagogy equation: principles to improve assessment a...
Flipping the technology-pedagogy equation: principles to improve assessment a...
 
TESTA, University of Greenwich Keynote (July 2013)
TESTA, University of Greenwich Keynote (July 2013)TESTA, University of Greenwich Keynote (July 2013)
TESTA, University of Greenwich Keynote (July 2013)
 
TESTA, Universtiy of Warwick SCAP Conference (July 2013)
TESTA, Universtiy of Warwick SCAP Conference (July 2013)TESTA, Universtiy of Warwick SCAP Conference (July 2013)
TESTA, Universtiy of Warwick SCAP Conference (July 2013)
 
Why do TESTA?
Why do TESTA?Why do TESTA?
Why do TESTA?
 
Improving student learning through programme assessment
Improving student learning through programme assessmentImproving student learning through programme assessment
Improving student learning through programme assessment
 
Implications of TESTA for curriculum design
Implications of TESTA for curriculum designImplications of TESTA for curriculum design
Implications of TESTA for curriculum design
 
Changing the assessment narrative
Changing the assessment narrativeChanging the assessment narrative
Changing the assessment narrative
 

Recently uploaded

1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
QucHHunhnh
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
QucHHunhnh
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
kauryashika82
 

Recently uploaded (20)

Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan Fellows
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural ResourcesEnergy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdf
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
Asian American Pacific Islander Month DDSD 2024.pptx
Asian American Pacific Islander Month DDSD 2024.pptxAsian American Pacific Islander Month DDSD 2024.pptx
Asian American Pacific Islander Month DDSD 2024.pptx
 

TESTA to FASTECH (November 2011)

  • 1. An evidenced-informed approach to enhancing programme-wide assessment TESTA to FASTECH Dr Tansy Jessop & Yaz El Hakim, University of Winchester Professor Paul Hyland, Bath Spa University JISC Online Annual: 22 November 2011
  • 2. Pre-Conference Activities Pre-reading: 1) Gibbs & Simpson (2004) Conditions under which assessment supports student learning. http://www2.glos.ac.uk/offload/tli/lets/lathe/issue1/articles/simpson.pdf 2) Gibbs, G. & Dunbar-Goddet, H. (2007) The effects of programme assessment environments on student learning. http://www.heacademy.ac.uk/assets/documents/teachingandresearch/gibbs_0506.pdf 3) Jessop, T., Smith, C. & El Hakim, Y. (2011) Programme- wide assessment: doing ‘more with less’ from the TESTA NTFS project. HEA Assessment & Feedback Briefing Paper. http://www.heacademy.ac.uk/assets/documents/assessment/2011_Winchester_SS_Briefing_Report.pdf
  • 3. 1) What conditions do you see as most important in student learning (Paper 1)? 2) What is your response to the idea of institutional and programme ‘assessment environments’ which influence assessment and feedback patterns? (Paper 2) 3) What are the main challenges and benefits of addressing assessment patterns on a whole programme? (Paper 3) Pre-conference questions
  • 5. Why TESTA has been compelling 1) The research methodology 2) It is conceptually grounded in assessment and feedback literature 3) It’s about improving student learning 4) It is programmatic in focus 5) The change process is dialogic & developmental
  • 6. Presentation Overview 1) The Research Methodology (Tansy) 2) Case study as a compelling narrative (Tansy) 3) Trends in assessment & feedback (Tansy) Q&A 4) The student effort narrative (Yaz) 5) The bewildered student narrative (Yaz) 6) Systems-failure on feedback narrative (Yaz) Q&A 7) A way forward: FASTECH (Paul)
  • 7. Two Paradigms Transmission • Expert to novice • Planned, packaged & ‘delivered’ • Feedback given by experts • Feedback received by novices • One way traffic • Very little dialogue • Emphasis on measurement • Competition Metaphor = mechanical system Social constructivist model • Participatory, democratic • Messy and process-oriented • Peer review • Self-evaluation • Social process • Dialogue • Emphasis on learning • Collaboration Metaphor = the journey
  • 8. 1) Research Methodology • triangulates data from three sources • presented in a case study • complex, ambiguous, textured • open to discussion - not the ‘final word’ • ‘before’ and ‘after’ data
  • 9. Programme Audit • How much summative assessment • How much formative (reqd, formal, feedback) • How many varieties of assessment • Proportion exams to coursework • Word count of written feedback • How much ‘formal’ oral feedback • Criteria, learning outcomes, course docs
  • 10. Assessment Experience Questionnaire version 3.3 • 28 questions • 5 point Likert scale where 5 = strongly agree • 9 scales and one overall satisfaction question • Scales link to conditions of learning • Examples: – quantity and distribution of effort; – use of feedback; – quantity and quality of feedback; – clear goals and standards
  • 11. Focus groups • What kinds of assessment • How assessment influences your study behaviour • Whether you know what quality work looks like • What feedback is like and how you use it
  • 12. Research Methodology ASSESSMENT EXPERIENCE QUESTIONNAIRE (AEQ n= 1200+) FOCUS GROUPS (n=50 with 301 students) PROGRAMME AUDIT (n=22) Programme Team Meeting
  • 13. 2) The cases are surprising, complex, puzzling Here is one case from the TESTA data……
  • 14. Case Study 1 • Lots of coursework (47 tasks) • Very varied forms (15 types of assessment) • Very few exams (1 in every 10) • Masses of written feedback on assignments (15,412 words) • Learning outcomes and criteria clearly specified ….looks like a ‘model’ assessment environment
  • 15. But students: • Don’t put in a lot of effort and distribute their effort across few topics • Don’t think there is a lot of feedback or that it very useful, and don’t make use of it • Don’t think it is at all clear what the goals and standards are ……what is going on?
  • 16. Your best guesses A. Variety of assessment confuses students B. Assessment in ‘bunched’ at certain times C. The feedback is too late to be of any use D. Teachers don’t share a common standard E. Other • Select your response from the buttons (A B C D E) at the bottom-right of the list of participants • Type any additional comments into the text-chat
  • 17. • Teachers work hard, students less so. • Feedback is too late to be useful • Teachers have varied standards • Students see feedback as ‘modular’ • Variety confuses students • Formative tasks are assigned low priority • Summative assessment drives effort What is going on?
  • 18. 3) Trends in assessment and feedback • High summative assessment, low formative • High variety (average 11; range 7-17) • Written feedback (ave7,153; r = 2,869-15,412 ) • Low oral feedback (average 6 hours) • Watertight documents, tacit standards • Huge institutional and programme variations: o formative: summative ratios (134:1 cf 1:10) o oral feedback (37 minutes to 30 hours)
  • 19. Q&A
  • 20. 4) The effort narrative. TESTA data shows that: • average of 12 summative per year • 24 teaching weeks, one every two weeks • summative tasks end-loaded & bunched • leading to patchy effort • and surface learning • with an average three formative tasks a year….
  • 21. The more you write the better you become at it… and if we’ve only written 40 pieces over three years that’s not a lot. So you could have a great time doing nothing until like a month before Christmas and you’d suddenly panic. I prefer steady deadlines, there’s a gradual move forward, rather than bam! In the second year, I kept getting such good marks I thought “If I’m getting this much without putting in much effort that means I could do so much better if I actually did do the hours” but it just goes up and down really.
  • 22. TESTA plus HEPI quiz Which one is false? A) 1 in 3 UK students study for 20 hours or less a week B) Students on only 1 out of 7 TESTA programmes agreed that they were working hard C) Students work hardest when there is a high volume of formative assessment and oral feedback D) Students work hardest when there is a high volume of summative assessment and written feedback E) 1 in 3 UK students undertake > 6 hours of paid work a week Select your response from the buttons (A B C D E) at the bottom-right of the list of participants
  • 23. Chat box What ideas might encourage students to put in effort regularly on degree programmes? • Type your responses in the text chat
  • 24. Strategies to encourage student effort Choose your top strategy to encourage effort: A) Raise expectations in first year B) Require more formative assessment C) Link formative and summative tasks D) Use more peer and self assessment E) Design small, frequent assessed tasks Select your response from the buttons (A B C D E) at the bottom-right of the list of participants
  • 25. Technologies that may help… What technologies might work to spur on regular and distributed effort? Type your responses in the text chat
  • 26. 5) The baffled student narrative o The language of written criteria is difficult to understand o feedback does not always refer to criteria o students feel that marking standards vary and are subjective and arbitrary o students sometimes use criteria instrumentally
  • 27. I’m not a marker so I can’t really think like them... I don’t have any idea of why it got that mark. They have different criteria, build up their own criteria. Some of them will mark more interested in how you word things. You know who are going to give crap marks and who are going to give decent marks.
  • 28. Chat Box What strategies might help students to internalise goals and standards? • Type your responses in the text chat
  • 29. Strategies to help students know what ‘good’ is Which strategy do you think helps most? A) Showing students models of good work B) Peer marking workshops C) Lots of formative tasks with feedback D) Plenty of interactive dialogue about standards E) Self assessment activities Select your response from the buttons (A B C D E) at the bottom-right of the list of participants
  • 30. 6) System-wide features make it difficult for students to use feedback and act on it o feedback often arrives after a module, or after submission of the next task o tasks are not sequenced or connected across modules, leading to lack of feed forward o students sometimes receive grades electronically before their feedback becomes available on parchment in a dusty office o technology has led to some depersonalised cut and pasting
  • 31. It’s rare that you’ll get it in time to help you on that same module. t’s rare that you’ll get it in time to help you on that same module. You know that twenty other people have got the same sort of comment. I look on the Internet and say ‘Right, that’s my mark. I don’t need to know too much about why I got it’. I only apply feedback to that module because I have this fear that if I transfer it to other modules it’s not going to transfer smoothly. You can’t carry forward most of the comments because you might have an essay first and your next assignment might be a poster.
  • 33. Types of changes 1. Reduced summative 2. Increased formative assessment 3. Streamlined variety 4. Raised expectations of student workload 5. Sequenced and linked tasks across modules 6. Practice based changes
  • 35. Q&A
  • 36. FASTECH Feedback and Assessment for Students with Technology What is FASTECH? • R&D Project (3 yrs): ‘R’ primarily with TESTA tools; ‘D’ in disciplines and universities. • approach: teaching teams with students interpret ‘R’ data to determine goals of ‘D’. • activities: to address QA and QE issues, optimize sector engagement (fastech.ac.uk) • outputs: R&D findings, experiences & guides by teachers, students, others… Pragmatic Principles? • Fast: using readily-available technologies; quick to learn, easy to use … • Efficient: after start-up period; saves time & effort ( paper), productivity … • Effective: brings significant learning benefit to students, pedagogic impact …
  • 37. FASTECH: a Pedagogical Goal Student baggage … • all can be strategic! and blocks: • ideas about roles of S & T • … … ability to manage own learning … In each assessment culture, this entails using technologies that help promote transparency & S participation in all processes from design and management to feedback and revision (validity, reliability & fairness are not enough) a reshaping of teacher & student responsibilities processes that enhance and create new: peer- learning activities & collaborations (in/out of class); self & peer assessment; recording, sharing & review of students’ progress and achievements … teacher revision of pedagogies, based upon records of student progress & achievement in learning attuning of assessment to address individual & distinctive needs & aspirations ….. Teacher baggage … and blocks: • ideas about role of assessment • unsure about value of feedback • assessment & marking conflated • criteria & standards • …
  • 38. Finally, for an excellent overview of technologies and pedagogies JISC, Effective Assessment in a Digital Age. Bristol: HEFCE, 2010. Available at: www.jisc.ac.uk/digiassess (esp., pp. 14-15, 54-55) For resources associated with this publication: www.jisc.ac.uk/assessresource Please contact us for more info about TESTA and FASTECH: Tansy.Jessop@winchester.ac.uk Yassein.El-Hakim@winchester.ac.uk p.hyland@bathspa.ac.uk Websites: www.testa.ac.uk & www.fastech.ac.uk (from January 2012) Thank You
  • 39. DISCUSSION to be continued in the conference discussion forum How do you think using technology in A&F will improve students’ learning?
  • 40. References Black, P. & D. William (1998) ‘Assessment and Classroom Learning’, Assessment in Education: Principles, Policy and Practice. 5(1): 7-74. Bloxham, S. & P. Boyd (2007) Planning a programme assessment strategy. Chapter 11 (157-175) in Developing Effective Assessment in Higher Education. Berkshire. Open University Press. Boud, D. (2000) Sustainable Assessment: Rethinking assessment for the learning society, Studies in Continuing Education, 22: 2, 151 — 167. Gibbs, G. & Simpson, C. (2004) Conditions under which assessment supports students' learning. Learning and Teaching in Higher Education. 1(1): 3-31. Gibbs, G., & Dunbar-Goddet, H. (2007) The effects of programme assessment environments on student learning. Higher Education Academy. http://www.heacademy.ac.uk/assets/York/documents/ourwork/research/gib bs_0506.pdf Gibbs, G. & Dunbar-Goddet, H. (2009). Characterising programme-level assessment environments that support learning. Assessment & Evaluation in Higher Education. 34,4: 481-489.
  • 41. Jessop, T., El Hakim, Y. & Gibbs, G. (2011) TESTA: Research inspiring change, Educational Developments 12 (4). In press. Jessop, T., McNab, N., and Gubby, L. (2012 forthcoming) Mind the gap: An analysis of how quality assurance procedures influence programme assessment patterns. Active Learning in Higher Education. 13(3). Knight, P.T. and Yorke, M. (2003) Assessment, Learning and Employability. Maidenhead. Open University Press. Nicol, D. J. and McFarlane-Dick, D. (2006) Formative Assessment and Self- Regulated Learning: A Model and Seven Principles of Good Feedback Practice. Studies in Higher Education. 31(2): 199-218. Nicol, D. (2010) From monologue to dialogue: improving written feedback processes in mass higher education, Assessment & Evaluation in Higher Education, 35: 5, 501 – 517 Sambell, K (2011) Rethinking Feedback in Higher Education. Higher Education Academy Escalate Subject Centre Publication.

Editor's Notes

  1. 22 x 18 = 400 ish modules etc