1. The TESTA project used assessment mapping, student surveys, and case studies of programs to analyze patterns of assessment and their impact on student learning.
2. Case studies showed that programs with many small summative assessments and unclear standards led to superficial student effort, while continuous formative assessment supported deep learning.
3. Most programs had assessment patterns that did not encourage regular student effort, clear communication of expectations, or useful feedback that students could act on.
4. The TESTA project aimed to catalyze evidence-informed changes to assessment practices through structural changes, improving assessment quality, and developing a community of practice around assessment design.
An evidence-based model to enhance programme-wide assessment using technology: TESTA to FASTECH . Presented by Tansy Jessop and Yaz El-Hakim (University of Winchester) and Paul Hyland (Bath Spa University). Facilitated by Mark Russell (University of Hertfordshire).
Jisc conference 2011
it is a technique which helps in decision making of selection.it is very well explain others to know about the pareto diagram.
team members:-
nisha verma, ni8kita, shreya gupta.
Improving student learning through assessment and feedback in the new higher education landscape by Professor Graham Gibbs presented at the Learning @ City 2012 Conference at City University London.
View the presentation video here: http://www.youtube.com/watch?v=DbzMTXRBcQk&feature=plcp
An evidence-based model to enhance programme-wide assessment using technology: TESTA to FASTECH . Presented by Tansy Jessop and Yaz El-Hakim (University of Winchester) and Paul Hyland (Bath Spa University). Facilitated by Mark Russell (University of Hertfordshire).
Jisc conference 2011
it is a technique which helps in decision making of selection.it is very well explain others to know about the pareto diagram.
team members:-
nisha verma, ni8kita, shreya gupta.
Improving student learning through assessment and feedback in the new higher education landscape by Professor Graham Gibbs presented at the Learning @ City 2012 Conference at City University London.
View the presentation video here: http://www.youtube.com/watch?v=DbzMTXRBcQk&feature=plcp
This PowerPoint by Dr. Dee McKinney & Katie Shepard was presented as a workshop for the East Georgia State College Center for Teaching & Learning for interested faculty & staff in January 2018.
Quis custodiet ipsos custodes? - Enhancement of Supervision and Examination P...Luigi Vanfretti
Some supervisors and examiners think they are great, they are actually convinced about it, but are they really as good as they think? How have they arrived at such conclusion and would this conclusion be valid in the presence of hard data? If you ask students, you might find a different answer, and hard data will show that.
In general, supervisors and examiners are reluctant on having their performance evaluated, however, if they are to enhance their practice, a systematic approach needs to be used where they can obtain knowledge about their strengths and weaknesses which would allow them to take informed decisions on how to improve their practice.
This project aims to derive strategies that will lead to a systematic enhancement of the quality of supervision and examination by using quantitative and qualitative methods. The author began assessing his performance from 2011-2013, and using the available data, a new method and instruments for quality enhancement will be derived in the future.
ETUG Spring 2014 - Improving Peer Review of Writing with Calibrated Peer ReviewBCcampus
Are you looking for ways to incorporate writing in a large enrolment course? Would you like to help students think more critically about their own writing? Do you already incorporate writing assignments in your course, but would like to reduce the amount of time you spend reading and assessing student writing? If so, Calibrated Peer Review (CPR) may be the tool for you. In this session, we’ll highlight the functionality of Calibrated Peer Review using SCIE113 as a case study. As a writing intensive course in a discipline not traditionally associated with writing, SCIE113 initially faced challenges with helping students understand the significant of peer review, how to constructively review a peer’s paper and how to think critically about their own work. We’ll discuss the development, evaluation, and evolution of Calibrated Peer Review assignments in SCIE113 and share both faculty and student feedback about the tool. We’ll also share guidelines for implementation and explore how CPR is used in other disciplines and contexts.
Similar to TESTA - UNSW, Sydney Australia (September 2011) (20)
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
Palestine last event orientationfvgnh .pptxRaedMohamed3
An EFL lesson about the current events in Palestine. It is intended to be for intermediate students who wish to increase their listening skills through a short lesson in power point.
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
The Roman Empire A Historical Colossus.pdfkaushalkr1407
The Roman Empire, a vast and enduring power, stands as one of history's most remarkable civilizations, leaving an indelible imprint on the world. It emerged from the Roman Republic, transitioning into an imperial powerhouse under the leadership of Augustus Caesar in 27 BCE. This transformation marked the beginning of an era defined by unprecedented territorial expansion, architectural marvels, and profound cultural influence.
The empire's roots lie in the city of Rome, founded, according to legend, by Romulus in 753 BCE. Over centuries, Rome evolved from a small settlement to a formidable republic, characterized by a complex political system with elected officials and checks on power. However, internal strife, class conflicts, and military ambitions paved the way for the end of the Republic. Julius Caesar’s dictatorship and subsequent assassination in 44 BCE created a power vacuum, leading to a civil war. Octavian, later Augustus, emerged victorious, heralding the Roman Empire’s birth.
Under Augustus, the empire experienced the Pax Romana, a 200-year period of relative peace and stability. Augustus reformed the military, established efficient administrative systems, and initiated grand construction projects. The empire's borders expanded, encompassing territories from Britain to Egypt and from Spain to the Euphrates. Roman legions, renowned for their discipline and engineering prowess, secured and maintained these vast territories, building roads, fortifications, and cities that facilitated control and integration.
The Roman Empire’s society was hierarchical, with a rigid class system. At the top were the patricians, wealthy elites who held significant political power. Below them were the plebeians, free citizens with limited political influence, and the vast numbers of slaves who formed the backbone of the economy. The family unit was central, governed by the paterfamilias, the male head who held absolute authority.
Culturally, the Romans were eclectic, absorbing and adapting elements from the civilizations they encountered, particularly the Greeks. Roman art, literature, and philosophy reflected this synthesis, creating a rich cultural tapestry. Latin, the Roman language, became the lingua franca of the Western world, influencing numerous modern languages.
Roman architecture and engineering achievements were monumental. They perfected the arch, vault, and dome, constructing enduring structures like the Colosseum, Pantheon, and aqueducts. These engineering marvels not only showcased Roman ingenuity but also served practical purposes, from public entertainment to water supply.
Ethnobotany and Ethnopharmacology:
Ethnobotany in herbal drug evaluation,
Impact of Ethnobotany in traditional medicine,
New development in herbals,
Bio-prospecting tools for drug discovery,
Role of Ethnopharmacology in drug evaluation,
Reverse Pharmacology.
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
The French Revolution, which began in 1789, was a period of radical social and political upheaval in France. It marked the decline of absolute monarchies, the rise of secular and democratic republics, and the eventual rise of Napoleon Bonaparte. This revolutionary period is crucial in understanding the transition from feudalism to modernity in Europe.
For more information, visit-www.vavaclasses.com
How to Split Bills in the Odoo 17 POS ModuleCeline George
Bills have a main role in point of sale procedure. It will help to track sales, handling payments and giving receipts to customers. Bill splitting also has an important role in POS. For example, If some friends come together for dinner and if they want to divide the bill then it is possible by POS bill splitting. This slide will show how to split bills in odoo 17 POS.
The Indian economy is classified into different sectors to simplify the analysis and understanding of economic activities. For Class 10, it's essential to grasp the sectors of the Indian economy, understand their characteristics, and recognize their importance. This guide will provide detailed notes on the Sectors of the Indian Economy Class 10, using specific long-tail keywords to enhance comprehension.
For more information, visit-www.vavaclasses.com
The Art Pastor's Guide to Sabbath | Steve ThomasonSteve Thomason
What is the purpose of the Sabbath Law in the Torah. It is interesting to compare how the context of the law shifts from Exodus to Deuteronomy. Who gets to rest, and why?
The Art Pastor's Guide to Sabbath | Steve Thomason
TESTA - UNSW, Sydney Australia (September 2011)
1. TESTA: an evidence –
informed approach to
improving programme
assessment
Dr Tansy Jessop, Senior Fellow
TESTA Project Leader
UNSW, Sydney
31 August 2011
2. TESTA background
• Conceptual framework (Gibbs &
Simpson, 2004)
• Assessment Experience Questionnaire
• Programme-level assessment patterns
(Gibbs and Dunbar-Goddet,
2007,2009)
• University of Winchester L&T projects –
exploring the idea of an assessment
‘culture’
3. What is the TESTA project?
• Funded 3 year HEA project (2009-12)
• Originally 4 partners, 7 programmes
• Programme assessment mapping
• Student voice assessment principles
• Quality frameworks
• Diagnosis Intervention Evaluation
4. Research Methodology
(Drawing on Gibbs and Dunbar-Goddet, 2008,2009)
ASSESSMENT
EXPERIENCE
QUESTIONNAIRE
(AEQ n= 1200+)
FOCUS GROUPS
(n=50 with
301 students)
PROGRAMME AUDIT
(n =22)
Programme
Team
Meeting
Case Study
6. TESTA Case Study 1: what’s going
on?
• Lots of coursework, of very varied forms
• Very few exams
• Masses of written feedback on assignments
• Learning outcomes and criteria clearly specified
….looks like a ‘model’ assessment environment
But students:
• Don’t put in a lot of effort and distribute their effort across few
topics
• Don’t think there is a lot of feedback or that it very useful, and
don’t make use of it
• Don’t think it is at all clear what the goals and standards are
7. TESTA Case Study 2: what’s going
on?
• 35 summative assessments
• No formative assessment specified in documents
• Learning outcomes and criteria wordy and woolly
• Marking by global, tacit, professional judgements
• Teaching staff mainly part-time and hourly paid
….looks like a problematic assessment environment
But students:
• Put in a lot of effort and distribute their effort across topics
• Have a very clear idea of goals and standards
• Are self-regulating and have a good idea of how to close the
gap
8. Case Study 1:
• Staff do loads of work, and it doesn’t really
work for students.
• Students are unable to evaluate their own
performance.
• Students don’t take control of their own
learning.
• Summative assessment drives effort but not
necessarily engagement & learning.
9. Case Study 2:
• Students do loads of work, and it works well
as a programme.
• Students are continually engaged in
evaluating their own and others’
performance.
• Students control and manage their own
learning.
• Formative assessment drives effort,
engagement and learning.
10. UK Assessment trends in HE
• Ratio formative to summative
• Diversity of assessment methods (1 to 20)
• innovation
• fragmentation modular courses
• multiplicity learning outcomes
• Exams to coursework
• Emphasis written criteria and guidance
• Resource constraints, large classes
• Wide differences between universities
11. “Conditions under which
assessment supports student
learning”
1. Assessed tasks capture sufficient
student time and effort
2. These tasks distribute effort evenly
across topics and weeks
12. Quantity of effort on TESTA
• Audit: Mean 36 summative assessments
• AEQ: Students from only 1/7 programmes agreed
that assessment encouraged regular effort
• What students said:
“I think we could do with more assessments over
the course of the year to make sure that people are
actually doing stuff”.
“The more you write the better you become at it…
and if we’ve only written 40 pieces over three years
that’s not a lot”.
13. • So you could have a great time doing nothing until
like a month before Christmas and you’d suddenly
panic. I prefer steady deadlines, there’s a gradual
move forward, rather than bam!
• If it’s not going to be in the exam, and it is quite
difficult, I wouldn’t bother with that.
• I personally think sometimes we get too much of this
end or half way through the term essay type things.
Continual assessments would be so much better.
• I prefer exams to be honest because I feel then you
actually have to put the effort in and you have to
revise everything that you’ve learnt, so it sticks with
me a bit more.
14. Conditions and principles that
support student learning
3. Assessment communicates clear and high
expectations to students (Gibbs, 2005)
4. Students should:
(a)know what is expected,
(b)know how this relates to their actual
performance, and
(c)have some information about how to close
the gap (Sadler, 1989).
15. Clear goals and standards
Audit: very clear written statements, mapped
to outcomes, low formative feedback; huge
variety of types
AEQ: Students on 0/7 agreed that they were
clear.
What students say:
There are criteria, but I find them really strange.
There’s “writing coherently, making sure the
argument that you present is backed up with
evidence” but that isn’t enough I’ve noticed.
16. I’m not a marker so I can’t really think like them... I
don’t have any idea of why it got that mark.
They have different criteria, build up their own
criteria. Some of them will mark more interested in
how you word things.
It’s easier to know if you’ve done a good essay than
creative piece... it’s very subjective.
You know who are going to give crap marks and
who are going to give decent marks.
17. I said “Oh how could I have got that to an A?”
Because I must have mentioned it when I was
talking about my dissertation to her and she was like
“But a B+ is really good” and I was like “But it’s not
an A!” and she was like “But a B+ is really good.
You’ve done really well to get that” and I was like
“But I’m not asking that. I’m asking why I didn’t get
an A for it. What could I have put in that I could
have got an A for it?” But she was like “You should
be really proud of yourself”.
18.
19. Press the pause button
a) Fill in the AEQ from your own
experience as a student.
b) One minute written reflections:
• What’s the best bit of feedback you’ve
ever received?
• Why was it useful?
• In small groups, extract principles of
good feedback.
20. More conditions for learning (Gibbs &
Simpson 2004)
5) Sufficient feedback is provided, both often
enough and in enough detail
6) It is quick enough for students to act on it
7) It focuses on learning rather than marks
8) It is understandable
9) Students attend to feedback and act on it to
improve their learning
21. What the research says about feedback
“High level and complex learning is best developed when
feedback is viewed as a relational process, that takes place
over time, is dialogic, and is integral to teaching and learning
(Sambell, 2011).
Students dominant view of ‘feedback’ is largely confined to
written tutor comments (Glover 2006).
Mass higher education is squeezing out dialogue with the
result that written feedback ...is now having to carry the burden
of teacher-student interaction(Nicol 2010).
We need to shift away from feedback which does not allow for
the possibility of a response (Boud, 1995).
22. Quantity and quality of feedback
• Audit: high volumes of written, low oral,
delivered slowly
• AEQ: Students from 0/7 programmes
agreed that they received sufficient
feedback on time
It’s rare that you’ll get it in time to help you on that
same module.
You struggle to find something in that comment
that’s useful for you to carry on and improve your
future pieces of work.
23. • I don’t find the written feedback that useful because
you’ve only got a tiny bit of space to write. It gives
you a rough idea but you need a tutorial really.
• It’s ‘Well done’ or ‘You’re doing well’... I’d rather that
they ripped it to pieces so that I could go from there.
• You know that twenty other people have got the
same sort of comment.
• It’s like they say ‘Oh yes, I don’t know who you are.
Got too many to remember, don’t really care, I’ll
mark you on your assignment’.
24. Use of feedback
• Audit: high volumes of written, low oral,
delivered slowly
• AEQ: Students from only 1/7 programmes
agreed that it was useful
When it comes to feedback on assignments I read
it and think “Well that’s fine, but I’ve already
handed it in now and got the mark. It’s too late”.
It was about nine weeks before we got it back. I’d
forgotten what I’d written.
25. Once the deadline comes up to just look on the Internet and
say ‘Right, that’s my mark. I don’t need to know too much
about why I got it’.
I only apply feedback to that module because I have this fear
that if I transfer it to other modules it’s not going to transfer
smoothly.
You can’t carry forward most of the comments because you
might have an essay first and your next assignment might be a
poster.
At the end of the day getting feedback from other students in
my class, I can relate more to what they’re saying and take it
on board. I think I’d just shut down if I was getting constant
feedback from my lecturer.
26. TESTA: a catalyst for change
• Patterns have become routine
• TESTA offers a reflective possibility
• A set of tools
• Evidence
• Based on Student voice
• To take more informed action and have
more agency in determining
assessment patterns
29. Structural examples
Change credit weighting of modules,
‘long thin’ rather than ‘short fat’
courses, reduce summative
assessments and put in place one
capstone assessment, change
assessment regulations and validation
processes, change feedback
mechanisms
31. Pedagogic
• using more peer and self assessment
and feedback;
• introducing more in-class formative
tasks
• assigning value to formative through
social pressure and public production;
• using technology to link and enrich
assessment tasks
32. Modular/unit/course level
• Ingenious ideas at a micro-level that
have limited effect on the programme;
• lack of community of practice effects,
team working etc
• individualistic sparks of genius with no
systemic influence.
34. References:
Boud, D. (2000) Sustainable Assessment: Rethinking assessment for the
learning society, Studies in Continuing Education, 22: 2, 151 -167.
Gibbs, G. & Simpson, C. (2004) Conditions under which assessment supports
students' learning. Learning and Teaching in Higher Education. 1(1): 3-31.
Gibbs, G. & Dunbar-Goddet, H. (2009). Characterising programme-level
assessment environments that support learning. Assessment & Evaluation in
Higher Education. 34,4: 481-489.
Glover, C. And Brown, E (2006) Written feedback for students: too much, too
detailed or too incomprehensible to be effective? Bioscience Education E-
journal. 7-3.
Jessop, T, McNab, N and Gubby, L. (2012) Mind the gap: An analysis of how
quality assurance procedures influence programme assessment patterns.
Active Learning in Higher Education. 13(3).
Nicol, D. (2010) From monologue to dialogue: improving written feedback
processes in mass higher education, Assessment & Evaluation in Higher
Education, 35: 5, 501 - 517
Sadler, D.R. (1989) Formative assessment and the design of instructional
systems, Instructional Science, 18, 119-144.
Sambell, K (2011) Rethinking Feedback in Higher Education. Higher Education
Academy Escalate Subject Centre Publication.
Editor's Notes
Einstein story/chauffeur etc.
Graham’s sabbatical – literature review, disciplinary takes on assessment. Graham’s a psychologist. Questionnaire – AEQ measures the extent of these conditions, bit like the CEQ, but focused on assessment – I’ll take you through an example later in the presentation. When Graham was Director of L&T at Oxford, after a lifetime working in Britain’s ‘new’ universities (or old polytechnics) he began to play with the idea that there are distinctive assessment environments in programmes and institutions, and used an early version of TESTA to test his hypothesis. My background very much as a teacher, teacher developer, and career researcher from the qualitative tradition, story/narrative/the power of voice etc, been doing some work at Winchester from this sort of touchy-feely patch on assessment.
Multi-level – institutional – touching various areas, including quality, management, structures
Robust, thematic, particular
22 x 18 = 400 ish modules etc
I know 94 of your courses have been subject to review, which is a bit like our programme audit, although there are some differences in emphasis; this is a qualitative representation of what seems to be going on in a programme of study assessment-wise. The two triangulating methodologies of the AEQ and focus groups are student experience data – student voice etc. Three legged stool. These three elements of data are compiled into a case profile which captures the interaction of an academic’s programme view, the ‘official line’ or discourse of assessment and how students perceive it. This is a very dynamic rendering because student voice is explanatory, but also probes some of our assumptions as academics about how students work and how assessment works for them etc. Finally the case profile is subject to discussion and contextualisation by insiders – the people who teach on the programme, who prioritise interventions.
Mass higher education; in the good old days – when Graham was a student – much more formative assessment. The unitisation of courses has led to much more summative – what you call courses, we call modules etc. Let a 1000 flowers bloom – diversification of varieties – wonderfully creative but can create problems – one of our programmes 17 varieties, 32 summative, no formative – how much practice do students get? One of the safety nets – written criteria guidelines – mass higher education thing, which means sometimes lack of dialogue and internalisation of those standards. Larger classes, students anonymous cyphers – David Nicol – students looking for a pedagogic relationship in their feedback – impoverished dialogue. Vast differences – Oxbridge – 134 formative opps to one summative for example. When you look at audit data you will see massive variations in practice.
We’ve had a look at some case snapshots, and how they reflect changing assessment patterns. Now what I’d like to do is take you through a thematic slicing of the data to show how these interact with Graham’s conditions for student learning, but also what the major themes are, and how they are reflected in the data.
Time on task principle. HEPI average number of self-directed hours per week – Media Studies 9 years to complete a degree if you equate with Bologna process. Australia – first year experience report certainly your students appear to be spending more time on self-directed learning than ours etc 2) you guys also ahead on sequencing/not bunching assessment – I see a lot of emphasis in the EMPA report on when the first task is set etc. Unitisation has done things to how we assess...
Holland/Alps diagram Philosophical tensions between creating sufficient scaffolding and independent learning/ time management etc..
Chickering and Gamson, Ivy League, 1987. What are the key things that create a good learning environment in HE. 7 principles.
Very difficult to communicate, tacit standards, the four quadrant matrix
Marker variation, how well embedded the criteria are, how well used, recognition of subjectivity, ‘hawks and sparrows’
Sad, mixture of quite instrumental student and a lecturer who can’t put into words how to improve. Or is that the problem? Psychologist Dweck. Symptomatic of the idea that ability/intelligence is fixed rather than that people learn and grow. A deep and subconscious idea lodged in many a lecturer’s head that a student is a an A student, a B student etc. I think this may be at the heart of this miscommunication.
So really clear written criteria, reasonable quantities of written feedback, and students are still sniffing at a muddled cocktail of ingredients – because so many of the standards are tacit, and learned through social processes, dialogue, communities of practice, practice and feedback; initiation into disciplines rather than the bland words of a criteria statement. Many students miss Sadler’s first point about how you need firstly to know what you are aiming at...then understand the difference between your performance and the standard in order to close the gap.
The most powerful single influence on student achievement is feedback (Ramsden 1992).
Extraordinary influence on learning (Black and William study)
Change is inevitable - except from a vending machine
One of the hidden benefits of TESTA is the team discussion it engenders. Soccer pitch. Professional development about assessment – not such an issue in Australia at the forefront of L&T. Allows teams to identify where they need to target action and intervention. It’s a whole team approach – course level changes rarely impact whole programmes. Multi-layered – not just about how best to assess students on the ground, it’s also about how to find best pedagogic fit with institutional frameworks which govern assessment. Ultimately the outcome TESTA is aiming at, is improved student learning through assessment – where it’s not a summative conveyor belt, where there is a balance between assessment for learning and measurement, authentic assessment tasks etc.
It’s early days for us. Long view of change. Within the TESTA pack of 8 programmes consisting of some 160 plus courses/modules, we have found teams do different things. Big bang changes affect the degree structure, require revalidation – one programme has done this. Thematic more slicing through one area of data lie feedback or clear goals and standards and addressing them on the whole programme. Pedagogic changes – the aha – where the whole programme team suddenly ‘gets’ the value of formative assessment and takes a consistent line. Course level = micro-level changes.