1. Getting to grips with TESTA methods
How do I do marry
up the data from
audit, AEQ and
focus group??
What is the
AEQ?
What does the
data tell me
about the
programme?
What can I
learn from a
focus group?
2. Activity One: mock audit
Programme
Team
Meeting
Assessment
Experience
Questionnaire
(AEQ)
TESTA
Programme
Audit
Student
Focus Groups
3. The Audit: Caveats
1. Audit is not everything
2. Official discourse
3. Planned curriculum
4. Some better data, some weaker, some gaps
4. • Some context
• Number of summative
• Number of formative
• Varieties of assessment
• Proportion of exams
• Written feedback
• Speed of return of
feedback
Summary of audit data
5. Mock Audit
To find out more: Download 8 steps to auditing a programme
https://www.testa.ac.uk/index.php/resources/research-tool-kits
6. Guesstimate for whole degree based on six month audit:
84 summative; 42 formative, 10-15 varieties, 25-30% by exam
7. Make sense of audit data
1) What is striking about the data?
2) What surprises or puzzles you?
3) What would you want to know
more about?
Each table look at:
• 1 x overview data
OR
• 1 x discipline
• Briefly discuss in
relation to questions
8. Assessment features across a 3 year UG degree (n=73)
Characteristic Range
Summative 12 -227
Formative 0 - 116
Varieties of assessment 5 - 21
Proportion of examinations 0% - 87%
Time to return marks & feedback 10 - 42 days
Volume of oral feedback 37 -1800 minutes
Volume of written feedback 936 - 22,000 words
9. Typical A&F patterns
73 programmes in 14 unis (Jessop and Tomas 2017)
Characteristic Low Medium High
Volume of summative
assessment
Below 33 40-48 More than 48
Volume of formative only Below 1 5-19 More than 19
% of tasks by examinations Below 11% 22-31% More than
31%
Variety of assessment
methods
Below 8 11-15 More than 15
Written feedback in words Less than
3,800
6,000-7,600 More than
7,600
11. The scope of information that you
need to know for that module is
huge…so you’re having to revise
everything - at the same time, you
want to write an in-depth answer
(Student, TESTA data).
Heavy workloads lead to surface
learning
(Lizzio, Wilson and Simons 2002)
13. Why a questionnaire about assessment?
• Weaker NSS scores
• Weak NSS?
• Quick and big data
• Quant/qualitative
14. AEQ 3.3 (2003)
• Designed to measure ‘conditions under which
assessment supports learning’
• Based on theory and evidence + selected CEQ scales
• Robust enough factor structure and scale coherence –
measures what it’s meant to be measuring?
15. AEQ 4.0 (2018)
• Fill in the AEQ 4.0 from the vantage point of
being a student in one of your classes
• Paper or online?
• https://educ.sphinxonline.net/v4/s/ha2fbs
16. Comparing Audit and AEQ data
from one programme
In pairs or groups, explore programme
audit and AEQ data from one programme.
Does anything stack up? Are there loose
ends, questions, contradictions?
18. Have a go at triangulating data
• Read through audit, AEQ and focus group data from one
programme
• Quick abstract/bullet points of what seems to be going on
• Discuss with your group:
a) What are the stand out themes?
b) Which jigsaw pieces fit together?
c) What unresolved issues remain?
19. Main pointers for focus group
• Questions are broad themes
• Easy to complicated
• Sit in a circle
• It’s the discussion that matters
• Go with the flow
• But steer when off topic, direct, pass the ball
• Troubleshooting
• Ethics
How to run a focus group>resources>research toolkits
http://www.testa.ac.uk/index.php
20. Getting students to attend…
• Get the support of lecturers, programme team
• Explore using student researchers
• Use vouchers
• Food
• Between 3 and 8 students for one hour
• Ethics and confidentiality
In the UK, assessment and feedback are primary areas of disquiet in the NSS. Provide very little diagnostic information to help course teams adopt more effective assessment strategies. Every year, routine charts red/green/orange – visual representation, accompanied by ritual humiliation of programmes, but not sure why or how to change! AEQ has been used in many countries
Looked at in the overview – student effort; intellectual challenge, focused on understanding rather than memorising or ‘sufficing’; clear about goals and stds; feedback is effective – students read it, understand it, use it to improve what they do next.
Cronbach’s Alpha – sounds like a disease to me but a test to measure the internal consistency of items – do all items measure the same construct?
Tansy
Use flip chart
Codes look at small units of meaning – a student says – it takes 4 weeks to get it back, so you’ve already handed in the next one, or the tasks are different one after the other, or I never bother at the end of the course because it’s over. All of these are partly to do with timing, and they contribute to the theme of students not using their feedback. Another sub reason is that students don’t use feedback is because they don’t trust it so they say things like If x marks it you’ll get a good grade, if y a bad, or if you get her on a good day, or it’s so subjective etc