This lesson, along with the succeeding ones, will all be about making / writing effective objective-type tests. In this presentation, the pre-service teacher will equip himself/herself with the needed knowledge to write short-answer and completion type test items.
Constructing Objective Supply Type of ItemsEzr Acelar
Used in Assessment of Learning 1.
Includes discussion for completion types of tests, short answer items, non-objective supply type, essay type, selected-response types,
This lesson, along with the succeeding ones, will all be about making / writing effective objective-type tests. In this presentation, the pre-service teacher will equip himself/herself with the needed knowledge to write short-answer and completion type test items.
Constructing Objective Supply Type of ItemsEzr Acelar
Used in Assessment of Learning 1.
Includes discussion for completion types of tests, short answer items, non-objective supply type, essay type, selected-response types,
Why formative? What is it? Why doesn't it work? How can we do it better?Tansy Jessop
Evidence of the value of formative assessment for students' learning is compelling, but embedding formative assessment in programmes of study is difficult. This presentation uses data from the TESTA project to theorise why it is challenging, and proposes solutions from practice at the University of Winchester.
Macroeconomics- Movie Location
This will be used as part of your Personal Professional Portfolio once graded.
Objective:
Prepare a presentation or a paper using research, basic comparative analysis, data organization and application of economic information. You will make an informed assessment of an economic climate outside of the United States to accomplish an entertainment industry objective.
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...Dr. Vinod Kumar Kanvaria
Exploiting Artificial Intelligence for Empowering Researchers and Faculty,
International FDP on Fundamentals of Research in Social Sciences
at Integral University, Lucknow, 06.06.2024
By Dr. Vinod Kumar Kanvaria
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
A workshop hosted by the South African Journal of Science aimed at postgraduate students and early career researchers with little or no experience in writing and publishing journal articles.
A review of the growth of the Israel Genealogy Research Association Database Collection for the last 12 months. Our collection is now passed the 3 million mark and still growing. See which archives have contributed the most. See the different types of records we have, and which years have had records added. You can also see what we have for the future.
Myths to Truths: Acting on evidence about student learning from assessment
1. Myths to Truths: Acting on evidence
about student learning from
assessment
Dr Tansy Jessop, Head of Learning & Teaching,
University of Winchester
Assessment for Learning Symposium
Durban University of Technology
9 October 2014
2. Myths about assessment and
feedback
Sisyphus rolls a boulder
up a hill
“an eternity of endless
labour, useless effort and
frustration”
Homer, 8th Century BC
3. 21st century equivalent
“You end up assessing for
assessment’s sake rather than
thinking about what the assessment
is for”.
Programme Leader, Winchester
(2008)
4. Three foundations
1) Assessment drives what students pay
attention to, and defines the actual
curriculum (Ramsden 1992).
2) Feedback is significant (Hattie, 2009; Black
and Wiliam, 1998)
3) Programme is central to influencing change.
6. Troubling prepositions
Assessment of Learning (Gipps 1994)
Assessment for Learning (Gipps 1994)
Assessment as learning (Torrance 2007)
7. Assessment for Learning
Formative assessment
The relationship between formative and summative
Feedback feeding forward
Helping students internalise goals and standards
More for less? Efficiencies?
Curriculum design
8. Three TESTA premises
1) Assessment drives what students pay
attention to, and defines the actual
curriculum (Ramsden 1992).
2) Feedback is significant (Hattie, 2009; Black
and Wiliam, 1998)
3) Programme is central to influencing change.
9. Today’s talk
TESTA: an evidence-based approach
Four areas of AFL
Evidence for common myths
Making AFL work – ideas, practices,
examples
10. TESTA: Transforming the Experience
of Students through Assessment
Higher Education Academy funded research project
(2009-12)
Evidence-based research and change process
Programme the central unit of change
Based on assessment for learning principles
12. Edinburgh
Edinburgh Napier
Greenwich
Canterbury Christchurch
Glasgow
Lady Irwin College University of Delhi
Sheffield Hallam
University of West Scotland
13. TESTA
“…is a way of thinking
about assessment and
feedback”
Graham Gibbs
14. Based on assessment for learning
principles
Students need to distribute effort and spend time
on task
Tasks with challenging and high expectations
Internalising understand goals and standards
Prompt feedback
Detailed, high quality, developmental feedback
Dialogic cycles of feedback
Deep learning – beyond factual recall
15. TESTA Research Methods
(Drawing on Gibbs and Dunbar-Goddet, 2008,2009)
2000 ASSESSMENT
EXPERIENCE
QUESTIONNAIRES
45 PROGRAMME
90 FOCUS GROUPS
AUDITS
Programme
Team
Meeting
16. Myth 1: Modules and semesters help
students to learn better
The weak spot?
17. Is the module the right metaphor
for learning?
modulus (Latin): small measure
“interchangeable units”
“standardised units”
“sections for easy constructions”
“a self-contained unit”
18. How well does IKEA 101 packaging
work for Engineering 101?
Furniture
Bite-sized
Self-contained
Interchangeable
Quick and instantaneous
Standardised
Comes with written
instructions
Consumption
Student Learning
Long and complicated
Interconnected
Distinctive
Slow, needs deliberation
Varied, differentiated
Tacit, unfathomable,
abstract
Production
19. What students say about the
system…
It’s difficult because your assignments are so detached from the next
one you do for that subject. They don’t relate to each other.
Because it’s at the end of the module, it doesn’t feed into our future
work.
We don’t get much time to think. We finish one assignment and the
other one is knocking at the door.
In the annual system there is more time to learn and do assignments.
In the semester system everything is so rushed up. In the annual
system the teachers say that they had more time to explain in detail.
20. …about shared practices
You’ll get really detailed, really commenting feedback from one
tutor and the next tutor will just say ‘Well done’.
Some of the lecturers are really good at feedback and others don’t
write feedback, and they seem to mark differently. One person will
tell you to reference one way and the other one tells you
something completely different.
21. …about shared standards
Every lecturer is marking it differently, which confuses people.
We’ve got two tutors- one marks completely differently to the
other and it’s pot luck which one you get.
They have different criteria, they build up their own criteria.
Q: If you could change one thing to improve what would it be?
A: More consistent marking, more consistency across everything
and that they would talk to each other.
23. TESTA changes based on evidence
1) Integrated assessment across modules
2) Multi-stage assessments (formative feedback feeding
forward)
3) Changes in quality assurance and degree validation
processes – from lego assembly of degrees module by
module via email to discussion and team based
development
4) Strengthening team approaches to marking
24. Myth 2: Assessment of learning is
more important because it counts
Hercules attacked the many
heads of the hydra, but as
soon as he smashed one
head, two more would
burst forth in its place!
Peisander 600BC
25. How much summative assessment is
taking place?
Range of UK summative assessment 12-68 over three
years
Indian and NZ universities – 100s of small assessments –
busywork, grading as ‘pedagogies of control’
Average in UK about two per module, about 40 in three
years
27. A student’s lecture to professors
The best approach from the student’s perspective is to focus on
concepts. I’m sorry to break it to you, but your students are not
going to remember 90 per cent – possibly 99 per cent – of what
you teach them unless it’s conceptual…. when broad, over-arching
connections are made, education occurs. Most details
are only a necessary means to that end.
http://www.timeshighereducation.co.uk/features/a-students-lecture-
to-professors/2013238.fullarticle#.U3orx_f9xWc.twitter
28. What students say…
A lot of people don’t do wider reading. You just focus on your essay
question.
I always find myself going to the library and going ‘These are the books
related to this essay’ and that’s it.
Although you learn a lot more than you would if you were revising for an
exam, because you have to do wider research and stuff, you still don’t do
research really unless it’s directly related to essays.
Unless I find it interesting I will rarely do anything else on it because I
haven’t got the time. Even though I haven’t anything to do, I don’t have
the time, I have jobs to do and I have to go to work and stuff.
29. Effort Map: High Veld Vs Alps
Week 6 Week 12
Max effort
Modest Effort
Low Effort
29
31. TESTA changes based on evidence
Reducing summative assessment
Increasing and making formative assessment work
Linking formative tasks to summative
Setting tasks involving research, case studies and
authentic assessment
Focusing on assessment which builds up and is aligned
with learning on the module – constructive alignment
32. Myth 3: Formative assessment is
difficult to do, and not worth doing
33. Defining formative assessment
“Definitional fuzziness” Mantz Yorke (2003)
Basic idea is simple – to contribute to student learning
through the provision of information about
performance (Yorke, 2003).
A fine tuning mechanism for how and what we learn
(Boud 2000)
35. What students say about formative
tasks…
It was really useful. We were assessed on it but we weren’t officially given a
grade, but they did give us feedback on how we did.
It didn’t actually count so that helped quite a lot because it was just a
practice and didn’t really matter what we did and we could learn from
mistakes so that was quite useful.
I find more helpful the feedback you get in informal ways week by week, but
there are some people who just hammer on about what will get them a
better mark.
He’s such a better essay writer because he’s constantly writing. And we
don’t, especially in the first year when we really don’t have anything to do.
The amount of times formative assignments could have taken place…
36. What prevents students from doing
formative tasks…
If there weren’t loads of other assessments, I’d do it.
If there are no actual consequences of not doing it, most students are
going to sit in the bar.
It’s good to know you’re being graded because you take it more
seriously.
I would probably work for tasks, but for a lot of people, if it’s not
going to count towards your degree, why bother?
The lecturers do formative assessment but we don’t get any feedback
on it.
37. Why summative can’t do the
same job as formative
Grades …the administrative device that actively diverted
students from really learning anything (Becker, 1968).
Feedback on summative tasks is more readily dismissed
when there is a grade (Black and William 1998, Orrell 2006,
Taras 2002; 2008).
Timing of summative tasks, often too late to act on
feedback.
39. Good cop, bad cop?
False dichotomy is unhelpful
Rebuilding formative-summative relationship
Linked, integrated, multi-stage assessment
40. How can we improve the currency
and practice of formative tasks?
41. TESTA changes based on evidence
Increase formative assessment
Require formative tasks, using QA and validation processes
Public tasks to motivate students to undertake formative
tasks (presentations, posters, blogs)
Authentic and challenging tasks linked to research, case
studies and large projects
Multi-stage tasks – formative to summative
Set expectations about formative in first year
Be consistent as a programme
43. Ideas for embedding formative tasks
1. The Case of American Studies (Multi-stage linked
formative-summative tasks)
2. The Case of BA Primary (Blogs)
3. The Case of MA Education (Triads)
4. The Case of Sports Psychology (Expectation setting)
5. Formative MUST be programmatic – it won’t work if
one keen maverick lecturer does it.
44. Myth 4: Feedback is written
monologue from lecturer to student
Getting feedback from other students in my class
helps. I can relate to what they are saying and take it
on board. I’d just shut down if I was getting constant
feedback from my lecturer.
I read it and think “Well, that’s fine but I’ve already
handed it in now and got the mark. It’s too late”.
45. What students say…
I read through it when I get it and that’s about it really. They
all go in a little folder and I don’t look at them again most of
the time. It’s mostly the mark really that you look for.
I’m personally really bad at reading feedback. I’m the kind of
person, and I hate to admit it, but I’ll look at the mark and
then be like ‘well stuff it, I can’t do anything about it’.
50. TESTA changes based on evidence
Cycles of feedback through self and peer review of work
Developing dialogue through cover sheets
Students initiating feedback through questions
Using technology to personalise feedback
Getting students to give feedback to teachers –
formative evaluation
51. Impacts at Winchester
Improvements in NSS scores on A&F – from bottom
quartile in 2009 to top quartile in 2013
Three programmes with 100% satisfaction ratings post
TESTA
All TESTA programmes have some movement upwards
on NSS A&F scores
Programme teams are talking about A&F and pedagogy
Periodic review processes are changing for the better.
53. References
Becker, H. (1968) Making the grade: the academic side of college life.
Boud, D. (2000) Sustainable Assessment: Rethinking assessment for the learning society, Studies in Continuing
Education, 22: 2, 151 — 167.
Gibbs, G. & Simpson, C. (2004) Conditions under which assessment supports students' learning. Learning and Teaching in
Higher Education. 1(1): 3-31.
Gibbs, G. & Dunbar-Goddet, H. (2009). Characterising programme-level assessment environments that support learning.
Assessment & Evaluation in Higher Education. 34,4: 481-489.
Harland, T. et al. (2014) An Assessment Arms Race and its fallout: high-stakes grading and the case for slow
scholarship. Assessment and Evaluation inn Higher Education.
http://www.tandfonline.com/doi/abs/10.1080/02602938.2014.931927
Hattie, J. (2007) The Power of Feedback. Review of Educational Research. 77(1) 81-112.
Jessop, T. and Maleckar, B. (2014). The Influence of disciplinary assessment patterns on student learning: a comparative
study. Studies in Higher Education. Published Online 27 August 2014
http://www.tandfonline.com/doi/abs/10.1080/03075079.2014.943170
Jessop, T. , El Hakim, Y. and Gibbs, G. (2014) The whole is greater than the sum of its parts: a large-scale study of students’
learning in response to different assessment patterns. Assessment and Evaluation in Higher Education. 39(1) 73-88.
Jessop, T, McNab, N & Gubby, L. (2012) Mind the gap: An analysis of how quality assurance processes influence programme
assessment patterns. Active Learning in Higher Education. 13(3). 143-154.
Jessop, T. El Hakim, Y. and Gibbs, G. (2011) Research Inspiring Change. Educational Developments. 12(4) 12-15.
Nicol, D. (2010) From monologue to dialogue: improving written feedback processes in mass higher education, Assessment
& Evaluation in Higher Education, 35: 5, 501 – 517
Sadler, D.R. (1989) Formative assessment and the design of instructional systems, Instructional Science, 18, 119-144.
Editor's Notes
Students spend most time and effort on assessment. Assessment is the cue for student learning and attention. It is also the area where students show least satisfaction on the NSS. Scores on other factors return about 85% of good rankings, whereas only 75% of students find assessment and feedback ‘good’. We often think the curriculum is the knowledge, content and skills we set out in the planned curriculum, but from a students’ perspective, the assessment demands frame the curriculum. Looking at assessment from a modular perspective leads to myopia about the whole degree, the disciplinary discourse, and often prevents students from connecting and integrating knowledge and meeting progression targets. It is very difficult for individual teachers on modules to change the way a programme works through exemplary assessment practice on modules. It takes a programme team and a programme to bring about changes in the student experience. Assessment innovations at the individual module level often fail to address assessment problems at the programme-level, some of which, such as too much summative assessment and not enough formative assessment, are a direct consequence of module-focused course design and innovation.
What is assessment for? Exploring the meaning and the distinctive w
Students spend most time and effort on assessment. Assessment is the cue for student learning and attention. It is also the area where students show least satisfaction on the NSS. Scores on other factors return about 85% of good rankings, whereas only 75% of students find assessment and feedback ‘good’. We often think the curriculum is the knowledge, content and skills we set out in the planned curriculum, but from a students’ perspective, the assessment demands frame the curriculum. Looking at assessment from a modular perspective leads to myopia about the whole degree, the disciplinary discourse, and often prevents students from connecting and integrating knowledge and meeting progression targets. It is very difficult for individual teachers on modules to change the way a programme works through exemplary assessment practice on modules. It takes a programme team and a programme to bring about changes in the student experience. Assessment innovations at the individual module level often fail to address assessment problems at the programme-level, some of which, such as too much summative assessment and not enough formative assessment, are a direct consequence of module-focused course design and innovation.
1) Context, connection, the big picture; 2) Pedagogies of control – domination by summative, grades oriented culture; 3) the struggle to value formative assessment; 4) feedback; 5) goals and standards
Huge appetite for programme-level data in the sector. Worked with more than 100 programmes in 40 universities internationally. The timing of TESTA – many universities revisiting the design of degrees, thinking about coherence, progression and the impact of modules on student learning. The confluence of modules with semesterisation, lacl of slow learning, silo effects and pointlessness of feedback after the end of a module…
What started as a research methodology has become a way of thinking. David Nicol – changing the discourse, the way we think about assessment and feedback; not only technical, research, mapping, also shaping our thinking. Evidence, assessment principles
Based on robust research methods about whole programmes - 40 audits; 2000 AEQ returns; 50 focus groups. The two triangulating methodologies of the AEQ and focus groups are student experience data – student voice etc. Three legged stool. These three elements of data are compiled into a case profile which captures the interaction of an academic’s programme view, the ‘official line’ or discourse of assessment and how students perceive it. This is a very dynamic rendering because student voice is explanatory, but also probes some of our assumptions as academics about how students work and how assessment works for them etc. Finally the case profile is subject to discussion and contextualisation by insiders – the people who teach on the programme, who prioritise interventions.
Raise the question: are there problems with the packaging? Works for furniture – does it work for student learning? Assumptions of modularity: self-contained; disconnected; interchangeable. The next slide indicates some of the tensions of packaging learning in modules, and tensions inherent in the ,metaphor./
Originally used for furniture and prefab and modular homes – how well does it suit educational purposes? I’m not taking issue with modules per se, but want to highlight that there have been some unintended consequences – some good, some bad – of using modular systems. Many programmes have navigated through them, some haven’t. Anyone who has built IKEA furniture knows that the instructions are far from self-evident – and we have translated a lot of our instructions, criteria, programme and module documents for students in ways that may be as baffling for them. Have we squeezed learning into a mould that works better for furniture?
Unintended consequences
One direction
Hierarchical
Performance
‘Pedagogies of control’
An assessment ‘Arms Race’
Why so much summative?
Content drives our view of curriculum
Tony Harland: I’m sorry, but we can’t afford to stay here. We’re off to do our assignment”. Assessment as learning
Backwash effect
The case of the under-performing engineers (Graham, Strathclyde)
The case of the cunning (but not litigious) lawyers (Graham, somewhere)
The case of the silent seminar (Winchester)
The case of the lost accountants (Winchester)
The case of the disengaged Media students (Winchester)
TESTA Higher Education Academy NTFS project, funded for 3 years in 2009. 4 partner universities, 7 programmes – ‘cathedrals group’. Gather data on whole programme assessment, and feed this back to teams in order to bring about changes. In the original seven programmes collected before and after data.