An evidence-based model to enhance programme-wide assessment using technology: TESTA to FASTECH . Presented by Tansy Jessop and Yaz El-Hakim (University of Winchester) and Paul Hyland (Bath Spa University). Facilitated by Mark Russell (University of Hertfordshire).
Jisc conference 2011
These slides are for a symposium on the Scholarship of Teaching and Learning, sponsored by the SoTL Institute at Mt. Royal University in Calgary, Alberta, Canada. They are about a project studying peer feedback on writing, in a course at the University of British Columbia in Vancouver (Canada) in which students write 10-12 essays over a full academic year, and engage in peer feedback sessions every week for that year as well.
Peer Feedback On Writing: Is More Better? A Pilot Study in Progress (poster)Christina Hendricks
This is a slide of a poster that was presented at the Society for Teaching and Learning in Higher Education conference in Vancouver, BC, Canada, in June of 2015. Download to view as Power Point slide and enlarge to see it all!
These slides are for a symposium on the Scholarship of Teaching and Learning, sponsored by the SoTL Institute at Mt. Royal University in Calgary, Alberta, Canada. They are about a project studying peer feedback on writing, in a course at the University of British Columbia in Vancouver (Canada) in which students write 10-12 essays over a full academic year, and engage in peer feedback sessions every week for that year as well.
Peer Feedback On Writing: Is More Better? A Pilot Study in Progress (poster)Christina Hendricks
This is a slide of a poster that was presented at the Society for Teaching and Learning in Higher Education conference in Vancouver, BC, Canada, in June of 2015. Download to view as Power Point slide and enlarge to see it all!
Macroeconomics- Movie Location
This will be used as part of your Personal Professional Portfolio once graded.
Objective:
Prepare a presentation or a paper using research, basic comparative analysis, data organization and application of economic information. You will make an informed assessment of an economic climate outside of the United States to accomplish an entertainment industry objective.
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
Biological screening of herbal drugs: Introduction and Need for
Phyto-Pharmacological Screening, New Strategies for evaluating
Natural Products, In vitro evaluation techniques for Antioxidants, Antimicrobial and Anticancer drugs. In vivo evaluation techniques
for Anti-inflammatory, Antiulcer, Anticancer, Wound healing, Antidiabetic, Hepatoprotective, Cardio protective, Diuretics and
Antifertility, Toxicity studies as per OECD guidelines
Synthetic Fiber Construction in lab .pptxPavel ( NSTU)
Synthetic fiber production is a fascinating and complex field that blends chemistry, engineering, and environmental science. By understanding these aspects, students can gain a comprehensive view of synthetic fiber production, its impact on society and the environment, and the potential for future innovations. Synthetic fibers play a crucial role in modern society, impacting various aspects of daily life, industry, and the environment. ynthetic fibers are integral to modern life, offering a range of benefits from cost-effectiveness and versatility to innovative applications and performance characteristics. While they pose environmental challenges, ongoing research and development aim to create more sustainable and eco-friendly alternatives. Understanding the importance of synthetic fibers helps in appreciating their role in the economy, industry, and daily life, while also emphasizing the need for sustainable practices and innovation.
A Strategic Approach: GenAI in EducationPeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
Instructions for Submissions thorugh G- Classroom.pptxJheel Barad
This presentation provides a briefing on how to upload submissions and documents in Google Classroom. It was prepared as part of an orientation for new Sainik School in-service teacher trainees. As a training officer, my goal is to ensure that you are comfortable and proficient with this essential tool for managing assignments and fostering student engagement.
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...Levi Shapiro
Letter from the Congress of the United States regarding Anti-Semitism sent June 3rd to MIT President Sally Kornbluth, MIT Corp Chair, Mark Gorenberg
Dear Dr. Kornbluth and Mr. Gorenberg,
The US House of Representatives is deeply concerned by ongoing and pervasive acts of antisemitic
harassment and intimidation at the Massachusetts Institute of Technology (MIT). Failing to act decisively to ensure a safe learning environment for all students would be a grave dereliction of your responsibilities as President of MIT and Chair of the MIT Corporation.
This Congress will not stand idly by and allow an environment hostile to Jewish students to persist. The House believes that your institution is in violation of Title VI of the Civil Rights Act, and the inability or
unwillingness to rectify this violation through action requires accountability.
Postsecondary education is a unique opportunity for students to learn and have their ideas and beliefs challenged. However, universities receiving hundreds of millions of federal funds annually have denied
students that opportunity and have been hijacked to become venues for the promotion of terrorism, antisemitic harassment and intimidation, unlawful encampments, and in some cases, assaults and riots.
The House of Representatives will not countenance the use of federal funds to indoctrinate students into hateful, antisemitic, anti-American supporters of terrorism. Investigations into campus antisemitism by the Committee on Education and the Workforce and the Committee on Ways and Means have been expanded into a Congress-wide probe across all relevant jurisdictions to address this national crisis. The undersigned Committees will conduct oversight into the use of federal funds at MIT and its learning environment under authorities granted to each Committee.
• The Committee on Education and the Workforce has been investigating your institution since December 7, 2023. The Committee has broad jurisdiction over postsecondary education, including its compliance with Title VI of the Civil Rights Act, campus safety concerns over disruptions to the learning environment, and the awarding of federal student aid under the Higher Education Act.
• The Committee on Oversight and Accountability is investigating the sources of funding and other support flowing to groups espousing pro-Hamas propaganda and engaged in antisemitic harassment and intimidation of students. The Committee on Oversight and Accountability is the principal oversight committee of the US House of Representatives and has broad authority to investigate “any matter” at “any time” under House Rule X.
• The Committee on Ways and Means has been investigating several universities since November 15, 2023, when the Committee held a hearing entitled From Ivory Towers to Dark Corners: Investigating the Nexus Between Antisemitism, Tax-Exempt Universities, and Terror Financing. The Committee followed the hearing with letters to those institutions on January 10, 202
Embracing GenAI - A Strategic ImperativePeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
TESDA TM1 REVIEWER FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
An evidence based model
1. An evidenced-informed approach to enhancing programme-wide assessment
TESTA to FASTECH
Dr Tansy Jessop & YazEl Hakim, University of Winchester
Professor Paul Hyland, Bath Spa University
JISC Online Annual: 22 November 2011
2. Pre-Conference Activities
Pre-reading: 1) Gibbs & Simpson (2004) Conditions under which assessment supports student learning.
http://www2.glos.ac.uk/offload/tli/lets/lathe/issue1/articles/simpson.pdf
2) Gibbs, G. & Dunbar-Goddet, H. (2007) The effects of programme assessment environments on student learning.
http://www.heacademy.ac.uk/assets/documents/teachingandresearch/gibbs_0506.pdf
3) Jessop, T., Smith, C. &El Hakim, Y. (2011) Programme- wide assessment: doing ‘more with less’ from the TESTA NTFS project. HEA Assessment & Feedback Briefing Paper.
http://www.heacademy.ac.uk/assets/documents/assessment/2011_Winchester_SS_Briefing_Report.pdf
3. 1)What conditions do you see as most important in student learning (Paper 1)?
2)What is your response to the idea of institutional and programme ‘assessment environments’ which influence assessment and feedback patterns? (Paper 2)
3)What are the main challenges and benefits of addressing assessment patterns on a whole programme? (Paper 3)
Pre-conference questions
5. Why TESTA has been compelling
1)The research methodology
2)It is conceptually grounded in assessment and feedback literature
3)It’s about improving student learning
4)It is programmatic in focus
5)The change process is dialogic &developmental
6. Presentation Overview
1)The Research Methodology (Tansy)
2)Case study as a compelling narrative (Tansy)
3)Trends in assessment & feedback (Tansy) Q&A
4)The student effort narrative (Yaz)
5)The bewildered student narrative (Yaz)
6)Systems-failure on feedback narrative (Yaz) Q&A
7)A way forward: FASTECH (Paul)
7. Two Paradigms
Transmission
•Expert to novice
•Planned, packaged & ‘delivered’
•Feedback given by experts
•Feedback received by novices
•One way traffic
•Very little dialogue
•Emphasis on measurement
•Competition
Metaphor = mechanical system
Social constructivist model
•Participatory, democratic
•Messy and process-oriented
•Peer review
•Self-evaluation
•Social process
•Dialogue
•Emphasis on learning
•Collaboration
Metaphor = the journey
8. 1)Research Methodology
•triangulates data from three sources
•presented in a case study
•complex, ambiguous, textured
•open to discussion -not the ‘final word’
•‘before’ and ‘after’ data
9. Programme Audit
•How much summative assessment
•How much formative (reqd, formal, feedback)
•How many varieties of assessment
•Proportion exams to coursework
•Word count of written feedback
•How much ‘formal’ oral feedback
•Criteria, learning outcomes, course docs
10. Assessment Experience Questionnaireversion 3.3
•28 questions
•5 point Likertscale where 5 = strongly agree
•9 scales and one overall satisfaction question
•Scales link to conditions of learning
•Examples:
–quantity and distribution of effort;
–use of feedback;
–quantity and quality of feedback;
–clear goals and standards
11. Focus groups
•What kinds of assessment
•How assessment influences your study behaviour
•Whether you know what quality work looks like
•What feedback is like and how you use it
12. Research Methodology
ASSESSMENT
EXPERIENCE
QUESTIONNAIRE
(AEQ n= 1200+)
FOCUS GROUPS
(n=50 with
301 students)
PROGRAMME AUDIT
(n=22)
Programme Team Meeting
13. 2) The cases are surprising, complex, puzzling
Here is one case from the TESTA data……
14. Case Study 1
•Lots of coursework (47 tasks)
•Very varied forms (15 types of assessment)
•Very few exams (1 in every 10)
•Masses of written feedback on assignments (15,412 words)
•Learning outcomes and criteria clearly specified
….looks like a ‘model’ assessment environment
15. But students:
•Don’t put in a lot of effort and distribute their effort across few topics
•Don’t think there is a lot of feedback or that it very useful, and don’t make use of it
•Don’t think it is at all clear what the goals and standards are
……what is going on?
16. Your best guesses
A.Variety of assessment confuses students
B.Assessment in ‘bunched’ at certain times
C.The feedback is too late to be of any use
D.Teachers don’t share a common standard
E.Other
•Select your response from the buttons (A B C D E) at the bottom-right of the list of participants
•Type any additional comments into the text-chat
17. •Teachers work hard, students less so.
•Feedback is too late to be useful
•Teachers have varied standards
•Students see feedback as ‘modular’
•Variety confuses students
•Formative tasks are assigned low priority
•Summative assessment drives effort
What is going on?
18. 3) Trends in assessment and feedback
•High summative assessment, low formative
•High variety (average 11; range 7-17)
•Written feedback (ave7,153; r = 2,869-15,412 )
•Low oral feedback (average 6 hours)
•Watertight documents, tacit standards
•Huge institutional and programme variations:
oformative: summative ratios (134:1 cf1:10)
ooral feedback (37 minutes to 30 hours)
20. 4) The effort narrative. TESTA data shows that:
•average of 12 summative per year
•24 teaching weeks, one every two weeks
•summative tasks end-loaded & bunched
•leading to patchy effort
•and surface learning
•with an average three formative tasks a year….
21. The more you write the better you become at it… and if we’ve only written 40 pieces over three years that’s not a lot.
So you could have a great time doing nothing until like a month before Christmas and you’d suddenly panic. I prefer steady deadlines, there’s a gradual move forward, rather than bam!
In the second year, I kept getting such good marks I thought “If I’m getting this much without putting in much effort that means I could do so much better if I actually did do the hours” but it just goes up and down really.
22. TESTA plus HEPI quiz
Which one is false?
A)1 in 3 UK students study for 20 hours or less a week
B)Students on only 1 out of 7 TESTA programmes agreed that they were working hard
C)Students work hardest when there is a high volume of formative assessment and oral feedback
D)Students work hardest when there is a high volume of summative assessment and written feedback
E)1 in 3 UK students undertake > 6 hours of paid work a week
Select your response from the buttons (A B C D E) at the bottom-right of the list of participants
23. Chat box
What ideas might encourage students to put in effort regularly on degree programmes?
•Type your responses in the text chat
24. Strategies to encourage student effort
Choose your top strategy to encourage effort:
A)Raise expectations in first year
B)Require more formative assessment
C)Link formative and summative tasks
D)Use more peer and self assessment
E)Design small, frequent assessed tasks
Select your response from the buttons (A B C D E) at the bottom-right of the list of participants
25. Technologies that may help…
What technologies might work to spur on regular and distributed effort?
Type your responses in the text chat
26. 5) The baffled student narrative
oThe language of written criteria is difficult to understand
ofeedback does not always refer to criteria
ostudents feel that marking standards vary and are subjective and arbitrary
ostudents sometimes use criteria instrumentally
27. I’m not a marker so I can’t really think like them... I don’t have any idea of why it got that mark.
They have different criteria, build up their own criteria. Some of them will mark more interested in how you word things.
You know who are going to give crap marks and who are going to give decent marks.
28. Chat Box
What strategies might help students to internalise goals and standards?
•Type your responses in the text chat
29. Strategies to help students know what ‘good’ is
Which strategy do you think helps most?
A)Showing students models of good work
B)Peer marking workshops
C)Lots of formative tasks with feedback
D)Plenty of interactive dialogue about standards
E)Self assessment activities
Select your response from the buttons (A B C D E) at the bottom-right of the list of participants
30. 6) System-wide features make it difficult for students to use feedback and act on it
ofeedback often arrives after a module, or after submission of the next task
otasks are not sequenced or connected across modules, leading to lack of feed forward
ostudents sometimes receive grades electronically before their feedback becomes available on parchment in a dusty office
otechnology has led to some depersonalised cut and pasting
31. It’s rare that you’ll get it in time to help you on that same module.
t’s rare that you’ll get it in time to help you on that same module.
You know that twenty other people have got the same sort of comment.
I look on the Internet and say ‘Right, that’s my mark. I don’t need to know too much about why I got it’.
I only apply feedback to that module because I have this fear that if I transfer it to other modules it’s not going to transfer smoothly.
You can’t carry forward most of the comments because you might have an essay first and your next assignment might be a poster.
33. Types of changes
1.Reduced summative
2.Increased formative assessment
3.Streamlined variety
4.Raised expectations of student workload
5.Sequenced and linked tasks across modules
6.Practice based changes
36. FASTECHFeedback and Assessment for Students with Technology
What is FASTECH?
•R&D Project (3 yrs): ‘R’ primarily with TESTA tools; ‘D’ in disciplines and universities.
•approach: teaching teams with students interpret ‘R’ data to determine goals of ‘D’.
•activities: to address QA and QE issues, optimize sector engagement (fastech.ac.uk)
•outputs: R&D findings, experiences & guides by teachers, students, others…
Pragmatic Principles?
•Fast: using readily-available technologies; quick to learn, easy to use …
•Efficient: after start-up period; saves time & effort ( paper), productivity …
•Effective: brings significant learning benefit to students, pedagogic impact …
37. FASTECH: aPedagogical Goal
Student
baggage …
•all can be strategic!
and blocks:
•ideas about roles of S & T
•…
… ability to manage own learning …
In each assessment culture, this entails using technologies that help promote
transparency & S participation in all processes from design and management to feedback and revision
(validity, reliability & fairness are not enough)
a reshaping of teacher & student responsibilities
processes that enhance and create new: peer- learning activities & collaborations (in/out of class); self & peer assessment; recording, sharing & review of students’ progress and achievements …
teacher revision of pedagogies, based upon records of student progress & achievement in learning
attuning of assessment to address individual & distinctive needs & aspirations …..
Teacher
baggage …
and blocks:
•ideas about role of assessment
•unsure about value of feedback
•assessment & marking conflated
•criteria & standards
•…
38. Finally, for an excellent overviewof technologies and pedagogies
JISC, Effective Assessment in a Digital Age. Bristol: HEFCE, 2010.
Available at: www.jisc.ac.uk/digiassess(esp., pp. 14-15, 54-55)
For resources associated with this publication:
www.jisc.ac.uk/assessresource
Please contact us for more info about TESTA and FASTECH:
Tansy.Jessop@winchester.ac.ukYassein.El-Hakim@winchester.ac.uk
p.hyland@bathspa.ac.uk
Websites: www.testa.ac.uk & www.fastech.ac.uk (from January 2012)
Thank You
39. DISCUSSIONto be continued in the conference discussion forum
How do you think using technology in A&F will improve students’ learning?
40. References
Black, P. & D. William (1998) ‘Assessment and Classroom Learning’, Assessment in Education: Principles, Policy and Practice.5(1): 7-74.
Bloxham, S. & P. Boyd (2007) Planning a programme assessment strategy. Chapter 11 (157-175) in Developing Effective Assessment in Higher Education. Berkshire. Open University Press.
Boud, D. (2000) Sustainable Assessment: Rethinking assessment for the learning society, Studies in Continuing Education, 22: 2, 151 —167.
Gibbs, G. & Simpson, C. (2004) Conditions under which assessment supports students' learning. Learning and Teaching in Higher Education. 1(1): 3-31.
Gibbs, G., & Dunbar-Goddet, H. (2007)The effects of programme assessment environments on student learning. Higher Education Academy. http://www.heacademy.ac.uk/assets/York/documents/ourwork/research/gibbs_0506.pdf
Gibbs, G. & Dunbar-Goddet, H. (2009). Characterising programme-level assessment environments that support learning. Assessment & Evaluation in Higher Education. 34,4: 481-489.
41. Jessop, T., El Hakim, Y. & Gibbs, G. (2011) TESTA: Research inspiring change, Educational Developments 12 (4). In press.
Jessop, T., McNab, N., and Gubby, L. (2012 forthcoming) Mind the gap: An analysis of how quality assurance procedures influence programme assessment patterns. Active Learning in Higher Education. 13(3).
Knight, P.T. and Yorke, M. (2003) Assessment, Learning and Employability. Maidenhead. Open University Press.
Nicol, D. J. and McFarlane-Dick, D. (2006) Formative Assessment and Self- Regulated Learning: A Model and Seven Principles of Good Feedback Practice. Studies in Higher Education. 31(2): 199-218.
Nicol, D. (2010) From monologue to dialogue: improving written feedback processes in mass higher education, Assessment & Evaluation in Higher Education, 35: 5, 501 –517
Sambell, K (2011) Rethinking Feedback in Higher Education. Higher Education Academy Escalate Subject Centre Publication.