• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Pgcap feedback-on-summative-assessment group powerpoint
 

Pgcap feedback-on-summative-assessment group powerpoint

on

  • 2,429 views

This is the Powerpoint presentation on the limitations of Summative Assessment for our PGCAP Action Learning Set. (c) John Cocksedge, Jaime Pardo, Monica Casey and Tahira Majothi, University of ...

This is the Powerpoint presentation on the limitations of Summative Assessment for our PGCAP Action Learning Set. (c) John Cocksedge, Jaime Pardo, Monica Casey and Tahira Majothi, University of Salford 2011.

Statistics

Views

Total Views
2,429
Views on SlideShare
2,429
Embed Views
0

Actions

Likes
0
Downloads
8
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Law and Watts 1977. Think about your own CPD Model.
  • http://www.heacademy.ac.uk/assets/York/documents/ourwork/rewardandrecog/ProfessionalStandardsFramework.pdf Areas of activity: 1. Designing and planning learning activities 2. Teaching and/or supporting student learning 3. Assessment and giving feedback to learners 6. Evaluation of practice and continuing professional development Core Knowledge: 1. The subject material 3. How students learn, both generally and in the subject 4. The use of appropriate technologies 5. Methods for evaluating the effectiveness of teaching (partly with student feedback although looking to develop student observations) Professional Values: 1. Respect for individual learners 2. Commitment to incorporating the process and outcomes of relevant professional practice 3. Commitment to development of learning communities 4. Commitment to encouraging participation in higher education, acknowledging diversity and promoting equality of opportunity 5. Commitment to CPD and evaluation of practice
  • These works have been issued under a Creative Commons Attribution-Non-commercial-No Derivatives 2.0 UK: England and Wales licence. Copyright resides with HEFCE on behalf of JISC.
  • Alternative to the essay, Richard Winter The Guardian , Tuesday 10 June 2003. Richard Winter is professor of education at Anglia Polytechnic University. Patchwork Text special issue of the SEDA journal Innovations in Education and Teaching International
  • Assessments fall in between formative and summative depending on the type of delivery and topic. Making use of a variety of assessments aids reflection and moves us closer to more inclusive learning for all. Therefore we need to work towards a “ synergy between formative and summative aspects” McLeary (2009) of teaching and learning. Learning and Teaching Scotland chief executive Bernard McLeary (2009)
  • I’m going to focus on using Clickers as a means of providing summative assessment and feedback on this. So, what are Clickers...
  • The aim of the information literacy strategy is to embed information literacy at all levels , in all programmes to provide students with the skills to find, evaluate and use relevant information. Library inductions and other sessions feed into this. Like Tahira, there is a challenge of incorporating summative assessment in my role. Most of the assessments and feedback that I do are in one off information literacy or library induction sessions. The summative assessment and feedback needs to take place within session as it’s unlikely that I’ll get to see them again. The students in my sessions vary. They range from 18 year olds that are straight in from college and to mature students returning to study for CPD. They can also be at any Level. They all come with their own experiences and there is no ‘typical student’ so I need to try and accommodate for this An example of how I use the clickers is in Library Induction sessions.  The aim of these sessions is to provide an overview of library resources and how to access these. The inductions are very student centred. From an initial powerpoint slide, the students choose what they want to learn about, what is relevant to them, and I do not necessarily go over all the elements of the slides as there may be some topics that the group is not interested in.  The students then get to log on to the PCs and access some of these services through the use of a task sheet.  They are encouraged to work together if they prefer. Towards the end of the session I will use the Clickers to assess understanding on what we have done in the session.  Each student is given a clicker I’ll ask 5 or 6 multiple choice questions. The group use the clickers to respond to the question and the system will feedback to the group the percentage for each answer and then, which answer is correct.  This helps me to gauge the group understanding and then give s me the opportunity to feedback to the group as a whole and cover areas where the there may have been confusion.  In this instance, the technology really aides my understanding in terms of what the group has learnt and what needs further input. The added benefits of using the clickers here is it’s not written feedback (as mentioned in Jamies, Phil Race quote), it’s instant and interactive – I can explain things further and they can ask questions about it. I can adapt the content of what I need to elaborate on depending on the group. It also tells me if the session has met its aims. The assessment is aligned with both the learning outcomes and the tasks that they carry out and therefore, constructively aligned (Biggs 2003).  That is, by the end of the session they should be familiar the basic library services and resources that are available to them. The questions at the end are to confirm knowledge – so it is a low level on Blooms taxonomy, but hopefully, the session give them the skills to go away and start to apply this knowledge.
  • In terms of the learners, there are a number of positive elements from using the clickers They are anonymous, and therefore, unthreatening. Caldwell (2007) indicates that students like the anonymity but they can still compare answers know that ‘they are not alone even when wrong’. It’s also suggested that ‘responding to questions encourages all students to think actively about a question and commit to a response even if they are uncertain. This kind of activation and challenging of learners’ current understanding is central for learning’ (McCune, no date) Students that may not want to speak up or raise hands to answers questions in public, through fear of embarrassment, can participate Feedback is immediate, they students know how they’ve done straight away and can respond to this if they want to The technology enables the feedback to be accessible and inclusive. It helps with this to read the question out and also go through the list of possible answers and also go over the results. This helps to meet the needs of people that prefer to listen but also for people that are visual learners The literature indicates that clickers ‘seem to increase attention through active participation’ (Julian and Benson 2008)
  • Student feedback from the sessions is always positive about the use of the clickers. Some of the feedback for the inductions include:

Pgcap feedback-on-summative-assessment group powerpoint Pgcap feedback-on-summative-assessment group powerpoint Presentation Transcript

  • Action learning set 3 Feedback on summative assessment
    • Presentation outline
    • John Cocksedge – Using a Hybrid approach to feedback and summative assessment
    • Tahira Majothi – The impracticalities of summative assessment in careers guidance and planning
    • Jaime Pardo – Investigating feedback on summative assessment within MMP and exploring possible alternate approaches to provide better feedback to students
    • Monica Casey – Using Clickers for feedback on summative assessment in library sessions
  • Product design dept The Hybrid approach to feedback on summative assessment John Cocksedge
  • “ Summative contrasts with formative assessment in that [the former] is concerned with summing up or summarizing the achievement status of a student, and is geared towards reporting at the end of a course of study especially for purposes of certification; it is essentially passive and does not normally have immediate impact on learning, although it often influences decisions which may have profound educational and personal consequences for the student” (Sadler 1989)
  • The nature of product design students
    • Designers
    • Produce novel, unexpected solutions
    • Tolerate uncertainty, working with incomplete information
    • Apply imagination and constructive forethought to practical problems
    • Modelling media as means of problem solving
    • Resolve ill-defined problems
    • Adopt solution-focussing strategies
    • Employ abductive/productive/appositional thinking
    • Use non verbal graphical/spatial modelling media
    • ‘ The Nature and Nurture of Design Ability’, (Cross 1990)
  • So how do we assess & feedback to product designers “ Whilst the value of process, personality traits and the social environment, is clearly important, creative output is the final benchmark on which judgments' are made and upon which consensus is achieved or disputed regarding the merit of the work”. (Karl K Jeffries, 2007)
  • We do feedback on summative assessment - BUT
    • Outgoing method is time consuming and produces assessment/feedback fatigue
    • Does not capture the individual learning journey
    • Does not capture/identify student diversity
    • Does not identify deep learning
    • Danger of influencing teaching methods/material
    • Could motivate students to only pass and not to learn
  • We use a hybrid approach of formative (feed forward) and summative assessment to produce feedback
    • Why?
    • To facilitate learning
    • To monitor learning in progress
    • Provide feedback/feed forward to learners
    • Provide feedback to colleagues
    • Diagnose learners needs or obstacles to learning
  • The hybrid approach and Kolb’s experiential learning cycle Concrete experiences Forming abstract concepts Observation & reflection Testing in new situations
  • The hybrid approach and Kolb’s experiential learning cycle Concrete experiences Forming abstract concepts Observation & reflection Testing in new situations Formative feedback / feed forward Feedback & observation – the learner considers the formative feedback received and decides what next The learner tries out the new approach Tutor activity Student activity
  • “ Formative assessment must be pursued for its main purpose of feedback into the learning process; it can also produce information which can be used to meet summative purposes” (Black 1995, cited in Brown 2007)
  • How do we do this in product design
    • Align our ILO’s with the Module plan and the assessable tasks (Constructive alignment, Biggs 1999)
    • Atelier model of learning (Design Council, Creative and Cultural Skills, 2006) – Personalise the curriculum
    • Sequence the modules, tasks and ILO’s along a consistent design process framework – Research, Ideation & verification
    • Weight the assessment tasks in relation to the ILO’s – Focus
    • Sustained frequency of one to one feedback
    • Capture and record formative feedback – ‘Doctors notes’, consistency
    • Criterion referencing – ‘Detailed module maps’
    • Encourage Ipsative assessment – Self awareness
    • Encourage Diagnostic self assessment - Motivation
  • The formative and summative assessment engine – the Module Map A consistent framework and point of reference for student feedback
  • How does this help us with feedback
    • It allows us to assess work on the fly
    • It allows us to monitor the flow of the module and adjust accordingly
    • It allows students to have full sight of and plan for assessable tasks
    • It allows us to develop/plan for appropriate resources
    • It allows us to develop timely feedback
    • It allows us to give very specific feedback
    • It is non threatening to students
    • It encourages students to ask questions / seek guidance
    • It allows students to experience success
    • It allows us to improve
  • Development in response to student feedback
  • What next ? ‘ As we use formative and summative assessment on our learners we must also use it on ourselves and our methods’
  • What next ?
    • Task mapping ‘power bulge’
    • Module maps
    • Exemplars
    • Feedback groups
    • Peer to peer
    • Self assessment (pre and post module)
    • Dynamic online self report diagnostics
    • Statement banks
    • Personalised development plans
    ‘ As we use formative and summative assessment on our learners we must also use it on ourselves and our methods’
  • “ The indispensable conditions for improvement are that the student comes to hold a concept of quality roughly similar to that held by the teacher, is able to monitor continuously the quality of what is being produced during the act of production itself, and has a repertoire of alternative moves or strategies from which to draw at any given point. In other words students have to be able to judge the quality of what they are producing and be able to regulate what they are doing during the doing of it ” (Sadler 1989)
  • Tahira
  • Challenges of Summative Assessment in a Careers Context
    • Stand alone careers workshops
    • Singular interactions
    • Diversity and the diverse range of students
    • Limited input into formal assessments
    © mylot.com, Google images
  • Self awareness: Gain knowledge and understanding about your career-related interests, skills, aptitudes, preferences and goals. Transition learning: Implement your career decisions and put your plans into effect. Produce CVs, apply for jobs and gain work experience . Decision-making: Evaluate opportunities, make decisions, action plan and set goals. Opportunity awareness: Identify sources of information and opportunities in training, education and work. SODT Model: Career Planning
  • Assessment activities within Careers
    • Salford Student Life Award
    • 1:1 QQ or long appointments
    • Workshops
    • Filmed mock interviews
    • Graduate Gateway
    • Career planning exercises
    • MBTI/Belbin
    • How does this meet UK Professional Standards
    • Framework (Areas of activity, Core Knowledge
    • and Professional values) ?
    © Salford Careers and Employability Service
  • Fluidity of assessments
    • Associative perspective (acquiring competence) – voting pads
    • Constructivist ( learning as achieving understanding)– construct own learning, self reflection – SSLA, Graduate Gateway
    • Social constructivist (learning as achieving understanding) – workshops, peer learning
    • Situative (learning as social practice) – ‘learning as arising from participation in communities of practice’ e.g. GG placements, SIFE, employer-led assessments etc
    • JISC (2010), Effective Assessment in a Digital Age. (p9-11)
    © elated.com Google Images.
  • Future Plans: Patchwork Text (Winter 2003) Methodology
    • Employability modules/Bespoke delivery:
    • Blackboard/Elluminate/VDS
    • Camtasia/Meebo
    • YouTube
    • Peer reviews/student observations
    • Case studies
    • Work experience
    • Specific support for care leaver graduates
    • This will involve:
    • Variety of assessments
    • Small working groups
    • Little and often – assessments
    • “… online tools can support peer and self-assessment in any location and at times to suit learners – the value of peer and self-assessment in developing learners’ ability to regulate their own learning is increasingly recognised.” JISC (2010), Effective Assessment in a Digital Age.
    © Flickr. Nicky Perryman
  • Revision of Bloom’s Taxonomy
    • Remembering – recalling relevant knowledge
    • Understanding – constructing meaning
    • Applying – implementing
    • Analyzing – differentiating
    • Evaluating – critiquing, self reflection
    • Creating – putting elements together in coherent steps
    • Revised version of Bloom’s Taxonomy (Anderson and Krathwohl (2001) cited in Forehand 2010)
    © boohewerdinesblogthing.blogspot.com
  • Working towards Constructive Alignment: Biggs (1999) Biggs, J (1999). The chapter above was taken from Houghton, Warren (2004)  Engineering Subject Centre Guide: Learning and Teaching Theory for Engineering Academics.  Loughborough: HEA Engineering Subject Centre. http://www.engsc.ac.uk/learning-and-teaching-theory-guide/constructive-alignment
  • References
    • Biggs, J (1999). ‘Teaching for Quality Learning at University’, in Houghton, W (ed) (2004)  Engineering Subject Centre Guide: Learning and Teaching Theory for Engineering Academics.  Loughborough: HEA Engineering Subject Centre.
    • Forehand, M. (2010) Emerging Perspectives on Learning, Teaching and Technology. University of Georgia website http://projects.coe.uga.edu/epltt/index.php?title=Bloom%27s_Taxonomy [Accessed 20/03/11]
    • JISC (2010), Effective Assessment in a Digital Age, A guide to technology-enhanced assessment and feedback . JISC pp9-11.
    • Law, B. and Watts, A.G. (1977) DOTS Model. London: Schools, Careers and Community. Church Information Office.
    • The Higher Education Academy (2006) The UK Professional Standards Framework for teaching and supporting learning in higher education.
    • Winter, R. (2003) ‘Alternative to the Essay’, on Guardian Education website http://www.guardian.co.uk/education/2003/jun/10/highereducation.uk [Accessed 23/03/11]
  • Jaime
  • “ Feedback on paper is the most dangerous, most widely-used, yet least effective way of helping students to learn from their triumphs and disasters. Face-to-face feedback helps students to make sense of their thinking, aided by tone of voice, facial expression, body language, encouraging smiles, speed of speech, emphasis on particular words, and the ability to fine-tune the feedback on the basis of how it is being received. Paper-based feedback allows for none of these.” http://phil-race.co.uk/if-i-were-in-charge/
  • Equality & Diversity
    • According to the Subject benchmark Statements from the Quality Assurance Agency for Higher Education:
    • “ Research indicates that dyslexia is more prevalent amongst students of art and design than in other subjects…”
    • Umran Ali, Equality and Diversity Coordinator for School of MMP:
    • “ The percentage of students on support plans within the school of MMP has been as high as 30% but is usually somewhere around the 10% mark. Compared to an average of a round 4% across the University as a whole.”
    • Group: Students with learning difficulties such as dyslexia
    • Measures:
    • Use of a variety of different teaching methods, including workshops and one on one tutorials
    • Use of staged handouts to support verbal delivery (handouts throughout the lecture instead of one big clump at the end)
    • Blackboard & other electronic resources used for notes and exercises
    • One on one tutorials for support & guidance
    • Group: Students with physical difficulties
    • Measures:
    • One on one tutorials for support & guidance
    • Careful choice of room/access
    • Use of a variety of audio/visual/text based content (for visually/hearing impaired students)
    • Pre planning for external visits to ensure disabled access/support.
    • Group: Students with mental health/personality disorders
    • Measures:
    • ‘ Opt out’ option for presentations, alternative provided(private or other form of assessment)
    • Small group presentations & gradual introduction of potentially difficult tasks (i.e. weekly practise of presentations building up to final formatively assessed task)
    • Sensitivity to personal needs: Not drawing accidental undue attention to student by asking questions to individual students during lectures/seminars.
  • We know from week 6: “ The dialogic feedback system puts the students at the centre of learning, providing them with a series of opportunities to act on feedback.” (Duhs, 2010, 5) Underlying my account is the view that: “ The single, strongest influence on learning is surely the assessment procedures …even the form of an examination question or essay topics set can affect how students study … It is also important to remember that entrenched attitudes which support traditional methods of teaching and assessment are hard to change.” (Entwistle,1996, pp. 111–12)
  • Student Feedback What was most useful? “ Tutor support, comments and information on handouts was provided nice and early on.” “ The tutor and peer help.” “ The group discussions, well organised.” “ The guidance throughout assignments.” Are there any changes you would recommend making to the module? “ To be longer, the whole year perhaps?”
  • “ Emphasis is placed on active rather than passive uses of the tool to encourage an ethos of independent learning: students set up their own blog, invite others to join, and upload images and other digital resources to support one another in research activities.” (p3.) “… now marks recorded in Turnitin are only visible to the individual student and his or her tutor. Students are also more likely to return to the feedback they have been given: grades and feedback remain stored in the system and are not lost by the time of the next assignment. ” (p3.) JISC Case study 3: Supporting The Transition To Degree Level Study, Loughborough College. http://www.jisc.ac.uk/media/documents/programmes/elearning/digiassess_supportingtrans.pdf
  • “ While it is difficult to establish that oral feedback has a greater impact on students’ cognitive development than written feedback, students on the MSc Occupational Psychology course appear to be more attentive to spoken feedback; most respond positively to the intimacy of the spoken word and perceive tutors’ advice as being clearer and more detailed. Audio-recorded feedback is also helping to reduce the isolation of learning remotely; early evidence from course data suggests that there may have been a positive impact on retention rates, although this has yet to be empirically evaluated: ‘ Podcasts made me feel closer to my tutors and I think they help you to build a relationship with them.’ Student, MSc Occupational Psychology, University of Leicester” (p3.) JISC Case study 6: Enhancing The Experience of Feedback, University of Leicester http://www.jisc.ac.uk/media/documents/programmes/elearning/digiassess_enhancingfeedbk.pdf
  • Conclusion
    • “ Feedback is a worthy focus of academic effort since it focuses students on what they need to improve.” ( Blayney and Freeman, 2004:2)
    • Written Feedback on Summative Assessment is widely used yet ineffective.
    • Technology can enhance the experience of feedback
        • Audio Feedback – podcasting
        • Use of blogs and e-portfolios
  • Monica
  • Using Clickers for summative assessment and feedback What are Clickers? Clickers* are similar to the technology used on the TV program “Who Wants To Be a Millionaire” during ‘ask the audience’. A teacher asks questions in-class and students use a ‘clicker’ to respond. The students’ responses can be viewed immediately on projector screen and/or scores can be captured then reports generated for further analysis. * Clickers are also known as Personal Response Systems (PRS), Audience Response Systems (ARS), Electronic Response Systems (ERS), Student Response Systems (SRS), Interactive Response Systems (IRS), Electronic Voting Systems (EVS), Classroom Response Systems (CRS), Zappers, Voting Pads …. and more. Taken from Dunleavy, C (no date)
  • Using Clickers in Library Inductions
    • Context:
      • Information Literacy strategy aims to provide students with transferable skills
      • Wide range of students
      • ‘ One shot’ sessions
    • Inductions:
      • Student centred
      • Clickers used for immediate summative assessment and feedback
      • (links with UK PSF Core Knowledge 4, ‘Use of appropriate learning technologies’)
  • What are the benefits for the learners?
    • Anonymous
      • Caldwell (2007) indicates they like to know they are not alone in their thinking
    • Responding to questions ‘encourages all students to think actively’ (McCune, no date)
    • Immediate face to face feedback
    • Enables feedback to be accessible and inclusive
  • Student Feedback
    • “ The voting pods were awesome”
    • “ Enjoyed the session with the interactive key pad and made me engage and learn more from the session”
    • “ Who knew being in a library could be so much fun!”
    • “ Overall, clickers have the potential to improve classroom learning, especially in large classes. Students and instructors find their use stimulating, revealing, motivating, and – as an added benefit – just plain fun” (Caldwell, 2007, p19)
  • Implications for ongoing practice ‘ As we use formative and summative assessment on our learners we must also use it on ourselves and our methods’
    • Exemplars
    • Feedback groups
    • Peer to peer
    • Self assessment (pre and post module)
    • Dynamic online self report diagnostics
    • Statement banks
    • Personalised development plans
    • Use of technology for feedback/summative assessment
    • Student feedback
    • Hybrid formative/summative approach
    Feedback on assessment should be about putting students at the centre of their own learning and equipping them with the tools for lifelong engagement
  • References & Bibliography   Biggs, J (1999). ‘Teaching for Quality Learning at University’, in Houghton, W (ed) (2004)  Engineering Subject Centre Guide: Learning and Teaching Theory for Engineering Academics.  Loughborough: HEA Engineering Subject Centre.   Brown, S. (1997) ‘Using formative assessment to promote student learning’, www.ldu.leeds.ac.uk/news/events/documents/BrownPowerPoint.pdf , accessed on 09/03/11   Cross N.G. (1990) ‘The nature and nurture of design ability’, Design Studies, Vol. 11, No 3, pp.127-140   Jeffries, K. (2007) ‘Diagnosing the creativity of designers: individual feedback within mass higher education’, Design Studies , vol. 28, issue 5, pp.485-497   Kolb, D.A. (1984) Experiential Learning: Experience as the Source of Learning and Development , Englewood Cliffs, NJ, Prentice-Hall   Sadler, D. R. (1989) ‘Formative assessment and the design of instructional systems’, Instructional Science, vol. 18, 119-144
    • Biggs, J (1999). ‘Teaching for Quality Learning at University’, in Houghton, W (ed) (2004)  Engineering Subject Centre Guide: Learning and Teaching Theory for Engineering Academics.  Loughborough: HEA Engineering Subject Centre.
    • Forehand, M. (2010) Emerging Perspectives on Learning, Teaching and Technology. University of Georgia website http://projects.coe.uga.edu/epltt/index.php?title=Bloom%27s_Taxonomy [Accessed 20/03/11]
    • JISC (2010), Effective Assessment in a Digital Age, A guide to technology-enhanced assessment and feedback . JISC pp9-11.
    • Law, B. and Watts, A.G. (1977) DOTS Model. London: Schools, Careers and Community. Church Information Office.
    • The Higher Education Academy (2006) The UK Professional Standards Framework for teaching and supporting learning in higher education.
    • Winter, R. (2003) ‘Alternative to the Essay’, on Guardian Education website http://www.guardian.co.uk/education/2003/jun/10/highereducation.uk [Accessed 23/03/11]
  • Biggs, J. and Tang, C. (2007) Teaching for Quality Learning at University. Society for Research into Higher Education & Open University Press. Duhs, R. (2010) „Please, no exam”‟ Assessment strategies for international students, in: SEDA Educational Developments, Issue 11.4, Dec, pp. 3-6 Hattie, J. and Timperley, H. (2007). The power of feedback. Review of Education Research , 77, 81-112. Knight, Peter T.(2002) 'Summative Assessment in Higher Education: Practices in disarray', Studies in Higher Education, 27: 3, 275 — 286 Entwistle, N. (1996) Recent research on student learning, in: J. TAIT & P. KNIGHT (Eds) The Management of Independent Learning , pp. 97–112 (London, Kogan Page) JISC Case study 3: Supporting The Transition To Degree Level Study, Loughborough College. http://www.jisc.ac.uk/media/documents/programmes/elearning/digiassess_supportingtrans.pdf JISC Case study 6: Enhancing The Experience of Feedback, University of Leicester. http://www.jisc.ac.uk/media/documents/programmes/elearning/digiassess_enhancingfeedbk.pdf JISC Case study 8: Reflecting on Feedback, University of Westminster. http://www.jisc.ac.uk/media/documents/programmes/elearning/digiassess_rereflectingfdback.pdf Subject benchmark Statements, Art and Design (2008). http://www.qaa.ac.uk/academicinfrastructure/benchmark/honours/artanddesign.asp
  • Biggs, J.B. (2003) Teaching for quality learning at university (2nd edition). Buckingham, Open University Press Caldwell, J. E. (2007) “Clickers in the large classroom: current research and best-practice tips” CBE-Life Sciences Education , Vol 6, Spring, pp 9 – 20 Deleo, P., Eichenholtz, S. and Sosin, A. A.(2009) “Bridging the information literacy gap with clickers”, The Journal of Academic Librarianship , 35 (5) pp438 - 444 Dunleavy, C (no date) Enhancing face-to-face teaching with Clickers . <www.ldu.salford.ac.uk/html/tel/tools/clickers.html> [accessed 20/03/2010] Julian, S. and Benson, K. (2008) “Clicking your way to library instruction assessment”, C&RL News , May, pp 258 – 260 McCune, V. (no date) “Effective use of clickers in the College of Science and Engineering”, one The College of Science and Engineering , Edinburgh University website. <www.scieng.ed.ac.uk/LTStrategy/clickers_effectiveUse.html> [accessed 21/03/2010]