Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Demolishing the seven pillars: a warning from research. Walton


Published on

Presented at LILAC 2010

Published in: Education
  • Be the first to comment

  • Be the first to like this

Demolishing the seven pillars: a warning from research. Walton

  1. 1. Geoff Walton, Academic Skills Tutor: Librarian, Staffordshire University Demolishing the Seven Pillars: a warning from research?
  3. 3. PLAN  Introduction - my motivation for doing PhD  Research project  My intention to build a picture to illustrate what students did during the learning interventions and reveal what their thoughts and feelings were for each stage  Implications for models of information literacy  New assessment tool for measuring levels of information discernment  Concluding remarks
  4. 4. INTRODUCTION  My original motivation to prove the ‘correctness’ of information literacy as a model and ‘expose’ VLEs as the ‘emperors new clothes’  Having completed my PhD my views have changed somewhat!  My own personal PhD journey  ‘proof’’ and ‘correctness’ are problematic  IL models don’t seem to be sensitive to the cognitive and affective processes that impinge in becoming information literate  VLEs, or more generally e-learning, offer very promising pedagogical opportunities
  5. 5. RESEARCH PROJECT – RESEARCH QUESTION AND HYPOTHESIS  How do the psychological states associated with information behaviour and thinking help explain the learning processes in an information literacy blended learning and teaching intervention?  Students who participate in online social network learning (OSNL) will demonstrate: (1) a greater degree of knowledge about e- resources (2) a greater ability to evaluate information than those who do not receive this intervention
  6. 6. RESEARCH PROJECT - METHODOLOGY  Participants Sport & Exercise first year undergraduates in first year core module Effective Learning, Information and Communication Skills (ELICS) Three tutor groups who received the following delivery:  A. Face-to-face workshop plus interactive web tools plus online social network learning (n=12)  B. Face-to-face workshop plus interactive web tools (n=11)  C. Face-to-face workshop only (n=12)
  7. 7. RESEARCH PROJECT – DATA GATHERING INSTRUMENTS Focus-group interviews (Group A) Questionnaire (Group A) Post-diagnostic test (All groups) Assessment - portfolio (All groups) Annotated bibliography Reflective practice statements Reflective essay
  8. 8. RESEARCH PROJECT – CODING FRAMEWORK FOR QUALITATIVE DATA Codes based on:  Bloom’s taxonomy  Hepworth’s information behaviour model  Moseley et al
  9. 9. RESEARCH PROJECT – FACE-TO-FACE WORKSHOP Overall context – Roles and norms  students studying at university and bring some prior knowledge with them Task – Problem-based scenario part of portfolio assignment  Examples in the one hour ‘hands-on’ activity-based workshop mirrored the problem-based scenario for the assessment (see p1 of handout)
  10. 10. RESEARCH PROJECT – FACE-TO-FACE WORKSHOP Behaviour  Interaction with source Source character  “looking at the library catalogue”, “search the library” Source behaviour “you can look for a certain subject [on the catalogue]” “you [the tutor] showed us what to do”
  11. 11. metacognition I have realised that these sources contain information that can help me to develop while studying at university You can get involved in the sessions yourselves more hands-on, it was active Being able to find out the books was really interesting
  12. 12. FACE-TO-FACE WORKSHOP Knowledge  Factual knowledge “it [the library catalogue] allowed me to see what books were available and where I could find them”  Process knowledge  “you don’t necessarily need to know what book you are looking for [on the catalogue], you can look for a certain subject” New behaviour  “I [now] use the online library to search for e- books and books”
  13. 13. ONLINE SOCIAL NETWORK LEARNING  Task - to find out how to evaluate web-pages by discussing the issue online  Used a questioning approach for example,  So, how would you decide what makes a good quality web page?  Structure of process (see p2-3)  Behaviour  Interaction with sources  peers, tutors, Berkeley website on evaluating information, the Internet Detective, and the tutor-summary for example, “we had to feedback on each others. I remember I was commenting on his [posting], he was commenting on mine”  See examples of a postings on p3-4
  14. 14. metacognition [The tutor summary] gave the whole group a bit of recognition [...] you read through what other people thought of URLs and took advice from other people not just the lecturer’s, it worked really well, it is a good way of reflecting what you’ve done
  15. 15. ONLINE SOCIAL NETWORK LEARNING  New knowledge  Both in terms of postings to the discussion and the tutor summary and final summary handout – synthesised output from discussion (see p3-6)  Transition to a feeling of less uncertainty  “It makes you aware, a little bit more aware of what web sites are more useful to you than others and there are quite a lot of web sites on line and you don’t want to be writing stuff in your assignments that’s not true. [Before] I didn’t know what the things at the end like .ac. org meant […].”  (Changed) Behaviour  “I have used [the evaluation criteria] actually, since we did it for essays and stuff, since we did these things in Effective learning it actually alerted me to what to look for when looking for a good web site and what to steer clear of.”
  16. 16. COMMERCIAL BREAK  All and more in the book!  Practical examples of learning and teaching interventions underpinned by theory  Face-to-face learning  Online learning  Based on empirical research
  17. 17. ASSESSMENT  Task  Answer the problem-based scenario see p1  Behaviour  Found six resources, two books, two journal articles and two web pages, evaluated them and then wrote about it
  18. 18. Analysis Evidence via assessed work transcripts see examples on p7k
  19. 19. ASSESSMENT  New knowledge  “I have acquired new knowledge on the Library Catalogue, Swetswise, e-journals and the Web. I now know how to look for E-journals and E-books on Swetswise and E-brary, something I did not know how to do before.”  Task completion  See assessed work transcript on p7  Changed behaviour  “I will incorporate this new knowledge and skills in the future by using these skills when completing a new task e.g. I can use the online library to search for e-books and books.“
  20. 20. OTHER EVIDENCE (QUANTITATIVE DATA)  Assessment (one-way ANOVA statistical test)  Annotated bibliography  Variety of evaluation criteria  Experimental group used greater variety of evaluation criteria than either groups B or C, but not statistically significant  However, large effect size (using Eta squared test) found – therefore, if this part of the study was carried out again with nineteen subjects per condition it would have produced statistically significant results  Frequency of evaluation criteria  Again, similar result to variety measure  However, large effect size - to gain a statistically significant result in a future study would require between twenty and twenty five subjects per condition.
  21. 21. OTHER EVIDENCE (QUANTITATIVE DATA)  Post-delivery diagnostic test – 14 multi-choice questions on the library catalogue, e-journals, referencing and evaluating web pages (analysed using one way ANOVA statistical test)  Test scores between Group A (experimental group) and Group C (control group) are significantly different at p < .025, t= 2.66 , Degrees of Freedom (df)= 22 (within groups).
  22. 22. IMPLICATIONS FOR MODELS OF INFORMATION LITERACY  They are too rigid and over simplified  The step-by-step approach indicated by some models doesn’t necessarily reflect how the person becomes information literate  They don’t seem to reflect (or harness) the social nature of learning  SCONUL model in particular does not recognise that potentially any student can synthesise information and produce new knowledge  Don’t take into account the affective dimension  ‘Grand narratives’ which aren’t sensitive to context
  23. 23. THREE SPHERES OF INFORMATION LITERACY Spheres can occur in no particular order Find/ access/ locate Evaluate/ discern Use/ communicate / produce Each sphere triggers its own set of behavioural, cognitive, metacognitive and affective states Becoming information literate takes place in a wider social context determined by roles, norms and tasks
  24. 24. NEW TOOL FOR ASSESSING INFORMATION DISCERNMENT  Level 1 - Students unaware or unconcerned regarding the need to evaluate information and may tend to use information without checking its quality - see p7-8 for more detail  Level 2 - Students have an emerging awareness of the need to evaluate information which is expressed weakly through notions of detail, suitability or quantity  Level 3 – Students aware of the need to evaluate information for quality but see the process in black and white, true or false, either/ or terms
  25. 25. NEW TOOL FOR ASSESSING INFORMATION DISCERNMENT  Level 4 – Students aware that evaluation isn’t simply a matter of black and white, they recognise the need to judge each source on its merits and talks about balance, deciding and using a range of criteria in the evaluation process.  Level 5 – Level 4 plus - students operating at this level can talk about the nature and relative value of evaluation criteria in a given setting
  26. 26. CONCLUDING REMARKS – A THEORY (SEE P9)  Becoming information literate appears to be about an individual completing a task in a given context. This context leads to the interaction with sources (e.g., databases, e-journals, books, e-books, peer and tutors) and in so doing brings about the interplay of an individual’s behavioural, cognitive, metacognitive and affective states. It is this interplay which determines the level of new knowledge learnt (or produced or both) and the degree of changed behaviour (i.e., level of information literacy) exhibited.
  27. 27. REFERENCES  Bloom, B. S., Engelhart, D., Furst, E. J., Krathwohl, D. A. and Hill, W. H. (1956). Taxonomy of educational objectives: the classification of educational goals: handbook 1: cognitive domain. New York: David McKay Company Inc.  Hepworth, M. (2004). A framework for understanding user requirements for an information service: defining the needs of informal carers. Journal of the American Society of Information Science and Technology, 55 (8), pp695-708.  Hepworth, M. & Walton, G. (2009). Teaching information literacy for inquiry-based learning. Oxford: Chandos.  Moseley, D. Baumfield, V., Higgins, S., Lin, M., Newton, D., Robson, S., Elliot, J. and Gregson, M. (2004). Thinking skills frameworks for post-16 learners: an evaluation. a research report for the Learning & Skills Research Centre. Trowbridge: Cromwell Press.