Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Design and validation of an
evaluative tool to measure information
literacy competencies in Higher Education
Elvira Saurin...
SESSION CONTENT
1. Context
2. Introduction
3. Project development
4. Development and validation of the
evaluative strategy...
Pontificia Universidad Católica de Chile
Santiago – Chile – South America
4 campus
22,000 students
32 Academic Schools...
2.INTRODUCTION
-Challenges facing current Higher Education institutions in Chile.
-Development of key information literacy...
3. PROJECT DEVELOPMENT
•Objective
To design and develop a tool for assessing the skills that UC
undergraduate students hav...
3. PROJECT DEVELOPMENT
Stage 1 - Design of the evaluative strategy:
Team composition.
Review of the literature and analy...
Stage 2 - Development and validation of the evaluative strategy:
Construction of feedbacks associated with the situation....
LITERATURE REVIEW
Most studies are based on ALA standards.
Few studies on how these tools are built and about the proced...
STUDENT AND FACULTY SURVEYS
 Aimed at identifying the participants perceptions
regarding IL competencies of undergraduate...
STUDENT AND FACULTY SURVEYS
SOME FINDINGS:
•The majority of students had a positive perception of their IL
competencies.
•...
SOME FINDINGS:
•Students showed problems in those aspects related to
the optimal use of advanced resources such as
special...
IL DOMAINS, COMPETENCIES, AND PERFORMANCE
OUTCOMES.
Profile of key competencies that a UC undergraduate should have in
or...
IL DOMAINS, COMPETENCIES, AND
PERFORMANCE OUTCOMES.
Involved: determining domains, competencies, sub
competencies and out...
COMPETENCE
Knowing how to perform, which includes, depending
on the situation, knowledge, to know-how
(procedures) and to ...
IL DOMAINS, COMPETENCIES, AND PERFORMANCE
OUTCOMES.
UC DOMAINS
DOMAIN 1:
Access to information
DOMAIN 2:
Evaluation of
inf...
Competence 1:
"Recognises a need for information,
defines and determines the nature
and extent of information needed."
IL ...
EVALUATIVE STRATEGY MODEL
How to evaluateWhat to evaluate Why to evaluate
Assesses those
competencies
considered key in
de...
4. DEVELOPMENT AND VALIDATION OF THE
EVALUATIVE STRATEGY
Construction of situations
for the test
Implementation of the
eva...
4. DEVELOPMENT AND VALIDATION OF THE
EVALUATIVE STRATEGY
Population sample: 17 Schools
– Acting
– Agricultural and Forest ...
UC IL user profile
(competencies,
sub competencies, outcomes)
Determination of situations
included in final test
Findings
...
Competence 1:
"Recognises a need for information,
defines and determines the nature
and extent of information needed."
IL ...
Example of a situation covered by the test to evaluate the
domain 1, competence 1, subcompetence 1.2., outcome 1.2.1.:
• T...
Example of a situation covered by the test to
evaluate the domain 1, competence 1,
subcompetence 1.2., outcome 1.2.1.:
• T...
Example of a situation covered by the test to
evaluate the domain 1, competence 1,
subcompetence 1.2., outcome 1.2.1.:
Fee...
5.CONCLUSIONS
We have designed and developed an evaluative strategy to
measure the information literacy competencies of UC...
5.CONCLUSIONS
A test, applied to 360 students representing the target
population, consisting of 37 situations, psychometri...
6.FUTURE STEPS
• Reviewing the situations included in the database that did not
achieve required psychometric values to im...
7. REFERENCES
Association of American & Research Libraries [ACRL], Task
Force on Information Literacy Competency Standards...
7. REFERENCES
Beile, P.M. 2005, Development and validation of the Beile test of
information literacy for Education (B-Tile...
7. REFERENCES
Bundy, A. (ed.) 2004, Australian and New Zeland information
literacy framework, 2nd ed. Adelaide: Australian...
7. REFERENCES
EducationalTesting Service (ETS) 2008. ISkills Content.
http://www.ets.org/portal/site/ets/menuitem.1488512e...
7. REFERENCES
Lau, J. 2004, Directrices internacionales para la alfabetización
informativa:propuesta.
http://bivir.uacj.mx...
7. REFERENCES
Saurina de Solanes, E., Faúndez, F. & Preiss, D. 2007. Diseño y
validación de un test para evaluar las compe...
7. REFERENCES
WASC/ACCJC.2004, Accreditation Reference Handbook.
http://www.accjc.org/ACCJC_Publications.htm (Retrieved 2
...
Upcoming SlideShare
Loading in …5
×

Design and validation of an evaluative tool to measure information literacy competencies in Higher Education. Saurina

121 views

Published on

Presented at LILAC 2009

Published in: Education
  • Be the first to comment

  • Be the first to like this

Design and validation of an evaluative tool to measure information literacy competencies in Higher Education. Saurina

  1. 1. Design and validation of an evaluative tool to measure information literacy competencies in Higher Education Elvira Saurina, Science and Technology Deputy Director, Library System- Pontificia Universidad Católica de Chile. Fabiola Faúndez, Head Academic Evaluation Unit -Universidad de Talca, Chile. David Preiss, Assistant Professor, Escuela de Psicología-Pontificia Universidad Católica de Chile. The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  2. 2. SESSION CONTENT 1. Context 2. Introduction 3. Project development 4. Development and validation of the evaluative strategy 5. Conclusions 6. Future steps 7. References The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  3. 3. Pontificia Universidad Católica de Chile Santiago – Chile – South America 4 campus 22,000 students 32 Academic Schools 9 libraries The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009 1. CONTEXT
  4. 4. 2.INTRODUCTION -Challenges facing current Higher Education institutions in Chile. -Development of key information literacy (IL) competencies. -Training efforts made evident the need for an innovative evaluative strategy tool: To enhance the personalization of learning according to the different levels of IL skills that college students presented. Complies with quality standards. Implemented by a test that allows students to have an overview of their IL skills. If the test is used before beginning a training program of development of IL skills, it would assume the character of a diagnostic evaluation. It could equally be used at the end of an IL training process to ascertain the level of competencies achieved. The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  5. 5. 3. PROJECT DEVELOPMENT •Objective To design and develop a tool for assessing the skills that UC undergraduate students have in regards to searching, evaluating and using information. •Stages Stage 1:Design of the evaluative strategy. Stage 2: Development and validation of the evaluative strategy The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  6. 6. 3. PROJECT DEVELOPMENT Stage 1 - Design of the evaluative strategy: Team composition. Review of the literature and analysis of IL tests developed in leading library universities around the world. Search and selection of IL competencies to be developed. Determine of IL sub competencies related to selected IL competencies. Determine performance outcomes. Determine the evaluation model used. Construction of situations to be included in the test The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  7. 7. Stage 2 - Development and validation of the evaluative strategy: Construction of feedbacks associated with the situation. Implementation of the pilot test to 211 students. Statistical analysis of pilot test findings (validity and reliability). Implementation of a second version test to 360 students. Statistical analysis of second test findings. Elaboration of final report. 3. PROJECT DEVELOPMENT The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  8. 8. LITERATURE REVIEW Most studies are based on ALA standards. Few studies on how these tools are built and about the procedures to validate and grant them reliability. A doctoral thesis (Beile, 2005) aims to assess the levels of development of IL skills of students in the specific area of education. Another thesis (Critchfield, 2005) describes the development of a reliable and valid instrument to measure the IL skills of freshmen based on the standards of the Association of College Research Libraries (ACRL, 2000). Two U.S. tests developed by the Educational Testing Service (ETS) and Project SAILS have been widely used among Canadian and American Universities. The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  9. 9. STUDENT AND FACULTY SURVEYS  Aimed at identifying the participants perceptions regarding IL competencies of undergraduates  Important input for building of the evaluative strategy later. The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  10. 10. STUDENT AND FACULTY SURVEYS SOME FINDINGS: •The majority of students had a positive perception of their IL competencies. •Student’s perception did not match that of faculty’s perception regarding undergraduates IL competencies. •The difference may reflect the fact that students did not know what skills they had to exercise to find information successfully, and therefore were unable to diagnose their problems. The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  11. 11. SOME FINDINGS: •Students showed problems in those aspects related to the optimal use of advanced resources such as specialized databases, or the recognition of intellectual property. •The difference in perceptions between faculty and students also suggests that the latter may not be aware of their deficits. STUDENT AND FACULTY SURVEYS The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  12. 12. IL DOMAINS, COMPETENCIES, AND PERFORMANCE OUTCOMES. Profile of key competencies that a UC undergraduate should have in order to be considered IL competent. Expertise of information specialists, the institutional frame of reference, the literature, and significant results obtained from the survey with students and faculty. The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  13. 13. IL DOMAINS, COMPETENCIES, AND PERFORMANCE OUTCOMES. Involved: determining domains, competencies, sub competencies and outcomes associated to the performance of what is considered “a competent IL person”. Based on: ALA, SCONUL, Australia & N. Zealand and Mexico standards. The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  14. 14. COMPETENCE Knowing how to perform, which includes, depending on the situation, knowledge, to know-how (procedures) and to know how to be (attitudes). OUTCOME OR PERFORMANCE Manner through which people give evidence of their competence in a particular area. DOMAIN Set of actions users express as competencies and are performed by an expert user. IL DOMAINS, COMPETENCIES, AND PERFORMANCE OUTCOMES. D E F I N I T I O N S The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  15. 15. IL DOMAINS, COMPETENCIES, AND PERFORMANCE OUTCOMES. UC DOMAINS DOMAIN 1: Access to information DOMAIN 2: Evaluation of information DOMAIN 3: Use of information The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  16. 16. Competence 1: "Recognises a need for information, defines and determines the nature and extent of information needed." IL DOMAINS, COMPETENCIES, SUB COMPETENCES AND OUTCOMES. E X A M P L E Domain 1: “Access to information” Sub competence: “Identifies a need for information “ Performance indicator: “Identifies terms and concepts that describe the information need”
  17. 17. EVALUATIVE STRATEGY MODEL How to evaluateWhat to evaluate Why to evaluate Assesses those competencies considered key in determining an initial IL state for undergraduates. Is given by the need to know the initial IL status undergraduate students have before becoming involved in an IL training activity. Is represented by an evaluative strategy that consists of a test made up of situations of varying complexity The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  18. 18. 4. DEVELOPMENT AND VALIDATION OF THE EVALUATIVE STRATEGY Construction of situations for the test Implementation of the evaluative strategy •115 situations •Reflecting real cases •Information Specialists and Methods Expert •Validated by Disciplinary Experts and library student assistants •Pilot test applied to 211 student to: validate the situations, its understanding and the degree to which they reflect the competences that they were intended to measure. •Optimized version based on the results of the pilot test was applied to 360 undergraduates. •Statistical analysis of test: estimation of the validity and reliability of each of the situations.
  19. 19. 4. DEVELOPMENT AND VALIDATION OF THE EVALUATIVE STRATEGY Population sample: 17 Schools – Acting – Agricultural and Forest Engineering – Art – Biology – Communications – Construction – Law – Geography – History – Engineering – Administration and Economics – Medicine – Music – Journalism – Sociology – Theater The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  20. 20. UC IL user profile (competencies, sub competencies, outcomes) Determination of situations included in final test Findings of pilot and optimised tests DETERMINATION OF SITUATIONS The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  21. 21. Competence 1: "Recognises a need for information, defines and determines the nature and extent of information needed." IL DOMAINS, COMPETENCIES, SUB COMPETENCES AND OUTCOMES: EXAMPLE OF A SITUATION Domain 1: “Access to information” Sub competence 1.2.: “Defines needed information “ Performance indicator 1.2.1.: “Explores general sources of information to increase familiarity with the topic”
  22. 22. Example of a situation covered by the test to evaluate the domain 1, competence 1, subcompetence 1.2., outcome 1.2.1.: • The situation developed is as follows: "Suppose you have to write a paper on the Free Trade agreement between Chile and the United States. You have the following information: You know that there is a working paper published by the Central Bank of Chile whose title is "The free trade treaty between Chile and the United States: A review of studies that quantify its impact", the Encarta Online Encyclopedia which has an article on the American Free Trade agreement and another on the request of Chile to join the treaty. In addition, you have information that a graduate student during the last year of his studies, wrote a thesis on "Free trade agreement between Chile and the United States: a model”. Finally, you have found out that ECLAC’s has published a series of articles on this topic.” The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  23. 23. Example of a situation covered by the test to evaluate the domain 1, competence 1, subcompetence 1.2., outcome 1.2.1.: • The student has to choose among the following alternatives: Bearing this background in mind and considering that your goal is to become familiar with the topic at a global level, which of the following sources of information would you choose? I. ECLAC’s Magazine, because it delivered the latest developments in negotiations II. A thesis produced by a graduate of your area, guided by a faculty expert on the subject. III. The working paper of the Central Bank of Chile, since it lets you know the broad outlines and impacts caused by the treaty. IV. The Encarta encyclopedia, because in general it lets you know the topic and its evolution. Choose one option: A. Only I B. Only IV C. I, II and III D. I, II, III and IV
  24. 24. Example of a situation covered by the test to evaluate the domain 1, competence 1, subcompetence 1.2., outcome 1.2.1.: Feedback messages associated with each alternative of responses are presented below: A. = The correct answer is B because the Encarta encyclopedia provides a general introduction to the topic. B. = Very good, this is the correct answer because the Encarta encyclopedia provides a general introduction to the topic. C. = The correct answer is B, the Encarta encyclopedia provides a general introduction to the topic. D. = The correct answer is B, the Encarta encyclopedia provides a general introduction to the topic. The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  25. 25. 5.CONCLUSIONS We have designed and developed an evaluative strategy to measure the information literacy competencies of UC undergraduate students which includes: -A profile that contains IL domains, competencies, sub competencies, and performance indicators appropriate for an undergraduate student from Chile. -A database of situations that may be used and combined to measure the IL competencies of the students. The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  26. 26. 5.CONCLUSIONS A test, applied to 360 students representing the target population, consisting of 37 situations, psychometrically validated, representing all or part of the following three competencies: •"Recognises a need for information, defines and determines the nature and extend of the information needed." (5 situations) •"Analyzes alternative sources of information according to the potentialities and difficulties of each one of them" (8 situations) •"Finds the required information effectively and efficiently by selecting appropriate methods and assessing their significance" (24 situations) The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  27. 27. 6.FUTURE STEPS • Reviewing the situations included in the database that did not achieve required psychometric values to improve them and develop a new version of the test. • Automation of the evaluative strategy for implementing it massively amongst all students of the University. • Applying the test to undergraduate students in future IL training sessions. • Sharing, the methodology and results of this experience with other Higher Education institutions in Chile, since there is no record of similar studies in our environment. The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  28. 28. 7. REFERENCES Association of American & Research Libraries [ACRL], Task Force on Information Literacy Competency Standards in Higher Education 2000, Information Literacy competency standards for higher education. http://www.ala.org/ala/mgrps/divs/acrl/standards/informationli teracycompetency.cfm (Retrieved 4 February 2009) Association of College & Research Libraries. [ACRL] 2007, Objectives for Information Literacy Instruction: a model statement for academic librarians. http://www.ala.org/ala/mgrps/divs/acrl/acrlstandards/objectiv esinformation.cfm (Retrieved 4 February 2009) The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  29. 29. 7. REFERENCES Beile, P.M. 2005, Development and validation of the Beile test of information literacy for Education (B-Tiled). PhD thesis, University of Central Florida, United States. Breivik, P. 2000, Information literacy and the engaged campus: giving students and community members the skills to take on (and not taken in by) the Internet. http://library.geneseo.edu/services/faculty/Instruction/breivi k.shtml (Retrieved 4 February 2009) The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  30. 30. 7. REFERENCES Bundy, A. (ed.) 2004, Australian and New Zeland information literacy framework, 2nd ed. Adelaide: Australian and New Zeland Institute for Information Literacy. Critchfield, Ron 2005, The development of an information literacy indicador for incoming college freshmen. PhD thesis, Nova Southeastern University, United States. Eadie, T. 1992. Beyond immodesty: Questioning the benefits of BI. Research Strategies, vol.10, pp.105-110. The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  31. 31. 7. REFERENCES EducationalTesting Service (ETS) 2008. ISkills Content. http://www.ets.org/portal/site/ets/menuitem.1488512ecfd5b884 9a77b13bc3921509/? vgnextoid=a05d0e3c27a85110VgnVCM10000022f95190RCRD& vgnextchannel=6e81a79898a85110VgnVCM10000022f95190RC RD (Retrieved 2 February 2009) Grassian, E. S., & Kaplowitz, J. R. 2001, Information literacy instruction: Theory and practice. New York: Neal-Schuman Publishers, Inc. Hernon, P. & Dugan, R. E. 2004, Outcomes assessment in higher education: Views and perspectives. Westport, CT: Libraries Unlimited. The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  32. 32. 7. REFERENCES Lau, J. 2004, Directrices internacionales para la alfabetización informativa:propuesta. http://bivir.uacj.mx/DHI/DoctosNacioInter/Docs/Directrices.pdf (Retrieved 6 February 2009) Project SAILS 2007, About Project SAILS. https://www.projectsails.org/sails/aboutSAILS.php? page=aboutSAILS (Retrieved 2 February 2009) Rabine, J. L. & Cardwell, C. 2000. Start making sense: Practical approaches to outcomes assessment for libraries. Research Strategies, vol.17, n°4, pp. 9-335. The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  33. 33. 7. REFERENCES Saurina de Solanes, E., Faúndez, F. & Preiss, D. 2007. Diseño y validación de un test para evaluar las competencias de los alumnos de pregrado en la busqueda, evaluacion y uso de información: informe final. Pontificia Universidad Católica de Chile. Society of College, National and University Libraries [SCONUL] 2004, Learning outcomes and information literacy” http://www.sconul.ac.uk/groups/information_literacy/papers/o utcomes.pdf (Retrieved 2 February 2009) The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009
  34. 34. 7. REFERENCES WASC/ACCJC.2004, Accreditation Reference Handbook. http://www.accjc.org/ACCJC_Publications.htm (Retrieved 2 February 2009) Zurowski, P. G. 1974, The information service environment: relationship and priorities. Washington, DC: National Commission on Libraries and Information Science. The Librarians Information Literacy Annual Conference (LILAC), Cardiff University, 30th March - 1st April 2009

×