• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Diagnosing students’ information literacy skills: Building on the past
 

Diagnosing students’ information literacy skills: Building on the past

on

  • 1,973 views

Class presentation on information literacy

Class presentation on information literacy

Statistics

Views

Total Views
1,973
Views on SlideShare
1,961
Embed Views
12

Actions

Likes
3
Downloads
0
Comments
0

4 Embeds 12

http://www.slideshare.net 8
http://static.slidesharecdn.com 2
http://www.tumblr.com 1
http://www.linkedin.com 1

Accessibility

Categories

Upload Details

Uploaded via as Apple Keynote

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • If large scale.
  • UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick <br /> <br /> https://www.projectsails.org/files/Presentation.pdf <br /> https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf <br /> <br /> https://www.projectsails.org/files/sample_report_2009.pdf
  • UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick <br /> <br /> https://www.projectsails.org/files/Presentation.pdf <br /> https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf <br /> <br /> https://www.projectsails.org/files/sample_report_2009.pdf
  • UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick <br /> <br /> https://www.projectsails.org/files/Presentation.pdf <br /> https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf <br /> <br /> https://www.projectsails.org/files/sample_report_2009.pdf
  • UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick <br /> <br /> https://www.projectsails.org/files/Presentation.pdf <br /> https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf <br /> <br /> https://www.projectsails.org/files/sample_report_2009.pdf
  • UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick <br /> <br /> https://www.projectsails.org/files/Presentation.pdf <br /> https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf <br /> <br /> https://www.projectsails.org/files/sample_report_2009.pdf
  • UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick <br /> <br /> https://www.projectsails.org/files/Presentation.pdf <br /> https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf <br /> <br /> https://www.projectsails.org/files/sample_report_2009.pdf
  • UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick <br /> <br /> https://www.projectsails.org/files/Presentation.pdf <br /> https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf <br /> <br /> https://www.projectsails.org/files/sample_report_2009.pdf
  • UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick <br /> <br /> https://www.projectsails.org/files/Presentation.pdf <br /> https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf <br /> <br /> https://www.projectsails.org/files/sample_report_2009.pdf
  • UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick <br /> <br /> https://www.projectsails.org/files/Presentation.pdf <br /> https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf <br /> <br /> https://www.projectsails.org/files/sample_report_2009.pdf
  • UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick <br /> <br /> https://www.projectsails.org/files/Presentation.pdf <br /> https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf <br /> <br /> https://www.projectsails.org/files/sample_report_2009.pdf
  • UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick <br /> <br /> https://www.projectsails.org/files/Presentation.pdf <br /> https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf <br /> <br /> https://www.projectsails.org/files/sample_report_2009.pdf
  • UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick <br /> <br /> https://www.projectsails.org/files/Presentation.pdf <br /> https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf <br /> <br /> https://www.projectsails.org/files/sample_report_2009.pdf
  • Here&#x2019;s results. For example, students majoring business performed lower than other universities. (mean, different from other insitutions) Don&#x2019;t tell why.
  • Technical literacy: wordprocessing, emailing, using web <br /> IL: Evaluate, organise, presenting, information <br /> How students perform <br /> Includes 14 short (3-5 minute) tasks and 1 longer (15 minute) task
  • Technical literacy: wordprocessing, emailing, using web <br /> IL: Evaluate, organise, presenting, information <br /> How students perform <br /> Includes 14 short (3-5 minute) tasks and 1 longer (15 minute) task
  • Define: Formulate a research statement to facilitate the search for information <br /> Access: Find and retrieve information from a variety of sources <br /> Evaluate: Judge the usefulness and sufficiency of information for a specific purpose <br /> Manage: Organize information so as to find it later <br /> Integrate: Summarize or otherwise synthesize information from a variety of sources <br /> Create: Generate or adapt information to meet a need, expressing a main point and supporting information <br /> Communicate: Adapt information for a particular audience
  • Define: Formulate a research statement to facilitate the search for information <br /> Access: Find and retrieve information from a variety of sources <br /> Evaluate: Judge the usefulness and sufficiency of information for a specific purpose <br /> Manage: Organize information so as to find it later <br /> Integrate: Summarize or otherwise synthesize information from a variety of sources <br /> Create: Generate or adapt information to meet a need, expressing a main point and supporting information <br /> Communicate: Adapt information for a particular audience
  • Define: Formulate a research statement to facilitate the search for information <br /> Access: Find and retrieve information from a variety of sources <br /> Evaluate: Judge the usefulness and sufficiency of information for a specific purpose <br /> Manage: Organize information so as to find it later <br /> Integrate: Summarize or otherwise synthesize information from a variety of sources <br /> Create: Generate or adapt information to meet a need, expressing a main point and supporting information <br /> Communicate: Adapt information for a particular audience
  • Define: Formulate a research statement to facilitate the search for information <br /> Access: Find and retrieve information from a variety of sources <br /> Evaluate: Judge the usefulness and sufficiency of information for a specific purpose <br /> Manage: Organize information so as to find it later <br /> Integrate: Summarize or otherwise synthesize information from a variety of sources <br /> Create: Generate or adapt information to meet a need, expressing a main point and supporting information <br /> Communicate: Adapt information for a particular audience
  • Define: Formulate a research statement to facilitate the search for information <br /> Access: Find and retrieve information from a variety of sources <br /> Evaluate: Judge the usefulness and sufficiency of information for a specific purpose <br /> Manage: Organize information so as to find it later <br /> Integrate: Summarize or otherwise synthesize information from a variety of sources <br /> Create: Generate or adapt information to meet a need, expressing a main point and supporting information <br /> Communicate: Adapt information for a particular audience
  • Define: Formulate a research statement to facilitate the search for information <br /> Access: Find and retrieve information from a variety of sources <br /> Evaluate: Judge the usefulness and sufficiency of information for a specific purpose <br /> Manage: Organize information so as to find it later <br /> Integrate: Summarize or otherwise synthesize information from a variety of sources <br /> Create: Generate or adapt information to meet a need, expressing a main point and supporting information <br /> Communicate: Adapt information for a particular audience
  • Define: Formulate a research statement to facilitate the search for information <br /> Access: Find and retrieve information from a variety of sources <br /> Evaluate: Judge the usefulness and sufficiency of information for a specific purpose <br /> Manage: Organize information so as to find it later <br /> Integrate: Summarize or otherwise synthesize information from a variety of sources <br /> Create: Generate or adapt information to meet a need, expressing a main point and supporting information <br /> Communicate: Adapt information for a particular audience
  • Define: Formulate a research statement to facilitate the search for information <br /> Access: Find and retrieve information from a variety of sources <br /> Evaluate: Judge the usefulness and sufficiency of information for a specific purpose <br /> Manage: Organize information so as to find it later <br /> Integrate: Summarize or otherwise synthesize information from a variety of sources <br /> Create: Generate or adapt information to meet a need, expressing a main point and supporting information <br /> Communicate: Adapt information for a particular audience
  • Everything is not applicable, but generally speaking... <br /> <br /> Time committment --> credit-based course
  • Here&#x2019;s example. This is a sample process of a credit IL course. <br /> Making record: 4.2 (can look back what they did and revise their process & performace)
  • Here&#x2019;s example. This is a sample process of a credit IL course. <br /> Making record: 4.2 (can look back what they did and revise their process & performace)
  • Here&#x2019;s example. This is a sample process of a credit IL course. <br /> Making record: 4.2 (can look back what they did and revise their process & performace)
  • Think aloud can be used as well
  • Affective: Feeling, perceptions <br /> Behavioral: what students can do, have learned <br /> Cognitive: What students know
  • Slide 12: (e.g., Asaravala, 2005; Educational Testing Services [ETS], 2009; Katz, 2007; Somerville, Smith, & Macklin, 2007) <br /> Slide 13 & 14: Katz, I. R. (2007).Testing information literacy in digital environments: ETS&#x2019;s iSkills assessment. Retrieved September 12, 2009, from http://www.ets.org/Media/Tests/ICT_Literacy/ppt/amla_plenary.ppt. (Slide ) <br /> Slide 17, 19: (e.g., Ishimura, Howard, Moukdad, 2007; Seamans, 2002) <br /> Slide 18: Ishimura, Howard, Moukdad, 2007 <br /> Slide 20, 21,22 (e.g., Nutefall, 2004; Snavely & Wright, 2003; Sonley, Turner, Myer, & Cotton, 2007) <br /> Slide 23 (e.g., Nutefall, 2004; Snavely & Wright, 2003; Sonley, Turner, Myer, & Cotton, 2007) <br /> Slide 24 (e.g., Nutefall, 2004; Snavely & Wright, 2003; Sonley, Turner, Myer, & Cotton, 2007) <br /> Slide 25: Radcliff, C. J., Jensen, M. L., Salem Jr., J. A., Burhanna, K. J., & Gedeon, J. A. (2007). A practical guide to information literacy assessment for academic librarians. Westport, CT: Libraries Unlimited. (Page 20)
  • Slide 17, 19: (e.g., Ishimura, Howard, Moukdad, 2007; Seamans, 2002) <br /> Slide 18: Ishimura, Howard, Moukdad, 2007 <br /> Slide 20, 21,22 (e.g., Nutefall, 2004; Snavely & Wright, 2003; Sonley, Turner, Myer, & Cotton, 2007) <br /> Slide 23 (e.g., Nutefall, 2004; Snavely & Wright, 2003; Sonley, Turner, Myer, & Cotton, 2007) <br /> Slide 24 (e.g., Nutefall, 2004; Snavely & Wright, 2003; Sonley, Turner, Myer, & Cotton, 2007) <br /> Slide 25: Radcliff, C. J., Jensen, M. L., Salem Jr., J. A., Burhanna, K. J., & Gedeon, J. A. (2007). A practical guide to information literacy assessment for academic librarians. Westport, CT: Libraries Unlimited. (Page 20)

Diagnosing students’ information literacy skills: Building on the past Diagnosing students’ information literacy skills: Building on the past Presentation Transcript

  • Diagnosing students’ information literacy skills: Building on the past Ph.D. Candidate Yusuke Ishimura September 17, 2009 1
  • Please note: All references used in this presentation are listed at the end of the slides (pp. 28-29) Y. Ishimura GLIS 619 September 17, 2009 2
  • Objectives • Become familiar with existing various information literacy (IL) assessment approaches • Understand types of evidence obtained from different assessment methods • Understand strengths & weaknesses of each assessment approach • Provide links to future assessment 3 Y. Ishimura GLIS 619 September 17, 2009
  • 4 Y. Ishimura GLIS 619 September 17, 2009
  • Informs student Informs teacher Formative Summative 4 Y. Ishimura GLIS 619 September 17, 2009
  • Improve Improve performance Informs student Informs teacher instruction Promoting Making Formative student instructional growth decisions Summative Evaluating Evaluating student programs achievement Recognize Modify accomplishment program 4 Y. Ishimura GLIS 619 September 17, 2009
  • Before Starting . . . Y. Ishimura GLIS 619 September 17, 2009 5
  • Large Scale Assessment 6 Y. Ishimura GLIS 619 September 17, 2009
  • Multiple-choice Tests (Overview) • Can compare IL skills across institutions/departments/individuals • Need time & money to develop reliable and valid instruments • Less time for marking • Good for testing students’ knowledge • Cannot assess higher-order skills • Content may be very generic Y. Ishimura GLIS 619 September 17, 2009 7
  • Multiple-choice Tests (Examples) Y. Ishimura GLIS 619 September 17, 2009 8
  • Multiple-choice Tests (Examples) • Developed by Kent State University • Used in 75 US + 7 CAN universities (Total: 42,304 students) • Uses ACRL’s IL standards • 40 multiple-choice questions - Needs: 37 items - Access: 73 items - Evaluation: 20 items - Intellectual honesty: 27 items • Multiple-choice format (Web or Paper) • $3 per student Y. Ishimura GLIS 619 September 17, 2009 8
  • Multiple-choice Tests (Examples) • Developed by Kent State University • Initiated by Prof. Mittermeyer • Used in 75 US + 7 CAN universities • Distributed to 1st year students in QC (Total: 42,304 students) (Total 3,003 students participated) • Uses ACRL’s IL standards • Uses ACRL IL standards • 40 multiple-choice questions • 20 multiple-choice questions - Needs: 37 items - Concept identification - Access: 73 items - Search strategy - Evaluation: 20 items - Document types - Intellectual honesty: 27 items - Search tools - Use of results • Multiple-choice format (Web or Paper) • $3 per student Y. Ishimura GLIS 619 September 17, 2009 8
  • Multiple-choice Tests (Sample Question-1) In order to become familiar with a subject about which I know very little, first I consult: ⃯ A journal ⃯ An encyclopedia ⃯ A database ⃯ A book ⃯ Other (please,specify): ⃯ Don’t know Y. Ishimura GLIS 619 September 17, 2009 9
  • Multiple-choice Tests (Sample Question-2) Which of the following search statements best utilizes nesting search operators for a communication research paper on speech anxiety CHECK ONLY ONE ANSWER. ⃯ Speech and talk and (anxiety or fear) ⃯ Speech or talk and (anxiety and fear) ⃯ Speech or talk and (anxiety or fear) ⃯ (Speech or talk) and anxiety or fear ⃯ (Speech or talk) and (anxiety or fear) Y. Ishimura GLIS 619 September 17, 2009 10
  • SAILS Results - Sample Report Figure 3.3 Chart for Skill Set: Developing a Research Strategy Goldfinch All Institutions Institution Type: Doctorate 1000 604 598 614 608 596 591 1 1 14 7 2 1 580 579 12 12 587 38 0 Engineering & Applied Business Education Science MAJOR Y. Ishimura GLIS 619 September 17, 2009 604 598 614 11 608 596 591 1
  • Multiple-choice Tests (Sample Findings) Undergraduate students demonstrated IL skills in some degree: • Demonstrated having knowledge of librarians for help • Strong understanding of AND, but not OR operator and nested search • Good at identifying search terms from research questions • Students’ recognition of limitations on search engines • Lack of knowledge in deciphering citations Y. Ishimura GLIS 619 September 17, 2009 12
  • Online Test (Example) Y. Ishimura GLIS 619 September 17, 2009 13
  • Online Test (Example) • IL skills + ICT literacy • Uses ACRL IL standards • Scenario-based assessment (No multiple-choice questions) • Web-based • $22-33 per test • Provides aggregated and individual data Y. Ishimura GLIS 619 September 17, 2009 13
  • Online Test (Example) • IL skills + ICT literacy • Confusing interface • Uses ACRL IL standards • Not related to disciplines • Scenario-based assessment • Too much focus on technology (No multiple-choice questions) content • Web-based • Can be a black box • $22-33 per test • Provides aggregated and individual data Y. Ishimura GLIS 619 September 17, 2009 13
  • Y. Ishimura GLIS 619 September 17, 2009 14
  • Y. Ishimura GLIS 619 September 17, 2009 14
  • Y. Ishimura GLIS 619 September 17, 2009 14
  • Y. Ishimura GLIS 619 September 17, 2009 14
  • Y. Ishimura GLIS 619 September 17, 2009 14
  • Y. Ishimura GLIS 619 September 17, 2009 14
  • Y. Ishimura GLIS 619 September 17, 2009 14
  • Y. Ishimura GLIS 619 September 17, 2009 14
  • Y. Ishimura GLIS 619 September 17, 2009 14
  • Y. Ishimura GLIS 619 September 17, 2009 15
  • E Your Score: 600 Percentile: 75 L Your Score Your Percentile P 700 99th Scores can range from 400 to 700. The midpoint of the scale M (550) represents the average performance of all early 2006 test takers. 75 A E The bracket represents the range of scores you might expect to 600 receive if you take this test again. S L The percentile shows how you did compared with all the people who took the test early in 2006. For example, if you received a score in the 60th percentile, you did better than 60 percent of all P test takers. 400 0 M Performance Feedback The iSkillsTM assessment measures seven different skill areas of information and communication technology literacy. The feedback below describes your performance on the tasks you saw, organized by these skill areas. This feedback is for your information only and is not A predictive of future performance. S 100 — 90 — 83 80 — 80 79 80 74 Percent Correct 70 — 70 62 66 60 — 60 60 61 55 50 — 46 40 — 46 43 40 42 40 39 30 — 30 22 20 — 10 — 0— DEFINE ACCESS EVALUATE MANAGE INTEGRATE CREATE COMMUNICATE iSKILLS Areas Y. Ishimura GLIS 619 September 17, 2009 16
  • E Your Score: 600 Percentile: 75 L Your Score Your Percentile P 700 99th Scores can range from 400 to 700. The midpoint of the scale M (550) represents the average performance of all early 2006 test takers. 75 A E The bracket represents the range of scores you might expect to 600 receive if you take this test again. S L The percentile shows how you did compared with all the people who took the test early in 2006. For example, if you received a score in the 60th percentile, you did better than 60 percent of all P test takers. 400 0 M Performance Feedback The iSkillsTM assessment measures seven different skill areas of information and communication technology literacy. The feedback below describes your performance on the tasks you saw, organized by these skill areas. This feedback is for your information only and is not A predictive of future performance. S 100 — 90 — 83 80 — 80 79 80 74 Percent Correct 70 — 70 62 66 60 — 60 60 61 55 50 — 46 40 — 46 43 40 42 40 39 30 — 30 22 20 — 10 — 0— DEFINE ACCESS EVALUATE MANAGE INTEGRATE CREATE COMMUNICATE iSKILLS Areas Y. Ishimura GLIS 619 September 17, 2009 16
  • E Your Score: 600 Percentile: 75 L Your Score Your Percentile P 700 99th Scores can range from 400 to 700. The midpoint of the scale M (550) represents the average performance of all early 2006 test takers. 75 A E The bracket represents the range of scores you might expect to 600 receive if you take this test again. S L The percentile shows how you did compared with all the people who took the test early in 2006. For example, if you received a score in the 60th percentile, you did better than 60 percent of all P test takers. 400 0 M Performance Feedback The iSkillsTM assessment measures seven different skill areas of information and communication technology literacy. The feedback below describes your performance on the tasks you saw, organized by these skill areas. This feedback is for your information only and is not A predictive of future performance. S 100 — 90 — 83 80 — 80 79 80 74 Percent Correct 70 — 70 62 66 60 — 60 60 61 55 50 — 46 40 — 46 43 40 42 40 39 30 — 30 22 20 — 10 — 0— DEFINE ACCESS EVALUATE MANAGE INTEGRATE CREATE COMMUNICATE iSKILLS Areas Y. Ishimura GLIS 619 September 17, 2009 16
  • E Your Score: 600 Percentile: 75 L Your Score Your Percentile P 700 99th Scores can range from 400 to 700. The midpoint of the scale M (550) represents the average performance of all early 2006 test takers. 75 A E The bracket represents the range of scores you might expect to 600 receive if you take this test again. S L The percentile shows how you did compared with all the people who took the test early in 2006. For example, if you received a score in the 60th percentile, you did better than 60 percent of all P test takers. 400 0 M Performance Feedback The iSkillsTM assessment measures seven different skill areas of information and communication technology literacy. The feedback below describes your performance on the tasks you saw, organized by these skill areas. This feedback is for your information only and is not A predictive of future performance. S 100 — 90 — 83 80 — 80 79 80 74 Percent Correct 70 — 70 62 66 60 — 60 60 61 55 50 — 46 40 — 46 43 40 42 40 39 30 — 30 22 20 — 10 — 0— DEFINE ACCESS EVALUATE MANAGE INTEGRATE CREATE COMMUNICATE iSKILLS Areas Y. Ishimura GLIS 619 September 17, 2009 16
  • E Your Score: 600 Percentile: 75 L Your Score Your Percentile P 700 99th Scores can range from 400 to 700. The midpoint of the scale M (550) represents the average performance of all early 2006 test takers. 75 A E The bracket represents the range of scores you might expect to 600 receive if you take this test again. S L The percentile shows how you did compared with all the people who took the test early in 2006. For example, if you received a score in the 60th percentile, you did better than 60 percent of all P test takers. 400 0 M Performance Feedback The iSkillsTM assessment measures seven different skill areas of information and communication technology literacy. The feedback below describes your performance on the tasks you saw, organized by these skill areas. This feedback is for your information only and is not A predictive of future performance. S 100 — 90 — 83 80 — 80 79 80 74 Percent Correct 70 — 70 62 66 60 — 60 60 61 55 50 — 46 40 — 46 43 40 42 40 39 30 — 30 22 20 — 10 — 0— DEFINE ACCESS EVALUATE MANAGE INTEGRATE CREATE COMMUNICATE iSKILLS Areas Y. Ishimura GLIS 619 September 17, 2009 16
  • Small Scale Assessment Y. Ishimura GLIS 619 September 17, 2009 17
  • Interviews & Focus Groups (Overview) • Evidence obtained through individual or group interviews • Involving small number of participants • Eliciting subjective experience in depth • Analysis takes time • Based on self-assessment of information literacy skills • Need training and inter-personal skills Y. Ishimura GLIS 619 September 17, 2009 18
  • Interviews & Focus Groups (Sample Responses) Y. Ishimura GLIS 619 September 17, 2009 19
  • Interviews & Focus Groups (Sample Responses) “I think Google is much easier to use than Novanet [online catalogue]. If I enter keywords by separating spaces, I can easily research results. On the other hand, I cannot reach the results on Novanet as I expect. So, I have trouble using it.” Y. Ishimura GLIS 619 September 17, 2009 19
  • Interviews & Focus Groups (Sample Responses) “I think Google is much easier to use than Novanet [online catalogue]. If I enter keywords by separating spaces, I can easily research results. On the other hand, I cannot reach the results on Novanet as I expect. So, I have trouble using it.” “For example, if I am studying the environment, one report discusses the amount of exhaust gas and says that the amount of gas does not affect the earth’s environment. But, if I carefully look at the report, the organisation receives financial support from oil companies. So, I look at the data very carefully. They do not present false information. But, I recognise that those kinds of reports focus on only one thing without considering other aspects of the situation. In terms of data, they only look at one row of data because it looks good even if the data are inferior to others.” Y. Ishimura GLIS 619 September 17, 2009 19
  • Interviews & Focus Groups (Sample Findings) • Students demonstrated IL skills to some degree: 1. Time pressure is a key factor to select paper topics 2. Google is a popular place for finding information among students 3. Tend to search for information that they want to find 4. Evaluation criteria for information is not sophisticated 5. Recognise importance of legal and ethical issues, but concept is difficult for students to fully understand • Librarians are not a part of students’ research process • Need more outreach to improve students’ IL skills Y. Ishimura GLIS 619 September 17, 2009 20
  • Portfolios (Overview) • Collecting pre-determined evidence during given time frame (e.g., topic selection, searching, evaluating, and using information) • Often record students’ reflection on the research process • Significant time commitment needed for students, faculty, & librarians • Often used in a credit-based IL courses or as a part of specific classes • Provide continuous feedback during the process • Can assess higher-order skills Y. Ishimura GLIS 619 September 17, 2009 21
  • Portfolios (Sample Process) Literature Topic reviews! selection! Identifying Search resources! strategies! Executing Databases search! selection! Y. Ishimura GLIS 619 September 17, 2009 22
  • Portfolios (Sample Process) Literature Topic reviews! selection! Identifying Search resources! strategies! Executing Databases search! selection! Y. Ishimura GLIS 619 September 17, 2009 22
  • Portfolios (Sample Process) Literature Topic reviews! selection! Identifying Search resources! Feedback strategies! Executing Databases search! selection! Y. Ishimura GLIS 619 September 17, 2009 22
  • Portfolios (Sample Findings) • Obtain holistic of students’ information literacy skills (e.g., Search strategies, resources used, synthesis of information) • Visible research process and progress (i.e., Students, librarians, and faculty) • Provide more meaningful and substantial feedback to students • Facilitate students’ understanding of research process Y. Ishimura GLIS 619 September 17, 2009 23
  • Observation & Screen Capture • One of three phases in research • HUGE amount of time for analysis • 86 hours of screen captures of • Provides an insight in information students behaviour seeking behaviour, but not significant findings • Observation of students’ activity during their search • Conversations with students during their search Y. Ishimura GLIS 619 September 17, 2009 24
  • Y. Ishimura GLIS 619 September 17, 2009 25
  • Time! Outside know- Cost! ledge! Selection criteria! Faculty’s coopera! Setting! -tion! Recruit- Domain! ment! Y. Ishimura GLIS 619 September 17, 2009 25
  • Time! Outside know- Cost! ledge! Selection criteria! Faculty’s coopera! Setting! -tion! Recruit- Domain! ment! Y. Ishimura GLIS 619 September 17, 2009 25
  • Time! Outside know- Cost! ledge! Selection criteria! Faculty’s coopera! Setting! -tion! Recruit- Domain! ment! Y. Ishimura GLIS 619 September 17, 2009 25
  • Time! Outside know- Cost! ledge! Selection criteria! Faculty’s coopera! Setting! -tion! Recruit- Domain! ment! Y. Ishimura GLIS 619 September 17, 2009 25
  • Time! Outside know- Cost! ledge! Selection criteria! Faculty’s coopera! Setting! -tion! Recruit- Domain! ment! Y. Ishimura GLIS 619 September 17, 2009 25
  • Time! Outside know- Cost! ledge! Selection Classroom criteria! Faculty’s coopera! Setting! Programme -tion! Institution Recruit- Domain! ment! Y. Ishimura GLIS 619 September 17, 2009 25
  • Time! Outside know- Cost! ledge! Selection criteria! Faculty’s coopera! Setting! -tion! Recruit- Domain! ment! Y. Ishimura GLIS 619 September 17, 2009 25
  • Time! Outside know- Cost! ledge! Selection criteria! Faculty’s coopera! Setting! -tion! Affective Recruit- Domain! ment! Behavioural Cognitive Y. Ishimura GLIS 619 September 17, 2009 25
  • Time! Outside know- Cost! ledge! Selection criteria! Faculty’s coopera! Setting! -tion! Recruit- Domain! ment! Y. Ishimura GLIS 619 September 17, 2009 25
  • Time! Outside know- Cost! ledge! Selection criteria! Faculty’s coopera! Setting! -tion! Recruit- Domain! ment! Y. Ishimura GLIS 619 September 17, 2009 25
  • Time! Outside know- Cost! ledge! Selection criteria! Faculty’s coopera! Setting! -tion! Recruit- Domain! ment! Y. Ishimura GLIS 619 September 17, 2009 25
  • Time! Outside know- Cost! ledge! Selection criteria! Faculty’s coopera! Setting! -tion! Recruit- Domain! ment! Y. Ishimura GLIS 619 September 17, 2009 25
  • Time! Outside know- Cost! ledge! Selection criteria! Faculty’s coopera! Setting! -tion! Recruit- Domain! ment! Y. Ishimura GLIS 619 September 17, 2009 25
  • Time! Outside know- Cost! ledge! Selection criteria! Faculty’s coopera! Setting! -tion! Recruit- Domain! ment! Y. Ishimura GLIS 619 September 17, 2009 25
  • Time! Outside know- Cost! ledge! Selection criteria! Faculty’s coopera! Setting! -tion! Recruit- Domain! ment! Y. Ishimura GLIS 619 September 17, 2009 25
  • Domain/ Setting Classroom Programmatic Institutional Informal Surveys Classroom assessment Surveys Interviewing Affective techniques Focus groups Focus groups Surveys Portfolios Portfolios Portfolios Informal Performance Behavioral Performance Portfolios Portfolios Portfolios Informal Classroom assessment techniques Knowledge test Knowledge test Cognitive Knowledge test Portfolios Portfolios Concept maps Portfolios Y. Ishimura GLIS 619 September 17, 2009 26
  • For Your Future Reference Y. Ishimura GLIS 619 September 17, 2009 27
  • For Your Future Reference • Examine past findings “critically” (Past research will inform your path) Y. Ishimura GLIS 619 September 17, 2009 27
  • For Your Future Reference • Examine past findings “critically” (Past research will inform your path) • Decide what you want to assess (You need purposes & goals for assessment) Y. Ishimura GLIS 619 September 17, 2009 27
  • For Your Future Reference • Examine past findings “critically” (Past research will inform your path) • Decide what you want to assess (You need purposes & goals for assessment) • Remember you cannot assess everything (Think small) Y. Ishimura GLIS 619 September 17, 2009 27
  • For Your Future Reference • Examine past findings “critically” (Past research will inform your path) • Decide what you want to assess (You need purposes & goals for assessment) • Remember you cannot assess everything (Think small) • Know the limitations of different assessment approaches (You may have to give up some evidences) Y. Ishimura GLIS 619 September 17, 2009 27
  • For Your Future Reference • Examine past findings “critically” (Past research will inform your path) • Decide what you want to assess (You need purposes & goals for assessment) • Remember you cannot assess everything (Think small) • Know the limitations of different assessment approaches (You may have to give up some evidences) • Use multiple data collection methodologies if applicable (You can triangulate your data) Y. Ishimura GLIS 619 September 17, 2009 27
  • For Your Future Reference • Examine past findings “critically” (Past research will inform your path) • Decide what you want to assess (You need purposes & goals for assessment) • Remember you cannot assess everything (Think small) • Know the limitations of different assessment approaches (You may have to give up some evidences) • Use multiple data collection methodologies if applicable (You can triangulate your data) • Have passion for assessment (You will find cool things for future IL improvement for students) Y. Ishimura GLIS 619 September 17, 2009 27
  • Notes Slide 3 Donham Jean, Enhancing teaching and learning (New York: Neal-Schuman, 2005), 251 Slide 6 & 7 Tom Adam and Ilo-Katryn Maimets, “Information Literacy Evaluation: Fishing for answers with SAILS,” https:// www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf. Kent State University, Project SAILS (Standardized Assessment of Information Literacy Skills) homepage, https:// www.projectsails.org/index.php?page=home. Diane Mittermeyer and Diane Quirion, “Information Literacy: Study of Incoming First-Year Undergraduates in Quebec,” http:// www.crepuq.qc.ca/documents/bibl/formation/studies_Ang.pdf. Slide 8 Mittermeyer and Quirion, “Information Literacy.” Slide 9 Adam and Maimets, “Information literacy evaluation.” Slide 10 Kent State University, “Results of the Standardized Assessement of Informaiton Lieteracy Skills (SAILS) for Goldfinch University,” http://www.projectsails.org/pubs/SampleReport2007.pdf. Y. Ishimura GLIS 619 September 17, 2009 28
  • Notes Slide 12 Amit Asaravala, "Testing Your Tech Smarts." Wired News, April 8, 2005, http://www.wired.com/culture/lifestyle/news/ 2005/04/67156 (accessed April 4, 2009). Educational Testing Service, “iSkills,” http://www.ets.org/portal/site/ets/menuitem.1488512ecfd5b8849a77b13bc3921509/? vgnextoid=159f0e3c27a85110VgnVCM10000022f95190RCRD&vgnextchannel=e5b2a79898a85110VgnVCM 10000022f95190RCRD. Irvin R. Katz, “Testing Information Literacy in Digital Environments: ETS’s iSkills Assessment,” http://www.etsliteracy.org/Media/ Tests/ICT_Literacy/ppt/amla_plenary.ppt. Mary M. Somerville, Lynn D. Lampert, Katherine S. Dabbour, Sallie Harlan, and Barbara Schader. "Toward Large Scale Assessment of Information and Communication Technology Literacy: Implementation Considerations for the ETS ICT Literacy Instrument," Reference Services Review 35 (2007): 8-20. Slide 13 & 14 Katz, “Testing Information Literacy.” Slide 15 Educational Testing Service, “Individual Score Report,” http://www.ets.org/Media/Tests/ICT_Literacy/pdf/ iskills_individual_report.pdf. Slide 17 Yusuke Ishimura, Vivian Howard, and Haidar Moukdad. “Information Literacy in Academic Libraries: Assessment of Japanese Students' Needs for Successful Assignment Completion in Two Halifax Universities,” Canadian Journal of Information and Library Science 31 (2008):1-26. Nanchy H. Seamans. "Student Perceptions of Information Literacy: Insights for Librarians." Reference Services Review 30 (2002): 112-23. Y. Ishimura GLIS 619 September 17, 2009 29
  • Note Slide 18 Ishimura, Howard, and Moukdad, “Information Literacy in Academic Libraries,” 1-26. Slide 19 Ishimura, Howard, and Moukdad, “Information Literacy in Academic Libraries,” 1-26. Seamans, “Students Perceptions,” 112-23. Slide 20, 21, & 22 Jennifer Nutefall. "Paper Trail: One Method of Information Literacy Assessment." Research Strategies 20 (2004): 89-98. Loanne L. Snavely and Carol A. Wright. "Research Portfolio Use in Undergraduate Honors Education: Assessment Tool and Model for Future Work." Journal of Academic Librarianship 29 (2003): 298–303. Valerie Sonley, Denise Turner, Sue Myer, and Yvonne Cotton. "Information Literacy Assessment by Portfolio: A Case Study." Reference Services Review 35 (2007): 41-70. Slide 23 K. Dunn. "Assessing Information Literacy Skills in the California State University: A Progress Report." Journal of Academic Librarianship 28 (2002): 26-35. Slide 24 & 25 Carolyn J. Radcliff, Mary Lee Jensen, Joseph A. Salem Jr., Kenneth J. Burhanna, and Julie A. Gedeon. A Practical Guide to Information Literacy Assessment for Academic Librarians. (Westport, CT: Libraries Unlimited, 2007), 20-22. Y. Ishimura GLIS 619 September 17, 2009 30
  • Y. Ishimura GLIS 619 September 17, 2009 31
  • Questions? Y. Ishimura GLIS 619 September 17, 2009 31