2. Please note: All references used in
this presentation are listed at the
end of the slides (pp. 28-29)
Y. Ishimura GLIS 619 September 17, 2009 2
3. Objectives
• Become familiar with existing
various information literacy (IL)
assessment approaches
• Understand types of evidence
obtained from different
assessment methods
• Understand strengths &
weaknesses of each
assessment approach
• Provide links to future
assessment
3
Y. Ishimura GLIS 619 September 17, 2009
9. Multiple-choice Tests (Overview)
• Can compare IL skills across institutions/departments/individuals
• Need time & money to develop reliable and valid instruments
• Less time for marking
• Good for testing students’ knowledge
• Cannot assess higher-order skills
• Content may be very generic
Y. Ishimura GLIS 619 September 17, 2009 7
11. Multiple-choice Tests (Examples)
• Developed by Kent State University
• Used in 75 US + 7 CAN universities
(Total: 42,304 students)
• Uses ACRL’s IL standards
• 40 multiple-choice questions
- Needs: 37 items
- Access: 73 items
- Evaluation: 20 items
- Intellectual honesty: 27 items
• Multiple-choice format (Web or Paper)
• $3 per student
Y. Ishimura GLIS 619 September 17, 2009 8
12. Multiple-choice Tests (Examples)
• Developed by Kent State University • Initiated by Prof. Mittermeyer
• Used in 75 US + 7 CAN universities • Distributed to 1st year students in QC
(Total: 42,304 students) (Total 3,003 students participated)
• Uses ACRL’s IL standards • Uses ACRL IL standards
• 40 multiple-choice questions • 20 multiple-choice questions
- Needs: 37 items - Concept identification
- Access: 73 items - Search strategy
- Evaluation: 20 items - Document types
- Intellectual honesty: 27 items - Search tools
- Use of results
• Multiple-choice format (Web or Paper)
• $3 per student
Y. Ishimura GLIS 619 September 17, 2009 8
13. Multiple-choice Tests (Sample Question-1)
In order to become familiar with a subject about which I know very little,
first I consult:
⃯ A journal
⃯ An encyclopedia
⃯ A database
⃯ A book
⃯ Other (please,specify):
⃯ Don’t know
Y. Ishimura GLIS 619 September 17, 2009 9
14. Multiple-choice Tests (Sample Question-2)
Which of the following search statements best utilizes nesting search
operators for a communication research paper on speech anxiety
CHECK ONLY ONE ANSWER.
⃯ Speech and talk and (anxiety or fear)
⃯ Speech or talk and (anxiety and fear)
⃯ Speech or talk and (anxiety or fear)
⃯ (Speech or talk) and anxiety or fear
⃯ (Speech or talk) and (anxiety or fear)
Y. Ishimura GLIS 619 September 17, 2009 10
15. SAILS Results - Sample Report
Figure 3.3 Chart for Skill Set: Developing a Research Strategy
Goldfinch All Institutions
Institution Type: Doctorate
1000
604 598 614 608
596 591
1 1 14 7
2 1
580 579
12 12 587
38
0
Engineering & Applied
Business Education
Science
MAJOR
Y. Ishimura GLIS 619 September 17, 2009 604 598 614 11
608
596 591
1
16. Multiple-choice Tests (Sample Findings)
Undergraduate students demonstrated IL skills in some degree:
• Demonstrated having knowledge of librarians for help
• Strong understanding of AND, but not OR operator and nested search
• Good at identifying search terms from research questions
• Students’ recognition of limitations on search engines
• Lack of knowledge in deciphering citations
Y. Ishimura GLIS 619 September 17, 2009 12
18. Online Test (Example)
• IL skills + ICT literacy
• Uses ACRL IL standards
• Scenario-based assessment
(No multiple-choice questions)
• Web-based
• $22-33 per test
• Provides aggregated and individual data
Y. Ishimura GLIS 619 September 17, 2009 13
19. Online Test (Example)
• IL skills + ICT literacy • Confusing interface
• Uses ACRL IL standards • Not related to disciplines
• Scenario-based assessment • Too much focus on technology
(No multiple-choice questions) content
• Web-based • Can be a black box
• $22-33 per test
• Provides aggregated and individual data
Y. Ishimura GLIS 619 September 17, 2009 13
30. E
Your Score: 600 Percentile: 75
L
Your Score Your Percentile
P
700 99th
Scores can range from 400 to 700. The midpoint of the scale
M
(550) represents the average performance of all early 2006
test takers.
75
A
E
The bracket represents the range of scores you might expect to
600 receive if you take this test again.
S
L
The percentile shows how you did compared with all the people
who took the test early in 2006. For example, if you received a
score in the 60th percentile, you did better than 60 percent of all
P
test takers.
400 0
M
Performance Feedback
The iSkillsTM assessment measures seven different skill areas of information and communication technology literacy. The feedback below
describes your performance on the tasks you saw, organized by these skill areas. This feedback is for your information only and is not
A
predictive of future performance.
S
100 —
90 —
83
80 — 80 79 80
74
Percent Correct
70 — 70
62 66
60 — 60 60 61 55
50 —
46
40 — 46 43 40
42 40 39
30 —
30 22
20 —
10 —
0—
DEFINE ACCESS EVALUATE MANAGE INTEGRATE CREATE COMMUNICATE
iSKILLS Areas
Y. Ishimura GLIS 619 September 17, 2009 16
31. E
Your Score: 600 Percentile: 75
L
Your Score Your Percentile
P
700 99th
Scores can range from 400 to 700. The midpoint of the scale
M
(550) represents the average performance of all early 2006
test takers.
75
A
E
The bracket represents the range of scores you might expect to
600 receive if you take this test again.
S
L
The percentile shows how you did compared with all the people
who took the test early in 2006. For example, if you received a
score in the 60th percentile, you did better than 60 percent of all
P
test takers.
400 0
M
Performance Feedback
The iSkillsTM assessment measures seven different skill areas of information and communication technology literacy. The feedback below
describes your performance on the tasks you saw, organized by these skill areas. This feedback is for your information only and is not
A
predictive of future performance.
S
100 —
90 —
83
80 — 80 79 80
74
Percent Correct
70 — 70
62 66
60 — 60 60 61 55
50 —
46
40 — 46 43 40
42 40 39
30 —
30 22
20 —
10 —
0—
DEFINE ACCESS EVALUATE MANAGE INTEGRATE CREATE COMMUNICATE
iSKILLS Areas
Y. Ishimura GLIS 619 September 17, 2009 16
32. E
Your Score: 600 Percentile: 75
L
Your Score Your Percentile
P
700 99th
Scores can range from 400 to 700. The midpoint of the scale
M
(550) represents the average performance of all early 2006
test takers.
75
A
E
The bracket represents the range of scores you might expect to
600 receive if you take this test again.
S
L
The percentile shows how you did compared with all the people
who took the test early in 2006. For example, if you received a
score in the 60th percentile, you did better than 60 percent of all
P
test takers.
400 0
M
Performance Feedback
The iSkillsTM assessment measures seven different skill areas of information and communication technology literacy. The feedback below
describes your performance on the tasks you saw, organized by these skill areas. This feedback is for your information only and is not
A
predictive of future performance.
S
100 —
90 —
83
80 — 80 79 80
74
Percent Correct
70 — 70
62 66
60 — 60 60 61 55
50 —
46
40 — 46 43 40
42 40 39
30 —
30 22
20 —
10 —
0—
DEFINE ACCESS EVALUATE MANAGE INTEGRATE CREATE COMMUNICATE
iSKILLS Areas
Y. Ishimura GLIS 619 September 17, 2009 16
33. E
Your Score: 600 Percentile: 75
L
Your Score Your Percentile
P
700 99th
Scores can range from 400 to 700. The midpoint of the scale
M
(550) represents the average performance of all early 2006
test takers.
75
A
E
The bracket represents the range of scores you might expect to
600 receive if you take this test again.
S
L
The percentile shows how you did compared with all the people
who took the test early in 2006. For example, if you received a
score in the 60th percentile, you did better than 60 percent of all
P
test takers.
400 0
M
Performance Feedback
The iSkillsTM assessment measures seven different skill areas of information and communication technology literacy. The feedback below
describes your performance on the tasks you saw, organized by these skill areas. This feedback is for your information only and is not
A
predictive of future performance.
S
100 —
90 —
83
80 — 80 79 80
74
Percent Correct
70 — 70
62 66
60 — 60 60 61 55
50 —
46
40 — 46 43 40
42 40 39
30 —
30 22
20 —
10 —
0—
DEFINE ACCESS EVALUATE MANAGE INTEGRATE CREATE COMMUNICATE
iSKILLS Areas
Y. Ishimura GLIS 619 September 17, 2009 16
34. E
Your Score: 600 Percentile: 75
L
Your Score Your Percentile
P
700 99th
Scores can range from 400 to 700. The midpoint of the scale
M
(550) represents the average performance of all early 2006
test takers.
75
A
E
The bracket represents the range of scores you might expect to
600 receive if you take this test again.
S
L
The percentile shows how you did compared with all the people
who took the test early in 2006. For example, if you received a
score in the 60th percentile, you did better than 60 percent of all
P
test takers.
400 0
M
Performance Feedback
The iSkillsTM assessment measures seven different skill areas of information and communication technology literacy. The feedback below
describes your performance on the tasks you saw, organized by these skill areas. This feedback is for your information only and is not
A
predictive of future performance.
S
100 —
90 —
83
80 — 80 79 80
74
Percent Correct
70 — 70
62 66
60 — 60 60 61 55
50 —
46
40 — 46 43 40
42 40 39
30 —
30 22
20 —
10 —
0—
DEFINE ACCESS EVALUATE MANAGE INTEGRATE CREATE COMMUNICATE
iSKILLS Areas
Y. Ishimura GLIS 619 September 17, 2009 16
36. Interviews & Focus Groups (Overview)
• Evidence obtained through individual or group interviews
• Involving small number of participants
• Eliciting subjective experience in depth
• Analysis takes time
• Based on self-assessment of information literacy skills
• Need training and inter-personal skills
Y. Ishimura GLIS 619 September 17, 2009 18
37. Interviews & Focus Groups (Sample Responses)
Y. Ishimura GLIS 619 September 17, 2009 19
38. Interviews & Focus Groups (Sample Responses)
“I think Google is much easier to use than Novanet [online catalogue]. If I
enter keywords by separating spaces, I can easily research results. On the
other hand, I cannot reach the results on Novanet as I expect. So, I have
trouble using it.”
Y. Ishimura GLIS 619 September 17, 2009 19
39. Interviews & Focus Groups (Sample Responses)
“I think Google is much easier to use than Novanet [online catalogue]. If I
enter keywords by separating spaces, I can easily research results. On the
other hand, I cannot reach the results on Novanet as I expect. So, I have
trouble using it.”
“For example, if I am studying the environment, one report discusses the
amount of exhaust gas and says that the amount of gas does not affect the
earth’s environment. But, if I carefully look at the report, the organisation
receives financial support from oil companies. So, I look at the data very
carefully. They do not present false information. But, I recognise that those
kinds of reports focus on only one thing without considering other aspects of
the situation. In terms of data, they only look at one row of data because it
looks good even if the data are inferior to others.”
Y. Ishimura GLIS 619 September 17, 2009 19
40. Interviews & Focus Groups (Sample Findings)
• Students demonstrated IL skills to some degree:
1. Time pressure is a key factor to select paper topics
2. Google is a popular place for finding information among students
3. Tend to search for information that they want to find
4. Evaluation criteria for information is not sophisticated
5. Recognise importance of legal and ethical issues, but concept is difficult for students
to fully understand
• Librarians are not a part of students’ research process
• Need more outreach to improve students’ IL skills
Y. Ishimura GLIS 619 September 17, 2009 20
41. Portfolios (Overview)
• Collecting pre-determined evidence during given time frame
(e.g., topic selection, searching, evaluating, and using information)
• Often record students’ reflection on the research process
• Significant time commitment needed for students, faculty, & librarians
• Often used in a credit-based IL courses or as a part of specific classes
• Provide continuous feedback during the process
• Can assess higher-order skills
Y. Ishimura GLIS 619 September 17, 2009 21
42. Portfolios (Sample Process)
Literature Topic
reviews! selection!
Identifying Search
resources! strategies!
Executing Databases
search! selection!
Y. Ishimura GLIS 619 September 17, 2009 22
43. Portfolios (Sample Process)
Literature Topic
reviews! selection!
Identifying Search
resources! strategies!
Executing Databases
search! selection!
Y. Ishimura GLIS 619 September 17, 2009 22
44. Portfolios (Sample Process)
Literature Topic
reviews! selection!
Identifying Search
resources! Feedback strategies!
Executing Databases
search! selection!
Y. Ishimura GLIS 619 September 17, 2009 22
45. Portfolios (Sample Findings)
• Obtain holistic of students’ information literacy skills
(e.g., Search strategies, resources used, synthesis of information)
• Visible research process and progress
(i.e., Students, librarians, and faculty)
• Provide more meaningful and substantial feedback to students
• Facilitate students’ understanding of research process
Y. Ishimura GLIS 619 September 17, 2009 23
46. Observation & Screen Capture
• One of three phases in research • HUGE amount of time for analysis
• 86 hours of screen captures of • Provides an insight in information
students behaviour seeking behaviour, but not
significant findings
• Observation of students’ activity
during their search
• Conversations with students during
their search
Y. Ishimura GLIS 619 September 17, 2009 24
63. Domain/
Setting
Classroom Programmatic Institutional
Informal
Surveys
Classroom assessment Surveys
Interviewing
Affective techniques Focus groups
Focus groups
Surveys Portfolios
Portfolios
Portfolios
Informal
Performance
Behavioral Performance Portfolios
Portfolios
Portfolios
Informal
Classroom assessment
techniques Knowledge test Knowledge test
Cognitive
Knowledge test Portfolios Portfolios
Concept maps
Portfolios
Y. Ishimura GLIS 619 September 17, 2009 26
64. For Your Future Reference
Y. Ishimura GLIS 619 September 17, 2009 27
65. For Your Future Reference
• Examine past findings “critically”
(Past research will inform your path)
Y. Ishimura GLIS 619 September 17, 2009 27
66. For Your Future Reference
• Examine past findings “critically”
(Past research will inform your path)
• Decide what you want to assess
(You need purposes & goals for assessment)
Y. Ishimura GLIS 619 September 17, 2009 27
67. For Your Future Reference
• Examine past findings “critically”
(Past research will inform your path)
• Decide what you want to assess
(You need purposes & goals for assessment)
• Remember you cannot assess everything
(Think small)
Y. Ishimura GLIS 619 September 17, 2009 27
68. For Your Future Reference
• Examine past findings “critically”
(Past research will inform your path)
• Decide what you want to assess
(You need purposes & goals for assessment)
• Remember you cannot assess everything
(Think small)
• Know the limitations of different assessment approaches
(You may have to give up some evidences)
Y. Ishimura GLIS 619 September 17, 2009 27
69. For Your Future Reference
• Examine past findings “critically”
(Past research will inform your path)
• Decide what you want to assess
(You need purposes & goals for assessment)
• Remember you cannot assess everything
(Think small)
• Know the limitations of different assessment approaches
(You may have to give up some evidences)
• Use multiple data collection methodologies if applicable
(You can triangulate your data)
Y. Ishimura GLIS 619 September 17, 2009 27
70. For Your Future Reference
• Examine past findings “critically”
(Past research will inform your path)
• Decide what you want to assess
(You need purposes & goals for assessment)
• Remember you cannot assess everything
(Think small)
• Know the limitations of different assessment approaches
(You may have to give up some evidences)
• Use multiple data collection methodologies if applicable
(You can triangulate your data)
• Have passion for assessment
(You will find cool things for future IL improvement for students)
Y. Ishimura GLIS 619 September 17, 2009 27
71. Notes
Slide 3
Donham Jean, Enhancing teaching and learning (New York: Neal-Schuman, 2005), 251
Slide 6 & 7
Tom Adam and Ilo-Katryn Maimets, “Information Literacy Evaluation: Fishing for answers with SAILS,” https://
www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf.
Kent State University, Project SAILS (Standardized Assessment of Information Literacy Skills) homepage, https://
www.projectsails.org/index.php?page=home.
Diane Mittermeyer and Diane Quirion, “Information Literacy: Study of Incoming First-Year Undergraduates in Quebec,” http://
www.crepuq.qc.ca/documents/bibl/formation/studies_Ang.pdf.
Slide 8
Mittermeyer and Quirion, “Information Literacy.”
Slide 9
Adam and Maimets, “Information literacy evaluation.”
Slide 10
Kent State University, “Results of the Standardized Assessement of Informaiton Lieteracy Skills (SAILS) for Goldfinch University,”
http://www.projectsails.org/pubs/SampleReport2007.pdf.
Y. Ishimura GLIS 619 September 17, 2009 28
72. Notes
Slide 12
Amit Asaravala, "Testing Your Tech Smarts." Wired News, April 8, 2005, http://www.wired.com/culture/lifestyle/news/
2005/04/67156 (accessed April 4, 2009).
Educational Testing Service, “iSkills,” http://www.ets.org/portal/site/ets/menuitem.1488512ecfd5b8849a77b13bc3921509/?
vgnextoid=159f0e3c27a85110VgnVCM10000022f95190RCRD&vgnextchannel=e5b2a79898a85110VgnVCM
10000022f95190RCRD.
Irvin R. Katz, “Testing Information Literacy in Digital Environments: ETS’s iSkills Assessment,” http://www.etsliteracy.org/Media/
Tests/ICT_Literacy/ppt/amla_plenary.ppt.
Mary M. Somerville, Lynn D. Lampert, Katherine S. Dabbour, Sallie Harlan, and Barbara Schader. "Toward Large Scale
Assessment of Information and Communication Technology Literacy: Implementation Considerations for the ETS ICT
Literacy Instrument," Reference Services Review 35 (2007): 8-20.
Slide 13 & 14
Katz, “Testing Information Literacy.”
Slide 15
Educational Testing Service, “Individual Score Report,” http://www.ets.org/Media/Tests/ICT_Literacy/pdf/
iskills_individual_report.pdf.
Slide 17
Yusuke Ishimura, Vivian Howard, and Haidar Moukdad. “Information Literacy in Academic Libraries: Assessment of Japanese
Students' Needs for Successful Assignment Completion in Two Halifax Universities,” Canadian Journal of Information and
Library Science 31 (2008):1-26.
Nanchy H. Seamans. "Student Perceptions of Information Literacy: Insights for Librarians." Reference Services Review 30 (2002):
112-23.
Y. Ishimura GLIS 619 September 17, 2009 29
73. Note
Slide 18
Ishimura, Howard, and Moukdad, “Information Literacy in Academic Libraries,” 1-26.
Slide 19
Ishimura, Howard, and Moukdad, “Information Literacy in Academic Libraries,” 1-26.
Seamans, “Students Perceptions,” 112-23.
Slide 20, 21, & 22
Jennifer Nutefall. "Paper Trail: One Method of Information Literacy Assessment." Research Strategies 20 (2004): 89-98.
Loanne L. Snavely and Carol A. Wright. "Research Portfolio Use in Undergraduate Honors Education: Assessment Tool and Model
for Future Work." Journal of Academic Librarianship 29 (2003): 298–303.
Valerie Sonley, Denise Turner, Sue Myer, and Yvonne Cotton. "Information Literacy Assessment by Portfolio: A Case Study."
Reference Services Review 35 (2007): 41-70.
Slide 23
K. Dunn. "Assessing Information Literacy Skills in the California State University: A Progress Report." Journal of Academic
Librarianship 28 (2002): 26-35.
Slide 24 & 25
Carolyn J. Radcliff, Mary Lee Jensen, Joseph A. Salem Jr., Kenneth J. Burhanna, and Julie A. Gedeon. A Practical Guide to
Information Literacy Assessment for Academic Librarians. (Westport, CT: Libraries Unlimited, 2007), 20-22.
Y. Ishimura GLIS 619 September 17, 2009 30
UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick
https://www.projectsails.org/files/Presentation.pdf
https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf
https://www.projectsails.org/files/sample_report_2009.pdf
UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick
https://www.projectsails.org/files/Presentation.pdf
https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf
https://www.projectsails.org/files/sample_report_2009.pdf
UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick
https://www.projectsails.org/files/Presentation.pdf
https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf
https://www.projectsails.org/files/sample_report_2009.pdf
UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick
https://www.projectsails.org/files/Presentation.pdf
https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf
https://www.projectsails.org/files/sample_report_2009.pdf
UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick
https://www.projectsails.org/files/Presentation.pdf
https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf
https://www.projectsails.org/files/sample_report_2009.pdf
UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick
https://www.projectsails.org/files/Presentation.pdf
https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf
https://www.projectsails.org/files/sample_report_2009.pdf
UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick
https://www.projectsails.org/files/Presentation.pdf
https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf
https://www.projectsails.org/files/sample_report_2009.pdf
UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick
https://www.projectsails.org/files/Presentation.pdf
https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf
https://www.projectsails.org/files/sample_report_2009.pdf
UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick
https://www.projectsails.org/files/Presentation.pdf
https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf
https://www.projectsails.org/files/sample_report_2009.pdf
UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick
https://www.projectsails.org/files/Presentation.pdf
https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf
https://www.projectsails.org/files/sample_report_2009.pdf
UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick
https://www.projectsails.org/files/Presentation.pdf
https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf
https://www.projectsails.org/files/sample_report_2009.pdf
UBC, Alberta, Brandon, Manitoba, Western, York, New Brunswick
https://www.projectsails.org/files/Presentation.pdf
https://www.projectsails.org/pubs/SAILSatWILU_May7_2006.pdf
https://www.projectsails.org/files/sample_report_2009.pdf
Here’s results. For example, students majoring business performed lower than other universities. (mean, different from other insitutions) Don’t tell why.
Technical literacy: wordprocessing, emailing, using web
IL: Evaluate, organise, presenting, information
How students perform
Includes 14 short (3-5 minute) tasks and 1 longer (15 minute) task
Technical literacy: wordprocessing, emailing, using web
IL: Evaluate, organise, presenting, information
How students perform
Includes 14 short (3-5 minute) tasks and 1 longer (15 minute) task
Define: Formulate a research statement to facilitate the search for information
Access: Find and retrieve information from a variety of sources
Evaluate: Judge the usefulness and sufficiency of information for a specific purpose
Manage: Organize information so as to find it later
Integrate: Summarize or otherwise synthesize information from a variety of sources
Create: Generate or adapt information to meet a need, expressing a main point and supporting information
Communicate: Adapt information for a particular audience
Define: Formulate a research statement to facilitate the search for information
Access: Find and retrieve information from a variety of sources
Evaluate: Judge the usefulness and sufficiency of information for a specific purpose
Manage: Organize information so as to find it later
Integrate: Summarize or otherwise synthesize information from a variety of sources
Create: Generate or adapt information to meet a need, expressing a main point and supporting information
Communicate: Adapt information for a particular audience
Define: Formulate a research statement to facilitate the search for information
Access: Find and retrieve information from a variety of sources
Evaluate: Judge the usefulness and sufficiency of information for a specific purpose
Manage: Organize information so as to find it later
Integrate: Summarize or otherwise synthesize information from a variety of sources
Create: Generate or adapt information to meet a need, expressing a main point and supporting information
Communicate: Adapt information for a particular audience
Define: Formulate a research statement to facilitate the search for information
Access: Find and retrieve information from a variety of sources
Evaluate: Judge the usefulness and sufficiency of information for a specific purpose
Manage: Organize information so as to find it later
Integrate: Summarize or otherwise synthesize information from a variety of sources
Create: Generate or adapt information to meet a need, expressing a main point and supporting information
Communicate: Adapt information for a particular audience
Define: Formulate a research statement to facilitate the search for information
Access: Find and retrieve information from a variety of sources
Evaluate: Judge the usefulness and sufficiency of information for a specific purpose
Manage: Organize information so as to find it later
Integrate: Summarize or otherwise synthesize information from a variety of sources
Create: Generate or adapt information to meet a need, expressing a main point and supporting information
Communicate: Adapt information for a particular audience
Define: Formulate a research statement to facilitate the search for information
Access: Find and retrieve information from a variety of sources
Evaluate: Judge the usefulness and sufficiency of information for a specific purpose
Manage: Organize information so as to find it later
Integrate: Summarize or otherwise synthesize information from a variety of sources
Create: Generate or adapt information to meet a need, expressing a main point and supporting information
Communicate: Adapt information for a particular audience
Define: Formulate a research statement to facilitate the search for information
Access: Find and retrieve information from a variety of sources
Evaluate: Judge the usefulness and sufficiency of information for a specific purpose
Manage: Organize information so as to find it later
Integrate: Summarize or otherwise synthesize information from a variety of sources
Create: Generate or adapt information to meet a need, expressing a main point and supporting information
Communicate: Adapt information for a particular audience
Define: Formulate a research statement to facilitate the search for information
Access: Find and retrieve information from a variety of sources
Evaluate: Judge the usefulness and sufficiency of information for a specific purpose
Manage: Organize information so as to find it later
Integrate: Summarize or otherwise synthesize information from a variety of sources
Create: Generate or adapt information to meet a need, expressing a main point and supporting information
Communicate: Adapt information for a particular audience
Everything is not applicable, but generally speaking...
Time committment --> credit-based course
Here’s example. This is a sample process of a credit IL course.
Making record: 4.2 (can look back what they did and revise their process & performace)
Here’s example. This is a sample process of a credit IL course.
Making record: 4.2 (can look back what they did and revise their process & performace)
Here’s example. This is a sample process of a credit IL course.
Making record: 4.2 (can look back what they did and revise their process & performace)
Think aloud can be used as well
Affective: Feeling, perceptions
Behavioral: what students can do, have learned
Cognitive: What students know
Slide 12: (e.g., Asaravala, 2005; Educational Testing Services [ETS], 2009; Katz, 2007; Somerville, Smith, & Macklin, 2007)
Slide 13 & 14: Katz, I. R. (2007).Testing information literacy in digital environments: ETS’s iSkills assessment. Retrieved September 12, 2009, from http://www.ets.org/Media/Tests/ICT_Literacy/ppt/amla_plenary.ppt. (Slide )
Slide 17, 19: (e.g., Ishimura, Howard, Moukdad, 2007; Seamans, 2002)
Slide 18: Ishimura, Howard, Moukdad, 2007
Slide 20, 21,22 (e.g., Nutefall, 2004; Snavely & Wright, 2003; Sonley, Turner, Myer, & Cotton, 2007)
Slide 23 (e.g., Nutefall, 2004; Snavely & Wright, 2003; Sonley, Turner, Myer, & Cotton, 2007)
Slide 24 (e.g., Nutefall, 2004; Snavely & Wright, 2003; Sonley, Turner, Myer, & Cotton, 2007)
Slide 25: Radcliff, C. J., Jensen, M. L., Salem Jr., J. A., Burhanna, K. J., & Gedeon, J. A. (2007). A practical guide to information literacy assessment for academic librarians. Westport, CT: Libraries Unlimited. (Page 20)
Slide 17, 19: (e.g., Ishimura, Howard, Moukdad, 2007; Seamans, 2002)
Slide 18: Ishimura, Howard, Moukdad, 2007
Slide 20, 21,22 (e.g., Nutefall, 2004; Snavely & Wright, 2003; Sonley, Turner, Myer, & Cotton, 2007)
Slide 23 (e.g., Nutefall, 2004; Snavely & Wright, 2003; Sonley, Turner, Myer, & Cotton, 2007)
Slide 24 (e.g., Nutefall, 2004; Snavely & Wright, 2003; Sonley, Turner, Myer, & Cotton, 2007)
Slide 25: Radcliff, C. J., Jensen, M. L., Salem Jr., J. A., Burhanna, K. J., & Gedeon, J. A. (2007). A practical guide to information literacy assessment for academic librarians. Westport, CT: Libraries Unlimited. (Page 20)