Design and validation of online instrument to measure information skills Amber Walraven, Joke Voogt, Jules Pieters
A recent study [W]e’ve come to use our laptops, tablets and smartphones as a ‘form of external or transactive memory, where information is stored collectively outside of ourselves. … We are becoming symbiotic with our computer tools, growing into interconnected systems that remember less by knowing information than by knowing where information can be found.’
 
STARTING POINTS The IPS-proces shows deficiencies in students as well as teachers (e.g., Byron, 2008; Kuiper, 2007; ten Brummelhuis, 2006; Walraven, 2008, 2009, 2010) Evaluating search results, information and source is problematic (o.a. Kuiper, 2007, Walraven, 2008) Instruction in these skills is rare
STARTING POINTS Measuring information skills often done in research Questionnaires Think aloud protocols Eye tracking Paper and pencil test Labor intensive and often expensive And what can educational practice do with it?
THOUGHTS FOR A SOLUTION Combine measurements to 1 online instrument.  To be used by teachers eventually Provide clear picture into information skills Whole process, and search the actual Internet Make use of self-assessment  self-assessment means more than students just grading their own work; rather, it engages them in determining what good work in a given situation is (Boud, 1995)
DIM: DIGITAL INFORMATION SKILLS MEASUREMENT
DIM: DIGITAL INFORMATION SKILLS MEASUREMENT
 
HOW DO WE MEASURE…. Evaluation skills evaluation button (during task and explicit task) questionnaire ( most useful site, why use information and why not to use information) Define questionnaire (Did you check with yourself whether you already had knowledge on this topic that you could use with this task?) Search logging queries, logging rank in hit list of opened sites, evaluation button questionnaire (Did you use multiple search terms with a search to find information?)
HOW DO WE MEASURE…. Scan and process information logging visited websites (and the time spent on those websites), use of the notepad to store information.  evaluation skills Organize and present information logging answer and notepad  questionnaire (thought about what you were going to write in your answer) Regulation questionnaire (what would you do the same the next time you receive a task like this again, and why?)
VALIDATION Content validity Making use of validated IPS model, previous used tasks, scoring protocols Focusgroups with researchers in the field.  Focusgroup with targetgroup (usability) Construct validity 84 students of secondary school used the DIM, results compared with think aloud protocols (TAP)
RESULTS: TASK Average time on task: 18:09 (SD 7:34) TAP: 23:09 (SD 7:1) Average search time: 8:21 (SD 6:27) remaining time: writing the answer TAP: 21:68 (SD 7:47) Average number of searches: 2.19 (SD 1.82) TAP: 10.3 (SD 7.36) Average number of sites visited 4.27 (SD 3.89)  TAP: 13,3 (SD 6.39) Average number of unique sites 3.67 (SD 2.93) Average use of evaluationbutton: 0.68 (SD 1.33)  TAP: 7.54 (SD 6.30) ! Only 24 students made comments during their search!
RESULTS: TASK -> EVALUATIONS DURING SEARCH - Coded with previously developed and validated coding scheme. Problems: no visual image of the search and evaluation process. More difficult to interpret evaluations. Evaluations unclear: Good website with a nice introduction Evaluations were not evaluations: It could also have a good influence on their language, cause when you don’t know a word you look it up in a dictionary.
RESULTS: TASK -> EVALUATIONS DURING SEARCH Evaluations mostly based on DIM: TAP: Title/summary -  Title/summary Site known to user -  Connection to task Appearance -  Language Kind of information -  Kind of information Connection to task -  Amount
RESULTS: EXPLICIT EVALUATIONS What do students comment on? DIM: TAP1 TAP2 Appearance of the site Connection to task Connection Connection to task Language More sites Language Kind of information Appearance Kind of information Amount Language Amount of information Speed Amount/Author/Kind/Reputation Lot of comments are just mentioning aspects, without an actual evaluation. This is comparable to our think aloud protocols (316 utterances were scored as undefined evaluations) and other research.
Conclusion The instrument gives insight in how students search for information and evaluate information. In order to further validate the instrument the following steps will be taken:  Score answers on task Score explicit criteria with model answer Analyse questionnaire Collect data with respondents we expect to score higher, to see whether the instrument is sensitive enough.  Perform study with experimental group, using prompts.
Dr. Amber Walraven Universiteit Twente Faculteit Gedragswetenschappen Vakgroep Curriculumontwerp & Onderwijsinnovatie [email_address] http://amberwalraven.edublogs.org Twitter: @amberwalraven

Ecer 2011

  • 1.
    Design and validationof online instrument to measure information skills Amber Walraven, Joke Voogt, Jules Pieters
  • 2.
    A recent study[W]e’ve come to use our laptops, tablets and smartphones as a ‘form of external or transactive memory, where information is stored collectively outside of ourselves. … We are becoming symbiotic with our computer tools, growing into interconnected systems that remember less by knowing information than by knowing where information can be found.’
  • 3.
  • 4.
    STARTING POINTS TheIPS-proces shows deficiencies in students as well as teachers (e.g., Byron, 2008; Kuiper, 2007; ten Brummelhuis, 2006; Walraven, 2008, 2009, 2010) Evaluating search results, information and source is problematic (o.a. Kuiper, 2007, Walraven, 2008) Instruction in these skills is rare
  • 5.
    STARTING POINTS Measuringinformation skills often done in research Questionnaires Think aloud protocols Eye tracking Paper and pencil test Labor intensive and often expensive And what can educational practice do with it?
  • 6.
    THOUGHTS FOR ASOLUTION Combine measurements to 1 online instrument. To be used by teachers eventually Provide clear picture into information skills Whole process, and search the actual Internet Make use of self-assessment self-assessment means more than students just grading their own work; rather, it engages them in determining what good work in a given situation is (Boud, 1995)
  • 7.
    DIM: DIGITAL INFORMATIONSKILLS MEASUREMENT
  • 8.
    DIM: DIGITAL INFORMATIONSKILLS MEASUREMENT
  • 9.
  • 10.
    HOW DO WEMEASURE…. Evaluation skills evaluation button (during task and explicit task) questionnaire ( most useful site, why use information and why not to use information) Define questionnaire (Did you check with yourself whether you already had knowledge on this topic that you could use with this task?) Search logging queries, logging rank in hit list of opened sites, evaluation button questionnaire (Did you use multiple search terms with a search to find information?)
  • 11.
    HOW DO WEMEASURE…. Scan and process information logging visited websites (and the time spent on those websites), use of the notepad to store information. evaluation skills Organize and present information logging answer and notepad questionnaire (thought about what you were going to write in your answer) Regulation questionnaire (what would you do the same the next time you receive a task like this again, and why?)
  • 12.
    VALIDATION Content validityMaking use of validated IPS model, previous used tasks, scoring protocols Focusgroups with researchers in the field. Focusgroup with targetgroup (usability) Construct validity 84 students of secondary school used the DIM, results compared with think aloud protocols (TAP)
  • 13.
    RESULTS: TASK Averagetime on task: 18:09 (SD 7:34) TAP: 23:09 (SD 7:1) Average search time: 8:21 (SD 6:27) remaining time: writing the answer TAP: 21:68 (SD 7:47) Average number of searches: 2.19 (SD 1.82) TAP: 10.3 (SD 7.36) Average number of sites visited 4.27 (SD 3.89) TAP: 13,3 (SD 6.39) Average number of unique sites 3.67 (SD 2.93) Average use of evaluationbutton: 0.68 (SD 1.33) TAP: 7.54 (SD 6.30) ! Only 24 students made comments during their search!
  • 14.
    RESULTS: TASK ->EVALUATIONS DURING SEARCH - Coded with previously developed and validated coding scheme. Problems: no visual image of the search and evaluation process. More difficult to interpret evaluations. Evaluations unclear: Good website with a nice introduction Evaluations were not evaluations: It could also have a good influence on their language, cause when you don’t know a word you look it up in a dictionary.
  • 15.
    RESULTS: TASK ->EVALUATIONS DURING SEARCH Evaluations mostly based on DIM: TAP: Title/summary - Title/summary Site known to user - Connection to task Appearance - Language Kind of information - Kind of information Connection to task - Amount
  • 16.
    RESULTS: EXPLICIT EVALUATIONSWhat do students comment on? DIM: TAP1 TAP2 Appearance of the site Connection to task Connection Connection to task Language More sites Language Kind of information Appearance Kind of information Amount Language Amount of information Speed Amount/Author/Kind/Reputation Lot of comments are just mentioning aspects, without an actual evaluation. This is comparable to our think aloud protocols (316 utterances were scored as undefined evaluations) and other research.
  • 17.
    Conclusion The instrumentgives insight in how students search for information and evaluate information. In order to further validate the instrument the following steps will be taken: Score answers on task Score explicit criteria with model answer Analyse questionnaire Collect data with respondents we expect to score higher, to see whether the instrument is sensitive enough. Perform study with experimental group, using prompts.
  • 18.
    Dr. Amber WalravenUniversiteit Twente Faculteit Gedragswetenschappen Vakgroep Curriculumontwerp & Onderwijsinnovatie [email_address] http://amberwalraven.edublogs.org Twitter: @amberwalraven