Students On-line - Digital Technologies and Performance by #OECD
Upcoming SlideShare
Loading in...5
×
 

Students On-line - Digital Technologies and Performance by #OECD

on

  • 17,948 views

Students On-line - Digital Technologies and Performance by OECD (June 2011)

Students On-line - Digital Technologies and Performance by OECD (June 2011)

Statistics

Views

Total Views
17,948
Slideshare-icon Views on SlideShare
17,948
Embed Views
0

Actions

Likes
0
Downloads
20
Comments
0

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Students On-line - Digital Technologies and Performance by #OECD Students On-line - Digital Technologies and Performance by #OECD Document Transcript

    • PISA 2009 Results:Students On Line Digital technologies anD Performance (Volume Vi)
    • this work is published on the responsibility of the secretary-general of the oecD. the opinionsexpressed and arguments employed herein do not necessarily reflect the official views of theorganisation or of the governments of its member countries. Please cite this publication as: oecD (2011), PISA 2009 Results: Students on Line: Digital Technologies and Performance (Volume VI) http://dx.doi.org/10.1787/9789264112995-enisBn 978-92-64-11291-9 (print)isBn 978-92-64-11299-5 (PDf)the statistical data for israel are supplied by and under the responsibility of the relevant israeli authorities. the use of such databy the oecD is without prejudice to the status of the golan heights, east Jerusalem and israeli settlements in the West Bankunder the terms of international law.Photo credits:getty images © ariel skelleygetty images © geostockgetty images © Jack hollingsworthstocklib image Bank © Yuri arcurscorrigenda to oecD publications may be found on line at: www.oecd.org/publishing/corrigenda.Pisatm, oecD/Pisatm and the Pisa logo are trademaks of the organisation for economic co-operation and Development (oecD).all use of oecD trademarks is prohibited without written permission from the oecD.© oecD 2011You can copy, download or print oecD content for your own use, and you can include excerpts from oecD publications, databases and multimediaproducts in your own documents, presentations, blogs, websites and teaching materials, provided that suitable acknowledgment of oecD as sourceand copyright owner is given. all requests for public or commercial use and translation rights should be submitted to rights@oecd.org. requests forpermission to photocopy portions of this material for public or commercial use shall be addressed directly to the copyright clearance center (ccc)at info@copyright.com or the centre français d’exploitation du droit de copie (cfc) at contact@cfcopies.com.
    • Forewordone of the ultimate goals of policy makers is to enable citizens to take advantage of a globalised world economy.this is leading them to focus on the improvement of education policies, ensuring the quality and sustainabilityof service provision, a more equitable distribution of learning opportunities and stronger incentives for greaterefficiency in schooling.Such policies all hinge on reliable information on how well education systems prepare students for life. mostcountries monitor students’ learning and the performance of schools. But in a global economy, the yardstick forsuccess is no longer improvement by national standards alone, but how education systems perform internationally.the oeCd has taken that challenge up by developing PISA, the Programme for International Student Assessment,which evaluates the quality, equity and efficiency of school systems in some 70 countries that, together, make upnine-tenths of the world economy. PISA represents a commitment by governments to monitor the outcomes ofeducation systems regularly within an internationally agreed framework and it provides a basis for internationalcollaboration in defining and implementing educational policies.the results from the PISA 2009 assessment reveal wide differences in education outcomes, both within and acrosscountries. the education systems that have been able to secure strong and equitable learning outcomes, and tomobilise rapid improvements, show others what is possible to achieve. naturally, GdP per capita influenceseducational success, but this only explains 6% of the differences between average student performance. the other94% reflect the potential for public policy to make a difference. the stunning success of Shanghai-China, whichtops every league table in this assessment by a clear margin, show what can be achieved with moderate economicresources and in a diverse social context. In mathematics, more than a quarter of Shanghai’s 15-year-olds canconceptualise, generalise, and creatively use information based on their own investigations and modelling ofcomplex problem situations. they can apply insight and understanding and develop new approaches and strategiesfor addressing novel situations. In the oeCd area, just 3% of students reach that level of performance.While better educational outcomes are a strong predictor of economic growth, wealth and spending on educationalone are no guarantee for better educational outcomes. overall, PISA shows that an image of a world dividedneatly into rich and well-educated countries and poor and badly-educated countries is out of date.this finding represents both a warning and an opportunity. It is a warning to advanced economies that they cannottake for granted that they will forever have “human capital” superior to that in other parts of the world. At a time ofintensified global competition, they will need to work hard to maintain a knowledge and skill base that keeps upwith changing demands.PISA underlines, in particular, the need for many advanced countries to tackle educational underperformance sothat as many members of their future workforces as possible are equipped with at least the baseline competenciesand skills that enable them to participate in social and economic development. the high social and economic costof poor educational performance in advanced economies risks otherwise to become a significant drag on economicdevelopment. At the same time, the findings show that poor skills are not an inevitable consequence of low nationalincome – an important outcome for countries that need to achieve more with less.But PISA also shows that there is no reason for despair. Countries from a variety of starting points have shown thepotential to raise the quality of educational outcomes substantially. Korea’s average performance was already high in2000, but Korean policy makers were concerned that only a narrow elite achieved levels of excellence in PISA. Withinless than a decade, Korea was able to double the share of students demonstrating excellence in reading literacy. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 3
    • Foreword A major overhaul of Poland’s school system helped to dramatically reduce performance variability among schools, reduce the share of poorly performing students and raise overall performance by the equivalent of more than half a school year. Germany was jolted into action when PISA 2000 revealed below-average performance and large social disparities in results, and has been able to make progress on both fronts. Israel, Italy and Portugal have moved closer to the oeCd average and Brazil, Chile, mexico and turkey are among the countries with impressive gains from very low levels of performance. But the greatest value of PISA lies in inspiring national efforts to help students to learn better, teachers to teach better, and school systems to become more effective. A closer look at high-performing and rapidly improving education systems shows that these have much in common that transcends differences in their history, culture and economic evolution. First, while most nations declare their commitment to education, the test comes when these commitments are weighed against others. How do they reward teachers compared to the way they pay other highly-skilled workers? How are education credentials weighed against other qualifications when people are being considered for jobs? Would you want your child to be a teacher? How much attention do the media pay to schools and schooling? Which matters more, a community’s standing in the sports leagues or its standing in the student academic achievement league tables? Are parents more likely to encourage their children to study longer and harder or to want them to spend more time with their friends or playing sports? In the most successful education systems, the political and social leaders have persuaded their citizens to make the choices needed to show that they value education more than other things. But placing a high value on education will get a country only so far if the teachers, parents and citizens of that country believe that only some subset of the nation’s children can or need to achieve world class standards. this report shows clearly that education systems built around the belief that students have different pre-ordained professional destinies to be met with different expectations in different school types tend to be fraught with large social disparities. In contrast, the best-performing education systems embrace the diversity in students’ capacities, interests and social background with individualised approaches to learning. Second, high-performing education systems stand out with clear and ambitious standards that are shared across the system, focus on the acquisition of complex, higher order thinking skills, and are aligned with high stakes gateways and instructional systems. In these education systems, everyone knows what is required to get a given qualification, in terms both of the content studied and the level of performance that has to be demonstrated to earn it. Students cannot go on to the next stage of their life – be it work or further education – unless they show that they are qualified to do so. they know what they have to do to realise their dream and they put in the work that is needed to achieve it. third, the quality of an education system cannot exceed the quality of its teachers and principals, since student learning is ultimately the result of what goes on in classrooms. Corporations, professional partnerships and national governments all know that they have to pay attention to how the pool is established from which they recruit; how they recruit; the kind of initial training their recruits get before they present themselves for employment; how they mentor new recruits and induct them into their service; what kind of continuing education they get; how their compensation is structured; how they reward their best-performers and how they improve the performance of those who are struggling; and how they provide opportunities for the best-performers to acquire more status and responsibility. many of the world’s best-performing education systems have moved from bureaucratic “command and control” environments towards school systems in which the people at the frontline have much more control of the way resources are used, people are deployed, the work is organised and the way in which the work gets done. they provide considerable discretion to school heads and school faculties in determining how resources are allocated, a factor which the report shows to be closely related to school performance when combined with effective accountability systems. And they provide an environment in which teachers work together to frame what they believe to be good practice, conduct field-based research to confirm or disprove the approaches they develop, and then assess their colleagues by the degree to which they use practices proven effective in their classrooms. last but not least, the most impressive outcome of world class education systems is perhaps that they deliver high- quality learning consistently across the entire education system such that every student benefits from excellent learning opportunity. to achieve this, they invest educational resources where they can make the greatest difference, they attract the most talented teachers into the most challenging classrooms, and they establish effective spending choices that prioritise the quality of teachers.4 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • Forewordthese are, of course, not independently conceived and executed policies. they need to be aligned across all aspectsof the system, they need to be coherent over sustained periods of time, and they need to be consistently implemented.the path of reform can be fraught with political and practical obstacles. moving away from administrative andbureaucratic control toward professional norms of control can be counterproductive if a nation does not yet haveteachers and schools with the capacity to implement these policies and practices. Pushing authority down to lowerlevels can be as problematic if there is not agreement on what the students need to know and should be able to do.Recruiting high-quality teachers is not of much use if those who are recruited are so frustrated by what they perceiveto be a mindless system of initial teacher education that they will not participate in it and turn to another profession.thus a county’s success in making these transitions depends greatly on the degree to which it is successful increating and executing plans that, at any given time, produce the maximum coherence in the system.these are daunting challenges and devising effective education policies will become ever more difficult as schoolsneeds to prepare students to deal with more rapid change than ever before, for jobs that have not yet been created,to use technologies that have not yet been invented and to solve economic and social challenges that we do not yetknow will arise. But those school systems that do well today, as well as those that have shown rapid improvement,demonstrate that it can be done. the world is indifferent to tradition and past reputations, unforgiving of frailty andcomplacency and ignorant of custom or practice. Success will go to those individuals and countries that are swiftto adapt, slow to complain and open to change. the task of governments will be to ensure that countries rise to thischallenge. the oeCd will continue to support their efforts. ***the report is the product of a collaborative effort between the countries participating in PISA, the experts andinstitutions working within the framework of the PISA Consortium, and the oeCd Secretariat. this volume ofthe report was drafted by a team led by Juliette mendelovits with guidance from the PISA Reading expert Groupand the oeCd PISA team, led by Andreas Schleicher. Contributing authors were Alla Berezner, John Cresswell,miyako Ikeda, Irwin Kirsch, dominique lafontaine, tom lumley, Christian monseur, Johannes naumann,Soojin Park and Jean-François Rouet. editorial and analytical support were provided by Francesca Borgonovi,michael davidson, maciej Jakubowski, Guillermo montt, oscar Valiente, Sophie Vayssettes, elisabeth Villoutreixand Pablo Zoido of the oeCd PISA team. Further advice was provided by marilyn Achiron, Simone Bloem,marika Boiron, Simon Breakspear, Henry Braun, nihad Bunar, Jude Cosgrove, Aletta Grisay, tim Heemsoth,donald Hirsch, david Kaplan, Henry levin, Barry mcCrae, dara Ramalingam, Wolfgang Schnotz, eduardo Vidal-Abarca and Allan Wigfield. Administrative support was provided by Juliet evans and diana tramontano.the PISA assessment instruments and the data underlying the report were prepared by the PISA Consortium, underthe direction of Raymond Adams at the Australian Council for educational Research (ACeR) and Henk moelandsfrom the dutch national Institute for educational measurement (CIto). the expert group that guided the preparationof the reading assessment framework and instruments was chaired by Irwin Kirsch.the development of the report was steered by the PISA Governing Board, which is chaired by lorna Bertrand(united Kingdom), with Beno Csapo (Hungary), daniel mcGrath (united States) and Ryo Watanabe (Japan) as vicechairs. Annex C of the volumes lists the members of the various PISA bodies, as well as the individual experts andconsultants who have contributed to this report and to PISA in general.Intel Corporation provided a generous financial contribution towards publishing this volume. Angel Gurría OECD Secretary-General PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 5
    • Table of ContentsExEcutivE summary ...........................................................................................................................................................................................................19introduction to Pisa ...................................................................................................................................................................................................23rEadEr’s GuidE ........................................................................................................................................................................................................................29ChapTEr 1 contExt of thE Pisa diGital rEadinG assEssmEnt............................................................................................31new technologies for text, new ways of reading................................................................................................................................................... 32 • differences in the readability and usability of text .....................................................................................................................................33 • new features of digital texts.......................................................................................................................................................................................34impact of digital texts on reading literacy.................................................................................................................................................................. 36 • Which aspects of reading are affected by digital text? .............................................................................................................................36some issues for assessing digital reading.....................................................................................................................................................................37conclusions ...................................................................................................................................................................................................................................... 38ChapTEr 2 studEnt PErformancE in diGital and Print rEadinG .................................................................................39digital reading................................................................................................................................................................................................................................ 40 • texts ............................................................................................................................................................................................................................................40 • Cognitive processes .........................................................................................................................................................................................................42 • Situation...................................................................................................................................................................................................................................44how the Pisa 2009 reading results are reported.................................................................................................................................................. 44 • How the PISA 2009 digital reading tests were designed, analysed and scaled ......................................................................44What students can do in digital reading ...................................................................................................................................................................... 49 • Students reading the different levels of proficiency on the digital reading scale ...................................................................49 • Average level of proficiency .......................................................................................................................................................................................51 • Gender differences in performance on the digital reading scale ......................................................................................................52Examples of digital reading items from the Pisa 2009 assessment .......................................................................................................... 54 • IWANTTOHELP ..................................................................................................................................................................................................................54 • SmELL........................................................................................................................................................................................................................................60 • JOb SEArcH ........................................................................................................................................................................................................................66similarities and differences between digital and print reading assessment ........................................................................................71 • Framework characteristics and test construct .................................................................................................................................................71 • test design and operational characteristics ......................................................................................................................................................73a comparison of performance in digital and print reading............................................................................................................................ 74 • Students reaching the different levels of proficiency.................................................................................................................................74 • Average level of proficiency .......................................................................................................................................................................................76 • Gender differences in performance on the digital and print reading scales .............................................................................78a composite scale for digital and print reading ..................................................................................................................................................... 80 • Students reaching the different levels of proficiency on the composite reading scale .......................................................82 • Average level of proficiency .......................................................................................................................................................................................83 • Gender differences in performance on the composite reading scale ............................................................................................85conclusions ...................................................................................................................................................................................................................................... 86 PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 7
    • Table of ConTenTs ChapTEr 3 naviGation in thE Pisa 2009 diGital rEadinG assEssmEnt .....................................................................89 General patterns in the relationship between navigation and performance in digital and print reading .....................90 • Relevance of pages...........................................................................................................................................................................................................91 • Indicators used to describe navigation ...............................................................................................................................................................91 • distribution of navigation indices at the country level ............................................................................................................................93 • Relationships among navigation, print and digital reading ..................................................................................................................97 • Correlations between navigation and performance ...................................................................................................................................97 • Regression of digital reading performance on print reading and navigation ............................................................................98 • non-linear effects of navigation on digital reading performance ..................................................................................................100 case studies: navigation behaviour of students in selected digital reading tasks .......................................................................102 • tasks analysed in the case studies .......................................................................................................................................................................103 • IWANTTOHELP ...............................................................................................................................................................................................................105 • SmELL.....................................................................................................................................................................................................................................113 • JOb SEArcH .....................................................................................................................................................................................................................117 conclusions ...................................................................................................................................................................................................................................120 ChapTEr 4 rElationshiPs BEtWEEn diGital rEadinG PErformancE and studEnt BackGround, EnGaGEmEnt and rEadinG stratEGiEs ......................................................................................................................................................123 family background ...................................................................................................................................................................................................................124 • Socio-economic background .................................................................................................................................................................................124 • Immigrant status ..............................................................................................................................................................................................................127 • languages spoken at home .....................................................................................................................................................................................127 • Performance differences within and between schools ..........................................................................................................................128 student engagement and attitudes................................................................................................................................................................................128 • engagement in reading and digital reading proficiency.......................................................................................................................129 • do students who enjoy reading read better on line? .............................................................................................................................131 • the association between the diversity of print material students read and digital reading proficiency...............132 • online reading practices ...........................................................................................................................................................................................133 • Gender differences in online reading practices .........................................................................................................................................134 • online reading practices and digital reading proficiency ..................................................................................................................135 reading strategies .....................................................................................................................................................................................................................138 • Awareness of strategies to understand and remember information ..............................................................................................138 • Awareness of effective strategies to summarise information .............................................................................................................138 model for the relationship between reading performance and student background characteristics ............................139 • Parents’ occupation.......................................................................................................................................................................................................139 • Parents’ education..........................................................................................................................................................................................................139 • number of books in the home ..............................................................................................................................................................................139 • Cultural possessions .....................................................................................................................................................................................................139 • Home educational resources..................................................................................................................................................................................139 conclusions ...................................................................................................................................................................................................................................140 ChapTEr 5 studEnts’ familiarity With information and communication tEchnoloGiEs .......143 students’ access to ict .........................................................................................................................................................................................................144 • the number of students who have never used a computer................................................................................................................144 • Students’ access to a computer and the Internet at home ..................................................................................................................146 • Students’ access to computers and the Internet at school ..................................................................................................................150 how students use technology at school and at home .....................................................................................................................................157 • Students’ use of ICt at home ..................................................................................................................................................................................157 • Students’ use of ICt at school ................................................................................................................................................................................162 students’ attitudes towards and self-confidence in using computers ..................................................................................................167 • Students’ attitudes towards using computers ...............................................................................................................................................167 • Students’ confidence in computer use and technical proficiency ................................................................................................170 conclusions ...................................................................................................................................................................................................................................1758 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • Table of ConTenTsChapTEr 6 studEnts’ usE of information and communication tEchnoloGiEsand thEir PErformancE in diGital rEadinG ......................................................................................................................................177access to and use of computers and performance ...........................................................................................................................................178 • Access to and use of computers at home .......................................................................................................................................................178 • Computer access and use at school...................................................................................................................................................................179different types of computer use and performance ...........................................................................................................................................180 • use of computers at home and performance ..............................................................................................................................................180 • use of computers at school and performance ............................................................................................................................................185relationship between selected computer activities and performance in digital reading, in detail ................................189 • Computer use at home ...............................................................................................................................................................................................189 • Computer use at school .............................................................................................................................................................................................190 • navigation and computer use at home and at school...........................................................................................................................191students’ self-confidence in doing ict tasks .........................................................................................................................................................195 • Students’ self-confidence in using computers and performance....................................................................................................195 • Students’ self-confidence in doing ICt tasks and activities ...............................................................................................................197conclusions ...................................................................................................................................................................................................................................197ChapTEr 7 somE asPEcts rElatEd to diGital rEadinG ProficiEncy ...........................................................................201variation in student reading performance ..............................................................................................................................................................203socio-economic aspects .......................................................................................................................................................................................................203 • Student socio-economic background ...............................................................................................................................................................203 • mean school socio-economic background...................................................................................................................................................204attitudes towards reading ...................................................................................................................................................................................................204 • enjoyment of reading...................................................................................................................................................................................................204 • diversity of reading materials ................................................................................................................................................................................205use of computers ......................................................................................................................................................................................................................205 • Computer use at home ...............................................................................................................................................................................................205 • Computer use at school .............................................................................................................................................................................................205online reading practices......................................................................................................................................................................................................205 • Searching-information activities ...........................................................................................................................................................................206 • Social activities ................................................................................................................................................................................................................206learning strategies....................................................................................................................................................................................................................206 • Awareness of strategies to understand and remember information ..............................................................................................206 • Awareness of effective strategies to summarise information .............................................................................................................206Gender ..............................................................................................................................................................................................................................................206variation explained by the model..................................................................................................................................................................................207conclusions ...................................................................................................................................................................................................................................207Policy imPlications ......................................................................................................................................................................................................209helping students develop effective skills in reading digital texts ...............................................................................................................209addressing underperformance of boys ......................................................................................................................................................................210improving access to ict .......................................................................................................................................................................................................210Enabling effective use of ict in schools....................................................................................................................................................................210rEfErEncEs .................................................................................................................................................................................................................................213annEx a tEchnical BackGround ..............................................................................................................................................................217annex a1a: Construction of digital reading scales and indices from the student, school and ICt questionnaires......218annex a1b: Construction of navigation indices ...................................................................................................................................................228annex a2: the PISA target population, the PISA samples and the definition of schools .......................................................233annex a3: Standard errors, significance tests and sub-group comparisons ....................................................................................247 PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 9
    • Table of ConTenTs annex a4: Quality assurance for the digital reading assessment ...........................................................................................................249 annex a5: development of the PISA assessment instruments for print and digital reading ................................................251 annex a6: tables showing the relationships between ICt activities and performance in print reading, mathematics and science ........................................................................................................................................................................254 annEx B taBlEs of rEsults ...................................................................................................................................................................................255 annex B1: Results for countries and economies ...............................................................................................................................................256 annex B2: Results for regions within countries..................................................................................................................................................385 annEx C thE dEvEloPmEnt and imPlEmEntation of Pisa – a collaBorativE Effort .........................389 This book has... StatLinks 2 ® A service that delivers Excel files   from the printed page! Look for the StatLinks at the bottom left-hand corner of the tables or graphs in this book. To download the matching Excel® spreadsheet, just type the link into your Internet browser, starting with the http://dx.doi.org prefix. If you’re reading the PDF e-book edition, and your PC is connected to the Internet, simply click on the link. You’ll find StatLinks appearing in more OECD books.10 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • Table of ConTenTsBoxesBox VI.A Key features of PISA 2009 ...................................................................................................................................................................................26Box VI.3.1 example of navigation indices ...........................................................................................................................................................................91Box VI.3.2 How the findings are organised.........................................................................................................................................................................92Box VI.4.1 A cycle of engagement in reading activities, reading strategies and reading performance ......................................................... 129Box VI.4.2 the association between reading engagement, awareness of reading strategies and reading performance .......................... 130Box VI.4.3 Interpreting PISA indices .................................................................................................................................................................................. 130Box VI.4.4 Relationship between online reading, print reading and enjoyment of reading ............................................................................. 137Box VI.5.1 How information on students’ familiarity with ICt was collected ...................................................................................................... 145Box VI.5.2 Indices to analyse frequency of ICt use ...................................................................................................................................................... 157Box VI.6.1 labels for each group of students: Students’ use of computers ............................................................................................................ 180Box VI.6.2 Relationship between ICt activities and performance in print reading, mathematics and science.......................................... 192Box VI.6.3 labels for each group of students: Students’ self-confidence in using computers.......................................................................... 195FiguresFigure VI.A A map of PISA countries and economies .......................................................................................................................................................27Figure VI.1.1 Comparison of print and digital texts ..............................................................................................................................................................33Figure VI.2.1 digital reading tasks by environment ..............................................................................................................................................................40Figure VI.2.2 digital reading tasks by text format ...................................................................................................................................................................41Figure VI.2.3 digital reading tasks by text type ......................................................................................................................................................................41Figure VI.2.4 digital reading tasks by aspect ..........................................................................................................................................................................42Figure VI.2.5 Relationship between text processing and navigation in digital reading tasks ......................................................................................43Figure VI.2.6 digital reading tasks by situation ......................................................................................................................................................................44Figure VI.2.7 Relationship between questions and students on a proficiency scale...................................................................................................45Figure VI.2.8 Summary descriptions for four levels of proficiency in digital reading.................................................................................................46Figure VI.2.9 map of selected digital reading questions in PISA 2009, illustrating the proficiency levels ..........................................................48Figure VI.2.10 How proficient are students in digital reading? ............................................................................................................................................49Figure VI.2.11 Comparing countries’ performance in digital reading................................................................................................................................51Figure VI.2.12 Where countries rank in digital reading performance................................................................................................................................52Figure VI.2.13 Gender differences in digital reading performance.....................................................................................................................................53Figure VI.2.14 How proficient are girls and boys in digital reading? .................................................................................................................................53Figure VI.2.15 distribution of score points in digital and print reading assessments, by text format ......................................................................71Figure VI.2.16 distribution of score points in digital and print reading assessments, by text type...........................................................................72Figure VI.2.17 distribution of score points in digital and print reading assessments, by aspect ...............................................................................73Figure VI.2.18 Similarities and differences between digital and print reading assessments in PISA 2009 .............................................................73Figure VI.2.19 A comparison of performance levels on the digital and print reading scales .....................................................................................75Figure VI.2.20 Percentage of students at each proficiency level on the digital and print reading scales................................................................76Figure VI.2.21 Comparison of mean performance in digital and print reading ..............................................................................................................77Figure VI.2.22 Where countries rank in digital and print reading performance .............................................................................................................78Figure VI.2.23 Comparison of gender gaps in digital and print reading ...........................................................................................................................79Figure VI.2.24 Alignment between the described levels for digital and print reading and composite reading ....................................................80Figure VI.2.25 Summary descriptions for the composite reading scale (digital and print combined) .....................................................................81Figure VI.2.26 How proficient are students on the composite reading scale? ................................................................................................................82Figure VI.2.27 Comparing countries’ performance on the composite reading scale ....................................................................................................84Figure VI.2.28 Where countries rank on the composite reading scale ..............................................................................................................................84Figure VI.2.29 Gender differences on the composite reading scale ...................................................................................................................................85Figure VI.2.30 How proficient are girls and boys on the composite reading scale? .....................................................................................................86 PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 11
    • Table of ConTenTs Figure VI.3.1 Illustration of the relationship between number of relevant pages visited and digital reading performance ...........................93 Figure VI.3.2 distribution of the number of pages and visits, aggregated across oeCd countries........................................................................94 Figure VI.3.3 Relationship between the number of relevant pages visited and digital reading performance .....................................................94 Figure VI.3.4 Relationship between the number of visits to relevant pages and digital reading performance ...................................................95 Figure VI.3.5 Relationship between the number of page visits and digital reading performance ..........................................................................95 Figure VI.3.6 Relationship between standard deviation and mean of the number of relevant pages visited......................................................96 Figure VI.3.7 Relationship between standard deviation of the number of relevant pages visited and digital reading performance ...........97 Figure VI.3.8 Relationship between the number of visits to relevant pages (centred) and digital reading performance, oeCd average ..................................................................................................................................................................................................... 101 Figure VI.3.9 Summary of characteristics of digital reading tasks analysed in this section ................................................................................... 104 Figure VI.3.10 Relevant pages for IWANTTOHELP – Question 4 .................................................................................................................................... 109 Figure VI.3.11 extremes of student behaviour for IWANTTOHELP – Question 4 ....................................................................................................... 112 Figure VI.4.1 Strength of socio-economic gradient and reading performance .......................................................................................................... 126 Figure VI.4.2 Student performance in digital reading and immigrant status .............................................................................................................. 127 Figure VI.4.3 Variation in performance in digital and print reading explained by students’ and schools’ socio-economic backgrounds ..... 128 Figure VI.4.4 Relationship between enjoyment of reading and digital reading performance............................................................................... 131 Figure VI.4.5 Relationship between diversity of reading and digital reading performance ................................................................................... 133 Figure VI.4.6 Index of online searching-information activities, by gender ................................................................................................................. 134 Figure VI.4.7 Index of online social activities, by gender ................................................................................................................................................ 135 Figure VI.4.8 Relationship between online searching-information activities and digital reading performance .............................................. 136 Figure VI.4.9 Relationship between online social activities and digital reading performance ............................................................................. 136 Figure VI.4.10 Single-level model to explain performance in digital and print reading, oeCd average-16...................................................... 139 Figure VI.5.1 Percentage of students who reported that they have never used a computer, by socio-economic background ................... 145 Figure VI.5.2 Percentage of students who reported having a computer at home in PISA 2000 and 2009 ........................................................... 146 Figure VI.5.3 Percentage of students who reported having a computer at home, by socio-economic background ...................................... 146 Figure VI.5.4 Change in the percentage of students who reported having a computer at home between 2000 and 2009, by socio-economic background ..................................................................................................................................................................... 147 Figure VI.5.5 Percentage of students who reported having access to the Internet at home in 2000 and 2009 ................................................... 149 Figure VI.5.6 Percentage of students who reported having access to the Internet at home, by socio-economic background ................... 149 Figure VI.5.7 Change in the percentage of students who reported having access to the Internet at home between 2000 and 2009, by socio-economic background ..................................................................................................................................................................... 150 Figure VI.5.8 Computers-per-student ratio in 2000 and 2009 ........................................................................................................................................ 151 Figure VI.5.9 Percentage of students with access to computers at school .................................................................................................................. 152 Figure VI.5.10 Percentage of students with access to the Internet at school................................................................................................................. 152 Figure VI.5.11 Percentage of students who reported using a computer at home and at school ............................................................................. 153 Figure VI.5.12 Percentage of students who reported using a computer at home and at school, by socio-economic background ............. 154 Figure VI.5.13 Percentage of students who reported using the Internet at home and at school ............................................................................. 155 Figure VI.5.14 Percentage of students in schools where the principal reported shortage or inadequacy of computers for instruction, by socio-economic background ..................................................................................................................................................................... 156 Figure VI.5.15 Percentage of students who reported that they did the following activities at home for leisure at least once a week, oeCd average-28 ............................................................................................................................................................................................... 158 Figure VI.5.16 Index of computer use at home for leisure, by gender and socio-economic background ........................................................... 159 Figure VI.5.17 Percentage of students who reported that they did the following activities at home for schoolwork at least once a week, oeCd average-29 ............................................................................................................................................................................................... 160 Figure VI.5.18 Index of computer use at home for schoolwork-related tasks, by gender and socio-economic background ........................ 161 Figure VI.5.19 Percentage of students who reported that they did the following activities at school at least once a week, oeCd average-29 ................................................................................................................................................................................................ 163 Figure VI.5.20 Index of computer use at school, by gender and socio-economic background.............................................................................. 164 Figure VI.5.21 Percentage of students who reported that they use a computer during regular classroom lessons at least some time during a typical week, oeCd average-29 ................................................................................................................................................... 165 Figure VI.5.22 Intensity of computer use during language-of-instruction lessons....................................................................................................... 16612 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • Table of ConTenTsFigure VI.5.23 Percentage of students who reported using laptops at school .............................................................................................................. 167Figure VI.5.24 Percentage of students who reported positive attitudes towards computers, oeCd average-28 ............................................... 168Figure VI.5.25 Index of attitudes towards computers, by gender and socio-economic background .................................................................... 169Figure VI.5.26 Percentage of students who reported being able to do each of the following tasks very well by themselves or with help from someone, oeCd average-29 ........................................................................................................................................ 170Figure VI.5.27 Index of self-confidence in ICt high-level tasks, by gender and socio-economic background ................................................. 171Figure VI.5.28 Percentage of students who reported being able to create a multi-media presentation ............................................................... 172Figure VI.5.29 Percentage of students who reported being able to use a spreadsheet to plot a graph................................................................. 173Figure VI.5.30 Percentage of students who reported being able to do the following tasks very well by themselves or with help from someone, in 2003 and 2009 ................................................................................................................................................................. 174Figure VI.6.1 difference in digital reading scores between students who use a computer at home and those who do not ....................... 178Figure VI.6.2 difference in digital reading scores between students who use a computer at school and those who do not.......................... 180Figure VI.6.3 Computer use at home for leisure, and digital reading performance, oeCd average-15............................................................ 181Figure VI.6.4 Index of computer use at home for leisure, and digital reading performance, by gender, oeCd average-15 ...................... 182Figure VI.6.5a Index of computer use at home for leisure, and digital reading performance, by socio-economic background (Japan) ........ 183Figure VI.6.5b Index of computer use at home for leisure, and digital reading performance, by socio-economic background (Chile) ........ 183Figure VI.6.6 Computer use at home for schoolwork, and digital reading performance, oeCd average-15 .................................................. 184Figure VI.6.7 Computer use at school and digital reading performance, oeCd average-15................................................................................ 186Figure VI.6.8 Index of computer use at school, and digital reading performance, by gender, oeCd average-15 ......................................... 186Figure VI.6.9 Index of computer use at school, and digital reading performance, by socio-economic background (Japan)...................... 187Figure VI.6.10a Intensity of computer use in school lessons, and digital reading performance, oeCd average-15 ......................................... 188Figure VI.6.10b Prevalence of computer use in school lessons, and difference in digital reading performance according to intensity of computer use in school lessons ................................................................................................................................................................. 188Figure VI.6.11 Frequency of computer use at home for leisure and schoolwork, and digital reading performance before and after accounting for print reading performance, oeCd average-15........................................................................................... 190Figure VI.6.12 Frequency of computer use at school, and digital reading performance before and after accounting for print reading performance, oeCd average-15.................................................................................................................................... 191Figure VI.6.A Index of computer use at home for leisure, and performance in print reading, digital reading, mathematics and science, oeCd average-15 ............................................................................................................................................................................................... 192Figure VI.6.B Index of computer use at home for leisure, and performance in print reading, mathematics and science, by gender, oeCd average-15 ............................................................................................................................................................................................... 193Figure VI.6.13a Index of the number of relevant pages visited, by frequency of computer use at home for leisure, oeCd average-15 ......... 194Figure VI.6.13b Index of the number of relevant pages visited, by frequency of computer use at home for schoolwork and computer use at school, oeCd average-15 ....................................................................................................................................... 194Figure VI.6.14 Self-confidence in ICt high-level tasks, and digital reading performance, oeCd average-15 .................................................. 196Figure VI.6.15 Index of self-confidence in ICt high-level tasks, and digital reading performance, by gender, oeCd average-15 ............ 196Figure VI.6.16 Frequency of computer use at home and school, and index of self-confidence in high-level ICt tasks, oeCd average-15 ..... 198Figure VI.7.1 Illustration of the relationship between students’ socio-economic background and student performance ........................... 202Figure VI.7.2 Score point differences in digital reading associated with variables in the multilevel regression models, oeCd average-15 ... 204Figure A1a.1 differences between students who participated in the digital reading assessment and all students, for print and digital reading .............................................................................................................................................................................................. 220Figure VI.A3.1 labels used in a two-way table....................................................................................................................................................................... 247TaBlestable VI.A An overview of performance in digital reading, navigation and computer use .................................................................................21table A1a.1 Performance in digital and print reading for the group of students who participated in the digital reading assessment and all students .................................................................................................................................................................................................... 219table A1a.2 Student socio-economic background (eSCS) for the group of students who participated in the digital reading assessment and all students .................................................................................................................................................................................................... 221 PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 13
    • Table of ConTenTs table A1a.3 levels of parental education converted into years of schooling .......................................................................................................... 224 table A1a.4 Rotated component pattern ............................................................................................................................................................................. 225 table A1b.1 Correlations of navigation indices (standardised per test) with digital reading scores (Wles), by country ............................ 229 table A1b.2 Correlations of navigation indices (standardised per test) with print reading scores (Wles), by country ............................... 229 table A1b.3 Regression of digital reading scores (Wles) on print reading scores (Wles) and the number of relevant pages visited (standardised per test) ........................................................................................................................................................................................ 230 table A1b.4 Regression of digital reading scores (Wles) on print reading scores (Wles) and the number of visits to relevant pages (standardised per test) ........................................................................................................................................................................................ 230 table A1b.5 Regression of digital reading scores (Wles) on print reading scores (Wles) and the number of page visits (standardised per test) ........................................................................................................................................................................................ 231 table A1b.6 Regression of digital reading scores (Wles) on print reading scores (Wles) and the number of page visits (standardised per test) including a quadratic trend for the number of page visits .......................................................................... 231 table A1b.7 Regression of digital reading scores (Wles) on print reading scores (Wles) and the number of visits to relevant pages (standardised per test) including a quadratic trend for the number of relevant page visits ......................................................... 232 table A1b.8 Regression of digital reading scores (Wles) on print reading scores (Wles) and the number of relevant pages visited (standardised per test) including a quadratic trend for the number of relevant pages visited ..................................................... 232 table A2.1 PISA target populations and samples (paper-based assessment) .......................................................................................................... 235 table A2.2 exclusions (paper-based assessment) ............................................................................................................................................................ 237 table A2.3 Response rates (paper-based assessment) .................................................................................................................................................... 239 table A2.4a Percentage of students at each grade level ................................................................................................................................................. 242 table A2.4b Percentage of students at each grade level, by gender ............................................................................................................................ 243 table A2.5 Student response rates (digital reading assessment) ................................................................................................................................. 245 table A2.6 School response rates (digital reading assessment) .................................................................................................................................. 246 table A5.1 distribution of items by the dimensions of the PISA framework for the assessment of print reading ...................................... 252 table A5.2 distribution of items by the dimensions of the PISA framework for the assessment of digital reading ................................... 252 table VI.2.1 Percentage of students at each proficiency level on the digital, print and composite reading scales ...................................... 256 table VI.2.2 Percentage of boys at each proficiency level on the digital, print and composite reading scales ............................................. 257 table VI.2.3 Percentage of girls at each proficiency level on the digital, print and composite reading scales .............................................. 258 table VI.2.4 mean score, variation and gender differences in student performance on the digital, print and composite reading scales ....... 259 table VI.3.1 descriptive statistics for the number of relevant pages visited, the number of visits to relevant pages and the number of page visits ......................................................................................................................................................................................................... 260 table VI.3.2 Correlations of navigation indices with digital reading scores (Wles)............................................................................................... 261 table VI.3.3 Correlations of navigation indices with print reading scores (Wles).................................................................................................. 261 table VI.3.4 Regression of digital reading scores (Wles) on print reading scores (Wles) and the number of relevant pages visited ......... 262 table VI.3.5 Regression of digital reading scores (Wles) on print reading scores (Wles) and the number of visits to relevant pages ....... 262 table VI.3.6 Regression of digital reading scores (Wle s) on print reading scores (Wle s) and the number of page visits ....................... 263 table VI.3.7 Regression of digital reading scores (Wles) on print reading scores (Wles) and the number of page visits including a quadratic trend for the number of page visits ......................................................................................................................................... 263 table VI.3.8 Regression of digital reading scores (Wles) on print reading scores (Wles) and the number of visits to relevant pages including a quadratic trend for the number of visits to relevant pages .............................................................................................. 264 table VI.3.9 Regression of digital reading scores (Wles) on print reading scores (Wles) and the number of relevant pages visited including a quadratic trend for the number of relevant pages visited ................................................................................................ 264 table VI.3.10 IWANTTOHELP Question 1. Summary of student performance ......................................................................................................... 264 table VI.3.11 IWANTTOHELP Question 1. number of pages visited........................................................................................................................... 265 table VI.3.12 IWANTTOHELP Question 1. Students with full credit: digital reading performance, by number of pages visited.............. 265 table VI.3.13 IWANTTOHELP Question 1. Students with full credit: reading performance, by number of page visits ............................... 265 table VI.3.14 IWANTTOHELP Question 2. digital reading performance, by visits to P25 ................................................................................... 26514 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • Table of ConTenTstable VI.3.15 IWANTTOHELP Question 4. Summary of student performance ......................................................................................................... 266table VI.3.16 IWANTTOHELP Question 4. time on task.................................................................................................................................................. 266table VI.3.17 IWANTTOHELP Question 4. Relationship of page visits to digital reading performance............................................................ 266table VI.3.18 IWANTTOHELP Question 4. Variations in time spent on the task and pages visited .................................................................... 266table VI.3.19 IWANTTOHELP Question 4. Student performance according to initial navigation sequences ................................................. 267table VI.3.20 IWANTTOHELP Question 4. number of page visits for students obtaining no credit .................................................................. 267table VI.3.21 SmELL Question 1. Summary of student performance............................................................................................................................ 267table VI.3.22 SmELL Question 1. digital reading performance and time spent on P02......................................................................................... 267table VI.3.23 SmELL Question 1. Students with full credit: number of page visits and time spent on P02...................................................... 268table VI.3.24 SmELL Question 1. digital reading performance, by number of visits to P02 ................................................................................ 268table VI.3.25 SmELL Question 3. Summary of student performance............................................................................................................................ 268table VI.3.26 SmELL Question 3. Visits to pages with information relevant to Smell tasks ................................................................................. 268table VI.3.27 SmELL Question 3. digital reading performance, by visits to relevant pages ................................................................................. 269table VI.3.28 JOb SEArcH Question 2: Summary of student performance ............................................................................................................... 269table VI.3.29 JOb SEArcH Question 2. differences in digital and print reading performance........................................................................... 269table VI.3.30 JOb SEArcH Question 2. digital reading performance, by navigation sequence......................................................................... 269table VI.3.31 JOb SEArcH Question 2. Students with full credit: digital reading performance, by number of visits to P03 .................... 270table VI.3.32 JOb SEArcH Question 2. digital reading performance of students who did and did not visit P03 ........................................ 270table VI.3.33 JOb SEArcH Question 2. digital reading performance, by number of irrelevant page visits ................................................... 270table VI.4.1 Performance groups in reading and socio-economic background ...................................................................................................... 271table VI.4.2 PISA index of economic, social and cultural status and reading performance, by national quarters of this index ............. 273table VI.4.3 Relationship between students’ reading performance and the index of economic, social and cultural status (eSCS)........ 275table VI.4.4 Percentage of students, reading performance and difference in the index of economic, social and cultural status (eSCS), by students’ immigrant background .............................................................................................................................................................. 277table VI.4.5 Percentage of students and reading performance, by language spoken at home ........................................................................... 279table VI.4.6 decomposition of the gradient of the index of economic, social and cultural status (eSCS) into between-school and within-school components ...................................................................................................................................................................... 280table VI.4.7 Students’ enjoyment of reading and digital reading performance ....................................................................................................... 282table VI.4.8 Relationship between enjoyment of reading and digital reading performance, by gender ......................................................... 283table VI.4.9 Students’ diversity of reading materials and digital reading performance ........................................................................................ 284table VI.4.10 Relationship between diversity of reading and digital reading performance, by gender ............................................................. 285table VI.4.11 Students’ level of online searching-information activities and digital reading performance....................................................... 286table VI.4.12 Students’ level of online social activities and digital reading performance...................................................................................... 287table VI.4.13 Relationship between online searching-information activities and digital reading performance, by gender ........................ 288table VI.4.14 Relationship between online social activities and digital reading performance, by gender ....................................................... 288table VI.4.15 Relationship between the index of understanding and remembering and reading proficiency................................................. 289table VI.4.16 Percentage of students with low levels of understanding in different reading proficiency levels .............................................. 291table VI.4.17 Relationship between the index of summarising and reading proficiency ....................................................................................... 292table VI.4.18 Percentage of students with low levels of summarising in different reading proficiency levels ................................................. 294table VI.4.19 Relationship between some student-level aspects and performance in reading............................................................................. 295table VI.4.20 Relationships between online reading practices, enjoyment of reading and diversity of reading............................................. 298table VI.5.1 Percentage of students who reported that they have never used a computer, by gender and socio-economic background ...... 299table VI.5.2 Percentage of students who reported having a computer at home in 2000 and 2009, by gender ............................................ 300table VI.5.3 Percentage of students who reported having a computer at home, by gender and socio-economic background ............... 301table VI.5.4 Percentage of students who reported having a computer at home in 2000 and 2009, by socio-economic background ....... 302table VI.5.5 Percentage of students who reported having access to the Internet at home in 2000 and 2009, by gender.............................. 303 PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 15
    • Table of ConTenTs table VI.5.6 Percentage of students who reported having access to the Internet at home, by gender and socio-economic background ...... 304 table VI.5.7 Percentage of students who reported having access to the Internet at home in 2000 and 2009, by socio-economic background ..................................................................................................................................................................... 305 table VI.5.8a Ratio of computers to the number of students in the modal grade of 15-year-olds ....................................................................... 306 table VI.5.8b Ratio of computers to the number of students in school in 2000 and 2009 .................................................................................... 306 table VI.5.9 Percentage of students with access to computers and the Internet at school................................................................................... 307 table VI.5.10a Percentage of students who reported using a computer at home and at school, by socio-economic background .................. 308 table VI.5.10b Percentage of students who reported using a computer at home and at school ................................................................................ 309 table VI.5.11 Percentage of students who reported using the Internet at home and at school ............................................................................. 310 table VI.5.12 Percentage of students in schools whose principals reported shortage or inadequacy of computers for instruction .......... 311 table VI.5.13 Percentage of students who reported that they did the following activity at home for leisure at least once a week........... 312 table VI.5.14 Index of computer use at home for leisure and reading performance ............................................................................................... 313 table VI.5.15 Percentage of students who reported that they did the following activity at home for schoolwork at least once a week ...... 315 table VI.5.16 Index of computer use at home for schoolwork, and reading performance .................................................................................... 316 table VI.5.17 Percentage of students who reported that they did the following activity at school at least once a week ............................. 318 table VI.5.18 Index of computer use at school, and reading performance ................................................................................................................. 319 table VI.5.19 Percentage of students who attend regular lessons, by time spent on using a computer during classroom lessons in a typical school week ................................................................................................................................................................................... 321 table VI.5.20 Percentage of students, by time spent on using a computer during foreign-language lessons in a typical school week ........ 322 table VI.5.21 Percentage of students who reported using laptops at school .............................................................................................................. 323 table VI.5.22 Percentage of students, by their attitudes towards computers .............................................................................................................. 324 table VI.5.23 Index of attitude towards computers and reading performance........................................................................................................... 325 table VI.5.24 Percentage of students, by level of self-confidence in ICt high-level tasks ...................................................................................... 327 table VI.5.25 Index of self-confidence in ICt high-level tasks, and reading performance .................................................................................... 328 table VI.5.26 Percentage of students, by level of self-confidence in creating a multi-media presentation....................................................... 330 table VI.5.27 Percentage of students, by level of self-confidence in using a spreadsheet to plot a graph ........................................................ 331 table VI.5.28 Percentage of students who reported being able to do some ICt high-level tasks in 2003 and 2009, by gender ............... 332 table VI.5.29 Percentage of students who reported being able to do some ICt high-level tasks in 2003 and 2009, by socio-economic background ..................................................................................................................................................................... 335 table VI.6.1 digital reading performance, by access to a computer at home ......................................................................................................... 338 table VI.6.2 digital reading performance, by computer use at home ........................................................................................................................ 338 table VI.6.3 digital reading performance, by access to a computer at school........................................................................................................ 339 table VI.6.4 digital reading performance, by computer use at school ...................................................................................................................... 339 table VI.6.5a digital reading performance, by index of computer use at home for leisure .................................................................................. 340 table VI.6.5b digital reading performance, by computer use at home for playing one-player games ............................................................... 341 table VI.6.5c digital reading performance, by computer use at home for playing collaborative online games ............................................. 341 table VI.6.5d digital reading performance, by computer use at home for sending e-mail.................................................................................... 342 table VI.6.5e digital reading performance, by computer use at home for chatting on line .................................................................................. 342 table VI.6.5f digital reading performance, by computer use at home for browsing the Internet for fun ......................................................... 343 table VI.6.5g digital reading performance, by computer use at home for downloading music, films, games or software from the Internet .................................................................................................................................................................................................. 343 table VI.6.5h digital reading performance, by computer use at home for publishing and maintaining a personal page, weblog or blog ..................................................................................................................................................................................................... 344 table VI.6.5i digital reading performance, by computer use at home for participating in online forums, virtual communities or spaces ................................................................................................................................................................................................................ 344 table VI.6.6a digital reading performance, by index of computer use at home for schoolwork ......................................................................... 345 table VI.6.6b digital reading performance, by computer use at home for browsing the Internet for schoolwork ......................................... 346 table VI.6.6c digital reading performance, by computer use at home for sending e-mail to communicate with other students about schoolwork ............................................................................................................................................................................................... 34616 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • Table of ConTenTstable VI.6.6d digital reading performance, by computer use at home for sending e-mail to communicate with teachers about schoolwork ............................................................................................................................................................................................... 347table VI.6.6e digital reading performance, by computer use at home for downloading, uploading or browsing material from the school’s website ................................................................................................................................................................................. 347table VI.6.6f digital reading performance, by computer use at home for checking the school’s website for announcements................. 348table VI.6.7a digital reading performance, by index of computer use at school ..................................................................................................... 349table VI.6.7b digital reading performance, by computer use at school for chatting on line ................................................................................ 350table VI.6.7c digital reading performance, by computer use at school for sending e-mail .................................................................................. 350table VI.6.7d digital reading performance, by computer use at school for browsing the Internet for schoolwork ....................................... 351table VI.6.7e digital reading performance, by computer use at school for downloading, uploading or browsing material from the school’s website ................................................................................................................................................................................. 351table VI.6.7f digital reading performance, by computer use at school for posting work on the school’s website........................................ 352table VI.6.7g digital reading performance, by computer use at school for playing simulations at school ...................................................... 352table VI.6.7h digital reading performance, by computer use at school for practicing and drilling .................................................................... 353table VI.6.7i digital reading performance, by computer use at school for doing individual homework on a school computer ............. 353table VI.6.7j digital reading performance, by computer use at school for group work and communication with other students .......... 354table VI.6.8a digital reading performance, by time spent using a computer in language-of-instruction lessons .......................................... 355table VI.6.8b Print reading performance, by time spent using a computer in language-of-instruction lessons .............................................. 356table VI.6.8c Reading performance, by computer use in language-of-instruction lessons .................................................................................... 358table VI.6.8d digital reading and mathematics performance, by computer use in mathematics lessons ......................................................... 359table VI.6.8e digital reading and science performance, by computer use in science lessons ............................................................................. 360table VI.6.9a digital reading performance by computer use at home for playing collaborative online games, before and after accounting for print reading performance .................................................................................................................................................. 361table VI.6.9b digital reading performance by computer use at home for browsing the Internet for fun, before and after accounting for print reading performance ......................................................................................................................................................................... 361table VI.6.9c digital reading performance by computer use at home for browsing the Internet for schoolwork, before and after accounting for print reading performance .................................................................................................................................................. 362table VI.6.9d digital reading performance by computer use at home for sending e-mail to communicate with other students about schoolwork, before and after accounting for print reading performance ............................................................................. 362table VI.6.10a digital reading performance by computer use at school for browsing the Internet for schoolwork, before and after accounting for print reading performance .................................................................................................................................................. 363table VI.6.10b digital reading performance by computer use at school for practicing and drilling, before and after accounting for print reading performance ......................................................................................................................................................................... 363table VI.6.11a Index of the number of relevant pages visited, by computer use at home for playing collaborative online games ............ 364table VI.6.11b Index of the number of relevant pages visited, by computer use at home for browsing the Internet for fun ......................... 364table VI.6.11c Index of the number of relevant pages visited, by computer use at home for browsing the Internet for schoolwork......... 365table VI.6.11d Index of the number of relevant pages visited, by computer use at home for sending e-mail to communicate with other students about schoolwork ......................................................................................................................................................... 365table VI.6.11e Index of the number of relevant pages visited, by computer use at school for browsing the Internet for schoolwork ....... 366table VI.6.11f Index of the number of relevant pages visited, by computer use at school for practicing and drilling ................................... 366table VI.6.12a digital reading performance, by index of self-condicence in ICt high-level tasks........................................................................ 367table VI.6.12b digital reading performance, by students’ self-confidence in editing digital photographs or other graphic images .......... 368table VI.6.12c digital reading performance, by students’ self-confidence in creating a database ........................................................................ 368table VI.6.12d digital reading performance, by students’ self-confidence in using a spreadsheet to plot a graph .......................................... 369table VI.6.12e digital reading performance, by students’ self-confidence in creating a presentation ................................................................. 369table VI.6.12f digital reading performance, by students’ self-confidence in creating a multi-media presentation ........................................ 370table VI.6.13a Index of self-confidence for high-level ICt tasks, by computer use at home for playing one-player games ......................... 370table VI.6.13b Index of self-confidence for high-level ICt tasks, by computer use at home for playing collaborative online games ....... 371table VI.6.13c Index of self-confidence for high-level ICt tasks, by computer use at home for sending e-mail .............................................. 371table VI.6.13d Index of self-confidence for high-level ICt tasks, by computer use at home for chatting on line............................................. 372 PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 17
    • Table of ConTenTs table VI.6.13e Index of self-confidence for high-level ICt tasks, by computer use at home for browsing the Internet for fun .................... 372 table VI.6.13f Index of self-confidence for high-level ICt tasks, by computer use at home for downloading music, films, games or software from the Internet ........................................................................................................................................................................... 373 table VI.6.13g Index of self-confidence for high-level ICt tasks, by computer use at home for publishing and maintaining a personal page, weblog or blog .................................................................................................................................................................... 373 table VI.6.13h Index of self-confidence for high-level ICt tasks, by computer use at home for participating in online forums, virtual communities or spaces ........................................................................................................................................................................ 374 table VI.6.14a Index of self-confidence for high-level ICt tasks, by computer use at home for browsing the Internet for schoolwork ......... 374 table VI.6.14b Index of self-confidence for high-level ICt tasks, by computer use at home for sending e-mail to communicate with other students about schoolwork ......................................................................................................................................................... 375 table VI.6.14c Index of self-confidence for high-level ICt tasks, by computer use at home for sending e-mail to communicate with teachers about schoolwork .................................................................................................................................................................... 375 table VI.6.14d Index of self-confidence for high-level ICt tasks, by computer use at home for downloading, uploading or browsing material from the school’s website ................................................................................................................................................................ 376 table VI.6.14e Index of self-confidence for high-level ICt tasks, by computer use at home for checking the school’s website for announcements ............................................................................................................................................................................................. 376 table VI.6.15a Index of self-confidence for high-level ICt tasks, by computer use at school for chatting on line ........................................... 377 table VI.6.15b Index of self-confidence for high-level ICt tasks, by computer use at school for sending e-mail............................................. 377 table VI.6.15c Index of self-confidence for high-level ICt tasks, by computer use at school for browsing the Internet for schoolwork ....... 378 table VI.6.15d Index of self-confidence for high-level ICt tasks, by computer use at school for downloading, uploading or browsing material from the school’s website ................................................................................................................................................................ 378 table VI.6.15e Index of self-confidence for high-level ICt tasks, by computer use at school for posting work on the school’s website ....... 379 table VI.6.15f Index of self-confidence for high-level ICt tasks, by computer use at school for playing simulations at school ................. 379 table VI.6.15g Index of self-confidence for high-level ICt tasks, by computer use at school for practicing and drilling .............................. 380 table VI.6.15h Index of self-confidence for high-level ICt tasks, by computer use at school for doing individual homework on a school computer ........................................................................................................................................................................................ 380 table VI.6.15i Index of self-confidence for high-level ICt tasks, by computer use at school for group work and communication with other students ............................................................................................................................................................................................. 381 table VI.7.1a Within- and between-school variation in digital reading performance, and variation explained by the multilevel regression model without print reading performance ............................................................................................................................. 381 table VI.7.1b multilevel regression model for digital reading performance, before accounting for print reading performance ............... 382 table VI.7.2a Within- and between-school variation in digital reading performance, and variation explained by the multilevel regression model with print reading performance.................................................................................................................................... 383 table VI.7.2b multilevel regression model for digital reading performance, after accounting for print reading performance ................... 384 table S.VI.a Percentage of students at each proficiency level on the digital, print and composite reading scales ...................................... 385 table S.VI.b Percentage of boys at each proficiency level on the digital, print and composite reading scales ............................................. 386 table S.VI.c Percentage of girls at each proficiency level on the digital, print and composite reading scales .............................................. 387 table S.VI.d mean score, variation and gender differences in student performance on the digital, print and composite reading scales ....... 38818 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • Executive SummaryPISA defines reading literacy as understanding, using, reflecting on and engaging with written texts in order toachieve one’s goals, develop one’s knowledge and potential, and participate in society. this definition applies toboth print and digital reading.Some 8% of students in the 16 participating OECD countries reached the highest level of digitial reading performance.Students proficient at level 5 or above can evaluate information from several web-based sources, assess thecredibility and utility of what they read, and navigate across pages of text autonomously and efficiently. But there isconsiderable variation across countries: more than 17% of students in Korea, new Zealand and Australia performat this level, while fewer than 3% in Chile, Poland and Austria do.At the same time, all participating countries and partner economies, except Korea, have significant numbers oflow-performing students. In Chile, Austria, Hungary and Poland, more than one-quarter of students perform belowlevel 2 on the digital reading scale, and in the partner country Colombia, nearly 70% of students perform below thislevel. this does not mean that such students have no proficiency in digital reading; many students performing at thislevel can scroll and navigate across web pages, as long as explicit directions are provided, and can locate simplepieces of information in a short block of hypertext. nevertheless, these students are performing at levels below thosethat allow them full access to educational, employment and social opportunities in the 21st century.Korea is the top-performing country in digital reading by a significant margin, with a mean score of 568.Korea is followed by new Zealand and Australia, both at 537 score points, Japan (519 score points), the partnereconomy Hong Kong-China (515 score points), Iceland (512 score points), Sweden (510 score points), Ireland(509 score points) and Belgium (507 score points). the partner country Colombia’s mean score (368 score points) iswell below those of the other participating countries and economies.In most countries, student performance in digital and print reading is closely related.on average, 7.8% of students in the 16 participating oeCd countries perform at level 5 or above on the digitalreading scale, while a slightly higher percentage (8.5%) performs at level 5 or 6 in print reading. on average,16.9%of students perform below level 2 in digital reading, while a similar percentage (17.4%) perform below the baselinelevel 2 on the print reading scale.However, in Poland, Hungary, Chile, Austria, denmark, the partner economy Hong Kong-China and the partnercountry Colombia, students perform significantly better, on average, in print than in digital reading. Conversely, inKorea, Australia, new Zealand, Ireland, Sweden, Iceland and the partner economy macao-China, students performsignificantly better, on average, in digital than in print reading. there is a tendency for the higher-performingcountries in both media to do better in digital media, while the lower-performing countries perform more stronglyin print media, although Hong Kong-China is an exception.In all participating countries and economies, the gender gap in performance is narrower in digital reading than inprint reading.Girls outperform boys in digital reading by an average of 24 score points, compared to an average of 39 scorepoints in print reading. the gender gap in digital reading is widest in new Zealand (a difference of 40 score points), PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 19
    • exeCuTive summary norway (35), Ireland (31), Iceland (30), Poland (29), Australia (28) and Sweden (26). When comparing boys and girls with similar levels of print reading proficiency and similar characteristics in some student and school aspects, boys achieve higher scores in digital reading than girls in denmark (22 score point difference), Austria (17), Poland (11), Hungary (11), Sweden (8), Korea (7), Spain (6), Iceland (6), Australia (5) and the partner economies Hong Kong- China (17) and macao-China (10). Proficient digital readers tend to know how to navigate effectively and efficiently. navigation is a key component of digital reading, as readers “construct” their text through navigation. thus, navigational choices directly influence what kind of text is eventually processed. Stronger readers tend to choose strategies that are suited to the demands of the individual tasks. Better readers tend to minimise their visits to irrelevant pages and locate necessary pages efficiently. However, PISA results show that even when guidance on navigation is explicit, significant numbers of students still cannot locate crucial pages. the digital reading assessment offers powerful evidence that today’s 15-year-olds, the “digital natives”, do not automatically know how to operate effectively in the digital environment, as has sometimes been claimed. Students’ attitudes towards reading and their socio-economic backgrounds and immigrant status seem to have similar associations with both print and digital reading proficiency. In most countries the average difference in digital reading performance between those students who are the most and least enthusiastic about reading is a striking 88 score points. on average, the least enthusiastic students are twice as likely to perform poorly in digital reading as the most enthusiastic readers; and in most countries, this finding holds for both boys and girls. engaging in certain online activities also has an impact on digital reading performance. In each of the 19 countries that took part in the digital reading assessment, the more frequently students search for information on line, the better their performance in digital reading. Being unfamiliar with online social practices, such as e-mailing and chatting, seems to be associated with low digital reading proficiency; but students who frequently e-mail and chat on line also perform less well than students who are only moderately involved in these activities. Access to ICT has grown significantly in recent years and, as a result, fewer than 1% of students across OECD countries reported that they had never used a computer; but a digital divide in the use of ICT is still evident between and within countries. on average across the oeCd countries that took part in the PISA 2000 and 2009 surveys, the percentage of students who reported having at least one computer at home increased from 72% in 2000 to 94% in 2009. the increase in access to a home computer during this period was larger among socio-economically disadvantaged students (37 percentage points) than among advantaged students (7 percentage points). In addition, the proportion of students in oeCd countries who reported having access to the Internet at home doubled from 45% to 89% during the same period. While at least 95% of students in 16 oeCd countries, the partner country liechtenstein, and the partner economies macao-China and Hong Kong-China reported that they use a computer at home, those proportions are significantly lower in Japan (76%), Chile (73%) and turkey (60%). In Japan, students often use mobile phones, rather than personal computers, for emailing and accessing the Internet. In all 27 oeCd countries for which data are available for both PISA 2000 and 2009, there was an increase in the computer-student ratio at school during that period – evidence of substantial investment in ICt resources. But the proportion of students who reported using a computer at school varies substantially across countries and economies. Within countries, the digital divide is often linked to students’ socio-economic background. Students from socio- economically advantaged backgrounds have higher levels of computer and Internet access at home; however, in some countries, the inequalities in the level of computer use at home is narrowed when disadvantaged students are given more opportunities to use a computer at school. Using a computer at home is related to digital reading performance in all 17 participating countries and economies, but that is not always true for computer use at school. the relationship between the frequency of computer use at home for leisure and for schoolwork and digital reading performance is not linear, but rather mountain-shaped: in other words, moderate users attain higher scores in digital reading than both rare and intensive users. In contrast, the relationship between students’ computer use at school and performance in digital reading tends to be negative with a slight curve, which means that more intensive use is20 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • exeCuTive summaryassociated with lower scores. Students who use computers intensively at school may require additional assignmentsto catch up to other students or may need more time to complete their studies.After accounting for students’ academic abilities, the frequency of computer use at home, particularly computer usefor leisure, is positively associated with navigation skills and digital reading performance, while the frequency ofcomputer use at school is not. these findings suggest that students are developing digital reading literacy mainly byusing computers at home to pursue their interests. • Table VI.A • an overview oF perFormanCe in digiTal reading, navigaTion and CompuTer use higher quality or equity than oeCd average At oeCd average (no statistically significant difference) lower quality or equity than oeCd average computer use at home computer use at school Percentage difference Percentage difference difference in digital difference in digital between top reading scores between top reading scores and bottom between and bottom between Gender index quarters those students quarters those students difference of number Percentage of the PisA who use and Percentage of the PisA who use and in digital of relevant of students index of those who of students index of those who digital reading scores pages visited who use economic, do not use a who use economic, do not use reading between boys (navigation a computer social and computer a computer social and a computer performance and girls skills) at home cultural status at home at school cultural status at school mean score score dif. mean index % % dif. score dif. % % dif. score dif. oEcd average 499 -24 46.3 92.3 16.0 80 74.2 0.3 9 Korea 568 -18 52.8 87.5 19.5 49 62.7 3.5 2.1OECD New Zealand 537 -40 49.7 92.5 20.2 90 83.4 6.4 20 Australia 537 -28 49.6 96.7 7.8 84 91.6 5.6 42 Japan 519 -23 50.1 75.9 38.6 48 59.3 2.6 14 Iceland 512 -30 47.5 99.1 1.2 74 79.5 5.1 22 Sweden 510 -26 47.8 97.7 4.7 105 89.1 4.7 28 Ireland 509 -31 47.4 93.2 10.9 60 62.9 0.4 -3 Belgium 507 -24 47.7 96.9 9 102 62.8 -1.1 9 Norway 500 -35 46.9 98.7 2.7 77 93.0 2.5 25 France 494 -20 46.1 m m m m m m Denmark 489 -6 47.2 98.8 2.8 79 93.0 1.8 6 Spain 475 -19 44.2 92.6 14.4 78 65.5 -4.0 11 Hungary 468 -21 41.6 91.8 23.6 102 69.3 -8.9 -27 Poland 464 -29 42.0 92.1 22.9 84 60.6 -9.1 -8 Austria 459 -22 43.3 98.2 3.7 94 84.1 -3.2 -6 Chile 435 -19 37.7 73.2 60.3 69 56.8 -2.0 2 Hong Kong-China 515 -8 48.1 96.4 5.2 33 82.6 0.2 3Partners Macao-China 492 -12 46.5 96.4 5.2 61 80.1 -1.0 4 Colombia 368 -3 31.5 m m m m m m notes: Values that are statistically signficant are indicated in bold (see Annex 3). Source: oeCd, PISA 2009 Database, tables VI.2.4, VI.3.1, VI.5.1, VI.5.10a. VI.6.2 and VI.6. 12 http://dx.doi.org/10.1787/888932436670 PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 21
    • Introduction to pISaThe pisa surveysAre students well prepared to meet the challenges of the future? Can they analyse, reason and communicatetheir ideas effectively? Have they found the kinds of interests they can pursue throughout their lives as productivemembers of the economy and society? the oeCd Programme for International Student Assessment (PISA) seeks toanswer these questions through its triennial surveys of key competencies of 15-year-old students in oeCd membercountries and partner countries/economies. together, the group of countries participating in PISA represents nearly90% of the world economy.1PISA assesses the extent to which students near the end of compulsory education have acquired some of theknowledge and skills that are essential for full participation in modern societies, with a focus on reading, mathematicsand science.PISA has now completed its fourth round of surveys. Following the detailed assessment of each of PISA’s three mainsubjects – reading, mathematics and science – in 2000, 2003 and 2006, the 2009 survey marks the beginning ofa new round with a return to a focus on reading, but in ways that reflect the extent to which reading has changedsince 2000, including the prevalence of digital texts.PISA 2009 offers the most comprehensive and rigorous international measurement of student reading skills to date.It assesses not only reading knowledge and skills, but also students’ attitudes and their learning strategies in reading.PISA 2009 updates the assessment of student performance in mathematics and science as well.the assessment focuses on young people’s ability to use their knowledge and skills to meet real-life challenges. thisorientation reflects a change in the goals and objectives of curricula themselves, which are increasingly concernedwith what students can do with what they learn at school and not merely with whether they have mastered specificcurricular content.PISA’s unique features include its:• Policy orientation, which connects data on student learning outcomes with data on students’ characteristics and on key factors shaping their learning in and out of school in order to draw attention to differences in performance patterns and identify the characteristics of students, schools and education systems that have high performance standards.• Innovative concept of “literacy”, which refers to the capacity of students to apply knowledge and skills in key subject areas and to analyse, reason and communicate effectively as they pose, interpret and solve problems in a variety of situations.• Relevance to lifelong learning, which does not limit PISA to assessing students’ competencies in school subjects, but also asks them to report on their own motivations to learn, their beliefs about themselves and their learning strategies.• Regularity, which enables countries to monitor their progress in meeting key learning objectives.• Breadth of geographical coverage and collaborative nature, which, in PISA 2009, encompasses the 34 oeCd member countries and 41 partner countries and economies.2 PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 23
    • inTroduCTion To pisa the relevance of the knowledge and skills measured by PISA is confirmed by studies tracking young people in the years after they have been assessed by PISA. longitudinal studies in Australia, Canada and Switzerland display a strong relationship between performance in reading on the PISA assessment at age 15 and future educational attainment and success in the labour-market (see Volume I, Chapter 2).3 the frameworks for assessing reading, mathematics and science in 2009 are described in detail in PISA 2009 Assessment Framework: Key competencies in Reading, Mathematics and Science (oeCd, 2009b). decisions about the scope and nature of the PISA assessments and the background information to be collected are made by leading experts in participating countries. Governments guide these decisions based on shared, policy- driven interests. Considerable efforts and resources are devoted to achieving cultural and linguistic breadth and balance in the assessment materials. Stringent quality-assurance mechanisms are applied in designing the test, in translation, sampling and data collection. As a result, PISA findings are valid and highly reliable. Policy makers around the world use PISA findings to gauge the knowledge and skills of students in their own country in comparison with those in other countries. PISA reveals what is possible in education by showing what students in the highest-performing countries can do in reading, mathematics and science. PISA is also used to gauge the pace of educational progress by allowing policy makers to assess to what extent performance changes observed nationally are in line with performance changes observed elsewhere. In a growing number of countries, PISA is also used to set policy targets against measurable goals achieved by other systems, to initiate research and peer-learning designed to identify policy levers and to reform trajectories for improving education. While PISA cannot identify causal relationships between inputs, processes and educational outcomes, it can highlight key features in which education systems are similar and different, sharing those findings with educators, policy makers and the general public. The FirsT reporT From The 2009 assessmenT this volume is the last of six volumes that provide the first international report on results from the PISA 2009 assessment. It explains how PISA measures and reports student performance in digital reading and analyses what students in the 19 countries and economies participating in this assessment are able to do. the other volumes cover the following issues: • Volume I, What Students Know and Can Do: Student Performance in Reading, Mathematics and Science, summarises the performance of students in PISA 2009, starting with a focus on reading, and then reporting on mathematics and science performance. It provides the results in the context of how performance is defined, measured and reported, and then examines what students are able to do in reading. After a summary of reading performance, it examines the ways in which this performance varies on subscales representing three aspects of reading. It then breaks down results by different formats of reading texts and considers gender differences in reading, both generally and for different reading aspects and text formats. Any comparison of the outcomes of education systems needs to take into consideration countries’ social and economic circumstances and the resources they devote to education. to address this, the volume also interprets the results within countries’ economic and social contexts. the chapter concludes with a description of student results in mathematics and science. • Volume II, Overcoming Social Background: Equity in Learning Opportunities and Outcomes, starts by closely examining the performance variation shown in Volume I, particularly the extent to which the overall variation in student performance relates to differences in results achieved by different schools. the volume then looks at how factors such as socio-economic background and immigrant status affect student and school performance, and the role that education policy can play in moderating the impact of these factors. • Volume III, Learning to Learn: Student Engagement, Strategies and Practices, explores the information gathered on students’ levels of engagement in reading activities and attitudes towards reading and learning. It describes 15-year-olds’ motivations, engagement and strategies to learn. • Volume IV, What Makes a School Successful? Resources, Policies and Practices, explores the relationships between student-, school- and system-level characteristics, and educational quality and equity. It explores what schools and school policies can do to raise overall student performance and, at the same time, moderate the impact of socio-economic background on student performance, with the aim of promoting a more equitable distribution of learning opportunities.24 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • inTroduCTion To pisa• Volume V, Learning Trends: Changes in Student Performance Since 2000, provides an overview of trends in student performance in reading, mathematics and science from PISA 2000 to PISA 2009. It shows educational outcomes over time and tracks changes in factors related to student and school performance, such as student background and school characteristics and practices.All data tables referred to in the analysis are included at the end of the respective volume. A Reader’s Guide is alsoprovided in each volume to aid in interpreting the tables and figures accompanying the report.technical annexes that describe the construction of the questionnaire indices, sampling issues, quality assuranceprocedures and the process followed for developing the assessment instruments, and information about reliabilityof coding are posted on the oeCd PISA website (www.pisa.oecd.org). many of the issues covered in the technicalannexes are elaborated in greater detail in the PISA 2009 Technical Report (oeCd, forthcoming).The pisa sTudenT populaTionIn order to ensure the comparability of results across countries, PISA devoted a great deal of attention to assessingcomparable target populations. differences between countries in the nature and extent of pre-primary educationand care, in the age of entry to formal schooling, and in the structure of the education system do not allow schoolgrade levels to be defined so that they are internationally comparable. Valid international comparisons of educationalperformance, therefore, need to define their populations with reference to a target age. PISA covers students who areaged between 15 years 3 months and 16 years 2 months at the time of the assessment and who have completed at least6 years of formal schooling, regardless of the type of institution in which they are enrolled, whether they are in full-timeor part-time education, whether they attend academic or vocational programmes, and whether they attend public orprivate schools or foreign schools within the country. For an operational definition of this target population, see thePISA 2009 Technical Report (oeCd, forthcoming). the use of this age in PISA, across countries and over time, allowsthe performance of students to be compared in a consistent manner before they complete compulsory education.As a result, this report can make statements about the knowledge and skills of individuals born in the same year whoare still at school at 15 years of age, despite having had different educational experiences, both in and outside school.Stringent technical standards were established to define the national target populations and to identify permissibleexclusions from this definition (www.pisa.oecd.org). the overall exclusion rate within a country was required tobe below 5% to ensure that, under reasonable assumptions, any distortions in national mean scores would remainwithin plus or minus 5 score points, i.e. typically within the order of magnitude of two standard errors of sampling(see Annex A2). exclusion could take place either through schools that participated or students who participatedwithin schools. there are several reasons why a school or a student could be excluded from PISA. Schools mightbe excluded because they are situated in remote regions and are inaccessible or because they are very small, orbecause of organisational or operational factors that precluded participation. Students might be excluded because ofintellectual disability or limited proficiency in the language of the test.In 29 out of 65 countries participating in the paper-based PISA 2009 assessment, the percentage of school-levelexclusions amounted to less than 1%; it was less than 5% in all countries. When the exclusion of students who metthe internationally established exclusion criteria is also taken into account, the exclusion rates increase slightly.However, the overall exclusion rate remains below 2% in 32 participating countries, below 5% in 60 participatingcountries, and below 7% in all countries except luxembourg (7.2%) and denmark (8.6%). In 15 out of 34 oeCdcountries, the percentage of school-level exclusions amounted to less than 1% and was less than 5% in all countries.When student exclusions within schools are also taken into account, there were 9 oeCd countries below 2% and25 countries below 5%. Restrictions on the level of exclusions in PISA 2009 are described in Volume I.the specific sample design and size for each country aimed to maximise sampling efficiency for student-levelestimates. In oeCd countries, sample sizes ranged from 4 410 students in Iceland to 38 250 students in mexico.Countries with large samples have often implemented PISA both at national and regional/state levels (e.g. Australia,Belgium, Canada, Italy, mexico, Spain, Switzerland and the united Kingdom). this selection of samples was monitoredinternationally and adhered to rigorous standards for the participation rate, both among schools selected by theinternational contractor and among students within these schools, to ensure that the PISA results reflect the skills ofthe 15-year-old students in participating countries. Countries were also required to administer the test to studentsin identical ways to ensure that students receive the same information prior to and during both the paper-based andthe digital reading assessments (for details, see Annex A4). detailed information about the samples for the digitalreading assessment is presented in Annex A2. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 25
    • inTroduCTion To pisa Box VI.A Key features of pisa 2009 Content • the main focus of PISA 2009 was reading. the survey also updated performance assessments in mathematics and science. PISA considers students’ knowledge in these areas not in isolation, but in relation to their ability to reflect on their knowledge and experience and to apply them to real-world issues. the emphasis is on mastering processes, understanding concepts and functioning in various contexts within each assessment area. • For the first time, the PISA 2009 survey also assessed 15-year-old students’ ability to read, understand and apply digital texts. this part of the survey was optional. Methods • Around 470 000 students completed the paper-based assessment in 2009, representing about 26 million 15-year-olds in the schools of the 65 participating countries and economies. Some 50 000 students took part in a second round of this assessment in 2010, representing about 2 million 15-year-olds from 10 additional partner countries and economies. • each participating student spent two hours carrying out pencil-and-paper tasks in reading, mathematics and science. In 19 countries, students were given additional questions via computer to assess their capacity to read digital texts. • the assessment included tasks requiring students to construct their own answers as well as multiple-choice questions. the latter were typically organised in units based on a written passage or graphic, much like the kind of texts or figures that students might encounter in real life. • Students also answered a questionnaire that took about 30 minutes to complete. this questionnaire focused on their background, learning habits, attitudes towards reading, and their involvement and motivation. • School principals completed a questionnaire about their school that included demographic characteristics and an assessment of the quality of the learning environment at school. Outcomes PISA 2009 results provide: • a profile of knowledge and skills among 15-year-olds in 2009, consisting of a detailed profile for reading and an update for mathematics and science; • contextual indicators relating performance results to student and school characteristics; • an assessment of students’ engagement in reading activities, and their knowledge and use of different learning strategies; • a knowledge base for policy research and analysis; and • trend data on changes in student knowledge and skills in reading, mathematics, science, changes in student attitudes and socio-economic indicators, and in the impact of some indicators on performance results. Future assessments • the PISA 2012 survey will return to mathematics as the major assessment area, PISA 2015 will focus on science. thereafter, PISA will turn to another cycle beginning with reading again. • Future tests will place greater emphasis on assessing students’ capacity to read and understand digital texts and solve problems presented in a digital format, reflecting the importance of information and computer technologies in modern societies.26 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • inTroduCTion To pisa • Figure VI.A • a map of pisa countries and economies oEcd countries Partner countries and economies in PisA 2009 Partner country in previous PisA surveys Australia Japan Albania mauritius* macedonia Austria Korea Argentina miranda-Venezuela* Belgium luxembourg Azerbaijan moldova* Canada mexico Brazil montenegro Chile netherlands Bulgaria netherlands-Antilles* Czech Republic new Zealand Colombia Panama denmark norway Costa Rica* Peru estonia Poland Croatia Qatar Finland Portugal Georgia* Romania France Slovak Republic Himachal Pradesh-India* Russian Federation Germany Slovenia Hong Kong-China Serbia Greece Spain Indonesia Shanghai-China Hungary Sweden Jordan Singapore Iceland Switzerland Kazakhstan tamil nadu-India* Ireland turkey Kyrgyzstan Chinese taipei Israel united Kingdom latvia thailand Italy united States liechtenstein trinidad and tobago lithuania tunisia macao-China uruguay malaysia* united Arab emirates* * these partner countries and economies carried out malta* the assessment in 2010 instead of 2009. Notes1. the GdP of countries that participated in PISA 2009 represents 87% of the 2007 world GdP. Some of the entities representedin this report are referred to as partner economies. this is because they are not strictly national entities.2. thirty-one partner countries and economies originally participated in the PISA 2009 assessment and ten additional partnercountries and economies took part in a second round of the assessment.3. marks, G.n (2007); Bertschy, K., m.A. Cattaneo and S.C. Wolter (2009); oeCd (2010c). PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 27
    • reader’s GuideData underlying the figuresthe data referred to in this volume are presented in Annex B and, in greater detail, on the PISA website(www.pisa.oecd.org).Five symbols are used to denote missing data:a the category does not apply in the country concerned. data are therefore missing.c there are too few observations or no observation to provide reliable estimates (i.e. there are fewer than 30 students or fewer than five schools with valid data).m data are not available. these data were not submitted by the country or were collected but subsequently removed from the publication for technical reasons.w data have been withdrawn or have not been collected at the request of the country concerned.x data are included in another category or column of the table.Country coveragethe Programme for International Student Assessment encompasses 65 countries and economies, including all34 oeCd countries and 31 partner countries and economies (see Figure VI.A). the data from another ninepartner countries were collected one year later and will be published in 2011. this publication features data on19 countries and economies for the digital reading assessment, including 16 oeCd countries, and 45 countriesfor the ICt questionnaire, including 29 oeCd countries.the statistical data for Israel are supplied by and under the responsibility of the relevant Israeli authorities.the use of such data by the oeCd is without prejudice to the status of the Golan Heights, east Jerusalem andIsraeli settlements in the West Bank under the terms of international law.Calculating international averagesAn oeCd average was calculated for most indicators presented in this report. the oeCd average correspondsto the arithmetic mean of the respective country estimates. the oeCd average is used to compare performanceacross education systems. In the case of some countries, data may not be available for specific indicators, orspecific categories may not apply. Readers should, therefore, keep in mind that the term “oeCd average” refersto the oeCd countries included in the respective comparisons.In this volume, different oeCd averages have been calculated, depending on the number of oeCd countriesparticipating in the digital reading assessment (16 oeCd countries), in the ICt questionnaire (29 oeCd countries),or in both of them (15 oeCd countries). the oeCd average in the tables is presented as oeCd average-xx,“xx” corresponding to the number of countries taken into account in this average. Some tables include theoeCd average without any number of countries. this means that the oeCd average does not take into accountthe same number of countries for the different columns. In this case, the number of countries encompassed inthe oeCd average is indicated in the title of the corresponding columns.the oeCd average is computed based on available data. However, sometimes there is no data available forcertain categories. In these cases, the oeCd average difference is not equal to the difference between the oeCdaverages of the two categories in question.Rounding figuresBecause of rounding, some figures in tables may not exactly add up to the totals. totals, differences andaverages are always calculated on the basis of exact numbers and are rounded only after calculation. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 29
    • reader’s guide All standard errors in this publication have been rounded to one or two decimal places. Where the value 0.00 is shown, this does not imply that the standard error is zero, but that it is smaller than 0.005. Reporting student data the report uses “15-year-olds” as shorthand for the PISA target population. PISA covers students who are aged between 15 years 3 months and 16 years 2 months at the time of assessment and who have completed at least 6 years of formal schooling, regardless of the type of institution in which they are enrolled, whether they are in full-time or part-time education, whether they attend academic or vocational programmes, and whether they attend public or private schools or foreign schools within the country. Reporting school data the principals of the schools in which students were assessed provided information on their schools’ characteristics by completing a school questionnaire. Where responses from school principals are presented in this publication, they are weighted so that they are proportionate to the number of 15-year-olds enrolled in the school. Focusing on statistically significant differences this volume discusses only statistically significant differences or changes. these are denoted in darker colours in figures and in bold font in tables. See Annex A3 for further information. Categorising student performance this report uses a shorthand to describe students’ levels of proficiency in the subjects assessed by PISA: top performers are those students proficient at levels 5 or 6 of the assessment. strong performers are those students proficient at level 4 of the assessment. moderate performers are those students proficient at level 2 or 3 of the assessment. lowest performers are those students proficient below level 2 of the assessment. Abbreviations used in this report Corr. Correlation dif. difference eSCS PISA index of economic, social and cultural status GdP Gross domestic product ISCed International Standard Classification of education PPP Purchasing power parity Further documentation For further information on the PISA assessment instruments and the methods used in PISA, see the PISA 2009 Technical report (oeCd, forthcoming) and the PISA website (www.pisa.oecd.org). this report uses the oeCd’s Statlinks service. Below each table and chart is a url leading to a corresponding excel workbook containing the underlying data. these urls are stable and will remain unchanged over time. In addition, readers of the e-books will be able to click directly on these links and the workbook will open in a separate window, if their Internet browser is open and running.30 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 1Context of the pISa Digital reading assessment Computer use has grown exponentially since the invention of the microcomputer three decades ago; as of mid-2010, almost one-third of the world’s population uses the Internet. Digital technologies have changed the ways texts are produced and displayed; and those changes have had an impact on how students read. This chapter focuses on how new kinds of texts have transformed reading. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 31
    • 1 Context of the PISA dIgItAl reAdIng ASSeSSment Since the invention of the microcomputer some 30 years ago, the number of computers in use worldwide has been growing at an exponential rate. By mid-2010, it was estimated that almost two billion people, or 29% of the world population, were using the Internet, with percentages ranging from 77% in north America to about 11% in Africa (miniwatts marketing Group, 2010). on average in oeCd countries in June 2010, around 25% of the population had a subscription for fixed-line broadband (oeCd Broadband Portal : www.oecd.org/sti/ict/broadband). the past decade has also seen the explosion of mobile technologies, with laptops, digital pads, smart phones and other portable digital devices being sold in increasingly large numbers. only around 8% of the global population is connected to fixed-line broadband, but mobile broadband connection is estimated at 14%, pointing to the growing importance of mobile Internet access in non-oeCd countries (Itu Statistics: www.itu.int/ict/statistics). Information and communication devices based on digital technologies are used in a wide range of contexts and for many different purposes. their most important common characteristic is that they all permit the display and perusal of text. Indeed, most applications of computer technologies, including videogames, involve some type of textual information. As a result, whatever their purposes, tasks or goals, users of computers and networked digital technologies are compelled to read digital texts. moreover, digital technologies deeply affect the shape, content and life-cycle of texts and, consequently, the very nature of reading. It is important for governments and societies to understand these changes as they have begun to affect, in turn, almost every aspect of life in society, including government, education, work, commerce and civic life. to cite just a few examples: more and more taxpayers fill in online forms; students search the web for information; jobseekers look up ads on employment websites; consumers order goods in online stores; and people build and maintain social communities on line. All these activities, and many others, require the production, dissemination, and reading of some type of text. this chapter begins with a review of the impact of digital technologies on the production and display of text. the potential consequences of these changes for defining reading skills and reading literacy are then discussed, stressing a number of features and processes that are characteristic of digital reading, and listing a number of important questions that are addressed in the PISA 2009 digital reading assessment. this chapter is not concerned with an analysis of how digital texts may affect. Instruction, such as lesson-based teaching and learning strategies, or social networking. the focus is on the act of reading and how reading is transformed by new forms of texts and textual devices. For more extended discussions of this and related topics, see Coiro, et al., 2008; dillon, 2004; mayer, 2005; and Rouet, 2006. new teChnologIeS for text, new wAyS of reAdIng From the invention of the cathode ray tube to the latest mobile communication devices, the advent of digital technologies has had a profound impact on the design, production, dissemination and uses of text. From a linguistic standpoint, a text is usually defined as a passage forming a “unified whole” (Halliday and Hasan, 1976). linguists agree that textual “unity” is not conferred through strict criteria of length or grammatical rules, but rather through the communication act that the text fulfils. texts originate from a source and are intended for an audience. they are meant to perform a specific communicative act, for instance, to tell, describe, explain, persuade, and so forth. the extent to which sets of linguistic utterances can indeed perform those acts depends on their compliance with a set of principles or “standards of textuality” (de Beaugrande and dressler, 1981). For instance, texts can only communicate effectively to the extent that they are coherent, cohesive, informative, relevant and acceptable. the general principles that define textuality are arguably similar across media. However, printed and digital technologies each possess some unique features that result in important differences in the way texts are produced, displayed, organised and connected to other texts. Furthermore, whereas printed texts have a relative permanence, digital texts are potentially dynamic and can be constantly completed, edited and updated. these differences have consequences for the access, comprehension and uses of text in a wide variety of situations, ranging from education to work to personal and civic purposes. It is therefore crucial to understand and assess the new forms of reading literacy that come with the practice of reading on digital displays (Coiro, 2009). Although digital text is often associated with microcomputing, information societies are replete with devices that display digital texts, without the reader having to manipulate a computer. examples include videoprojected slides used during conferences, electronic advertisements or public communication signs, information booths in railway32 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 1 Context of the PISA dIgItAl reAdIng ASSeSSmentstations, shopping centres and airports, but also displays of iPods, mobile phones, digital pads and many more.throughout the past decade, the list of these new devices has been continually expanded and updated.the growing practice of displaying text digitally is having a deep impact on the shape and contents of the textsthemselves. digital texts differ from printed texts in readability and usability, and also in the social and economicprocesses that drive the creation, dissemination and multi-dimensional uses of text.Differences in the readability and usability of textSuperficially, texts displayed digitally may seem very similar to those that are printed on paper. they use thesame basic sign systems (for example, the Roman alphabet or Japanese Kanji, punctuation marks), the samesyntax and, to some extent, the same rules for composing passages and signalling structure (margins, paragraphs,headings and so forth). However, a closer examination reveals important differences. one prominent differenceis the physical size of the display area or “page”. A15-inch computer screen is about the physical size of an A4 oruS letter page, which is smaller than printed newspapers, catalogues or supermarket flyers. And in recent yearselectronic gadgets with much smaller displays, such as digital pads and smartphones, have become increasinglypopular.In addition, the combination of smaller size and poorer quality of digital information means that the reader of digitaltext must generally cope with reduced readability and piecemeal presentation of information. A simple illustration isprovided in Figure VI.1.1, which shows the amount of text featured on a printed and a digital page of a newspaper.the excerpt of the printed page roughly corresponds to the display size of the web page. • Figure VI.1.1 • Comparison of print and digital texts Print Digital Press clipping of “taking the road to greatness”, Screen grab from www.theage.com.au of story “taking the road to greatness”, by megan Backhouse / Fairfax media publication by megan Backhouse / Fairfax media publicationHowever, digital texts should not be regarded as mere impoverished versions of printed texts. digital technologiesare constantly being improved and may eventually be comparable to high-quality printing technologies. In addition,designers of digital documents have created new publishing standards to cope with the limitations inherent in thedigital medium (consider, for instance, the increasingly popular web-based applications tailored to small screens).digital technologies have also introduced new ways to represent and organise information, some of which result inclear benefits for the reader compared to printed texts. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 33
    • 1 Context of the PISA dIgItAl reAdIng ASSeSSment New features of digital texts From static pages to dynamic windows and frames digital texts provide new ways for the reader to move within and across pages of text. Some of these have to do with the limitations of digital displays reviewed above; others are original inventions that have brought readers new ways of accessing and navigating through texts. In order to fully appreciate the impact of these new devices on digital reading literacy, one must keep in mind a few essential differences between printed and digital text in terms of page composition and arrangement into volumes. In printed texts, the content is intrinsically connected to the physical artefact. A passage of text exists both as a verbal message and as a concrete artefact: the page, the chapter, or the volume. Printed texts can and must be stored and indexed, like any collections of material objects – hence, since the 16th century at least, the use of numbering systems to order books in libraries and page numbers in books (Platteaux, 2008). In both cases, the number always represents the serial position of the item in the respective set. As a consequence, tables of contents and indexes have emerged as universal cataloguing techniques for printed artefacts. In digital texts, however, the physical storage of the information is independent of its organisation as it appears to the reader. Pages of digital texts are also independent from the particular display that is used to visualise them. For example, one can view a particular web page using a 21-inch desktop monitor, a 15-inch laptop or a smartphone. most often, the pages are larger than the actual display screen or window. this is a major difference from printed text in which the text frame is most often equal to the physical page, and sometimes smaller, such as in newspaper pages. Because of the virtual nature of page contents and formats, designers have had to replace page composing and numbering with other indexing and retrieval techniques. these techniques have been continually revised over the past two decades, and navigation devices are continuously updated in new versions of web browsers. to cite just one example, the “new tab” function appeared after 2000, even though these devices did not require any advanced technology. the reason why older versions of browsers did not include this and other useful features is unclear, but it may be that the excitement raised by multiple-window operating systems in the early 1990s overshadowed for a while the serious usability issues that came along with reading on line. digital texts come with devices that let the reader navigate within and across pages of digital texts. In the past decade, common devices used to navigate digital pages were the vertical and horizontal scroll bars, index tabs and expandable menu frames. none of these devices has ever had any meaning in the world of printed text. their mastery and use is a component of the so-called “new literacies” (Coiro, et al., 2008) typical of the electronic age. From linear arrangement to networking and hyperlinking even more dramatic differences between printed and digital displays can be found at the level of multitext compounds, such as electronic books or websites. designers of digital documents have created various techniques to represent the contents of those compounds and to let the reader move from page to page. one of the earliest indexing techniques used in digital documents is the menu, or list of page headings, from which the reader is invited to make a choice. the digital menu resembles a table of contents except that there are usually no page numbers. Instead, the reader selects an option by clicking directly on the item or a symbol that represents it, which results in the display of the selected page instead of or on top of the menu page (that is, in a new window or tab). Since there are no page numbers, however, once the page is displayed, the reader has no direct clue about its position among the set that makes up the electronic book. Such clues have to be provided indirectly through analogical symbols (for example, a micropage within a series of micropages at the bottom of the screen) or through path-type expressions, such as “Habitats – marine – open waters – mediterranean open waters – Common skate” (example adapted from nilsson and mayer, 2002). menus can be made hierarchical, which means that selecting a menu item causes another, more specific, menu to be displayed. Alternatively they may be presented as separate pages, or as part of multitext pages. In the context of web pages, menus are more and more frequently presented in a frame at the top or to the left of the display window. the rest of the window can be updated with the menu remaining constant, which can help the reader to keep a sense of his or her location in the document set.34 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 1 ConTexT oF The pisa digiTal reading assessmenTthe issue of designing effective menu systems for digital information systems has been revived lately with the adventof mobile devices that can display vast amounts of multimedia information (see, for example, St Amant, et al., 2007).other active areas of research and development are the design of “hands-free” menu systems guided by eyemovements or speech.one of the most distinctive features of digital texts is the hypertext link, a technique that appeared in the 1980sas a means of connecting pages of information in large electronic documents (Koved and Shneiderman, 1986).the hypertext link or hyperlink is a piece of information (usually a word or a phrase) that is logically connected toanother piece of information (usually a page). Clicking a hyperlink results in the display of a new page instead of oron top of the page previously displayed.Hyperlinks may be presented in separate lists (also called menus) or embedded within content pages. Whenembedded, hyperlinks are generally marked using a specific colour or typography.the use of hyperlinks allows for the creation of multipage documents with a networked structure. unlike lists orhierarchies, the arrangement of pages in a networked structure is not systematic. Rather, it follows the semanticrelationships across pages. It is up to the author of a multipage digital document to link a page with another pageby inserting a hyperlink.the hyperlink has contributed to the popularisation of digital documents (hypertexts) whose overall organisation isunlike that of traditional documents. In some early studies, hypertexts were praised as a means to “free” the readerfrom the supposedly cumbersome constraints of linear texts. But scientific studies of hypertext reading have foundthat network-like document organisation frequently results in disorientation and cognitive overload (Conklin, 1987;Rouet and levonen, 1996). navigation and orientation within nonlinear structures seems to rely on the reader’sability to mentally represent the top-level structure of the hypertext. Global organisers that accurately representthe overall structure of the information space made up by the hypertext document, such as structured menus andcontent maps, are usually of some help, provided that such organisers use symbols and metaphors that are alreadyfamiliar to the reader (Rouet and Potelle, 2005).In summary, skilled reading, navigation and information search in digital texts requires the reader to be familiar withexplicit and embedded hyperlinks, nonlinear page structures, and global content representation devices and tools.empirical evidence so far indicates that navigating digital texts is far from trivial, and may pose some challenges tocertain categories of users, such as the elderly (lin, 2004).From illustrated text to multimedia and augmented realitydigital technologies have also introduced new ways of integrating verbal texts with other forms of representation.online pictures and graphics can be clicked on to reveal descriptions and comments. text can also be integratedwith animated pictures, graphics and even video materials. Augmented reality allows one to integrate an actualenvironment (say, a Renaissance castle) with explanations and comments presented on a digital device. At the timeof writing (January 2011), the use of multimedia presentations on fixed and mobile digital devices was booming, andwas assisting individuals in moving around city streets, visiting museums and exhibitions, and learning professionalskills in domains ranging from mechanics to surgery.these innovations were still too marginal to be incorporated in the 2009 edition of the PISA digital readingassessment, but they will progressively be integrated in future PISA assessments.From authored texts to online discussion and social networksAnother prominent feature of digital texts is the shift from so-called authored texts to message-based discussionforums, social networks and Web 2.0. the spread of the Internet, combined with the interactivity of electronicdisplays, have made it possible to create new forms of communication that lie between traditional written textsand spoken conversations. Receiving and sending e-mail or short text messages, participating in discussion groupsor engaging in social relationships through the web is becoming more and more common (Pew Research Center,2010a). these activities require a mastery of reading comprehension and written skills, even though the genresand forms of texts that are involved appear relatively new. Research on the impacts of these new forms of textualcommunication on skill acquisition is warranted. (For a recent review of the state of the art, see Kemp, 2011;light, 2011 and, in particular, Coe and oakhill, 2011.) PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 35
    • 1 Context of the PISA dIgItAl reAdIng ASSeSSment ImPACt of dIgItAl textS on reAdIng lIterACy this section outlines the new literacy demands and opportunities that are associated with digital texts. (For more extended reviews, see Britt and Gabrys, 2000; Coiro, et al., 2008; Kemp, 2011; Reinking, 1994; Rouet, 2006; Warschauer, 1999.) Some types of reading are still mostly done using printed materials, while others are specific to the electronic medium. For instance, even experienced computer users read novels and extended informational texts on paper (see study of medical school students printing, martin and Platt, 2001). on the other hand, the activity of reading search engine lists is almost exclusive to reading on line, as is reading a personal blog (a genre that seems to have been born with the new millennium: Blood, 2000) or the comprehension of an online job-application form. thus, digital reading cannot always be strictly compared to print reading. this is, in fact, the best evidence in support of the design of a new framework and new assessment procedures for digital reading. However, a wide range of reading activities can be performed using both types of texts. Popular examples include reading news, informational texts, texts with a practical purpose such as buying goods or getting directions. However, because the digital versions of these texts differ – sometimes dramatically – from their printed counterparts, it is useful to consider how they affect reading skills and reading literacy. A powerful illustration of this is found in the area of literacy-assessment research itself, where so-called test-mode effects have been found with computerised versions of tests, resulting in better or worse performance than when printed versions are used (Clariana and Wallace, 2002). Which aspects of reading are affected by digital text? Independent of the particular reading situation or purpose, there is a need to identify those components of reading literacy that are relatively preserved and those that are the most affected by digital texts. low-level processes such as word identification or syntactic parsing are presumably very similar in printed and digital reading, aside from the general surface readability issues discussed in the previous section. the processes involved in building a mental representation of the text, such as identifying referents of anaphoric expressions or maintaining coherence locally and globally, would also appear to be relatively unaffected. these processes may simply be hindered in the case of lengthy texts displayed on line, because the reader will have more trouble referring to a previously read section (for a discussion see Foltz, 1996). differences between print and digital reading are more apparent when considering macro-aspects of reading, such as accessing texts of interest, integrating information across texts, or evaluating texts for quality and credibility. Access to text Printed texts require the reader to locate a material artefact, and use the categorisation and organisers to locate information of interest within that artefact. digital texts require the reader to search phrases, scan heterogeneous links, and use navigation devices. the latter procedures call upon the reader’s ability to generate vocabulary, assess the relevance of verbal expressions (and disregard distractors), and understand the hierarchical structuring of information in menu trees. the skilled reader of digital texts must be familiar with the use of navigation devices and tools. He or she must also be able to mentally represent the movement of the window over the text page, so as to be able to move in the correct direction. this includes an ability to overcome apparent discrepancies, for example the fact that the arrow oriented downwards on the scrollbar actually moves the text upwards. As early as 1989, Foss noted that some users tended to get lost in the maze of windows that ended up covering each other on their computer screen; early human-factors experiments often concluded that just two side-by-side windows seemed to be a good compromise for most readers (Wiley, 2001; Wright, 1993). the opening, layout and closing of multiple windows is arguably a skill in itself. there is indeed some evidence that reading complex digital texts relies on visuo-spatial abilities as much as on language- processing abilities (Pazzaglia, et al., 2008; see also naumann, et al., 2008). Integration across texts Integration, defined as comparing and relating different pieces of texts, calls upon similar processes, whatever the medium. However, because digital texts do not follow any stable categorisation scheme, and because the digital medium makes it so easy to cross-reference texts, readers are much more likely to find themselves jumping across36 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 1 ConTexT oF The pisa digiTal reading assessmenTdifferent texts within a single reading episode. Furthermore, the web offers readers the possibility of compiling alarge number of different sources on any given topic. therefore, the accumulation of information across multiplepassages is becoming typical of the sustained reading of digital texts. Integration across text requires sophisticatedreading skills and strategies, which are not spontaneously mastered by young readers (Britt and Rouet, forthcoming).even though these skills are not specific to digital reading, they may explain a significant portion of readers’ digitalreading proficiency.Evaluation of textReaders of web-based documents are faced with a wide array of materials, given the open, unregulated natureof web publishing. Current retrieval systems are mostly based on the semantic match between the query and thecontents, regardless of any indication of genre, accuracy, authority or trustworthiness. It is up to the reader tofind out not just what the text is about, but also who wrote it, who published it, when, for what purpose and withwhat potential biases. In the printed world, a range of perceptual and contextual cues (what the text looks likeand where it is found), as well as the presence of human mediators (for example, the librarian, the bookseller, thecritic) often facilitate these attributions. on the web, however, most of these cues and mediations are missing andthe reader has to resort to deeper levels of reasoning to evaluate the quality of the text (Britt and Gabrys, 2000).there is mounting evidence that evaluating web information is indeed a difficult aspect of digital reading for mostteenagers, even though they rely more and more on the web to acquire new information about subjects of interest(dinet, et al., 2003; darroch, et al., 2005; Kuiper, et al., 2005).some issues For assessing digiTal readingthe PISA digital reading assessment addresses a number of important issues that arise from the differences betweenprint and digital reading outlined above.First, it considers whether print and digital reading belong to the same construct. the PISA 2009 reading framework(oeCd, 2009b) points out that, while many of the skills required for print and digital reading are similar, digitalreading demands some new emphases and strategies to be added to the reader’s repertoire. “Gathering informationon the Internet requires skimming and scanning through large amounts of material and immediately evaluating itscredibility. Critical thinking, therefore, has become more important than ever in reading literacy” (Halpern, 1989;Shetzer and Warschauer, 2000; Warschauer, 1999). It is important to find out which specific dimensions of tasks andstudents’ characteristics explain students’ proficiency in digital reading, accounting for print reading proficiency.data from the digital reading assessment will allow for investigating whether the specific features of digital text, such asnonlinearity, navigation, intertextuality, and uncertainty regarding the quality of information, explain a specific share ofthe variance in student performance. Some of these issues are beyond the scope of this report, but the characteristics ofthe tasks and students’ navigation behaviour are the subjects of Chapters 2 and 3, respectively, of this volume.the results of the digital reading assessment also make it possible to explore the extent to which a student’s social,cultural and economic background is associated with proficiency in digital reading. these associations are exploredin Chapter 4, as is the relationship of digital reading proficiency with malleable characteristics, such as students’engagement in print and digital reading activities and their awareness of reading strategies.over the past ten years, there has been a discussion as to whether the people who have been exposed to informationtechnology from a young age, so-called “digital natives”, might readily possess the skills and abilities required tomake use of digital devices, compared to older people, the so-called “digital immigrants” (Prensky, 2001). there ismounting evidence that mere exposure to technology is not sufficient for becoming a skilled user. As time elapses, thegap in technology use between generations is progressively decreasing. the Pew Research Center (2010b) has foundthat even though “millenials” (people who were between 5 and 20 years old at the turn of the 21st century) are morelikely than older generations to use mobile digital devices and social networks, they are no longer dominant in othertypes of digital activities, such as looking up government websites or financial information. of particular interest isan investigation of prior exposure to and familiarity with digital technologies, and the extent to which they explainstudents’ proficiency in digital reading tasks. Results of the information and communication technologies (ICt)familiarity survey, an international option in PISA 2009 implemented in 45 countries, are provided in Chapter 5 ofthis report. Chapter 6 presents an analysis of the relationship between digital reading proficiency and ICt familiarityand mainly use for the 17 countries that participated in both options in PISA 2009. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 37
    • 1 Context of the PISA dIgItAl reAdIng ASSeSSment Chapter 7 expands this theme by presenting an analysis of the combined influence on digital reading proficiency of a range of variables, including print reading proficiency, gender, online and print reading engagement, reading strategies and selected socio-cultural variables, as well as ICt experience. Access to technology is necessary but certainly not sufficient in itself to acquire digital reading literacy. As noted by Warschauer (1999), overcoming the “digital divide” is not only a matter of developing access to online technology, but also of enhancing people’s abilities to access and make use of information through electronic devices. Indeed, recent studies show a wide range of proficiency levels among groups of “digital natives” (Kennedy, et al., 2008). A growing number of experts call for “a more nuanced understanding of students’ technology experiences”, to use the words of Bennett and maton (2010). ConCluSIonS the advent of information and communication technologies has sparked a revolution in the design and dissemination of texts. online reading is becoming increasingly important in information societies. even though the core principles of textuality and the core processes of reading and understanding text are similar across media, there are good reasons to believe that the specific features of digital texts call for specific text-processing skills. the PISA 2009 digital reading assessment was designed to investigate students’ proficiency at tasks that require the access, comprehension, evaluation and integration of digital texts across a wide range of reading contexts and tasks. the rest of this report presents the results of this first attempt to obtain a large-scale picture of digital reading skills among today’s 15-year-olds.38 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 Student performancein Digital and print reading This chapter examines the particular features of digital texts and analyses how well students can read those texts. It also discusses the similarities and differences between print and digital reading, and compares the results of the two reading assessments by merging them into a single scale. Results presented throughout the chapter are also analysed by gender. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 39
    • 2 sTudenT perFormanCe in digiTal and prinT reading What does it mean to be a proficient reader in the digital medium? this chapter examines how well students around the world can read digital texts and whether there are any differences between boys and girls as digital readers. It also discusses the relationship between digital and print reading and presents a comparison of results among the 19 countries that participated in both digital and print reading assessments in PISA 2009. the chapter concludes by presenting countries’ results in the two assessments merged into a single reading scale, and analyses these results further by gender. digiTal reading PISA defines reading literacy as understanding, using, reflecting on and engaging with written texts, in order to achieve one’s goals, develop one’s knowledge and potential, and participate in society. this broad definition refers to the texts that we read, the processes of reading and the purposes for which we read. It is as applicable to digital reading as it is to print reading. this section describes the main features of the reading framework as it relates to digital reading, and the way in which those features have been operationalised in the 2009 digital reading assessment. Texts digital texts are conceived of as a subset of written texts. For the purposes of PISA 2009, digital text is synonymous with hypertext: a text or texts with navigation tools and features that allow the reader to move from one page or site to another. they are texts composed predominantly of language rendered in a graphic form. While non-verbal graphic elements, such as illustrations, photographs, icons and animations can, and typically do, constitute part of a digital text in PISA, oral language, such as audio recording or the soundtrack of a film, is not included in this definition of text. many kinds of hypertexts were included in PISA 2009 in order to represent the digital medium as fully as possible. the characteristics of digital texts in PISA are specified in terms of environment, format and type, and navigation tools and features. the environment variable comprises two categories: authored and message-based. Authored texts are those with which readers are expected to engage receptively. Message-based texts are those with which readers are invited to interact. A small number of tasks that require reading both authored and message-based texts with equal attention are categorised as mixed. Figure VI.2.1 shows the distribution, by environment, of all tasks in the 2009 digital reading assessment, and examples of each category are provided in the coloured section later in this chapter. • Figure VI.2.1 • digital reading tasks by environment Environment % of tasks sample tasks Authored 66% • IWANTTOHELP – task 3 • SMELL – tasks 1, 2 and 3 • JOB SEARCH – tasks 1 and 3 message-based 28% • IWANTTOHELP – tasks 1 and 2 • JOB SEARCH – task 2 mixed 6% • IWANTTOHELP – task 4 Source: oeCd, PISA 2009 Database. 12 http://dx.doi.org/10.1787/888932435378 In order to approximate the experience of reading message-based texts, some of the tasks based on these texts require the test-takers to respond as if interacting with the text, for example by “replying” to an e-mail message (see the sample IWAnttoHelP task 4). the second text characteristic defined for digital reading in PISA is text format, which comprises four categories: continuous, non-continuous, mixed and multiple. Figure VI.2.2 shows the distribution, by text format, of all tasks in the 2009 digital reading assessment. examples of each category are provided in the coloured section later in this chapter.40 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT reading • Figure VI.2.2 • digital reading tasks by text format text format % of tasks sample tasks continuous 7% • IWANTTOHELP – task 1 non-continuous 10% • JOB SEARCH – task 1 mixed 7% • JOB SEARCH – task 3 multiple 76% • IWANTTOHELP – tasks 2, 3 and 4 • SMELL – tasks 1, 2 and 3 • JOB SEARCH – task 2Source: oeCd, PISA 2009 Database.12 http://dx.doi.org/10.1787/888932435378Given the assessment’s intention to represent the experience of navigating across multiple pages and sites that istypical of digital reading, the weighting towards multiple texts is strong, with over three-quarters of the tasks inthat category. only tasks that focus on a single digital page are classified as continuous, non-continuous or mixed.nevertheless, many of the tasks classified as multiple are based on sets of continuous, non-continuous and mixedformat material.the third text classification is text type, which has six categories: argumentation, description, exposition, instruction,narration and transaction. Four of the six are represented in the digital assessment: argumentation, description,exposition and transaction. While narrative texts were sought for the assessment, no suitable material of an appropriatelength and quality was found; the test development phase for PISA 2009 pre-dated the rise of e-books. Instructionaltexts are also absent from the PISA 2009 assessment – a matter of space limitations rather than deliberate exclusion.Figure VI.2.3 shows the distribution by text type of all tasks in the 2009 digital reading assessment. examples of tasksrepresenting three of the categories are found in the coloured section later in this chapter. • Figure VI.2.3 • digital reading tasks by text type text type % of tasks sample tasks Argumentation 21% • IWANTTOHELP – task 3 description 31% • IWANTTOHELP – tasks 1 and 2 • JOB SEARCH – tasks 1, 2 and 3 Exposition 31% • SMELL – tasks 1, 2 and 3 transaction 14% – mixed 3% • IWANTTOHELP – task 4Source: oeCd, PISA 2009 Database.12 http://dx.doi.org/10.1787/888932435378the text type of one of the sample tasks, IWAnttoHelP task 4, is classified as mixed because, while the end pointis a response to a transactional text (an e-mail), the text that the reader needs to consult also includes substantialpieces of both argumentation and description.Important distinguishing characteristics of digital texts are the navigation tools and features that help readers tonegotiate their way into, around and across texts. While there are parallels in the print medium, such as tables ofcontents, headings and page numbers, many navigation tools and features are unique to the digital medium, andthey are indeed part of the definition of hypertext.Some navigation tools and features allow the reader to move the reading window over the text page – using scrollbars, buttons, index tabs and so forth – so that the whole of the digital page can be viewed, even though only partof it is visible at any one time. other tools and features, such as hyperlinks and menus, allow the reader to movefrom one page or site to another, or – in the case of pop-ups – to call up additional, superimposed information. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 41
    • 2 sTudenT perFormanCe in digiTal and prinT reading A third type of navigation feature is global organisers, such as structured menus and content maps, which represent the relational structure of pages and links. they are used to help orient the reader to what is available on a site beyond the visible page, allowing readers to gauge the full scope of a text. digital reading requires familiarity with explicit and embedded hyperlinks, non-sequential page structures and global content representation devices. Consequently, in the PISA 2009 digital reading assessment a range of navigation tools and structures is included as one important component in measuring proficiency in digital reading. the tools and features include: scroll bars for moving up and down a page; tabs for different websites; lists of hyperlinks displayed in a row, in a column or as a drop-down menu; embedded hyperlinks – that is, hyperlinks included in paragraphs, tables of information or a list of search results; and site maps. Cognitive processes Aspects the definition of reading in PISA includes the words understanding, using and reflecting (see Chapter 1 of this volume and PISA 2009 Results: What Students Know and Can Do: Student Performance in Reading, Mathematics and Science [Volume I] ). these are the cognitive skills involved in processing texts and they are at the heart of both digital and print reading. In the PISA reading framework and in the tasks built to reflect the framework, these terms are further defined in relation to three aspects: access and retrieve, integrate and interpret and reflect and evaluate. A fourth aspect category, complex, has been added specifically to accommodate those digital reading tasks that involve multiple demands. Figure VI.2.4 shows the distribution, by aspect, of all tasks in the 2009 digital reading assessment, and indicates the examples of tasks provided later in this chapter. A little over one-third of all the tasks are categorised as integrate and interpret, with the rest spread fairly evenly across the other three categories. • Figure VI.2.4 • digital reading tasks by aspect Aspect % of tasks sample tasks Access and retrieve 24% • IWANTTOHELP – tasks 1 and 2 integrate and interpret 35% • IWANTTOHELP – task 3 • SMELL – tasks 1 and 3 • JOB SEARCH – task 2 reflect and evaluate 21% • SMELL – task 2 • JOB SEARCH – tasks 1 and 3 complex 21% • IWANTTOHELP – task 4 Source: oeCd, PISA 2009 Database. 12 http://dx.doi.org/10.1787/888932435378 Text processing and navigation In the digital medium, the cognitive processes of accessing, retrieving, interpreting, integrating, reflecting and evaluating are called upon for both text processing and navigation. Text processing in the digital medium is in many ways similar to the constellation of skills and strategies typically associated with print reading. Confronted with a chunk of digital text, the reader may need to locate key pieces of information, interpret nuances of language, integrate different elements of the text, draw on prior knowledge of textual and linguistic structures and features, make judgements about the cogency of an argument or the appropriateness of the style, and reflect on the relationship between the content and his or her own experience or knowledge of the world. Navigation involves moving around the digital medium to access the information that is needed. A set of cognitive skills parallel to those required for text processing is drawn upon – though the structures and features that need to be negotiated are different, and therefore the kinds of mental activities required also vary. typically in navigating the digital medium there is a strong emphasis on predicting, and on evaluating and integrating information. Accessing and retrieving information may require traversing several pages or sites, predicting the likely content of a series of unseen screens, based on visible text information, in order to efficiently locate the required information.42 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT readingReaders integrating and interpreting in the digital medium use the traditional repertoire of constructing meaningfrom continuous and non-continuous texts, but their task is often complicated by the fact that the relevant text is notimmediately visible in its entirety. Readers need to make decisions about which links and menus to use to accessmaterial from different pages within the same website, or they may need to use tabs to view and compare informationfrom different websites. the reader needs to navigate to survey what is available, to compare, contrast and filter thematerial, and to synthesise information. Predicting what is relevant and appropriate to a search requires the readerto reflect and evaluate, as does deciding on the authority, relevance and utility of a text, once it is accessed.navigation as described here is part of the cognitive process of digital reading, not merely a set of technicalmanoeuvres such as cliking on links or scrolling. However, because navigation is manifested in behaviours likethese, in a way that is mostly unobservable during print reading (other than through page-turning or throughlaboratory techniques, such as eye- or brain-scanning), it offers new opportunities for insights into the cognitiveprocesses of reading. Some of these opportunities are explored in Chapter 3 of this volume.Both navigation and text-processing skills are required to complete most digital reading tasks. Some tasks placemore emphasis on navigation and others on text processing. the relationship between the two skills in the tasksincluded in the PISA 2009 digital reading assessment, based on the jugdgement of expert raters, is representedin Figure VI.2.5. the horizontal axis represents the cognitive load that comes from processing the text, while thevertical axis represents the cognitive load that comes from the navigation required to successfully complete the task.each task is represented by one plot (or, in the case of tasks with both full- and partial-credit scoring, by two plots).the position of the plot indicates the relative contribution of text processing and navigation to the task. the datapoints for the tasks described in the coloured section in this chapter are numbered from 1 to 12. • Figure VI.2.5 • relationship between text processing and navigation in digital reading tasks Navigation 4.5 4 5 number on graph task id 4.0 1 IWANTTOHELP – task 1 11 2 IWANTTOHELP – task 2 3.5 3 IWANTTOHELP – task 3 8 3.0 4 IWANTTOHELP – task 4 (partial credit) 2.5 5 IWANTTOHELP – task 4 (full credit) 3, 9, 10 7 6 SMELL – task 1 2 6 2.0 7 SMELL – task 2 1.5 8 SMELL – task 3 12 9 JOB SEARCH – task 1 1 1.0 10 JOB SEARCH – task 2 (partial credit) 0.5 11 JOB SEARCH – task 2 (full credit) 12 JOB SEARCH – task 3 0.0 0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 Text processingSource: oeCd, PISA 2009 Database.12 http://dx.doi.org/10.1787/888932435378tasks that require low levels of both skills – requiring little or no navigation and minimal text processing – appear atthe bottom left corner of the graph, close to the origin. the task closest to this description among the sample tasksis IWANTTOHELP task 1. In this task, the required information is in a prominent position in a short text, and it isexplicit. the page on which the information appears is presented to the reader at the beginning of the task; in otherwords, no navigation is required. tasks that require high levels of both navigation and text processing appear in thetop right corner of the graph: the further from the origin, the more complex the task. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 43
    • 2 sTudenT perFormanCe in digiTal and prinT reading In between the two extreme cases, tasks represent different combinations of the two variables. A digital reading assessment might include tasks that require high levels of navigation, but low levels of text processing. these tasks would be represented in the top left corner of the graph. this kind of task might require the use of multiple strategies to navigate between web pages, such as the use of embedded links or drop-down menus, but involve web pages with little other text on them, therefore requiring a low level of text processing but high levels of navigation. no tasks in the 2009 digital reading text that had low levels of text processing combined with high demand in navigation; the closest to this description among the sample tasks is IWANTTOHELP task 4 (partial credit). to gain even partial credit for this task, readers need to negotiate several web pages, sometimes with explicit direction but also using text-based clues to predict which links will lead to relevant information. While the task demands that the reader traverse several pages of text, no more than superficial processing of any of the encountered texts is required for a partial credit score. tasks that require high levels of text processing, but low levels of navigation, appear at the bottom right of the graph. A task of this kind might involve, for example, dealing with a text that is dense or complex, therefore requiring a high level of text processing, but that is immediately visible to the reader in its entirety, thus requiring no navigation. the task closest to this description among the sample tasks is JOB SEARCH task 3. this task requires no navigation apart from scrolling on the presented page. the text itself is not particularly dense or complex; however, the task does require drawing inferences from the text and relating them to knowledge from beyond the text. therefore it depends more heavily on text processing than on navigation. It was considered necessary to include a small number of tasks of this kind because although they do not require the skills that are unique to digital reading, they do represent one kind of task that might be required in the real-life digital environment. If this kind of task were excluded, the differences between digital and print reading would be artificially inflated. Ideally, an assessment of digital reading would show tasks distributed fairly evenly across the space defined in Figure VI.2.5. As the mapping shows, the actual distribution of tasks in PISA 2009 approaches this ideal. situation Situation is used in PISA to classify texts and their associated tasks, and refers to the contexts and uses for which the author constructed the text. By sampling texts across a variety of situations the intent is to maximise the diversity of content included in the PISA reading literacy survey. each set of stimuli is assigned to one of the four identified situations – educational, occupational, personal and public – according to the likely audience and purpose for which it is intended. Figure VI.2.6 shows the distribution, by situation, of all tasks in the 2009 digital reading assessment and indicates the situation category of the material provided in the coloured section later in this chapter. • Figure VI.2.6 • digital reading tasks by situation situation % of tasks sample tasks Educational 10% – occupational 24% • IWANTTOHELP • JOB SEARCH Personal 21% – Public 45% • SMELL Source: oeCd, PISA 2009 Database. 12 http://dx.doi.org/10.1787/888932435378 how The pisa 2009 reading resulTs are reporTed How the Pisa 2009 digital reading tests were designed, analysed and scaled the development of the PISA 2009 digital reading assessment was co-ordinated by a consortium of educational research institutions under the auspices of the oeCd Secretariat, and under the guidance of a group of international reading experts, several of whom were included because of their research interest in digital reading. Consortium test-development centres and some participating countries submitted stimulus material and questions.44 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT readingthe material was refined iteratively over the three years leading up to the administration of the assessment in2009. the development process included several rounds of commentary from participating countries, as well aspiloting with small groups of 15-year-olds, and a formal field trial in which 15-year-olds from all of the countriesparticipating in this international option. the reading expert group recommended the final selection of tasks,which was made based on the technical quality of the tasks, assessed according to how they performed in thefield trial, and their cultural appropriateness and interest for 15-year-olds, as judged by the participating countries.the set of tasks also needed to represent the required framework balance, reflecting the various categories oftext, aspect and situation. In addition, the selection sought to ensure that tasks varied in their emphasis on textprocessing and navigation, and that they ranged widely in difficulty, allowing for an accurate assessment of all15-year-old students, from the least proficient to the most able in digital reading.twenty-nine digital reading tasks yielding 38 score points were used in PISA 2009, but each student in the samplesaw only some of these tasks because different sets of tasks were given to different students. the tasks were organisedinto three 20-minute clusters, with each sampled student administered two of the clusters. each student was thusgiven a forty-minute digital reading assessment, with an additional 10 minutes for orientation and practice questionsat the beginning of the testing session. the clusters were rotated in six forms so that each cluster was paired with theother two and appeared in both first and second position in the pairing.this design makes it possible to construct a single scale of digital reading proficiency, in which each question isassociated with a particular point on the scale that indicates its difficulty, and each student’s performance is associatedwith a particular point on the same scale that indicates his or her estimated proficiency. A description of the modellingtechnique used to construct this scale can be found in PISA 2009 Technical Report (oeCd, forthcoming).the relative difficulty of tasks in a test is estimated by considering the proportion of test-takers who answer eachquestion correctly. the relative proficiency of students taking a particular test is estimated by considering theproportion of test questions they answer correctly. A single continuous scale shows the relationship between thedifficulty of questions and the proficiency of students. By constructing a scale that shows the difficulty of eachquestion, it is possible to locate the level of digital reading literacy that the question represents. By showing theproficiency of each student on the same scale, it is possible to describe the student’s level of digital reading literacy. • Figure VI.2.7 • Relationship between questions and students on a proficiency scale Digital reading scale Student A, with It is expected that Student A will be able Item VI relatively high to complete Items I to V successfully proficiency and probably Item VI as well. Items with relatively high difficulty Item V It is expected that Student B will be able Item IV Student B, to complete Items I, II and III successfully, Items with with moderate will have a lower probability of moderate difficulty proficiency completing Item IV and is unlikely to complete Items V and VI successfully. Item III Item II Items with relatively low difficulty Item I It is expected that Student C will be unable Student C, with to complete Items II to VI successfully relatively low proficiency and will also have a low probability of completing Item I successfully. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 45
    • 2 sTudenT perFormanCe in digiTal and prinT reading estimates of student proficiency reflect the kinds of tasks that students would be expected to perform successfully. this means that students are likely to be able to complete questions successfully at or below the difficulty level associated with their own position on the scale (but they may not always do so). Conversely, they are unlikely to be able to successfully complete questions above the difficulty level associated with their position on the scale (but they may sometimes do so). Figure VI.2.7 illustrates how this probabilistic model works. How digital reading proficiency levels are defined in PISA 2009 PISA 2009 provides an overall scale drawing on all the questions in the digital reading assessment. the metric for the digital reading scale was set so that the mean and the standard deviation of the 16 equally weighted oeCd countries that participated in the digital reading assessment are the same as those for the same group of countries’ print reading mean and standard deviation. this mean was 499 score points, with a standard deviation of 90. to help in interpreting what students’ scores mean in substantive terms, the scale is divided into levels, based on a set of statistical principles, and then descriptions are generated, based on the tasks that are located within each level, to describe the kinds of skills and knowledge needed to successfully complete those tasks. Given the relatively small number of items in the pool for PISA 2009, the range of difficulty of digital reading tasks allows for the description of four levels of reading proficiency: level 2, level 3, level 4 and level 5 or above. Below level 2 there is a “place-holder” region of the scale, with too few items to support level descriptions. this area is called “Below level 2”. It is anticipated that items reflecting this low level of proficiency will be developed for future PISA surveys. Similarly, tasks may be added to the top of the scale to allow for the description of a level 6. Students with a proficiency within the range of level 2 are likely to be able to successfully complete tasks within that band of difficulty, but are unlikely to be able to complete tasks at higher levels. Students with scores within the range of level 4 are likely to be able to successfully complete tasks located at that level and at the lower levels. PISA applies a standard methodology for constructing proficiency scales. Based on a student’s performance on the tasks in the test, his or her score is generated and located in a specific part of the scale, thus allowing the score to be associated with a defined proficiency level. the level at which the student’s score is located is the highest level for which he or she would be expected to successfully answer most of a random selection of questions within the same level. • Figure VI.2.8 • summary descriptions for four levels of proficiency in digital reading Percentage of students lower able to perform tasks score at this level or above level limit (oEcd average) characteristics of tasks tasks at this level typically require the reader to locate, analyse and critically evaluate 5 information, related to an unfamiliar context, in the presence of ambiguity. they or above 7.8% require generating criteria to evaluate the text. tasks may require navigation across multiple sites without explicit direction, and detailed interrogation of texts in a variety 626 of formats. tasks at this level may require the reader to evaluate information from several sources, 4 navigating across several sites comprising texts in a variety of formats, and generating 30.3% criteria for evaluation in relation to a familiar, personal or practical context. other tasks at this level demand that the reader interpret complex information according to well- 553 defined criteria in a scientific or technical context. tasks at this level require that the reader integrate information, either by navigating 3 across several sites to find well-defined target information, or by generating simple 60.7% categories when the task is not explicitly stated. Where evaluation is called for, only the information that is most directly accessible or only part of the available information 480 is required. tasks at this level typically require the reader to locate and interpret information that is 2 well-defined, usually relating to familiar contexts. they may require navigation across a limited number of sites and the application of web-based navigation tools such as 83.1% drop-down menus, where explicit directions are provided or only low-level inference is called for. tasks may require integrating information presented in different formats, 407 recognising examples that fit clearly defined categories. Source: oeCd, PISA 2009 Database. 12 http://dx.doi.org/10.1787/88893243537846 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT readingthus, for example, in an assessment composed of tasks spread uniformly across level 4, students with a score locatedwithin this level would be expected to complete at least 50% of the tasks successfully. Because a level covers a range ofdifficulties and proficiencies, success rates across the band vary. Students near the bottom of the level would be likelyto succeed in just over 50% of the tasks spread uniformly across the level, while students at the top of the level wouldbe likely to succeed in well over 70% of the same tasks.Figure VI.2.8 provides details of the nature of the skills, knowledge and understanding required at each level of thedigital reading scale.A profile of PISA reading questionsIn order to establish reliable trends in PISA, a sufficient number of questions must be retained from year to year.other questions are publicly released after the survey to illustrate how performance was measured. A selectionof the released questions for the 2009 reading assessment is presented in the coloured section of this chapter toillustrate the framework characteristics and the levels of proficiency described above.Four variables that influence the difficulty of digital reading tasks have been identified:• Characteristics of text. this variable relates to the features of the texts that need to be processed to complete a task. tasks based on texts with unfamiliar content in formal or technical language will, on average, be more difficult than short texts with familiar, everday content expressed in idiomatic language. the complexity of text structure, the vocabulary and the layout all influence the ease with which a text-based task can be completed. moreover, the sheer quantity of text influences difficulty. the longer the text, and the more pages of digital text that must be consulted, the more difficult a task is likely to be.• Complexity of navigation. A digital reading task may focus on information that is immediately visible on the starting page of the task, it may require scrolling on that page, or it may require the reader to visit several pages or sites. tasks become more difficult when the information needed to complete the task is not immediately visible. Complexity of navigation also depends on the quantity, prominence, consistency and familiarity of navigation tools and structures on the available pages. When moving between pages is required, if there are many hyperlinks or menu items to choose from, the reader is likely to find the task more difficult than if there are only one or two hyperlinks to choose from. A task is made easier if there are prominently placed links in a conventional location on the screen; a task is more difficult if links are embedded in the text or are in an otherwise unconventional or inconspicuous location. Finally, the degree of direction in navigating influences task difficulty. even when the reader needs to consult several pages, explicit directions about the pages that must be visited and the navigation structures to use can make the task relatively easy.• Explicitness of task demands. this variable relates to the specificity of direction in completing the task: how much the reader needs to infer the scope and substance of what is required for the response. difficulty is influenced by the relationship between the task and the text that must be processed. If the question uses the same or similar terminology to that used in the text, the task will be easier than if the terms used are different. When the criteria for responding are not explicitly stated in the task, so that readers have to generate their own criteria, difficulty increases. In this context, task formats in which the student selects a response from a limited list, such as multiple-choice items, tend to be easier than those for which the student needs to construct the response. (this variable does not reflect the specificity of guidance for navigation, which is accounted for in the complexity of navigation variable.)• Nature of response. this variable relates to the kind of mental processing that the reader has to undertake to complete the task. Where the reader needs to generate concepts from within the text, rather than having them supplied, the task is likely to be more demanding. Where the reader needs to make a series of inferences, to evaluate and reflect, to construct relationships, such as causation or contrast among elements of the text, the task is typically more difficult than one in which processing the text only requires a simple transfer or basic identification of material. Further, a task that focuses on abstract concepts will be more difficult than one in which concrete information is the focus.the difficulty of the digital reading tasks is varied by manipulating these four variables. Figure VI.2.9 shows an itemmap of the digital reading tasks that are presented later in this chapter. the 12 locations on the map represent the10 tasks, with two of the tasks yielding two locations because they have full-credit and partial-credit scoring. theitem map shows the score for each location, with a brief general description of the nature of the task. It also shows,for each location, difficulty ratings made by expert judges in relation to each of the four variables described aboveon a scale of 1 to 4, with 1 designating the least demand and 4 the greatest. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 47
    • 2 sTudenT perFormanCe in digiTal and prinT reading • Figure VI.2.9 • map of selected digital reading questions in pisa 2009, illustrating the proficiency levels lower complexity Explicitness nature score task Quality of of task of level limit (and score) nature of task of text navigation demand response 5 SMELL evaluate a web page in terms of credibility/ 4 2 3.5 4 or task 2 (657) trustworthiness of information after following above an explicitly directed link from search results, generating own criteria for evaluation. Scroll to read the full text, which includes some specialised (scientific) language. 626 4 JOB SEARCH Analyse a list of options in a descriptive text 2 3.5 2 3 task 2.2 related to employment, using predefined criteria. full credit Follow two links using explicit instructions, and (624) scroll. Select four options from drop-down menus, combining prior knowledge with information integrated from a seond page. (Full Credit) SMELL distinguish between the main idea and subsidiary 3.5 2 3 3 task 1 (572) ideas in an expository scientific text, in the presence of strong distracting information. Follow a link from search results to a web page using a literal match, scrolling to read the full text. IWANTTOHELP Integrate and reflect upon information from several 3 4 3 3 task 4.2 web pages by comparing short texts on multiple full credit pages of a website about community work with (567) criteria referred to on a personal blog; explain a choice based on this comparison. Follow a series of at least four links, using explicit instructions. (Full Credit) JOB SEARCH Hypothesise about the reason for including 1.5 1 4 3 task 3 (558) a condition in a job advertisement. Support explanation using prior knowledge and information from the text. no navigation required. 553 3 IWANTTOHELP Integrate information by comparing a short text on 3 4 2 2 task 4.1 one website about community work with criteria partial credit referred to on a personal blog. Follow a series of (525) at least four links, using explicit instructions. (Partial Credit) SMell Synthesise information from two websites, 3 3 2 2 task 3 (485) following links from search results guided by explicit directions. Identify a generalisation common to information on the two sites using low-level inference. 480 2 JOB SEARCH Select a job suitable for a student from a list 1.5 2 2 2 task 1 (463) of four search results comprising short descriptions of jobs. IWANTTOHELP Recognise the main purpose of a website dealing 1.5 2 2 2 task 3 (462) with a community activity from a short description on its Home page. Follow a single link with explicit directions. JOB SEARCH Analyse a list of options in a descriptive text related 2 2 2 1.5 task 2.1 to employment, using predefined criteria. Follow partial credit two links using explicit instructions. Select three (462) suitable options from drop-down menus. (Partial Credit) IWANTTOHELP locate explicitly stated personal information on a 1 2 1 1.5 task 2 (417) page of a personal blog, following one explicitly directed link and using two literal matches between task and text. 407 below IWANTTOHELP locate explicitly stated information in a personal 1 1 1.5 1.5 task 1 (362) 2 blog. Find a synonymous match between the task and the text. no navigation required. 12 http://dx.doi.org/10.1787/88893243537848 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT reading whaT sTudenTs Can do in digiTal reading PISA summarises student performance on a scale that provides an overall picture of students’ accumulated digital reading skills, knowledge and understanding at age 15. Results for this overall digital reading performance measure are presented in the following part of the chapter, covering both the average level of reading performance in each country and the distribution of reading proficiency. students reading the different levels of proficiency on the digital reading scale this section describes performance in terms of the four levels of proficiency that have been constructed for reporting digital reading in PISA 2009. Figure VI.2.8 shows the cumulative percentage of students in all participating oeCd countries who are proficient at each of the four levels. the distribution of student performance across these proficiency levels in each participating country is shown in Figure VI.2.10. table VI.2.1 shows the percentage of students at each proficiency level on the digital reading scale, with standard errors. • Figure VI.2.10 • How proficient are students in digital reading? Below Level 2 Level 2 Level 3 Level 4 Level 5 or above 100Percentage of students 80 60 40 20 0 20 40 60 80 Hungary Austria Chile Colombia Korea Japan Australia Hong Kong-China New Zealand Macao-China Ireland Iceland Sweden Norway Belgium Denmark France OECD average-16 Spain Poland Countries are ranked in descending order of the percentage of students at Levels 2, 3, 4, 5 or above. Source: OECD, PISA 2009 Database, Table VI.2.1. 12 http://dx.doi.org/10.1787/888932435378 Proficiency at Level 5 or above (scores higher than 626) Students proficient at level 5 on the digital reading scale are skilled readers in this medium. they are able to evaluate information from several web-based sources, assessing the credibility and utility of what they read using criteria that they have generated themselves. they are also able to work out a pathway across multiple sites to find information without explicit direction: that is, they are able to navigate autonomously and efficiently. these two capabilities – critical evaluation and expertise in locating relevant information – are key skills in a medium in which there is virtually unlimited material available, and in which the integrity of the sources is often dubious. dealing with semi-technical material as well as with more popular and idiomatic texts, students performing at level 5 or above assimilate the broad sense of the material they encounter and also notice fine distinctions in the detail of the texts, allowing them to draw inferences and form plausible hypotheses. those performing at level 5 or above can be regarded as “top performers” in digital reading. Across the 16 oeCd countries that participated in the digital reading assessment in 2009, 8% of students performed at this level. But there is considerable variation across the countries, from over 17% in Korea, new Zealand and Australia to fewer than 3% in Chile, Poland and Austria. the partner country Colombia and partner economy macao-China also had very small percentages of students at level 5 or above. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 49
    • 2 sTudenT perFormanCe in digiTal and prinT reading Proficiency at Level 4 (scores higher than 553 but lower than or equal to 626) Students at this level can perform challenging reading tasks in the digital medium. they evaluate the authority and relevance of sources of information when provided with support, and can explain the criteria on which their judgements are based. they can locate and synthesise information across several sites when navigation between the sites requires the exercise of low-level inference. dealing with a range of text formats and text types, including those in more formal registers and written in technical language, students at this level are able to compare and contrast the information they find on different sites, and to hypothesise and form opinions about what they read drawing on information from everyday life. Students proficient at level 5 or above can also successfully complete level 4 tasks. Across the participating oeCd countries, 30% of students are proficient at level 4 or above. For the majority of these countries and for the partner economy Hong Kong-China, about one-fifth to one-quarter of students perform within this level. A notable exception is Korea, where over 40% of students perform within level 4. taken together with the students performing at level 5 or above, over 60% of Korean students are proficient at level 4 – a proportion larger than that of any other country. the next highest-performing countries are Australia and new Zealand, both with 46% of students proficient at least at level 4. Belgium, Japan, Iceland, Sweden and Ireland and the partner economy Hong Kong-China all have over 30% of students proficient at level 4 or above. the proportion of students in Chile proficient at that level is less than 10% and in the partner country Colombia it is less than 2%. Proficiency at Level 3 (scores higher than 480 but lower than or equal to 553) Students performing at this level can cope with digital reading tasks of moderate complexity. they respond to digital texts in both authored and message-based environments. When given explicit guidance, they navigate across several pages to locate relevant material, and compare and contrast information from a number of web-based texts when the criteria for comparison or contrast are clearly stated. they evaluate information in terms of its usefulness for a specified purpose or in terms of personal preference. Across the 16 participating oeCd countries, a majority (61%) of 15-year-olds is proficient at level 3 or above. In most of these countries, this is the modal level of highest attainment; only in Korea, Australia and new Zealand is the modal level of performance higher (level 4), while in Chile, the model level is lower (level 2). Among partner economies, students in both Hong Kong-China and macao-China also most commonly perform at level 3, while the modal performance of students in the partner country Colombia is below the described levels. In all participating countries except Chile and Colombia, then, it can be inferred that the majority of young people is capable of dealing with many everyday digital reading tasks, although they are unlikely to be able to manage more challenging tasks, such as finding information entirely by themselves or critically evaluating sources to ascertain their authenticity and their relevance to the reader. Proficiency at the Level 2 (scores higher than 407 but lower than or equal to 480 points) Students proficient at this level navigate successfully using conventional navigation tools and features. When provided with explicit instructions, they locate links even when they are not prominent and scroll to find required information. using predefined criteria they select relevant material from a list of search results or a drop-down menu. they can locate several pieces of information in one text and transfer them to another format (such as an order form). they form generalisations, such as recognising the intended audience of a website, or figuring out a common requirement of two correspondents in an e-mail exchange. Across participating oeCd countries, more than four-fifths of students (83%) are proficient at level 2 or above. In Australia and Japan, this proportion rises to over 90% and in Korea to 98%. All participating countries and partner economies, except Korea, have significant numbers of students performing below the defined levels for the digital reading scale. In the oeCd countries Chile, Poland, Austria and Hungary, more than one-quarter of students perform below level 2, and in Colombia, nearly 70% of students perform below this level. this does not mean that such students have no proficiency in digital reading. many students performing at this level can scroll and navigate across web pages, as long as explicit directions are provided, and can locate simple pieces of information in a short block of hypertext. nevertheless, although the digital reading skills of these students are not necessarily negligible, they are performing at levels that are not likely to allow them full access to educational, employment and social opportunities in the 21st century.50 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT readingaverage level of proficiencyAnother way of summarising the differences between countries is to consider their mean performance. Since onlyabout half of the oeCd countries participated in the PISA 2009 digital reading assessment option, the mean andstandard deviation for the pooled data set of the 16 oeCd countries in digital reading were arbitrarily set at thesame values as this group of countries’ mean (499) and standard deviation (90) for print reading in 2009.1 thesevalues establish the benchmark against which each country’s digital reading performance in PISA 2009 is compared.Figure VI.2.11 shows each country’s mean score for digital reading. For each country shown in the middlecolumn, the list in the right hand column shows countries whose mean scores are not sufficiently different to bedistinguished with at least 95% certainty. For all other cases, one country has higher performance than anotherif it is above it in the list in the middle column, and lower performance if it is below. For example, Hong Kong-China’s performance, which comes fifth on the list, is not significantly different from that of Japan, which comesfourth, Iceland, which comes sixth, Sweden (seventh) and Ireland (eighth). the dark band in the middle shows theparticipating countries norway and France, whose performances are not statistically significantly different fromthe oeCd average. • Figure VI.2.11 • Comparing countries’ performance in digital reading Statistically significantly above the oeCd average not statistically significantly different from the oeCd average Statistically significantly below the oeCd average mean comparison country countries whose mean score is not statistically significantly different from that of the comparison country 568 Korea 537 New Zealand Australia 537 Australia new Zealand 519 Japan Hong Kong-China 515 Hong Kong-China Japan, Iceland, Sweden, Ireland 512 Iceland Hong Kong-China, Sweden, Ireland, Belgium 510 Sweden Hong Kong-China, Iceland, Ireland, Belgium 509 Ireland Hong Kong-China, Iceland, Sweden, Belgium 507 Belgium Iceland, Sweden, Ireland 500 Norway France 494 France norway, macao-China, denmark 492 Macao-China France, denmark 489 Denmark France, denmark 475 Spain Hungary 468 Hungary Spain, Poland, Austria 464 Poland Hungary, Austria 459 Austria Hungary, Poland 435 Chile 368 ColombiaSource: oeCd, PISA 2009 Database.12 http://dx.doi.org/10.1787/888932435378Korea is the top-performing country by a significant margin, with a mean score of 568. this indicates that, onaverage, 15-year-olds in Korea perform at level 4 in digital reading. new Zealand and Australia are in second andthird positions, both at 537. Japan (519) and the partner economy Hong Kong-China (515) are in the next rank,together with Iceland (512) and Sweden (510). two additional european countries have mean scores significantlyhigher than the oeCd average: Ireland (509) and Belgium (507). norway (500) and France (494) have means notsignificantly different from the oeCd average. denmark (489) and the partner economy macao-China (492) havemeans not significantly different to that of France, though they are below the oeCd average. on average in allof these countries except Korea, 15-year-olds perform at PISA proficiency level 3 in digital reading. Students inthe remaining five oeCd countries perform, on average, at level 2: Spain (475), Hungary (468), Poland (464),Austria (459) and Chile (435). the partner country Colombia’s mean score (368) is well below those of the otherparticipating countries, indicating that, on average, Colombian 15-year-olds perform below the described levels ofdigital reading. As mentioned above, however, this does not signify a complete lack of skills.Because the figures are derived from samples, it is not possible to determine a precise rank of a country’s performanceamong the participating countries. It is possible, however, to determine, with 95% likelihood, a range of ranks inwhich the country’s performance lies, as shown in Figure VI.2.12. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 51
    • 2 student performance in digital and print reading • Figure VI.2.12 • Where countries rank in digital reading performance Statistically significantly above the oeCd average not statistically significantly different from the oeCd average Statistically significantly below the oeCd average Digital reading scale Range of rank OECD countries All countries/economies Mean score S.E. Upper rank Lower rank Upper rank Lower rank Korea 568 (3.0) 1 1 1 1 New Zealand 537 (2.3) 2 3 2 3 Australia 537 (2.8) 2 3 2 3 Japan 519 (2.4) 4 4 4 5 Hong Kong-China 515 (2.6) 4 7 Iceland 512 (1.4) 5 7 5 8 Sweden 510 (3.3) 5 8 5 9 Ireland 509 (2.8) 5 8 6 9 Belgium 507 (2.1) 6 8 7 9 Norway 500 (2.8) 9 10 10 11 France 494 (5.2) 9 11 10 13 Macao-China 492 (0.7) 11 13 Denmark 489 (2.6) 10 11 11 13 Spain 475 (3.8) 12 13 14 15 Hungary 468 (4.2) 12 14 14 16 Poland 464 (3.1) 13 15 15 17 Austria 459 (3.9) 14 15 16 17 Chile 435 (3.6) 16 16 18 18 Colombia 368 (3.4) 19 19 note: See Annex A3 for a detailed description of how the range of ranks is computed. Source: oeCd, PISA 2009 Database. 12 http://dx.doi.org/10.1787/888932435378 Inequality of learning outcomes the gap between the means of the highest- and lowest-performing oeCd countries (Korea and Chile) is 133 points – one-and-a-half standard deviations and almost two full proficiency levels. While the disparities between countries are evident, an equally large disparity in performance exists between the highest- and lowest-performing students within some of the countries. this is the case in Hungary, Austria and Belgium, where 141, 137 and 133 score points, respectively, separate the mean performance levels of the top and bottom quarters of the 15-year-old population. this finding is of particular concern. there is growing consensus that not only does such inequality reflect a reduced possibility for those on the lower rungs either to contribute to society or to benefit from its capital, but inequality within countries (compared to that between countries) is more likely to be perceived as unfair, because the disparities are local and obvious; and that, in turn, could sap a collective sense of well-being or lead to social unrest (Friedman, 2005; Pickett & Wilkinson, 2009). A wide disparity in performance within countries is not inevitable, and relatively narrow gaps between the highest and lowest performance are not associated with any particular level of overall proficiency. With the average gap between the top and bottom quarter of students at 120 score points across the participating oeCd countries, the Asian countries and economies, whose mean scores range from average to very high, all have distribution ranges well below the oeCd mean. the interquartile range (the difference between the first and third quartiles) in these two countries and two economies is 88 for Korea, 89 for macao-China, 95 for Japan and 103 for Hong Kong-China. the comparable figure for Colombia, the lowest performing country, was also below the oeCd average difference (113), while for one of the best-performing countries, new Zealand, the difference is 131 score points. Chapters 4, 6 and 7 examine some of the factors that may explain these variations in performance. Gender differences in performance on the digital reading scale Girls have outperformed boys in print reading in every oeCd and partner country and economy – except in Israel and the partner country Peru in PISA 2000 – since PISA’s first reading assessment was administered in 2000 (oeCd, 2003). does the same hold true for digital reading? the brief answer is “almost”. Figure VI.2.13 shows gender differences in reading performance for each country; tables VI.2.2, VI.2.3 and VI.2.4 provide further details. the mean difference between boys’ and girls’ performance in digital reading is 24 score points in favour of girls. In all but one country the difference is statistically significant. the exception is Colombia, where girls outperform boys by an average of only three score points. except for Poland, the greatest gender differences are all in either52 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT readingenglish-speaking or nordic countries: new Zealand (40-point difference), followed by norway (35), Ireland (31),Iceland (30), Poland (29), Australia (28) and Sweden (26). denmark is alone among nordic and english-speakingcountries in having a below-average gap between boys’ and girls’ performance.Figure VI.2.14 shows the percentages of boys and girls performing at each proficiency level and the percentagebelow the lowest level. • Figure VI.2.13 • Gender differences in digital reading performance Boys All students Girls Mean score on Gender difference the digital reading scale (girls – boys) New Zealand Norway Ireland Iceland Poland Australia Sweden OECD average-16 Belgium Japan Austria Hungary France Spain Chile Korea Macao-China Hong Kong-China Denmark Colombia 350 400 450 500 550 600 0 10 20 30 40 50 Mean score Score point differenceNote: Gender differences that are statistically significant are marked in a darker tone.Countries are ranked in ascending order of the gender score point difference (girls – boys).Source: OECD, PISA 2009 Database, Table VI.2.4.12 http://dx.doi.org/10.1787/888932435378 • Figure VI.2.14 • How proficient are girls and boys in digital reading? Boys Girls 6.3 Level 5 or above 9.3 20.0 Level 4 25.1 29.4 Level 3 31.4 23.6 Level 2 21.0 20.7 Below Level 2 13.1 % 35 30 25 20 15 10 5 0 0 5 10 15 20 25 30 35 %Source: OECD, PISA 2009 Database, Tables VI.2.2 and VI.2.3.12 http://dx.doi.org/10.1787/888932435378As shown in Figure VI.2.14, the mean highest proficiency level for both boys and girls across the participating oeCdcountries is level 3, and the percentages of boys and girls performing at this level are quite similar (29% and 31%,respectively). However, the next most common level of performance for boys is level 2 (24% of boys), while for girls itis level 4 (25%); in both cases, around one-quarter of students perform at that level. In other words, on average, overhalf the boys in participating oeCd countries perform at levels 2 and 3, whereas a similar percentage of girls performsat levels 3 and 4. Again, there is substantial variation among countries. At one end of the proficiency spectrum, moregirls in Korea, Australia and new Zealand perform at level 4 than at any other level, whereas only in Korea do moreboys perform at that level than at any other. At the other end of the spectrum, only in Chile do more girls perform atlevel 2 than at any other level, while more boys in both Chile and Poland perform at level 2 than at any other level. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 53
    • 2 sTudenT perFormanCe in digiTal and prinT reading examples oF digiTal reading iTems From The pisa 2009 assessmenT IWANTTOHELP iWaNTToHelP – QuESTION 1 situation: Occupational Level 5 or above Environment: message-based 626 Level 4 text format: Continuous 553 Level 3 text type: Description 480 Level 2 Aspect: Access and retrieve – retrieve information 407 Question format: multiple choice Below Level 2 difficulty: 362 (below Level 2) read Maika’s blog entry for January 1. What does the entry say about Maika’s experience of volunteering? a. She has been a volunteer for many years. B. She only volunteers in order to be with her friends. C. She has done a little volunteering but would like to do more. D. She has tried volunteering but does not think it is worthwhile. Scoring full credit: C. She has done a little volunteering but would like to do more. Comment The first page that students see in this unit is the home page of the blog (Life begins at 16) of a young person named maika. This page contains two entries from the blog, for January 1 and January 6. Although this kind of text often appears on a social networking site, the specific content describes maika’s interest in and plans for doing voluntary work, so this question (and later questions in this unit) are classified as falling within the occupational context. Fifteen-year-old students may not have much experience of volunteering, but the concept is quite concrete, and the text is made accessible by the use of language that is relatively simple and colloquial (“Just a quick post”, “(seriously)”), and addressed directly to the audience who may be reading it (“share my New Year’s resolution with you”, “You may remember”, “has anyone else used this site?”). The page contains features typical of social networking sites, with four links available within the site (“About”, “Contact”, “read my complete profile”, “Comments”) and one link to an external site (www.iwanttohelp.org).54 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT readingThis task requires the reader to identify information about maika’s experience of volunteering. Students need to readthe short text entry for January 1 in order to locate the answer. It is not necessary to scroll down to see the remainderof the entry for January 6, nor for any other kind of navigation. The second and third sentences of the text give anindication of maika’s desire to work as a volunteer, which discounts option D and guides the reader towards thesecond part of the key (“would like to do more”). The key is a simple paraphrase of two pieces of information inthe following sentence: “… last year I did a couple of short term voluntary jobs …, but this year I’d like a long-termposition …”. Given the relative prominence of the information in this short text, the direct and relatively simplelanguage, the lack of need to navigate, and the straightforward way in which terms in the question and key toexpressions they locate in the text are related, this has all the features of an easy question.iWaNTToHelP – QuESTION 2situation: Educational Level 5 or above 626Environment: message-based Level 4 553text format: multiple Level 3text type: Description 480 Level 2Aspect: Access and retrieve – retrieve information 407 Below Level 2Question format: multiple choicedifficulty: 417 (Level 2)Go to Maika’s “ about” page.What kind of work does Maika want to do when she leaves school?a. photography.B. Web design.C. Banking.D. Social work.Scoringfull credit: B. Web design.CommentThis question also starts on the home page of the blog, but the question directs students to navigate to a secondpage. Therefore, in contrast to all print reading tasks, the information needed to answer the question cannot beobtained from the material initially presented: the student needs to locate an additional text by clicking on the link.In this instance, selecting the correct link from the five available is easy because there is a literal match between theterm in the task and the name of the link (“About”), and because the link is prominent. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 55
    • 2 sTudenT perFormanCe in digiTal and prinT reading Once students click on this link, a second text appears, hiding the first text – this is one of the strongest distinctions between print and digital texts. This new text is very brief, containing a small amount of background information about the personal life of the writer of the blog. It can be considered as dealing with information of a kind likely to be fairly familiar to most 15-year-olds. There is minor distracting information in option A, with reference to “PhotoSet” in the text, while option D is also plausible, given the information on the first text (the home page) about maika’s expressed desire to do voluntary work and to make a difference to someone’s life. Answering this question relies on making a literal match between the key and one of the terms in the text, “web design”. The brevity of the text, its simple language, and the literal matches make this question relatively comprehensible; it appears that the need for one navigation step adds an element of difficulty, making it slightly more difficult than the previous question. iWaNTToHelP – QuESTION 3 situation: Educational Level 5 or above 626 Environment: Authored Level 4 553 text format: multiple Level 3 text type: Argumentation 480 Level 2 Aspect: Integrate and interpret – Form a broad understanding 407 Below Level 2 Question format: multiple choice difficulty: 462 (Level 2) Open the link that Maika refers to in her January 1 post. What is the main function of this website? a. To encourage people to buy iwanttohelp products. B. To encourage people to give money to people in need. C. To explain how you can make money by volunteering. D. To provide people with information about ways to volunteer. E. To tell people in need where they can find help. Scoring full credit: d. to provide people with information about ways to volunteer. Comment In this task students are required to recognise the main idea of a text, but in order to do this, they first need to find the text. In order to view the necessary text, they have to click on a link, as indicated in the task. Only one of the hyperlinks on this page occurs within the blog entry for January 1, so the direction in the task is explicit, but four other links available on the page act as distractors. Clicking on the correct link takes the reader not only to a new page, but also to an entirely new website, the home page for an organisation called iwanttohelp. This page opens in a new tab, so that it is possible for students to click on the tab “maika’s blog” if they wish to return to the first text, although that is not necessary for this task. The content of the new website is more abstract, employing terms that may be relatively unfamiliar to students, such as “non-profit organisation”, “opportunity” and “.org”, and is addressed to a large anonymous audience rather than operating at the personal level of a blog.56 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT readingThis text is classified as argumentation because it encourages readers to take action, either by contacting otherorganisations (“Find an Opportunity Now”) or by making donations (“We rely on public donations”). Four links toother part of the website are available on this page if students wish to explore the site in order to obtain a broaderpicture of the organisation. This, however, would be time consuming and inefficient. Such opportunities always existfor anyone reading material on the Internet, so one feature of reading in this environment is being able to judgewhen it is necessary to open new links, thus expanding the number of available texts.In this case, in order to answer this broad understanding question, students need to read the short description of theorganisation provided in the box on the left of the home page, supported by the prominent question and link abovethe photograph. It is not possible to make any literal matches between the task and the key: some (relatively low)level of inference is needed to recognise that this site provides information explaining how people could volunteer.The distractors all have some degree of plausibility, because of their references to the iwanttohelp site, to money andpeople in need, to volunteering, and to giving information about help.This task is somewhat harder than the previous task, although it is still relatively easy. The comparative difficulty isexplained by the need to navigate to the text with the required information using the correct link; the amount ofpotentially distracting information available through irrelevant links on the web pages; the somewhat abstract andunfamiliar information and language used; and the need for a level of inference to answer the question.iWaNTToHelP – QuESTION 4 Level 5 or abovesituation: Educational 626 Level 4Environment: mixed 553text format: multiple Level 3 480text type: mixed Level 2Aspect: Complex 407 Below Level 2Question format: Constructed responsedifficulty: Full credit 567 (Level 4); Partial credit 525 (Level 3)read Maika’s blog for January 1. Go to the iwanttohelp site and find an opportunity for Maika. Use the e-mailbutton on the “Opportunity Details” page for this opportunity to tell Maika about it. Explain in the e-mail whythe opportunity is suitable for her. Then send your e-mail by clicking on the “Send” button.Scoringfull credit: Selects Graphic Artist or upway Primary School and writes a message in the e-mail text box with arelevant explanation that matches maika’s criteria.E-mail message for Graphic ArtistRefers to ongoing position or future or web design or art.• You’re a great artist and it is ongoing – you said you wanted a longer type of work right?• It’s ongoing and it would help you get experience for your future.• You are obviously interested in graphic design, and want to pursue this when you finish school, and you would also love to volunteer. this would be a great opportunity to do both these things, and will look great on your CV too!oRE-mail message for Upway Primary SchoolRefers to ongoing position or making a difference.• this would be a good job – ongoing and you get to help some kids.• Here’s a job where you’ll really make a difference.Partial credit: Selects Graphic Artist or upway Primary School and writes a message in the e-mail text box with noexplanation or an irrelevant explanation.E-mail message for Graphic ArtistGives insufficient or vague answer.• You’d like it.Shows inaccurate comprehension of the opportunity or gives an implausible or irrelevant answer.• You’d be working with kids a lot. [Irrelevant, not one of maika’s criteria.]• It gives you a chance to get out and about.oR PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 57
    • 2 sTudenT perFormanCe in digiTal and prinT reading E-mail message for Upway Primary School Gives insufficient or vague answer. • You need an hour a week but it sounds like this could be what you’re looking for. [Lacks reference to job criteria, repeats part of stem.] • You’d like it. Shows inaccurate comprehension of the opportunity or gives an implausible or irrelevant answer. • It gives you a chance to get out and about. Comment This is an example of a complex task, which involves all three aspects of reading. It also has a substantial navigation requirement. This complexity highlights a number of differences between print and digital reading tasks. The overall task requires students to construct a short e-mail message after integrating and reflecting upon information located in several texts. The text type has not been specified because the task requires the reader to integrate information from several types of text: argumentation (the iwanttohelp website), description (maika’s blog) and transaction (the e-mail). beginning with an interpretation of information given on maika’s blog, students are then required to locate a number of pages on the iwanttohelp website, evaluate information on these pages in relation to what they have read on the blog, and use the evaluation to send maika a simple message. There is no single pathway for navigation, and two different texts can be used to formulate responses that receive credit. This variability is typical of navigation in the digital environment. The task requires students to navigate from the starting page, maika’s blog, to the Latest Opportunities page shown below. To see the whole page, scrolling is required. This page offers four opportunities for students to evaluate on maika’s behalf, each with links providing additional information. Students may open as many of the links as they consider necessary. The page for the upway Primary School opportunity is shown below.58 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT readingThis text is fairly short, but relatively dense, with quite complex vocabulary (“an innovative approach”, “a morediverse population”, “foster the academic development”, “academic support”). Having located the opportunities,students need to compare descriptions of the opportunities with the criteria given on maika’s blog. They may clickon the tab to re-read her entry for January 1, where she refers to wanting “a long-term position” in which she can“make a difference”. A broad understanding of the upway Primary School text would support the evaluation thatworking here would fit maika’s criteria. This interpretation is supported by expressions such as “The volunteer meetswith the student … for a minimum of one year” and “through academic support, positive role-modelling, and aone-to-one friendship, students will succeed”.Some students may also use the link “read my complete profile” or “About”, which refers to her interest in “a futurein web design” and to her “artwork”. The information here supports the selection of the Graphic Artist opportunity.Students may use the “back” and “Forward” buttons, the links on each page and the scroll bar to navigate backand forth between descriptions of various opportunities until they have selected the one that they judge to be mostsuitable. In each case it is necessary to scroll down to see a full description of the opportunity.Once students have chosen an opportunity, they need to construct an e-mail message to send to maika. They dothis by opening yet another link, “E-mail opportunity details to a friend”, in accordance with the task instructions.The page where they do this has the e-mail address and subject lines already completed, together with the beginningof a message: “Thought you’d be interested in this volunteer opportunity because...”. To receive credit, students mustselect either the Graphic Artist or the upway Primary School opportunity. Students who recommend the Graphic Artistopportunity receive full credit if they refer to the fact that this opportunity is an ongoing position; or comment that it isrelevant to her future or to her interest in web design or art. Students who recommend upway Primary School receivefull credit if they refer either to the fact that this is an ongoing position or to the idea of making a difference.Students who select one of these two opportunities but do not write a message that refers to the criteria maika is seekingnevertheless receive partial credit for having successfully completed much of this complex task: accessing relevantinformation, comparing information from different texts and making a judgment about which opportunity is suitable.In summary, in order to obtain full credit for this task, students need to go through a series of processes, involvingmultiple navigation steps to access a series of texts. Some of the navigation steps are made explicit in the taskinstructions, but readers need to make multiple evaluations of the available links to decide which ones would allowthe most efficient way of completing the task. Students need to make multiple interpretations of texts, from maika’sblog as well as various pages on the iwanttohelp website, and to compare ideas and information across these texts,in support of the reflection and evaluation that the task requires. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 59
    • 2 sTudenT perFormanCe in digiTal and prinT reading SmELL sMell – QuESTION 1 situation: Educational Level 5 or above 626 Environment: Authored Level 4 text format: multiple 553 Level 3 text type: Exposition 480 Aspect: Integrate and interpret – Form a broad understanding Level 2 407 Question format: multiple choice Below Level 2 difficulty: 572 (Level 4) Go to the “Smell: a Guide” web page. Which of these statements best expresses the main idea on this page? a. Smell can interfere with normal patterns of behaviour. B. Smell warns humans and animals of danger. C. The primary purpose of smell is to help animals to find food. D. The development of smell takes place early in life. E. The basic function of smell is recognition. Scoring full credit: e. the basic function of smell is recognition. this question presents a list of six search results for the term “smell”. only the first four are immediately visible. If students wish to see the full list of six they need to either scroll down or click on the “maximise” button in the top right corner of the browser. the screen shot below shows what the students see if they click the “maximise” button.60 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT readingCommentThe question first directs students to navigate to a second web page, “Smell: A Guide”, and to identify the mainidea of the text on this page. The information needed to answer the question cannot be obtained from the materialpresented in the search results. Links are available from the search results page to several other pages. There area maximum of four available tabs in this task: the Global Search page, Smell: A Guide, Food in the News, andPsychology Now. The links to the remaining three results lead to a page that states, “This page has no contentavailable.” and has a link back to the search results page. Selecting the correct link from the six available is easy,because there is a literal match between the term in the task and the name of the link (“Smell: A Guide”), andbecause the link is the first in the list, and hence the most prominent.Once students click on this link, a second text appears, in a new tab. This is a relatively long and dense expositorytext dealing with the role of smell. Students can identify that it is published by a research and teaching department(as indicated by the link “Current research Projects” on the left of the page, and the headings “Teaching” and“research and teaching information”) in a university (the urL for the page is ”www.biology.litternuni.edu.au/smell/index.html”). The text examines the everyday concept of smell in a scientific way. It contains multiple reference toeveryday concepts, relating the abstract notion of the role of smell to these concepts in concrete ways (for example,“potential danger” is illustrated by “smoke indicates fire”; “Elephants’ sense of smell” is related to how humansharness it in tracking poachers; babies’ reactions to unpleasant smells are described). Consistent with its originand purpose, the text includes some specialised (scientific) language (“identity of other living creatures”, “uniquelyidentifiable”, “land mammal”, “foraging ants”, “facial expressions that indicate rejection”, “a putrid smellingsubstance”) that requires careful reading and good vocabulary knowledge for complete understanding.Students need to use the scroll bar to view the full text, and scrolling is probably necessary for this question, whichfocuses on the main idea of the text. Distracting navigational features are provided by top and side menus.The first four options contain strongly distracting information of various kinds. Option A includes the ideas both ofinterference and of patterns of behaviour, plausible in this scientific context, except that the text does not support alink between them. Option b (chosen by over 25% of students) is possibly the strongest distractor because it appealsto common sense, and offers a simple paraphrase of an example of how smell is used, an idea presented in thesecond sentence in the text (“Sometimes our sense of smell can warn of potential dangers.”); however, this idea isnot consistently discussed through the text. Option C involves a misinterpretation of another sentence in the sameparagraph, which describes another example of the use of smell (“sometimes”; “for example”), not its primary purpose.Option D presents a literal match (“early in life”) with an idea presented in the text, but a detail rather than themain idea. Students can be expected to need to skim the entire text in order to relate the terms “basic function” and“recognition” in option E with a global interpretation of the text. The idea of “basic function” is hinted at in the openingsentence of the text (“the role of smell”), but it would be premature to link “information about the environment” fromthe text with “”recognition”. It is the repetition of descriptions of functions of smell and examples of how these relateto recognition, scattered through the text (“potential dangers … smoke … fire”; “distinguish … twins .. siblings”; PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 61
    • 2 sTudenT perFormanCe in digiTal and prinT reading “elephants … track poachers”; “ants … know when to leave the nest”; “babies … rejection”), that allows the reader to identify this option as the key. Despite this repetition of ideas, the item is relatively difficult, most likely as a result of the combination of the length of the text, the use of specialist (scientific) language, and the plausibility of the information provided in the distractors. sMell – QuESTION 2 Level 5 or above situation: Public 626 Level 4 Environment: Authored 553 text format: multiple 480 Level 3 text type: Exposition Level 2 407 Aspect: reflect and evaluate – reflect on and evaluate content of text Below Level 2 Question format: Open constructed response difficulty: 657 (Level 5 and above) Go to the “Food in the news” web page. Would this web page be a suitable source for you to refer to in a school science assignment about smell? answer Yes or no and refer to the content of the “Food in the news” web page to give a reason for your answer. Scoring full credit Answers (or implies) no and gives a plausible supporting explanation, referring to the trivial or sensational nature of the website content, or the popularisation of the issues by journalists or the site’s failure to explicitly give its sources of information. • no, it’s just trying to popularise science and has almost certainly oversimplified the original research. • no, it just offers sensational news. look at the superficial issues covered in this site. • no, it is obviously from a popular news magazine not a scholarly source. • no, it has loads of silly links that show it’s not a serious site. • no, not suitable because it is just written by journalists not scientists. oR Answers (or implies) Yes and indicates that the site would be helpful as a secondary source, leading to more reputable sources. • Yes, it would help me to find the original research. • Yes, I would use it to look and see if more serious publications said the same thing. oR Answers (or implies) Yes and gives a plausible supporting explanation, referring to the article’s sources of information or the level of detail provided. • Yes, because it is a review of real research. • Yes, because it talks about several real studies. • Yes, they’re talking about a study that won a nobel prize, so it must be true. • Yes, the study is described in detail so I don’t think they would make it up. no credit Gives insufficient or vague answer. • Yes, the Food in the news page was convincing because the results that they were showing did not seem opinionated and sounded reliable. [vague] • I don’t think it’s reliable because it’s about the power behind our sense of smell. [vague] • Yes, it’s a long article. Why would they make all that up? • no, my teacher would not be impressed. oR Shows inaccurate comprehension of the material or gives an implausible or irrelevant answer. • Yes, because it’s by a motoring organisation, which really matters. [irrelevant] • I think it would be reliable because it describes how smell can affect your mood. [irrelevant]62 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT reading Comment The text is again quite lengthy, and scrolling is required to view it in its entirety. This question asks students to open a different link from the previous question. Again, the relevant link is simple to identify there is a literal match with the question stem, and no scrolling is required to see the link, “Food in the news”. Evaluate two web pages in terms of credibility/ trustworthiness of information. Follow a link from search results to a web page using literal match. Scrolling is needed to read the full text. Contextual support relevant to response includes other links on the page. Text includes some specialised (scientific) language. Students are required to evaluate the web page, “Food in the news”, in terms of its suitability as a reference for a school assignment. This kind of task may be considered representative of the kind of issue faced with tremendous frequency by students when completing school tasks involving Internet- based research.The page has many features which may contribute to the students’ evaluation. The page itself has numerous linksthat indicate this is a commercial site (carrying the url “whatsinthenews.com”), which clearly has a populist pitchthat intends to reach a wide audience (“Entertainment”, “TV Guide”, “Shopping”, “Advertise with us”), and whichhas few, if any, pretensions to academic seriousness. There are also links to related sites (“Travel in the news”, etc.),and to a series of related stories which have a rather sensationalist flavour (“Cheese makers to use electronic nosefor market gain?”, “Anyone can learn to love vegetables”). These features would tend to make the text accessible tostudents. The lack of academic pretension is reinforced by the fact that the article on this page carries a somewhatsensationalist title, “The smell of pizza can change people’s behaviour”, and is not credited to any specific source.There is reference to “a leading European motoring organisation” as the source for a “review of research”, but noreason is offered for why such an organisation should concern itself with the diverse findings related to smell thatare included here. All of these features may be considered relevant to a view that the site could be consideredunsuitable as a source for a school assignment.On the other hand, the article presented here does refer to results of a range of scientific findings from several studies;it does name the author of the review (“Conrad King”); it refers to some of the researchers by name (“researchersrichard Axel and Linda buck”) and offers them credibility by citing the fact that they were recipients of “a NobelPrize in 1994 for their ground-breaking research”). Some of the detail presented lends credibility to the article, mostnotably the paragraphs which begin, “Smell, which essentially dictates the incredible complexity of food tastes, hasalways been the least understood of our senses” and “However, the way genes regulate smell differs from person toperson”. These paragraphs contain technical (scientific) language appropriate to the topic and information presented(“family of 1,000 olfactory genes”; “olfactory genes which are switched on in some people and not in others”;“nearly every human being displays a different pattern of active and inactive odour-detecting receptors”); these alsoadd to the difficulty of the text. These features could be referred to in support of a claim that the site would providesuitable information for a school assignment on smell. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 63
    • 2 sTudenT perFormanCe in digiTal and prinT reading Students have to make their own evaluation of the web page for the stated purpose, using one or more of the ideas discussed here, and then to express this. Evaluation of the suitability of something as abstract as a text page for a hypothetical purpose is a complex psychological task, perhaps especially so when, as here, there are multiple arguments to be made in support of or against a position. Students need to form a mental image of a hypothetical school science assignment, including the process of conducting research, then consider whether the information here would be suitable for it. There is no direction as to whether they should consider the content, the style or any other specific features of the web page. The challenge posed by the specialist nature of some of the scientific language, the length of the text, including the wide range and the requirement to refer to the content of the web page, rather than just to talk in vague terms about notions of suitability or its lack all contribute to the difficulty of this task. It seems that this kind of evaluative task, critical though it probably is for 15-year-old students, is not easily managed, as this task is in the high range of difficulty. sMell – QuESTION 3 Situation: Public Level 5 or above Environment: Authored 626 Level 4 text format: multiple 553 Level 3 text type: Exposition 480 Aspect: Integrate and interpret – Develop an interpretation Level 2 407 Question format: multiple choice Below Level 2 difficulty: 485 (Level 3) There is information about the smell of lemon on the pages “Food in the news” and “psychology now”. Which statement summarises the conclusions of the two studies about the smell of lemon? a. Both studies suggested that the smell of lemon helps you work quickly. B. Both studies suggested that most people like the smell of lemon. C. Both studies suggested that the smell of lemon helps you to concentrate. D. Both studies suggested that females are better at detecting the smell of lemon than males. Scoring full credit Code 1: C. Both studies suggested that the smell of lemon helps you to concentrate. Synthesise information from two web pages. Follow links from search results to two websites using literal match. Identify generalisation common to information on both sites.64 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT readingCommentAs with the previous tasks in this unit, students cannot answer this question from the Global Search page initiallypresented. Instead, students need to locate and read multiple texts. This task introduces a third text, again accessibleusing literal matches between the question and the search result links. Students are required to compare ideas in thisnew text, “Psychology Now”, and the one seen in the previous task, “Food in the news”. Three tabs are open, andstudents need to switch between two of them, possibly multiple times, in order to synthesise information in the texts.because only one text can be visible at one time, demands are placed on students’ memory in a way that is unlikelywhen all relevant information is presented on a single page. The new text is shorter than the other two students haveread, with no scrolling required. The page contains a series of links on the left, but these are not strong distractors,as there are no terms in them which match expressions or ideas in the question. They need to locate within thetwo pages references to studies about lemon. In each case they can make a literal match on the word “lemon”. In“Psychology Now”, this is easily found in the second paragraph, but the term is much harder to locate in “Food inthe news”, as students need to scroll down until they see it in the penultimate paragraph.The options offer distracting information in the form of ideas included in one of the texts.The ease of locating the term “lemon” in both the pages is very likely the key reason why this task proved to berelatively easy.The paragraph about lemon in the text “Psychology Now” is about work (option A); the paragraph also mentions“smell of lemon” and “54%”, which could lead to association with option b (“most people like the smell of lemon”);option D receives support from the sentence, ” Women are generally better at identifying smells than men.”, whichprovides a generalisation that goes beyond the specific issue of lemon, the focus of the question. Students need tosynthesise information spread throughout the paragraph to infer a link between a reduction in typing errors in theworkplace and the idea that the smell of lemon helps concentration.In the text, “Food in the news”, the reference to smell is not at all prominent, being found in the sixth paragraph.Once students have located this, though, it is relatively easy to relate terms in the text (“concentration levels …Similarly … smells of lemon … promote … mental focus”) with the key. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 65
    • 2 sTudenT perFormanCe in digiTal and prinT reading JOb SEArcH Screen shots are used to illustrate parts of the stimulus relevant to each question. the digital version of this unit and other released tasks are available at www.erasq.acer.edu.au. JoB searCH – QuESTION 1 situation: Occupational Level 5 or above Environment: Authored 626 Level 4 text format: Non-continuous 553 text type: Description Level 3 480 Aspect: reflect and evaluate – reflect on and evaluate content of text Level 2 407 Question format: multiple choice Below Level 2 difficulty: 463 (Level 2) This is a page from a job search website. Which job in this list is most suitable for school students? Click on the button next to the job. Scoring full credit Code 1: B. Juice Bar team members. Comment The context for this question is a website that helps people to find and apply for jobs. The page that students see is a list of four available jobs, listed as “Today’s Jobs”. Initially the first two are fully visible, and students can see the full list by either scrolling down or clicking on the “maximise” button in the top right corner of the browser. The screen shot below shows what the students see if they click the “maximise” button. The text is fairly short, organised in list form, and uses fairly simple language that should be familiar to students, even if they have had no experience of employment or seeking a job. In order to determine which job is most suitable for school students, readers need to use clues related to time and availability. The expressions “for weekdays”, “full time” and “9am will allow them to 5pm” to reject the first and the last two options; “part-time job” and “from 5pm” indicate that the second option is likely to suit school students. Distracting information is included in the reference to “secondary school” in the third job listed, while the kinds of jobs listed, “café staff” (the first job in the list) and “retail assistant” (the third job) are the kind of job that many students may think of as suitable for school students. There is no need to click on any links, or to explore the Job Search website in order to find the information needed to answer this question. The combination of the fact that the text is fairly simple, and the lack of navigation needed probably contribute to the relative facility of this question, which about two-thirds of students answered successfully.66 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT readingJoB searCH – QuESTION 2situation: Occupational Level 5 or above 626Environment: message-based Level 4 553text format: multiple Level 3text type: Description 480 Level 2Aspect: Integrate and interpret – Develop an interpretation 407 Below Level 2Question format: Complex multiple choicedifficulty: Full credit 624 (Level 4); Partial credit 462 (Level 2)You have decided to apply for the Juice Bar job. Click on the link and read the requirements for this job.Click on “ apply now” at the bottom of the Juice Bar job details to open your résumé page. Complete the“relevant Skills and Experience” section of the “My résumé” page by choosing four experiences from thedrop down lists that match the requirements of the Juice Bar job.ScoringQuestion intent: • Integrate and interpret – develop an interpretation • Analyse a list of options using predefined criteriaScoring Comment: Initially each part is coded separately. Final scoring combines codes as shown below.full creditSelects the following four experiences (in any order): • efficient at cleaning dishes: working at Corner Restaurant • Good at following instructions: followed kitchen safety regulations daily • Knowledge of food handling and preparation experience: work at Corner Restaurant • Work well with team: won the 2007 sports team player awardPartial creditSelects any three of the following four experiences (in any order): • efficient at cleaning dishes: working at Corner Restaurant • Good at following instructions: followed kitchen safety regulations daily • Knowledge of food handling and preparation experience: work at Corner Restaurant • Work well with team: won the 2007 sports team player awardCommentThe task in the second question of this unit is for students to open a job advertisement, then adopt the role ofsomeone applying for this job, and to decide which of a list of qualifications and experiences from their résuméare relevant to the job described in the advertisement. They are not required to write a job application, nor to haveexperience of working or applying for jobs, as all the information needed is supplied in the texts. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 67
    • 2 sTudenT perFormanCe in digiTal and prinT reading This question opens on a different page from the first, although is still part of the same Job Search website. The open tab “Today’s Jobs” from the previous question has been replaced here with “Current Job”, and the main text displayed here is the prominent link “View details of job: Juice bar Team members”, which is the job from the list displayed in the previous question that is suitable for school students. The link also explains that if it is clicked, a new tab will open. This question requires relatively complex navigation, in that students need to open several pages, and to compare multiple piece of information on two of them. The question provides explicit instructions to students about these navigation steps, directing them to click on the prominent link on the page that is open to view the job advertisement, then on a link on that page, and finally to select four options in the drop down menus available on the third page that will open. They are also directed to use information from the job advertisement to inform their selection. The screen shot below shows the job advertisement, and how the tab for the “Job Search” page remains open. The “Apply Now” link referred to in the task instructions is only visible if they scroll down or maximise the page. The text of the “Juice bar” advertisement is written in a way designed to appeal to young people, using terms such as “energetic”, “vibrant”, “HEAPS OF ENErGY”, “FuN” and “a bit of extra cash”, and aims to make the job appear accessible to people even if they do not have relevant experience (“preferred but not essential”). The absence of specialist language, and the list format for qualifications of Juice bar workers and of the available shifts, mean that the reading demand should be fairly low, as each idea is expressed in a minimum number of words.68 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT readingThe screen shot below shows how the third page opens in a third tab. Students are therefore able to move betweenthe key pages, the “Juice bar” advertisement and the “my résumé” page, by clicking on these tabs.In order to view all the options in the drop-down menus, students need to scroll down using the arrows under “myrelevant Skills and Experience”. The screen shots below illustrate the first six options that students see when theyfirst click on one of the arrows, and then the last six of the available ten, when they scroll down within the window.The list of ten options is the same for each drop-down menu.A number of irrelevant links can be clicked, but these are at the bottom of the “Job Search” pages and are notprominent. Clicking on the tabs on the “Job Search” page allows students alternative (if slightly longer) pathways fornavigating to the “Juice bar” and “my résumé” pages.Students are directed to refer to the Juice bar advertisement for the job specifications, in order to inform their choiceswhen completing the drop-down lists. It is to be expected that many students will switch between these two pagesseveral times, to be sure they have obtained all the information they need.because navigation for this task requires a number of steps, including comparing information on two pages, andmaking choices in four different drop-down menus, it can be seen as relatively complex. The first three relevant optionsare visible without scrolling down. Although a level of common sense may assist in making choices relevant to a jobin a juice bar, only reference to the advertisement can confirm the nature of the job requirements. It is relatively easyfor students to obtain partial credit by correctly selecting three relevant options. The fourth option relevant to the jobadvertisement is the final one in the list, and selecting it requires making a inference to link winning a “Sports TeamPlayer of the Year award”, which is not immediately relevant to working in a juice bar, with the job requirement,“co-operates in a group”. In order to obtain full credit for this item, students need to select all four relevant options. Thecombination of the multiple navigation steps, the multiple drop-down menus, and the need for an inference to selectthe fourth relevant option contribute to the relative difficulty of obtaining full credit for this item.JoB searCH – QuESTION 3 Level 5 or abovesituation: Occupational 626 Level 4Environment: Authored 553text format: mixed 480 Level 3text type: Description Level 2 407Aspect: reflect and evaluate – reflect on and evaluate content of text Below Level 2Question format: Open Constructed responsedifficulty: 558 (Level 4)“note: Successful applicants can work a maximum of two shifts per week.”Why do you think the employer has made this rule? PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 69
    • 2 sTudenT perFormanCe in digiTal and prinT reading Scoring Question intent: • Reflect and evaluate – Reflect on and evaluate content of text. • Hypothesise about the reason for including a condition in a job advertisement using prior knowledge and information from the text. full credit Code 1: Refers (explicitly or implicitly) to a benefit or protection for the employer oR employee. must be consistent with the stipulation of not working more than two shifts and with working a fixed two shifts. may refer to the flexibility, reliability or effectiveness of the (pool of) employees or to the employer’s concerns about employee welfare. • It is safer that way because the business can still operate oK if someone is off work for a few weeks. • Students often have other priorities at those times. [“Those times” refers to the shifts in the advertisement. Implies benefit to employee.] • It is unlikely most students can do more than 2 shifts a week. • they don’t want to rely on any one person. [implied protection from risk] • they say that at the start in case you’re not very good. • they want lots of different people working there. • they want lots of happy faces. • they don’t want you to get tired. • Because it’s a tough job, and they don’t want you to get tired and quit. • Because they want a big staff in case someone quits or gets sick. • Because the chaos at the Juice Bar is too much for anyone more often than twice a week. • Because the best workers are people with other interests/hobbies than the job, and they want you to keep doing what you like. • So students and other people who may be studying or holding down other jobs can still work casually but don’t have the restrictions of working all day every day. Comment The final question in this unit requires no navigation beyond scrolling down the open page (the advertisement for the Juice bar) to view the sentence referred to in the instructions, that students may work no more than two shifts a week. Students need to draw on their world knowledge as well as ideas presented in the advertisement to understand why such a restriction might be included. They receive credit for answers that consider the interests of either the employer or the employee. Clues in the advertisement that may be relevant include the reference to a team (numerous workers), to the busy, energetic nature of the work, the need to present a happy face, etc. The requirement for students to make plausible links between these ideas in the text and their potential implications in the real world, seeing the view point of either employers or employees, rather than considering only their own situation, is likely to play a major role in the relative difficulty of this item.70 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT readingsimilariTies and diFFerenCes beTween digiTal and prinT readingassessmenTsthe framework for reading (see Box VI.1.1 and oeCd, 2010a, Chapter 2) treats digital and print reading as a singledomain, while acknowledging that there are some intrinsic differences. A key distinction that underpins manyconsequential differences is the fact that, in the digital medium, the reader is generally unable to see the physicalextent of the available text at any given moment, while at the same time he or she has almost immediate access to anearly infinite array of material via the Internet. the differences reflected in the framework were built into the designof the two assessments and the tests themselves. In this section, the construct and balance of the digital and printassessment instruments are compared in relation to the assessment framework, and then the design and operationalsimilarities and differences of the two assessments are reviewed.Framework characteristics and test constructthe intent in developing and extending the framework for reading to include digital reading was both to acknowledgethe unitary nature of reading, regardless of medium, and to respect the differences between digital and print. twomain framework variables, text and aspect, shape the development of both digital and print reading assessments.2Textthe PISA framework for reading describes four text characteristics: medium (print or digital), environment, formatand text type. the text environment category – authored or message-based – is only applicable to digital reading.the main categories of text format in print reading are continuous and non-continuous, reflecting the fact thatin print, readers often have access to and encounter only a single text at a particular time. While in everyday lifereaders often need to consult several print texts, PISA makes only minimal use of such tasks for practical reasons.By contrast, a computer-based reading assessment makes presentation of multiple texts a practical possibility, aswell as reflecting the reality that in the world of hypertext, on which the PISA digital reading assessment is focused,there is almost unlimited access to texts; and reading in this medium usually involves referring to several pages,and often to several texts from different sources, composed by different authors and appearing in different formats.the distribution of tasks by text format in the two media thus reflects both typical reading practices, and a betteropportunity for large-scale assessment to measure readers’ capacity to access, sort and selectively use several texts.Figure VI.2.15 shows the number and percentage of score points by text format on the PISA 2009 digital and printreading scales. the numbers and percentages quoted in this and the following similar figures relate to score pointsrather than individual task numbers. this allows for a more accurate representation of the relative weighting of thesecategories in the instruments. • Figure VI.2.15 • distribution of score points in digital and print reading assessments, by text format number of score points % of total score points number of score points % of total score pointstext format PisA 2009 digital PisA 2009 digital PisA 2009 print PisA 2009 printContinuous 2 5 87 62Non-continuous 4 11 41 29Mixed 2 5 7 5Multiple 30 79 5 4total 38 100 140 100Source: oeCd, PISA 2009 Database.12 http://dx.doi.org/10.1787/888932435378Text type refers to the rhetorical structure of a text. the category transactional was introduced into the PISA 2009framework to reflect such texts as e-mails and text messages, which are the predominant type of text encounteredby many digital readers (see Chapters 4 and 5). While transactional texts also exist in the print medium, in personalletters and notes, for example, they are not as prominent. Conversely, the category narration is more prominent inthe print reading assessment, representing its importance in print reading behaviour. the substance of narrationis social and personal experience and imaginative life, in the form of literature, history, biography and memoir.these texts are typically an important part of school curricula and they are also valued types of reading by manyindividuals beyond school. narration in the digital medium, in the form of e-books, was not yet common when the2009 assessment was being developed in 2006-07. Figure VI.2.16 shows the number and percentage of score pointsby text type in the PISA 2009 digital and print reading assessments. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 71
    • 2 sTudenT perFormanCe in digiTal and prinT reading • Figure VI.2.16 • distribution of score points in digital and print reading assessments, by text type number of scorepoints % of total score points number of score points % of total score points text type PisA 2009 digital PisA 2009 digital PisA 2009 print PisA 2009 print Argumentation 8 21 30 21 Description 11 29 32 23 Exposition 11 29 44 31 Narration 0 0 22 16 Instruction 0 0 12 9 Transaction 6 16 0 0 Not specified 2 5 0 0 total 38 100 140 100 Source: oeCd, PISA 2009 Database. 12 http://dx.doi.org/10.1787/888932435378 Aspects three cognitive processes, or aspects, are common to digital and print reading: access and retrieve, integrate and interpret and reflect and evaluate. the aspect access and retrieve involves orienting and searching, using knowledge of the medium’s structures and features to find information. In print reading, readers apply their skills in accessing and retrieving in a concrete space, while in digital reading they do so in a more abstract space. In addition, the sequence in which information is presented in print is more or less fixed, while in the digital medium readers construct their own sequences of information to a greater extent. As a result, the cognitive load of access and retrieve tasks in digital reading is generally greater than that in print reading. However, in the digital reading assessment, the degree to which readers have to construct their own sequence of information retrieval is often controlled by the task directives, such as “Click on the link …, then go to the page …” (for details, see Chapter 3). the aspect integrate and interpret covers a very wide variety of cognitive tasks, including inferring the connection between one part of the text and another, processing the text to form a summary of the main ideas, identifying the distinction between principal and subordinate elements, finding a specific instance in the text of something earlier described in general terms, and comparing, contrasting and understanding figurative and nuanced language. All of these cognitive processes are common to digital and print reading. the main difference lies in what needs to be integrated. the number and diversity of the texts that can be drawn upon are usually much greater in the digital medium, and this is reflected in the PISA assessments. Integrating in the digital assessment requires the reader to consult multiple texts, sometimes in different formats, while integrate tasks in print reading usually focus on a single piece of stimulus. the aspect reflect and evaluate involves thinking about the form and the content of texts, both in relation to personal experience and to more extrinsic standards. While predictive reading and critical evaluation are important in both media, readers of digital texts are more often required not only to predict what will be useful and relevant, because there is so much information to choose from, but also to judge the credibility of the content, given that publication is often not subject to any editorial filter between the author and the reader. this fact is reflected in the larger proportion of tasks in the assessment that focus on students’ capacity to evaluate what they read. the percentage of tasks devoted to each of the aspects varies between the digital and print reading assessments. In print reading, tasks reflecting the integrate and interpret aspect occupy about half of the assessment, while access and retrieve and reflect and evaluate tasks each account for roughly one-quarter of the assessment. tasks in the digital reading assessment are more evenly spread across these three aspects. moreover, in some digital reading tasks, readers must draw on all three aspects, for example in navigating to and between multiple texts, in sequences that may vary substantially. the digital reading assessment therefore adds another aspect, complex, to acknowledge the fact that the complexity of some tasks cannot be represented by any one of the three previously-established aspects. Figure VI.2.17 shows the number and percentage of score points, by aspect, in the PISA 2009 digital and print reading assessments.72 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT reading • Figure VI.2.17 • distribution of score points in digital and print reading assessments, by aspect number of score points % of total score points number of score points % of total score pointsAspect PisA 2009 digital PisA 2009 digital PisA 2009 print PisA 2009 printAccess and retrieve 7 18% 34 24Integrate and interpret 11 29% 69 49Reflect and evaluate 8 21% 37 26Complex 12 32% 0 0total 38 100 140 100Source: oeCd, PISA 2009 Database.12 http://dx.doi.org/10.1787/888932435378Test design and operational characteristicsIn addition to differences in the constructs of the two reading assessments, there were differences in how they wereadministered. Figure VI.2.18 sets out the major similarities and differences in the design and delivery of the PISA2009 digital and print reading assessments. • Figure VI.2.18 • similarities and differences between digital and print reading assessments in pisa 2009feature Digital reading Print readingMode of delivery and data collection Computer-based delivery system Pencil and paperNumber of countries participating A subset of 19 (16 oeCd countries 65 (34 oeCd countriesin the assessment and 3 partner countries /economies) and 31 partner countries /economies)Required number of students per country 1 500 4 500Actual average number of students per country oeCd countries: 1944 oeCd countries: 8800that administered the assessment Partner countries/economies: 1820 Partner countries /economies: 5700Average number of students per school 10 30that administered the assessmentNumber of items 29 131Number of score points 38 140Average test administration time per student 40 minutes 65 minutesAverage number of score points yielded 25 33per studentScale construction Single digital reading scale Single print reading scale and subscales based on aspects and text formatsSource: oeCd, PISA 2009 Database.12 http://dx.doi.org/10.1787/888932435378Mode of delivery and data collection, and implications for participation and sample numbersthe immediately obvious difference between the digital and print reading assessments is that the former was deliveredand completed on a computer and the latter was delivered and completed with pen and paper. Because computer-based assessment is relatively new, technically challenging and requires substantial resources, many of the earlyattempts to assess digital reading and other computer-based knowledge and skills have used a paper-based format.In some instances, a hybrid model was used, in which the stimulus is delivered via computer but the responses arecaptured on paper. Conversely, as computer-based assessments become more common and cheaper, print reading isbeginning to be assessed on line, with print-style texts represented digitally. For PISA, it was judged important to usecomputers both for delivering the tasks and for collecting students’ responses. this approach reflects the nature of thedigital reading texts, thus allowing measurement of students’ activated knowledge about and skills in using texts in themedium. It also allows for collecting evidence about 15-year-olds’ performance on reading tasks in a way that reflectsthe definition of reading as entailing the capacity to “use … written texts”: for example, students respond to somedigital tasks by selecting from drop-down menus (in the case of selected-response items), or in the form of a blog ore-mail message (in constructed-response items). these response formats provide an added dimension of authenticity. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 73
    • 2 sTudenT perFormanCe in digiTal and prinT reading the decision to use a digital mode of delivery and data collection had resource implications, which undoubtedly contributed to the fact that only 19 of the 65 PISA countries opted to participate in the 2009 digital reading assessment. the need to make computers available also influenced the decision to administer the digital reading assessment to a smaller sample than usual within the participating countries. one-third of the students in each sampled school who undertook the paper-based assessment were selected for the digital reading assessment. All of the students in the digital reading sample had also been assessed in print reading, so that comparisons between performances on digital and print reading can be made with confidence. For further details about implementing both assessments, see PISA 2009 Technical Report (oeCd, forthcoming). Number of items and score points in digital and print reading Just as the sample of students who participated in the digital reading assessment is smaller per country than those who participated in the paper-based assessment, so the pool of items used in 2009 is also comparatively small: 29 digital reading tasks compared with 131 print reading tasks. A larger proportion of the digital items has partial- credit scoring, however, which means that the ratio between the pooled score points for digital and print reading (38 compared with 140) is higher than that between items. these differences narrow further when considering the measures of student proficiency. each student sampled for the paper-based assessment in PISA was administered a test of 120 minutes. Within this time all students in the sample spent between 30 and 120 minutes on reading tasks, with an average of 65 minutes of reading. (the students’ remaining time was dedicated to mathematics and/or science assessment tasks.) All students in the subsample for digital reading assessment were delivered 40 minutes of test material. In effect, while the whole item pool is much smaller for digital than for print reading, at the student level there was much less difference between the amounts of assessment data collected per student: on average, 33 score points for print reading and 25 score points for digital reading. As a result, the precision and reliability of the measurement of student performance in the two media are similar. nonetheless, from the perspective of framework coverage and reporting on subscales, the difference between print and digital reading in the numbers of items and score points is significant. In print reading, framework coverage is well supported by the comparatively large pool of items, and three aspect subscales (access and retrieve, integrate and interpret and reflect and evaluate) and two text format subscales (continuous and non-continuous), as well as a single scale for print reading, were constructed and reported upon (See oeCd, 2010b, Chapter 2). While the pool of 29 digital reading items allows for a light sampling of almost all of the categories of each of the major framework variables, yielding a single digital reading scale, there are insufficient data to support any subscale construction. a Comparison oF perFormanCe in digiTal and prinT reading overall, the correlation between digital and print reading performance is 0.83, with correlations for individual countries ranging from 0.71 to 0.89. By way of comparison, the correlations of print reading with mathematics and science (average for the 16 oeCd countries) are 0.83 and 0.88, respectively; the correlation of digital reading with mathematics and science is 0.76 and 0.79, respectively. though there is clearly a strong relationship between performance in print reading and digital reading, the correlation statistic also indicates some performance differences between the two types of reading. the scales for the two reading assessments were constructed in a similar way so that, when considering only the 16 oeCd countries that participated in the digital reading assessment, the mean and standard deviation for both digital and print reading are 499 and 90, respectively (the digital scale having been constructed to match the PISA 2009 results in print reading of the 16 participating oeCd countries). therefore, country comparisons of reading performance in the two media are valid. students reaching the different levels of proficiency In order to facilitate comparison, the proficiency levels for digital reading – level 5 or above, level 4, level 3 and level 2 – are aligned with the same proficiency levels for print reading. the comparison is limited by the fact that the number of digital reading items administered in 2009 was small, so that while print reading has seven described levels (level 6 as the highest level and level 1b as the lowest level), digital reading has only four. A comparison between the digital and print proficiency levels, and the percentage of students at each of the four parallel described levels, are provided in Figure VI.2.19.74 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT readingFigure VI.2.19 indicates that across oeCd countries the percentage of students performing at any given level indigital reading is similar to the percentage performing at the equivalent level in print reading. However, there arenotable differences at the country level. Figure VI.2.20 shows the distribution of students in each participatingcountry over the four described proficiency levels for digital reading and the parallel levels in print reading. theupper bar for each country shows the distribution of performance on the digital reading scale and the lower barshows the distribution on the print reading scale. • Figure VI.2.19 • a comparison of performance levels on the digital and print reading scales Digital reading Print reading lower Percentage of students able Percentage of students able score to perform tasks at this level or above to perform tasks at this level or abovelevel limit (oEcd average) (oEcd average) 5 7.8% 8.5% of students in the 16 participating oeCd countries can perform tasks or at least at level 5 on the reading scaleabove 626 30.5% of students in the 16 participating oeCd countries can perform tasks 4 30.3% at least at level 4 on the reading scale 553 59.6% of students in the 16 participating oeCd countries can perform tasks 3 60.7% at least at level 3 on the reading scale 480 82.6% of students in the 16 participating oeCd countries can perform tasks 2 83.1% at least at level 2 on the reading scale 407Source: oeCd, PISA 2009 Database.12 http://dx.doi.org/10.1787/888932435378Given that the digital reading scale was constructed to match the mean and standard deviation of the print readingscale, it follows that the oeCd average for performance is level 3 for digital and print reading; both bands spanthe score point range of 480 to 552, and most individual countries show the same results for their mean highestproficiency levels: level 3. An exception is Chile, where, on average, students are proficient at level 2 for both digitaland print reading. A few countries have different modal levels for digital and print reading. In Korea, new Zealandand Australia, level 4 is the modal level in digital reading, while level 3 is the modal level in print reading, and theproportion of students who reached level 5 is greater in digital reading than print reading. In other words, in thesecountries larger proportions of students can be described as “strong performers” in the digital medium than in theprint medium. In contrast, in the partner economy Hong Kong-China, the modal level in digital reading is level 3,while in print reading it is level 4. the partner country Colombia has a similar disparity in performance betweendigital and print reading, with a modal level performance in print reading (level 2) higher than that in digital reading(below level 2).on average, 7.8% of students in the participating oeCd countries perform at level 5 or above on the digital readingscale, while a slightly higher percentage (8.5%) performs at level 5 or 6 in print reading. At the country level,there are three oeCd countries in which more than 15% of students are proficient in digital reading at level 5or above: Korea (19.2%), new Zealand (18.6%) and Australia (17.3%); whereas only one country, new Zealand,has a comparable percentage of students performing at level 5 or 6 in print reading (15.7%). the country with thesecond highest percentage of students performing at level 5 or 6 in print reading is Japan (13.4%), while only 5.7%of Japanese students are proficient at level 5 or above in digital reading.PISA’s shorthand description of “lowest performers” applies to those performing below level 2 of the (print anddigital) reading, mathematics and science assessments. on average across the 16 oeCd countries that participatedin the 2009 digital reading assessment, 16.9% of students performed below level 2 in digital reading, while asimilar percentage (17.4%) performed below the baseline level 2 on the print reading scale. Although there is widevariation across countries, about the same percentages of students within most countries are proficient below thebaseline level in digital and print reading; that is, the proportions of low-performing students on digital and printreading are within five percentage points of each other. Ireland and Japan are the only countries in which there is asubstantially larger proportion of low-performing students in print reading. In Ireland, 17.2% are low performers inprint reading compared with 12.1% in digital reading; in Japan, 13.6% perform below level 2 in print compared PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 75
    • 2 sTudenT perFormanCe in digiTal and prinT reading with 6.7% below level 2 in digital reading. the picture is reversed in Chile, Hungary, Poland and the partner country Colombia, where there are substantially larger groups of low performers in digital than in print reading. the percentage of low performers in digital reading in Chile is 37.7%, while the percentage of low performers in print reading, while still substantial, is smaller, at 30.6%. In Hungary and Poland, the disparity is greater: the percentage of low performers in digital reading in Hungary is 26.8%, while in print reading it is only 17.6%; in Poland, 26.3% perform below the baseline level in digital reading, but only 15.0% do so in print reading. the partner country Colombia shows the greatest disparity: just over one-third of students (34.2%) perform below the baseline level in print reading, but two-thirds (68.4%) are below the baseline level in digital reading. • Figure VI.2.20 • Percentage of students at each proficiency level on the digital and print reading scales Digital reading Print reading Level 5 or above Level 4 Level 3 Level 2 Below Level 2 Percentage of students Percentage of students below Level 2 at Level 2 or above Korea Korea Japan Japan Australia Australia Hong Kong-China Hong Kong-China New Zealand New Zealand Macao-China Macao-China Ireland Ireland Iceland Iceland Sweden Sweden Norway Norway Belgium Belgium Denmark Denmark France France OECD average-16 OECD average-16 Spain Spain Poland Poland Hungary Hungary Austria Austria Chile Chile Colombia Colombia Percentage 100 80 60 40 20 0 20 40 60 80 100 Percentage of students of students Countries are ranked in descending order of the percentage of students at Level 2 or above in digital reading. Source: OECD, PISA 2009 Database, Table VI.2.1. 12 http://dx.doi.org/10.1787/888932435378 average level of proficiency Another way of summarising the differences between countries is to compare their mean performances in the two reading media. A mean of 499 and a standard deviation of 90, respectively, are the benchmarks – pooled average for the 16 participating oeCd countries – against which each country’s digital and print reading performances in PISA 2009 are compared.76 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT readingFigure VI.2.21 shows each country’s mean scores for digital and print reading. Statistically significant differencesare highlighted.the figure shows that only a handful of individual countries – Japan, France, Belgium, norway and Spain – have asimilar mean for digital and print reading. • Figure VI.2.21 • Comparison of mean performance in digital and print reading digital reading Print reading difference between digital and print mean score s.E. mean score s.E. mean dif. s.E. Australia 537 (2.8) 515 (2.3) 21.70 1.81OECD Austria 459 (3.9) 470 (2.9) -11.70 2.98 Belgium 507 (2.1) 506 (2.3) 1.45 1.61 Chile 435 (3.6) 449 (3.1) -14.85 2.41 Denmark 489 (2.6) 495 (2.1) -5.99 1.91 Spain 475 (3.8) 480 (3.1) -4.95 2.79 France 494 (5.2) 496 (3.4) -1.35 4.82 Hungary 468 (4.2) 494 (3.2) -25.84 2.92 Ireland 509 (2.8) 496 (3.0) 13.27 2.64 Iceland 512 (1.4) 500 (1.4) 11.56 0.94 Japan 519 (2.4) 520 (3.5) -0.63 2.91 Korea 568 (3.0) 539 (3.5) 28.31 1.99 Norway 500 (2.8) 503 (2.6) -3.28 2.00 New Zealand 537 (2.3) 521 (2.4) 16.48 1.70 Poland 464 (3.1) 500 (2.6) -36.96 2.20 Sweden 510 (3.3) 497 (2.9) 12.90 2.11 oEcd average-16 499 (0.8) 499 (0.7) 0.01 0.63 Colombia 368 (3.4) 412 (3.6) -43.06 2.64Partners Hong Kong-China 515 (2.6) 533 (2.1) -18.36 2.40 Macao-China 492 (0.7) 487 (0.9) 5.29 0.84 note: Values that are statistically significant are indicated in bold (see Annex 3). Source: oeCd, PISA 2009 Database. 12 http://dx.doi.org/10.1787/888932435378In Poland, Hungary, Chile, Austria, denmark, the partner economy Hong Kong-China and the partner countryColombia, students perform significantly better, on average, in print than in digital reading. In Korea, Australia,new Zealand, Ireland, Sweden, Iceland and the partner economy macao-China, students perform significantlybetter, on average, in digital than in print reading. there is a tendency for the higher-performing countries in bothmedia to do better in digital media, while the lower-performing countries perform more strongly in print media,although Hong Kong-China is an exception.Another way of comparing countries’ performance is to look at their ranking. Because the figures are derived fromsamples, it is not possible to determine a precise rank among the participating countries. It is possible, however,to determine, with 95% likelihood, a range of ranks in which the country’s performance level lies. Figure VI.2.22shows the relative ranking of the participating countries in digital and print reading.Figure VI.2.22 shows that Korea ranks first among oeCd countries in both digital and print reading, and Chile rankslast. the partner economy Hong Kong-China is ranked at the same level as Korea in print reading, but is belowit by several ranks in digital reading. At the other end of the scale, the partner country Colombia is ranked lastamong all the participating countries on both scales. Around the middle of the ranking, the oeCd average, thereis a wide band of possible ranks in both media. For example, denmark ranks between ninth and thirteenth amongoeCd countries for print reading and between tenth and eleventh for digital reading. France’s position is even moredifficult to ascertain: it ranks anywhere between seventh and thirteenth for print reading and between ninth andeleventh for digital reading. For these countries, there is no clear difference in relative position on the two scales.However, for other countries, the ranking does shed light on relative performance on the two scales. Spain and thepartner economy macao-China rank higher on the digital reading scale than on the print reading scale. Ireland andAustralia also show this pattern, but for these two countries, possible ranks overlap. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 77
    • 2 sTudenT perFormanCe in digiTal and prinT reading • Figure VI.2.22 • where countries rank in digital and print reading performance Statistically significantly above the oeCd average not statistically significantly different from the oeCd average Statistically significantly below the oeCd average Digital reading scale Print reading scale range of rank range of rank oEcd countries All countries/economies oEcd countries All countries/economies mean mean score s.E. upper rank lower rank upper rank lower rank score s.E. upper rank lower rank upper rank lower rank korea 568 (3.0) 1 1 1 1 539 (3.5) 1 1 1 2 new Zealand 537 (2.3) 2 3 2 3 521 (2.4) 2 3 3 4 Australia 537 (2.8) 2 3 2 3 515 (2.3) 3 4 4 5 Japan 519 (2.4) 4 4 4 5 520 (3.5) 2 4 3 5 Hong Kong-China 515 (2.6) 4 7 533 (2.1) 1 2 iceland 512 (1.4) 5 7 5 8 500 (1.4) 6 10 7 11 sweden 510 (3.3) 5 8 5 9 497 (2.9) 7 13 8 14 ireland 509 (2.8) 5 8 6 9 496 (3.0) 8 13 9 14 Belgium 507 (2.1) 6 8 7 9 506 (2.3) 5 7 6 8 norway 500 (2.8) 9 10 10 11 503 (2.6) 5 9 6 10 france 494 (5.2) 9 11 10 13 496 (3.4) 7 13 8 14 Macao-China 492 (0.7) 11 13 487 (0.9) 15 15 denmark 489 (2.6) 10 11 11 13 495 (2.1) 9 13 10 14 spain 475 (3.8) 12 13 14 15 481 (2.0) 14 14 16 16 hungary 468 (4.2) 12 14 14 16 494 (3.2) 9 13 9 14 Poland 464 (3.1) 13 15 15 17 500 (2.6) 5 11 6 12 Austria 459 (3.9) 14 15 16 17 470 (2.9) 15 15 17 17 chile 435 (3.6) 16 16 18 18 449 (3.1) 16 16 18 18 Colombia 368 (3.4) 19 19 413 (3.7) 19 19 note: See Annex A3 for a detailed description of how the range of ranks is computed. Source: oeCd, PISA 2009 Database. 12 http://dx.doi.org/10.1787/888932435378 gender differences in performance on the digital and print reading scales the mean difference between boys’ and girls’ performance in digital reading is 24 score points, in favour of girls, while the mean gender difference in print reading for the same 16 oeCd countries is 38 score points. there is still a marked difference in performance in favour of girls in digital reading, but it is less extreme than the disparity between boys’ and girls’ performance in print reading. Figure VI.2.23 shows the scores for boys and girls in digital and print reading, ranked by the gender difference in digital reading performance. In all participating countries and economies the gap in gender performance was wider in print than digital reading. the variations in the size of the gender gap among countries do not seem to be associated with the absolute levels of performance. In the highest-performing country in both digital and print reading, Korea, the gender gaps in both digital and print reading are close to the respective oeCd averages, while in one of the other top-performing countries, new Zealand, the gender gaps in both media are among the widest among all countries. Among countries performing below the oeCd average in digital and print reading, Austria has a substantially narrower gap between boys and girls in digital reading (22 points) than in print reading (41 points), while the gaps between Chilean boys and girls in digital and print reading are almost the same (19 points and 22 points, respectively). of the 19 countries and economies that participated in the digital reading assessment, those with the widest gender gaps in digital reading tend to have a comparatively wide gender gap in print reading as well. new Zealand, for example, shows a large gender gap in digital reading (40 points) and in print reading (46 points). Ireland and Australia show a similar pattern. In these countries, whatever factors might explain the performance differences between boys and girls in the digital medium seem to be the same as, or at least have a similar effect to, those that underpin performance differences in the print medium. like these predominantly english-speaking countries, three of the nordic countries, norway, Iceland and Sweden, have above-average gender gaps in digital reading performance (girls outperform boys by 35, 30 and 26 score points, respectively). However, unlike in new Zealand, Ireland and Australia, these three nordic countries have much wider gender gaps in print reading than in digital reading: girls in norway, Iceland and Sweden outperform boys by 47, 44, and 46 score points, respectively, in print reading. Poland also has an above-average gap (29 points) between girls’ and boys’ performance in digital reading and it also has a massive gap of 50 points in print reading.78 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT reading • Figure VI.2.23 • Comparison of gender gaps in digital and print reading All students Digital reading Print reading Digital reading scale Boys Girls Boys Girls Print reading scale Mean score on Gender difference the digital and print reading scales (girls – boys) New Zealand Norway Ireland Iceland Poland Australia Sweden OECD average-16 Belgium Japan Austria Hungary France Spain Chile Korea Macao-China Hong Kong-China Denmark Colombia 350 400 450 500 550 600 0 20 40 60 Mean score Score point differenceNote: Gender differences that are statistically significant are marked in a darker tone.Countries are ranked in descending order of the gender difference in digital reading performance.Source: OECD, PISA 2009 Database, Table VI.2.4.12 http://dx.doi.org/10.1787/888932435378Conversely, countries with narrow gender gaps in digital reading tend to have narrow gender gaps in print reading aswell. In some cases, the differences in the gaps, measured in score points, are quite small. For example, the partnercountry Colombia shows no significant gender gap in digital reading proficiency and a gap of only eight score pointsbetween boys and girls in print reading. the oeCd countries Chile, Spain and Belgium also show relatively smalldifferences in the gender gap in performance for both digital and print reading.In another group of countries with below-average gaps between boys’ and girls’ performance in digital reading,there is a much more substantial gender gap in performance in print reading. In the two partner economies, macao-China and Hong Kong-China, the gap between boys and girls in digital reading is only 12 and 8 points, respectively,while the gender gap in print reading proficiency is just a little below the oeCd average of 38 points, at 34 and33 points, respectively. denmark has a gap of just six points between boys and girls in digital reading proficiency;but while the gender gap in print reading proficiency is below-average, it is still a substantial 29 score points. Forthese two economies and one country, it would appear that the factors influencing boys’ and girls’ digital readingproficiency are different from those that affect their proficiency in print reading.While girls are generally more proficient readers in both media, on average, girls score seven points lower in digitalreading than in print reading, and boys score seven points higher. It was noted above that a handful of individualcountries – Japan, France, Belgium, norway and Spain – have a similar mean for digital and print reading. For some PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 79
    • 2 sTudenT perFormanCe in digiTal and prinT reading of these countries, however, the apparent similarity in performance across the two media masks significant gender differences. France and norway, the only two countries whose performance in both digital and print reading was not significantly different to the oeCd average, offer illustrations. their “average” performance masks the fact that French girls scored 11 points lower in digital reading than they did in print reading, while French boys scored 9 points higher in digital than in print reading. Similarly, norwegian girls scored 10 points lower in digital than in print reading, while norwegian boys performed about the same on the two assessments. two other countries, Japan and denmark, and the partner economy macao-China, also show girls scoring lower in digital reading than in print reading, while boys attain higher scores. In Sweden, Iceland and Korea, boys and girls performed better in digital than in print reading, but boys performed much better in digital reading than in print reading. In contrast, in Poland, Austria, Hungary and in the partner economy Hong Kong-China, boys and girls performed worse in digital reading than in print reading, but in Poland and the partner economy Hong Kong-China girls performed much worse in digital reading. In these countries, policy makers might consider developing strategies specifically to improve girls’ familiarity with and skills in reading digital texts. In summary, then, it is clear that, on average, the gap between boys’ and girls’ proficiency that has been such a constant feature in of print reading performance is narrowed in digital reading, but in every country except one it has not disappeared. It is clear too that there is a good deal of variation across countries in the relative sizes of the gaps in performance between boys and girls across the two media. the variations do not appear to be associated with the absolute levels of performance, but there are some interesting patterns among countries with cultural and/or linguistic similarities that would reward further investigation. Some of the possible explanations are explored in succeeding chapters. a ComposiTe sCale For digiTal and prinT reading Because readers today need to handle texts in both digital and print media, it is useful to consider reading proficiency as a single measure. Accordingly, PISA has developed a composite reading scale. the scale is based on equal weighting of results from the two assessments: an arithmetic average. the equal weighting is justified in measurement terms because both of the assessments estimate student proficiency reliably. It is justified in construct terms because proficiency in both digital and print reading is essential for citizens of the 21st century (for further details, see Annex A1a). the distribution of the digital reading items on a single scale is similar to the distribution of the print reading items, and when the two sets of items are calibrated together, the estimates of the difficulty of each item are similar to their estimates on the separate scales. Since the same methodology was used to construct the scales for digital and print reading proficiency, with the hierarchy of levels set at the same cut-points on the respective scales and the level bands at the same widths, it is possible to align the descriptions of results for those levels in digital reading where there are sufficient data. In generating descriptions for the composite levels, the combined set of items from the two separate scales was again inspected, and the main common features identified as characteristics of the new composite level. the descriptions also include some elements specifically relating to navigation, consistent with items within the level. thus, the construction of a composite scale provides an overall picture of reading proficiency that is both qualitatively and quantitavely consistent with the two separate scales. Figure VI.2.24 shows the match between the digital and print reading levels. the numerical terms used to describe proficiency in print reading have been adopted for the composite reading scale to allow the full range of descriptions, though the absence of digital reading items at the highest and lowest levels means that the descriptions at the extremes are confined to print reading. • Figure VI.2.24 • alignment between the described levels for digital and print reading and composite reading lower score limit Digital reading Print reading Composite reading 698 level 6 level 6 level 5 or above 626 level 5 level 5 553 level 4 level 4 level 4 480 level 3 level 3 level 3 407 level 2 level 2 level 2 335 level 1a level 1a Below level 2 (undescribed) level 1b level 1b 262 Below level 1b (undescribed) Below level 1b (undescribed) Source: oeCd, PISA 2009 Database. 12 http://dx.doi.org/10.1787/88893243537880 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT reading • Figure VI.2.25 • summary descriptions for the composite reading scale (digital and print combined) Percentage of students able to perform tasks lower at this level score or abovelevel limit (oEcd average) characteristics of tasks 0.60% tasks at this level typically require the reader to make multiple inferences, comparisons and contrasts 6 that are both detailed and precise. they require demonstration of a full and detailed understanding of one or more texts and may involve integrating information from more than one text. tasks require the reader to deal with unfamiliar ideas, in the presence of prominent competing information, and to generate abstract categories for interpretations. Reflect and evaluate tasks may require the reader to hypothesise about or critically evaluate a complex text on an unfamiliar topic, taking into account multiple criteria or perspectives, and applying sophisticated understandings from beyond the text. there are limited data about both access and retrieve tasks and digital tasks at this level, but in both cases it appears that a salient condition is precision of analysis and fine attention to detail that is 708 inconspicuous in the texts. 6.60% tasks at this level that involve retrieving information require the reader to locate and organise several 5 pieces of deeply embedded information. In tasks requiring interpretation the reader needs to draw on a full and detailed understanding of one or more texts whose content or form is unfamiliar. Reflect and evaluate tasks require critical evaluation or hypothesis, drawing on specialised knowledge. these tasks typically require the reader to generate the criteria on which a critical evaluation is based. In the digital medium, tasks may require the reader to navigate across several sites without guidance, negotiating information in different formats. For all aspects of reading, tasks at this level 626 typically involve dealing with concepts that are contrary to expectations or ambiguous. 29.80% tasks at this level that involve retrieving information require the reader to locate and organise several 4 pieces of embedded information. Some tasks at this level require interpreting the meaning of nuances of language in a section of text by taking into account the text as a whole. other tasks require understanding and applying categories in an unfamiliar context. Reflect and evaluate tasks at this level require readers to use formal or public knowledge to hypothesise about or critically evaluate a text. digital reading tasks may require the reader to navigate across a number of sites with only limited guidance. Readers must demonstrate an accurate understanding of long or complex texts whose content or form may be 553 unfamiliar, particularly texts that deal with scientific or technical content. 60.50% tasks at this level require the reader to locate, and in some cases recognise the relationship between, 3 several pieces of information that must meet multiple conditions. In tasks requiring interpretation the reader may need to integrate several parts of a text in order to identify a main idea, understand a relationship or construe the meaning of a word or phrase. the reader needs to take into account many features in comparing, contrasting or categorising. often the required information is not prominent or there is much competing information; or there are other text obstacles, such as ideas that are contrary to expectation or negatively worded. Reflect and evaluate tasks at this level may require connections, comparisons, and explanations, or they may require the reader to evaluate a feature of text. Some reflect and evaluate tasks require readers to demonstrate a fine understanding of the text in relation to familiar, everyday knowledge. other tasks do not require detailed text comprehension but require the reader to draw on less common knowledge. In the digital medium, the task may require several steps of well-directed navigation. where evaluation is required, the reader needs to generate simple categories, and apply them using the information that is most directly accessible, or 480 only part of the available information. 83.50% Some tasks at this level require the reader to locate one or more pieces of information, which may 2 need to be inferred and may need to meet several conditions. others require recognising the main idea in a text, understanding relationships, or construing meaning within a limited part of the text when the information is not prominent and the reader must make low level inferences. tasks at this level may involve comparisons or contrasts based on a single feature in the text. In the print medium, typical reflect and evaluate tasks at this level require readers to make a comparison or several connections between the text and outside knowledge, by drawing on personal experience and attitudes. In the digital medium, tasks require locating and interpreting well-defined information, usually in familiar contexts. the task may require navigation across a limited number of sites and use of other web-based tools such as drop-down menus; if so, the reader is supplied with clear directions 407 to the relevant links. 95.10% tasks at this level require the reader to locate one or more independent pieces of explicitly stated 1a information; to recognise the main theme or author’s purpose in a text about a familiar topic; or to make a simple connection between information in the text and common, everyday knowledge. typically the required information in the text is prominent and there is little if any competing information. the reader is explicitly directed to consider relevant factors in the task and in the text. there are limited data about digital reading at this level, but it appears that, if access to more than 335 one page is required for a task, navigation directions are explicitly directed and links are prominent. 99.2% tasks at this level require the reader to locate a single piece of explicitly stated information in 1b a prominent position in a short, syntactically simple text with a familiar context and text type, such as a narrative or a simple list. the text may provide support to the reader, such as repetition of information, pictures or familiar symbols. there is minimal competing information. In tasks requiring interpretation the reader may need to make simple connections between adjacent pieces 262 of information. (there are insufficient data about digital reading at this level.)Source: oeCd, PISA 2009 Database.12 http://dx.doi.org/10.1787/888932435378 PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 81
    • 2 sTudenT perFormanCe in digiTal and prinT reading students reaching the different levels of proficiency on the composite reading scale Figure VI.2.25 describes the composite reading scale. Although there were few digital reading tasks designed to reflect the equivalent level of difficulty of tasks at levels 1a, 1b and 6 in print reading, student performance can nevertheless be accurately measured to cover all seven levels of the composite reading scale. the distribution of student performance across these proficiency levels for each participating country and economy is shown in Figure VI.2.26. table VI.2.1 provides figures for the percentage of students at each proficiency level on the composite reading scale, with standard errors. • Figure VI.2.26 • How proficient are students on the composite reading scale? Below Level 1b Level 1b Level 1a Level 2 Level 3 Level 4 Level 5 Level 6 100 Percentage of students 80 60 40 20 0 20 40 60 Korea Hong Kong-China Japan Australia New Zealand Macao-China Norway Ireland Iceland Sweden Denmark OECD average-16 Belgium France Poland Spain Hungary Austria Chile Colombia Countries are ranked in descending order of the percentage of students at Levels 2, 3, 4, 5 and 6. Source: OECD, PISA 2009 Database, Table VI.2.1. 12 http://dx.doi.org/10.1787/888932435378 Proficiency at Level 6 (scores higher than 698 points) the description of what students proficient at level 6 know and can do is drawn almost entirely from the print reading scale since only one digital reading item was calibrated at this level of difficulty. on average across oeCd countries, 0.6% of 15-year-old students perform at this level. only two countries have a significantly higher percentage of students performing at level 6, new Zealand (2.5%) and Australia (2.2%). Korea (0.8%), which ranks first in mean performance, attains close to the oeCd average for students performing at this very high level, reflecting the relative homogeneity of its student population’s proficiency in both digital and print reading. In some countries and economies, notably Chile, Spain, the partner country Colombia and the partner economy macao-China, fewer than one-tenth of 1% of students are proficient at this level. As noted in Chapter 2 of Volume 1, What Students Know and Can Do, the very small percentage of students performing at level 6 illustrates that the PISA scale is capable of distinguishing reading proficiency up to the highest level of excellence among 15-year-olds. Proficiency at Level 5 (scores higher than 626 but lower than or equal to 698 points) on average across the 16 participating oeCd countries, 7.2% of students are proficient at this level or above, but the proportions range from over twice this percentage in Korea and new Zealand, to less than half in Chile, Austria, Spain, Poland, denmark, the partner country Colombia and the partner economy macao-China. Proficiency at Level 4 (scores higher than 553 but lower than or equal to 626 points) Across the 16 participating oeCd countries, 29.8% of students are proficient at level 4 or above. In Australia, new Zealand, Belgium, Iceland, France, Ireland, and Sweden, about one-quarter of students attain level 4 as their82 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT readinghighest level of proficiency. In Japan and the partner economy Hong Kong-China, the proportion is closer to one-third, while in Korea it is almost 40%. About one-fifth of students in norway, denmark, Hungary, and Poland attainlevel 4 as their highest level, while about 15% of students in Spain, Austria and the partner economy macao-Chinaattain this level. Some 8% of Chilean students and just over 2% of Colombian students attain this level as theirhighest level.Proficiency at Level 3 (scores higher than 480 but lower than or equal to 553 points)Just over 60% of 15-year-olds across the 16 participating oeCd countries are proficient at level 3 or above. It canbe inferred, then, that the majority of young people in these countries is capable of dealing with many everydayreading tasks, regardless of the medium. However, in Hungary, Poland, Spain and Austria Poland and Hungary, onlyaround 50% are proficient at level 3 or above, and in Chile only a third of students attain this level of proficiency.this means that in these countries half or fewer of 15-year-olds are able to perform the kinds of reading taskscommonly expected of young people and adults in their everyday lives.In all but two of the participating oeCd countries, level 3 is the modal level of highest attainment. the exceptionsare Korea, whose modal highest attainment level is level 4, and Chile, whose modal highest attainment level islevel 2. Students in the partner economies Hong Kong-China and macao-China also most commonly perform atlevel 3, while the modal performance level of Colombian students is level 1a.Proficiency at Level 2 (scores higher than 408 but lower than or equal to 480 points)Across the participating oeCd countries, some 84% of students are proficient at baseline proficiency level 2 orabove. only in Austria, Chile and the partner country Colombia does the proportion of 15-year-olds proficient atthis level fall below three-quarters.Proficiency at Level 1a (from 335 to 408 points)Some 95% of 15-year-old students across participating oeCd countries are proficient at level 1a or higher. In mostcountries, the proportion is well over 90%, while in Chile and Austria it is just under 90% (89.2% and 89.8%,respectively). In Colombia, nearly 75% of 15-year-olds perform at or above level 1a; but for nearly one-third ofstudents in this partner country, level 1a is their highest performance level.Proficiency at Level 1b (from 262 to 335 points) and below Level 1b (below 262 points)the description of what students proficient at level 1b know and can do is drawn entirely from the print readingscale. on average across the 16 participating oeCd countries, 4% of students reach level 1b as their highest levelof proficiency. In Japan and the partner economies Hong Kong-China and macao-China, fewer than 2% of studentsperform no higher than level 1b, while in Korea the proportion of those students is less than 0.5%.A small percentage of students in oeCd countries perform below the lowest level on the PISA composite digital andprint reading scale, level 1b. on average, only 0.8% of students have scores below 262 points on the PISA scale.In the partner country Colombia, the lowest performing of the countries that participated in the 2009 digital readingassessment, just over 5% of students perform below this level on the composite reading scale.Students whose proficiency is estimated at below level 1b on the composite reading scale do not necessarilylack reading skills completely, but there is insufficient information on which to base a description of their readingproficiency, given the small number of tasks at that level presented in PISA 2009. the fact that fewer than one in onehundred students across oeCd countries cannot perform tasks at level 1b demonstrates that the PISA reading scalecan measure and describe the performance of almost all students.average level of proficiencyFigure VI.2.27 shows each country’s mean score for composite digital and print reading. For each country shown inthe middle column, the list in the right column shows countries whose mean scores are not sufficiently different tobe distinguished with at least 95% certainty. For all other cases, one country has higher performance than another ifit is listed above the second country in the middle column, and lower performance if it is listed below. For example,Hong Kong-China’s performance, which comes fourth in the list, cannot be distinguished from that of new Zealandor Australia, which come second and third respectively, and Japan, which comes fifth. the darkest band in themiddle shows those participating countries whose performance is not statistically significantly different from theoeCd average. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 83
    • 2 sTudenT perFormanCe in digiTal and prinT reading • Figure VI.2.27 • Comparing countries’ performance on the composite reading scale Statistically significantly above the oeCd average not statistically significantly different from the oeCd average Statistically significantly below the oeCd average mean comparison country countries whose mean score is not statistically significantly different from that of the comparison country 553 Korea 529 New Zealand Australia, Hong Kong-China 526 Australia new Zealand, Hong Kong-China, Japan 524 Hong Kong-China new Zealand, Australia, Japan 520 Japan Australia, Hong Kong-China 507 Belgium Iceland, Sweden, Ireland, norway 506 Iceland Belgium, Sweden, Ireland, norway 504 Sweden Belgium, Iceland, Ireland, norway, France 502 Ireland Belgium, Iceland, Sweden, norway, France 502 Norway Belgium, Iceland, Sweden, Ireland, France 495 France Sweden, Ireland, norway, denmark, macao-China 492 Denmark France, macao-China 489 Macao-China France, denmark 482 Poland Hungary, Spain 481 Hungary Poland, Spain 478 Spain Poland, Hungary 464 Austria 442 Chile 390 Colombia Source: oeCd, PISA 2009 Database. 12 http://dx.doi.org/10.1787/888932435378 Korea is the top-performing country by a significant margin, with a mean score of 553. this indicates that, on average, 15-year-olds in Korea perform at the border between levels 3 and 4 on the composite reading scale. new Zealand, Australia, the partner economy Hong Kong-China and Japan follow. these countries, together with Belgium and Iceland, perform significantly above the oeCd average. Students in Sweden, Ireland, norway and France perform near the oeCd average, while students in denmark and the partner economy macao-China perform significantly below the oeCd average, but cannot be distinguished from students’ performance in France. Poland, Hungary and Spain follow. All of the countries mentioned above, except for Korea and Spain, have a mean level of proficiency within the level 3 band. Spain, Austria and Chile have a mean proficiency within level 2 while the partner country Colombia’s mean is within level 1a. • Figure VI.2.28 • where countries rank on the composite reading scale Statistically significantly above the oeCd average not statistically significantly different from the oeCd average Statistically significantly below the oeCd average Composite reading scale range of rank oEcd countries All countries/economies mean score s.E. upper rank lower rank upper rank lower rank korea 553 (3.1) 1 1 1 1 new Zealand 529 (2.2) 2 3 2 3 Australia 526 (2.4) 2 3 2 4 Hong Kong-China 524 (2.0) 3 5 Japan 520 (2.6) 4 4 4 5 Belgium 507 (2.1) 5 8 6 9 iceland 506 (1.3) 5 7 6 8 sweden 504 (2.9) 5 9 6 10 ireland 502 (2.6) 6 10 7 11 norway 502 (2.5) 6 10 7 11 france 495 (3.7) 9 11 10 13 denmark 492 (2.1) 10 11 11 13 Macao-China 489 (0.7) 12 13 Poland 482 (2.6) 12 14 14 16 hungary 481 (3.4) 12 14 14 16 spain 478 (3.2) 12 14 14 16 Austria 464 (3.1) 15 15 17 17 chile 442 (3.1) 16 16 18 18 Colombia 390 (3.2) 19 19 note: See Annex A3 for a detailed description of how the range of ranks is computed. Source: oeCd, PISA 2009 Database. 12 http://dx.doi.org/10.1787/88893243537884 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT readingFor many of the countries it is not possible to determine a precise rank; however, it is possible to determine, with95% likelihood, a range of ranks in which the country’s performance level lies. Figure VI.2.28 shows the range ofpossible ranks for each country.gender differences in performance on the composite reading scaleAs noted earlier, girls consistently outperform boys in digital and print reading, both on average across the oeCd areaand in individual countries. However, in digital reading, the gender gap is narrower by an average of 15 score points,and in the partner country Colombia, it disappears entirely. Given that the composite reading scale is an amalgam ofthe digital and print scales, with equal weighting for each, it is not surprising that the gender gap in favour of girls liesbetween the gender gap in print reading (38 score points) and that in digital reading (24 score points).Figure VI.2.29 shows gender differences in reading performance for each country. tables VI.2.2, VI.2.3 and VI.2.4provide further details. • Figure VI.2.29 • Gender differences on the composite reading scale Boys All students Girls Mean score on Gender difference the composite reading scale (girls – boys) New Zealand Norway Poland Iceland Sweden Ireland Australia Austria OECD average-16 Japan France Hungary Korea Belgium Spain Macao-China Chile Hong Kong-China Denmark Colombia 350 400 450 500 550 600 0 10 20 30 40 50 Mean score Score point differenceNote: Gender differences that are statistically significant are marked in a darker tone.Countries are ranked in ascending order of the score point difference between girls and boys.Source: OECD, PISA 2009 Database, Table VI.2.4.12 http://dx.doi.org/10.1787/888932435378the mean difference between boys’ and girls’ performance on the composite reading scale is 31 score points infavour of girls. the mean score for boys is 483, near the bottom of level 3, while it is 515 for girls, still withinlevel 3, but towards the top of the level. the difference in performance between boys and girls is statisticallysignificant in all oeCd countries and partner countries and economies except Colombia. new Zealand shows thelargest gap among the 16 oeCd countries and three partner countries and economies that participated in the 2009digital reading assessment, with a gender gap of 43 points. norway shows the next widest gap (41 points), thenPoland (39 points), Iceland (37 points), Sweden (36 points) and Ireland (35 points). All of these countries, exceptPoland, are at or above the oeCd average in mean proficiency. other chapters discuss the factors that are relatedto the smaller gender difference in digital reading performance.Figure VI.2.30 shows the percentages of boys and girls performing at the proficiency levels 2, 3 and 4 on thecomposite reading scale. the three lowest levels are summarised as “Below level 2” and the two highest levels as“level 5 or above”. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 85
    • 2 student performance in digital and print reading • Figure VI.2.30 • How proficient are girls and boys on the composite reading scale? Boys Girls 5.3 Level 5 or above 9.1 19.2 Level 4 26.2 29.3 Level 3 32.1 25.0 Level 2 21.0 21.2 Below Level 2 11.7 % 35 30 25 20 15 10 5 0 0 5 10 15 20 25 30 35 % Source: OECD, PISA 2009 Database, Tables VI.2.2 and VI.2.3. 12 http://dx.doi.org/10.1787/888932435378 Figure VI.2.30 shows that the modal highest proficiency level for both boys and girls on average across the participating oeCd countries is level 3, and the percentages of boys and girls who perform at this level are similar (29% and 32%, respectively). In most individual countries, this modal level of performance for both boys and girls is the same: usually level 3, but level 4 in Korea, level 2 in Chile and level 1a in Colombia. In a few countries, the modal levels for boys are different from that for girls. In new Zealand, Australia, Belgium and the partner economy Hong Kong-China, the modal performance for girls is level 4, while for boys it is only level 3 (tables VI.2.2 and VI.2.3). In Austria and Poland, the modal level of performance for boys is level 2, while the highest level of proficiency reached by most girls is level 3. In these six countries in particular, a dual focus on developing strategies to improve both the digital and print reading proficiency of boys would be likely to yield overall improvements in reading at the national level. conclusions this chapter has discussed the similarities and differences between digital and print reading and has shown that digital reading involves many of the skills required to process print texts, including awareness of language, and the capacity to form inferences from parts of a text and to construe connections between them. But digital reading also requires different skills, such as the deployment of new knowledge about the unique structures and features of digital texts. It also requires heightened proficiency in prediction, integration and evaluation that are even more emphatically called upon in digital than in print reading, because the amount of text visible at any one time is small, its origin often unverified and its extent often unknown. Reporting digital reading as a separate scale highlights countries’ proficiency in this medium. While countries vary in their performance in digital and print reading, one pattern emerges clearly: the gender gap is narrower in digital reading proficiency than it is in print reading proficiency. on average across the 16 participating oeCd countries, the gap narrowed by 14 points, and it shrunk to some degree in every participating country and economy. these results suggest that it might be possible to harness boys’ relatively strong performance in digital reading and use it to improve their overall proficiency as readers. the results of the digital reading assessment have also been reported in combination with print reading as a composite scale. Reporting reading performance on a composite scale reflects what it means to be a proficient reader in the 21st century. Given that there is mounting evidence of the economic and social benefits of developing human capital, countries should consider allocating resources to teaching students how to read in both digital and print media. As the first large-scale international assessment of digital reading, PISA 2009 has provided initial insights into the proficiency of young people in accessing, interpreting and evaluating information on line, drawing on data from 16 oeCd countries and three partner countries and economies. While this group represents only about one third of the PISA participants it is a significant proportion. the PISA 2009 digital reading assessment has laid the ground for further investigations, and for an expanded set of countries to build on in future cycles.86 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 2 sTudenT perFormanCe in digiTal and prinT reading Notes1. the mean and standard deviation for print reading were computed using the pooled samples of the 16 oeCd countries andusing transformed student final weights and replicates, so that their sum per country is a constant. these transformed weights areusually denoted as senate weights.2. For further details, see Chapter 1 of PISA 2009 Framework: Key Competencies in Reading, Mathematics and Science (oeCd,2009b) and Chapter 2 of PISA 2009 Results: What Students Know and Can Do: Student Performance in Reading, Mathematics andScience (oeCd, 2010b). PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 87
    • 3 navigation in the pISa 2009Digital reading assessment Navigation is a key feature of digital reading. Tracking and analysing the sequences of pages students visit to complete a task can help to identify which navigation behaviours are associated with greater digital reading proficiency. In addition to examining this relationship, the chapter presents a series of case studies showing how students respond to certain digital reading tasks. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 89
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT As discussed in Chapter 2, navigation is considered to be part of the cognitive process of digital reading. In addition to locating clickable links within texts, students are required to predict what kind of information they will encounter once these links are opened, including its likely utility or relevance to the task in which they are engaged. these cognitive processes themselves are not directly observable; however, there are traces of the results of at least some of this cognitive activity in the navigation pathways that students follow. tracking and analysing the sequences of pages students visit provide insights into navigation behaviours that, in turn, can ultimately suggest which kinds of navigation behaviour are more or less likely to be effective in digital reading. this chapter examines how general patterns of navigation behaviour across tasks, and navigation patterns in response to individual tasks, relate to overall proficiency in digital and print reading. It also presents a series of case studies, illustrating student behaviour in response to a number of digital reading tasks. general paTTerns in The relaTionship beTween navigaTion and perFormanCe in digiTal and prinT reading one of the major distinctive features of digital text or, more specifically, hypertext (see oeCd, 2009b, p. 22), is that it consists of several pieces of text, or “nodes”, that are interconnected via hyperlinks (see Chapter 1). the reader is required to select pieces of text and put them into an appropriate order so that both the selection and the ordering fit both the reading goal and the learner’s cognitive resources, such as their prior knowledge (Salmerón, et al., 2006). this process of selecting and ordering pieces of textual information in hypertext is referred to as “navigation” (see lawless and Schrader, 2008, for an in-depth discussion of the “navigation” metaphor). A considerable number of studies have found that navigation is closely linked to understanding digital texts. this is because in digital reading, a reader “constructs” his or her text through navigation. thus, his or her navigational choices directly influence what kind of text is eventually processed. this affects both the text’s content and structure. navigation choices will determine which pieces of information will be accessible to the reader, and whether that information is appropriate to the task at hand. they will also determine whether the pieces of information accessed will be in a semantically coherent order, and thus require more or less cognitive effort to be understood (Kintsch, 1998). A wide variety of methods has been used in prior research to describe students’ navigation behaviour (naumann, 2008; Richter, et al., 2003; and Rouet and Passerault, 1999). Among these are graphical methods that fully describe a given reader’s navigational path. to relate navigation to measures of comprehension or learning outcomes statistically, however, navigation behaviour has to be captured in some metric or scale. In the most simple case, this metric can be qualitative (or “nominal”) and classify students in terms of whether their navigational behaviour falls into one or another category. An example of such a scale is classifying students as to whether they performed a specific navigation action or not – for example, whether or not they clicked on a particular link. Another example is the distinction between different “types” of navigators, who differ in more than one aspect of their navigational behaviour. lawless and Kulikowich (1996), for instance, looked at seven different navigational indices, such as the proportion of relevant pages accessed, the proportion of special features accessed, such as movies or sound effects, or the number of deviations from an optimal path. these seven indices served as the basis for a cluster analysis. this analysis resulted in grouping students into three clusters, identified as “knowledge seekers”, “feature explorers”, and “apathetic users”. Within this classification, “knowledge seekers” were those who navigated in a very structured and task-oriented way, and were not easily distracted by task-irrelevant text content or devices. these users usually scored best on a reading-recall measure. “Feature explorers” tended to investigate each and every feature in the hypertext, especially its technical features. A student belonging to this class of user would probably click on a video or an animation that looked interesting or appealing, more or less regardless of its importance to completing the particular learning task. these users scored second best. “Apathetic users” were not easily distracted, but they did little navigating: their paths were usually short, and their information-seeking behaviour did not meet the requirements of the task. these users scored the worst. A reader’s navigational behaviour can also be described by one or more variables indicating the extent to which he or she performed pre-defined acts of navigation, leading not to a discrete classification, or a nominal scale, but to an ordinal or an interval scale. one variable of this kind that has been used widely in describing task-oriented navigation is the extent to which readers access task-relevant information within the digital text environment. A straightforward and frequently used way to measure task-oriented navigation is to count the number of task-relevant90 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenTpage visits, or to take the ratio of task-relevant visits, divided by the number of page visits. this variable has proven tobe highly predictive of learning outcomes in hypertext or hypermedia learning (Cress and Knabel, 2003; mcdonaldand Stevenson, 1998a, 1998b; naumann, et al., 2008).In addition to accessing information, ordering information is crucial for proper comprehension of hypertextmaterials. Students who fail to organise the material they read in a semantically coherent order are at a disadvantage,especially if they have minimal prior knowledge and they are thus not in a position to bridge gaps in understandingby appropriate inferences (Salmerón, et al., 2005). thus, one theoretically important aspect of navigation is capturedby indices that look not only at individual page visits, such as visits to task-relevant pages, but at movement betweenpages, that is, semantically coherent vs. incoherent movement between pages belonging to the same hypertext nodevs. movement between pages belonging to different hypertext nodes.relevance of pagesthe PISA 2009 digital reading assessment tasks were deliberately constructed so that navigation was requiredto obtain full credit. thus, in some tasks, students were required to go through a number of pages to access theinformation they needed to complete the task, or to integrate information from at least two different pages. Forexample, in the unit “IWANTTOHELP”, students engage with a blog entry written by a girl named “maika”, whodiscusses her intention to start a volunteer job. From the blog entry, a text-embedded link refers to a site from a non-profit organisation called “iwanttohelp”, where volunteering opportunities are offered. In Question 3 of this unit,students are asked to define the purpose of the “iwanttohelp” website. to answer the question, students first haveto use the link to the “iwanttohelp.org” website, and then have to determine that this website’s aim is “providingpeople with information about ways to volunteer” (as stated in one of the multiple-choice options). In this task, inaddition to the two pages that students need to visit to receive a score in this item (unless they guess), there area number of additional pages that might reasonably be assumed to be helpful in determining the purpose of theiwanttohelp website, such as an FAQ page or an “About” page, which can be accessed using a site map. In eachtask there are a number of pages that will only be chosen by students as a result of poor comprehension, as thosepages contain no relevant information. thus, each unit contains three types of pages: those that must be visited tocomplete a given task (necessary pages), those that either are necessary or might be useful in completing the task(relevant pages), and those that are clearly irrelevant to the task (irrelevant pages). thus, the necessary pages are asubset of the relevant pages (each necessary page is also relevant, but not the reverse).indicators used to describe navigationthree indicators are used to describe students’ navigation behaviour. First, as a rough index of how intensely studentsuse the environment overall, the number of page visits is examined. this comprises visits to any pages, regardless oftheir relevance to the task, and regardless of whether each is a first visit to the page or a revisit. Students with a verylow score on this variable might be called “apathetic” according to lawless and Kulikowich (1996). Second, thenumber of visits to relevant pages is taken into account. this index describes how often students accessed a page thatcontains task-relevant information, has to be accessed to find task-relevant information, or can be assumed to containtask-relevant information. this index describes the overall intensity of students’ task-oriented navigation behaviour. Box VI.3.1 example of navigation indices the following sample pathway illustrates how the navigation indices number of page visits, number of visits to relevant pages, and number of relevant pages visited are computed: step no. Page accessed description 1 “Page 1” In this example, the pages that are considered relevant to the task are marked in 2 “Page 2” bold (pages “1”, “4” and “5”). thus, a student displaying this path would be assigned 3 “Page 1” seven as the number of page visits, corresponding to the total length of the path, or 4 “Page 3” the number of steps taken. the number of visits to relevant pages would amount to 5 “Page4” four, since the student visited a page classified as relevant four times (in steps 1, 3, 6 “Page 5” 5 and 6). Finally, the number of relevant pages visited would amount to three, since 7 “Page 3” three different relevant pages were accessed (pages “1”, “4” and “5”). PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 91
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT However, this index says nothing about how comprehensively a student covers the material that is potentially relevant to the task. In theory, a student could switch back and forth between two pages that are both relevant to a task, and thus obtain a large number of visits to relevant pages, despite seeing only a small part of the material, and without navigating in any straight or task-oriented way. Given this possibility, the number of relevant pages visited is calculated. this index describes how many of the pages judged to be relevant to a task were accessed while the student worked on that task.1 the tracking and analysis of the sequences of pages students visit to complete a task to identify navigation behaviours associated with greater digital reading proficiency is one of the major aspects of assessing student competencies that ICt enable. the two indices relating to visits to task-relevant pages can be expected to be positively associated with digital reading performance. In the case of the number of relevant pages visited, the assumption is clear: the greater the proportion of relevant pages a student visits, the more likely the student is to succeed in that task, and the better he or she is likely to perform in the assessment as a whole. this is because before understanding the content of a text that is necessary or relevant to a given task, the text itself must first be accessed. A positive association with performance can also be assumed for the number of visits to relevant pages, since students who include more visits to relevant pages in their navigational paths will have navigated more systematically and will have had access to more task-relevant information than students who had fewer visits to relevant pages. revisits to task-relevant pages can be a sensible navigation strategy. the relationship between number of page visits, that is, the mere length of a students’ navigational path, and performance is unclear. While some studies find path length to be positively associated with learning outcomes (Barab, et al., 1996; Brunstein and Krems, 2005; lin, 2003), others find no such association (mceneaney, 2001; naumann, et al., 2007). the different findings might be related to the origin of the path lengths. For example, a path length might be the result of a student getting lost and attempting to find his or her way back to a better path. this path length would have a negative association with performance. the same would be true if the path length were the result of idle and distracted navigation. on the other hand, if path length is a result of comprehensive coverage Box VI.3.2 how the findings are organised the findings relating to navigation in the PISA digital reading assessment that use indices aggregated across tasks are organised as follows: First, the distribution (mean, standard deviation, median, skewness) of the three indicators of navigation (number of page visits, number of visits to relevant pages, and number of relevant pages visited) is given for each country. Within countries, the mean and standard deviation of all three indicators are also plotted against each other, and against countries’ mean digital reading scores. then, correlations between the three indicators and between both digital and print reading scores are reported.a Correlations between print reading scores and navigation, and between digital reading scores and navigation, are also reported. Regression analyses that introduce navigation as a predictor of digital and print reading performance are then reported. these analyses show whether students with similar levels of print reading proficiency differ in their digital reading performance, depending on their navigational behaviour. Finally, regression analyses that consider non-linear trends in the prediction of performance based on navigation are discussed. A moderate, as opposed to a low, number of visits to relevant pages can be expected to benefit performance, especially if revisits are included. However, when students go beyond a moderate number of visits to relevant pages, for example by moving frequently back and forth between two (relevant) pages, it might not improve their performance. thus, the impact of increasing numbers of visits to relevant pages on digital reading performance might be expected to be diminished. the same holds for the number of page visits. Figure VI.3.1 illustrates the non-linear relation between the number of relevant page visits and digital reading performance. A similar curve is expected for the number of page visits. Following these analyses, case studies of navigation behaviour in six individual tasks are analysed and related to performance a. Here, and in the rest of this chapter, Weighted likelihood estimates (Wles) are used for both digital and print reading proficiency scales because indices of navigation were not included in the background model for the computation of Plausible Values (PVs), and thus cannot be used as predictors in regression models using PVs as dependent variables.92 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenTof available material, with a lot of visits, and goal-oriented revisits, to task-related materials, or the exploration ofpages considered relevant, the number of page visits will have a positive association with performance.Distribution of navigation indices at the country levelthe distribution of navigation indices within countries is slightly skewed for each index, especially for the numberof relevant pages visited and the number of page visits. the number of relevant pages visited is skewed to theleft in every country except Colombia, where the skew is to the right (table VI.3.1). the number of page visits isconsistently skewed to the right in every country (table VI.3.1).this means that for the number of relevant pages visited, the median is larger than the mean because some, althoughcomparatively few, students perform differently from the majority in that they visit relatively few relevant pages. Forthe number of page visits, the mean is larger than the median. Here, relatively few students access and revisit pagesmuch more often than the majority. overall, the skewness of the frequency distributions of navigation indicatorsis less pronounced than it often is in the small-scale studies reported in the literature (for example, the number ofvisits to relevant pages in naumann, et al., 2008).2 Figure VI.3.2 illustrates the distribution of navigational indicesaggregated across oeCd countries; the overall shapes of the distributions within countries are the same as the shapeof the distributions that result when the data are aggregated across oeCd countries.Across countries and economies, there is wide variation in the distribution of the navigation indices considered(table VI.3.1). For instance, with respect to the mean number of relevant pages visited, students in Korea saw anaverage of 53 pages, while in Colombia they saw only 31 pages. the same holds for the mean number of visits torelevant pages, which varies between 44 (Colombia) and 74 (Korea), and the mean number of page visits, whichvaries between 58 (Colombia) and 100 (macao-China). these differences, especially those in the number of relevantpages visited, match closely country differences in digital reading performance (Figure VI.3.3). • Figure VI.3.1 • Illustration of the relationship between number of relevant pages visited and digital reading performance Digital reading score High A small increase in predicted digital reading score Moderate vs. high number of relevant page visits A large increase in predicted digital reading score Low vs. moderate number Low of relevant page visits Low Moderate High Number of relevant page visits 1 2 http://dx.doi.org/10.1787/888932435397 PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 93
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT • Figure VI.3.2 • Distribution of the number of pages and visits, aggregated across OECD countries (a) Number of (b) Number of Frequency relevant pages visited Frequency visits to relevant pages Frequency (c) Number of page visits 2.500 2.500 5 000 2.000 2.000 4 000 1.500 1.500 3 000 1.000 1.000 2 000 500 500 1 000 0 0 0 0 20 40 60 80 0 50 100 150 200 250 0 100 200 300 Number of relevant Number of visits Number of pages visited to relevant pages page visits Source: OECD, PISA 2009 Database. 1 2 http://dx.doi.org/10.1787/888932435397 At the country/economy level, both the Pearson and rank-order correlations between the mean number of relevant pages visited and the mean digital reading score amount to 0.98. • Figure VI.3.3 • Relationship between the number of relevant pages visited and digital reading performance Digital reading score 600 Hong Kong- Korea China New Sweden Zealand 550 Australia Iceland Ireland Japan 500 France Belgium Poland Denmark Hungary Macao- Spain China Norway 450 Austria Chile 400 Colombia 350 30 35 40 45 50 55 Number of relevant pages visited Source: OECD, PISA 2009 Database, Tables VI.2.4 and VI.3.1. 1 2 http://dx.doi.org/10.1787/888932435397 the relation is somewhat less clear concerning the number of visits to relevant pages and the number of page visits. the reason is that students in the participating Asian countries and economies were more likely to revisit relevant pages and to explore pages beyond those considered relevant (Figures VI.3.4 and VI.3.5).94 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT • Figure VI.3.4 •Relationship between the number of visits to relevant pages and digital reading performance Digital reading score 600 Korea 550 Australia New Zealand Ireland Belgium Sweden Japan Norway Iceland France Hong Kong-China 500 Denmark Macao-China Hungary Poland Spain Austria 450 Chile 400 Colombia 350 40 45 50 55 60 65 70 75 Number of visits to relevant pagesSource: OECD, PISA 2009 Database, Tables VI.2.4 and VI.3.1.1 2 http://dx.doi.org/10.1787/888932435397 • Figure VI.3.5 • Relationship between the number of page visits and digital reading performance Digital reading score 600 Korea New 550 Zealand Australia Japan Ireland Iceland Belgium Sweden Hong Kong-China 500 Norway France Denmark Macao-China Hungary Spain Austria Poland 450 Chile 400 Colombia 350 55 60 65 70 75 80 85 90 95 100 Number of page visitsSource: OECD, PISA 2009 Database, Tables VI.2.4 and VI.3.1.1 2 http://dx.doi.org/10.1787/888932435397While the difference between the mean number of page visits and the mean number of relevant pages visited is 29for the oeCd average, it is 46 for Japan, Korea and the partner economy Hong Kong-China, and as high as 53 forthe partner economy macao-China.not only across countries and economies, but also within countries there is considerable variation in eachnavigational index, as indicated by within-country standard deviations (table VI.3.1). Standard deviations in thenumber of relevant pages visited range from 7.3 (Korea) to 11.5 (Hungary). Standard deviations in the number ofvisits to relevant pages range between 14.3 (new Zealand) and 20.0 (Colombia). Standard deviations in the numberof page visits range from 22.4 (denmark) to 34.1 (macao-China). PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 95
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT All tasks were constructed so that students had to navigate (see Chapter 1). However, in a number of tasks, students were given guidance on how to navigate most efficiently, such as which link to click on first on the start page, and instructions on how to minimise the risk of getting “lost”. It is thus a significant finding that students differ to a large degree in the number of relevant pages visited, in the number of visits to relevant pages and in the number of page visits. If, in tasks where guidance was provided, a large majority of students had followed the instructions on how to navigate, and if they had found for themselves the shortest route, there would have been much less variation in all three indicators. the amount of within-country variation that occurs in the number of relevant pages visited has a negative relation both with the number of relevant pages visited and with digital reading scores at the country/economy level (Figures VI.3.6 and VI.3.7). At that level, the Pearson correlation between the within-country standard deviation in relevant pages visited and the mean number of relevant pages visited is -0.79 while the rank order correlation is -0.81. the Pearson correlation between the within-country standard deviation in relevant pages visited and the mean digital reading score is -0.79, and the rank order correlation is -0.77. For instance, students in Korea, who scored highest in digital reading and also had the highest number of relevant pages visited, at the same time had the lowest standard deviation in the number of relevant pages visited (7.3). In contrast, students in the partner country Colombia, who scored lowest in digital reading, displayed large variations in the number of relevant pages visited (standard deviation 10.9). likewise, students in Chile, who had the second lowest performance in digital reading and visited the second lowest mean number of relevant pages, had the second highest standard deviation in the number of relevant pages visited (11.3). • Figure VI.3.6 • Relationship between standard deviation and mean of the number of relevant pages visited Mean number of relevant pages visited 60 55 New Korea Zealand Japan Hong Kong-China 50 Australia Sweden Belgium Iceland Ireland France Macao-China Norway 45 Denmark Spain Austria Poland Hungary 40 Chile 35 Colombia 30 7.0 7.5 8.0 8.5 9.0 9.5 10.0 10.5 11.0 11.5 12.0 Standard deviation of the number of relevant pages visited Source: OECD, PISA 2009 Database, Table VI.3.1. 1 2 http://dx.doi.org/10.1787/888932435397 In part, the negative correlation between standard deviation and mean across countries is due to the fact that in some countries there was a tendency for most students to visit all relevant pages, in which case the standard deviation was close to zero. It is not likely, however, that the negative relation between standard deviation and mean in the number of relevant pages visited is entirely due to a ceiling effect. depending on the test version administered, the number of relevant pages available was 63, 73 or 76; but even in countries and economies where students visited high numbers of relevant pages (e.g. 53 in Korea or 48 in Sweden and Hong Kong-China), the mean number of relevant pages visited was well below the maximum number of relevant pages.96 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT • Figure VI.3.7 • Relationship between standard deviation of the number of relevant pages visited and digital reading performance Digital reading score 600 Korea New 550 Zealand Australia Hong Kong-China Japan Belgium Sweden Iceland Ireland France 500 Norway Macao-China Denmark Poland Hungary Spain Austria 450 Chile 400 Colombia 350 300 7.0 7.5 8.0 8.5 9.0 9.5 10.0 10.5 11.0 11.5 12.0 Standard deviation of the number of relevant pages visitedSource: OECD, PISA 2009 Database, Tables VI.2.4 and VI.3.1.1 2 http://dx.doi.org/10.1787/888932435397thus, in countries and economies where students succeeded, on average, in accessing a large proportion of thematerial relevant to the task, for example, Korea, Japan, new Zealand, Australia and Iceland, only a few studentsdid not succeed in accessing a large amount of relevant material, resulting in comparatively little variation in thenumber of relevant pages visited (Figures VI.3.6 and VI.3.7). At the same time, these are the countries where studentsperformed better in digital reading.relationships among navigation, print and digital readingAs outlined above, navigation can be assumed to be closely associated with proficiency in digital reading. However,correlations between navigation and print reading can also be assumed, for a number of reasons. First, in assessmentssuch as the PISA 2009 digital reading assessment, the task is presented in written form. Second, most navigationaldevices, such as text-embedded links, menu items, or items in a drop-down menu, have textual labels that haveto be deciphered. thus, lower-level reading processes, such as word identification or syntactic parsing, are oneprerequisite for navigation. In addition, these processes should be routine in order to leave available cognitiveresources for making navigational choices (naumann, et al., 2008). third, to make appropriate predictions, forexample, about where a text-embedded link will lead, and thus whether it makes sense to use it, its textual contexthas to be considered and understood. thus, text-level reading skills are required in addition to routine lower-levelprocesses for efficient navigation.despite these considerations, navigation is a process that is specific to digital reading, even if it might be affected byproficiency in print reading. thus, while there may be associations between print reading and navigation, they arelikely to be stronger between digital reading and navigation. this is because navigation is considered to be a specificand integral part of digital reading, as outlined in the PISA 2009 Assessment Framework (oeCd, 2009).Correlations between navigation and performanceNavigation and digital reading performanceBivariate correlations between the three indicators of navigation and digital reading performance are all positive,and strong for the number of relevant pages visited (table VI.3.2). As expected, correlations are highest for thenumber of relevant pages visited, ranging from 0.68 (Korea) to 0.86 (Hungary), followed by the number of visits to PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 97
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT relevant pages, that range from 0.39 (Korea) to 0.75 (Hungary). Correlations between the number of page visits, not taking task relevance into account, are still positive, but comparatively small, ranging from 0.15 (macao-China) to 0.59 (Hungary). on average across oeCd countries that participated in the digital reading assessment, correlations between navigation and digital reading performance are 0.81 (number of relevant pages visited), 0.62 (number of visits to relevant pages), and 0.42 (number of page visits). Navigation and print reading performance there are significant positive associations between print reading performance and with navigation as well. these are, however, consistently weaker for print reading than for digital reading (tables VI.3.2 and VI.3.3). Correlations of the number of relevant pages visited with print reading scores range from 0.43 (macao-China) to 0.72 (Hungary); correlations of the number of visits to relevant pages with print reading scores range from 0.24 (macao-China) to 0.63 (Hungary); and correlations of the number of page visits with print reading scores range from 0.06 (macao-China) to 0.51 (Hungary). on average across oeCd countries that participated in the digital reading assessment, the correlations of navigation indices with print reading scores are 0.62 (number of relevant pages visited ), 0.48 (number of visits to relevant pages) and 0.33 (number of pages visited ). thus, consistent with the need to employ reading skills in order to accomplish navigation tasks, navigation is related not only to digital reading, but to print reading as well. At the same time, corresponding correlations between indices of navigation are stronger for digital than for print reading (tables VI.3.2 and VI.3.3). regression of digital reading performance on print reading and navigation multiple regression analyses were conducted to test whether navigation would be predictive of digital reading performance after accounting for print reading proficiency. these analyses provide a crucial test for the claim that navigation – as captured by the indices used here – is a specific and integral part of digital reading, especially given that navigation is correlated not only with digital reading but also with print reading scores. theoretically, one model that could account for the data presented thus far would assume that good navigation is a by-product of good print reading proficiency, which also influences digital reading proficiency (Salmerón and García, forthcoming). In this case, correlations between navigation and digital reading achievement should be close to zero when print reading proficiency is accounted for. In other words, if good navigation were a by-product of good reading proficiency, and thus correlated with digital reading scores, in a multiple regression of digital reading scores on print reading and navigation, navigation should have no increment in variance explained over and above what is already explained by print reading. Although such a model is not considered seriously in the hypertext literature, rarely has it been put to the test: in most studies investigating the impact of navigation on comprehension in electronic environments, no independent measure of print reading proficiency has been included. thus, there is little evidence of an association between navigation and digital reading comprehension after accounting for print-reading proficiency. Number of relevant pages visited In a regression of digital reading scores on print reading scores and the number of relevant pages visited, the regression coefficient for both predictor variables is significant for each country (table VI.3.4). this means that students with the same level of print reading proficiency will still differ in their predicted digital reading achievement, depending on how many relevant pages they visited. on the other hand, students accessing an equal number of task-relevant pages will still differ in their predicted digital reading score depending on their print reading proficiency. the magnitude of the effects of navigation conditional on print reading, and of print reading conditional on navigation, can be examined by inspecting both the regression coefficients and the amount of unique variance explained by each predictor. Regression coefficients for the number of relevant pages visited range from 5.22 in the partner country Colombia and the partner economy macao-China, to 6.93 in France, with an average of 6.40 across all participating oeCd countries. this means that for students with similar print reading proficiency, their predicted digital reading score is increased by between about five and about seven score points for each relevant page visited. Regression coefficients for print reading proficiency vary between 0.23 in Japan and 0.39 in new Zealand, with an average of 0.31 across all participating oeCd countries.98 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenTthus, for students who visit an equal number of relevant pages, their predicted digital reading score is increased bybetween 2 and 4 score points with each additional 10 score points gained on the print reading scale.the increase in variance explained in digital reading (Dr2) that is obtained when the number of relevant pagesvisited is included in the model, in addition to print reading proficiency, ranges from 16% (Korea) to 31% (France),with an average increase of 23% across all participating oeCd countries. Including print reading proficiency as apredictor of digital reading proficiency, in addition to the number of relevant pages visited, increases the varianceexplained in digital reading by between 4% (Poland and Spain) and 11% (Korea and macao-China), with an averageof 6% across all participating oeCd countries. In terms of conventions for effect sizes (Cohen, 1988), the effect ofnavigation on digital reading performance after accounting for print reading proficiency is large, with an effect sizef 2 that ranges from 0.38 in Korea to 1.32 in France, with a mean of 0.83 across all participating oeCd countries.3effect sizes for print reading proficiency, while accounting for the number of relevant pages visited, are also large byconvention, but smaller than those obtained for navigation.this analysis suggests that navigation ability is an additional component of reading in the digital medium, beyondthe other abilities that students have, and employ, in print reading. Although there is overlap with print reading,performance is improved when students navigate effectively, that is, when they maximise visits to relevant pages.Number of visits to relevant pagesIn a regression of digital reading performance on print reading performance and the number of visits to relevantpages, regression coefficients are significant for the number of relevant page visits consistently across countries,ranging from 1 in Korea to 3 in Austria (table VI.3.5), with an average of 2.4 across all participating oeCd countries.this means that for students with equal proficiency in print reading, their digital reading score increases by between1 and 3 score points for any visit to a relevant page, whether this page has already been visited or not. Regressioncoefficients for print reading proficiency vary between 0.38 (Japan) and 0.60 (new Zealand), with an average of0.50 across all participating oeCd countries. this means that accounting for the number of visits to relevant pages,students’ digital reading score increases by between 0.38 score points and 0.60 score points for each additionalscore point on the print reading scale.the increase in variance explained in digital reading proficiency that is obtained when the number of visits to relevantpages is included in the model, in addition to print reading proficiency, ranges between 3% (Korea) and 14% (Austriaand Hong Kong-China), with an average of 11% across all participating oeCd countries. Including print readingproficiency as a predictor, in addition to number of visits to relevant pages, increases the variance explained bybetween 14% (Hungary) and 29% (Korea), with an average of 20% across all participating oeCd countries. effectsizes in these analyses range from medium to large for the number of visits to relevant pages and are large for printreading proficiency (table VI.3.5). thus, although once again both print reading proficiency and navigation can beproven to account for independent proportions of variance in digital reading performance, the pattern of results is inone way reversed in comparison to the analysis involving the number of relevant pages visited: taking the number ofrelevant pages visited as an indicator of navigation, and as a predictor of digital reading performance, in addition toprint reading proficiency, the number of relevant pages visited accounts for a larger proportion of unique variance thanprint reading proficiency does. taking the number of visits to relevant pages, rather than the number of relevant pagesvisited, as an indicator of performance reverses this pattern. Here, a larger proportion of unique variance is accountedfor by print reading proficiency than by the number of visits to relevant pages.Number of page visitsIn a regression of digital reading performance on the number of page visits and print reading proficiency, allregression coefficients are positive and significant (table VI.3.6). Regression coefficients for the number of pagevisits range from 0.26 (Korea) to 1.26 (Austria), with an average of 0.92 across all participating oeCd countries(table VI.3.6).this means that the predicted digital reading performance for students with the same print reading proficiency isincreased by between 0.26 and 1.26 score points per additional visit to any page, whether it is relevant to the taskor not. For print reading proficiency, in this analysis, regression coefficients varying between 0.43 (Japan) and 0.70(new Zealand) are obtained, with a mean of 0.61 across all participating oeCd countries. In terms of unique varianceaccounted for by each of the predictors, the effect for the number of page visits varies between 1% additional variance PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 99
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT explained (Korea and the partner economy macao-China) and 8% additional variance explained (norway), with an average of 5% across all participating oeCd countries. In contrast, the unique variance accounted for by print reading proficiency in this analysis varies between 23% (Colombia) and 49% (new Zealand), with an average across all participating oeCd countries of 34%. effect sizes for each of the predictors range from small to medium for the number of page visits and are large for print reading proficiency (table VI.3.6). thus, similar to what has already been seen for the number of visits to relevant pages, and in contrast to what was found for the number of relevant pages visited, print reading proficiency accounts for considerably larger proportions of unique variance than does the number of page visits. taken together, the results presented in this section indicate clearly that navigation has positive associations with digital reading performance even when print reading proficiency is accounted for. In the case of the number of relevant pages visited, which provides an indication of the amount of potentially relevant information that students view, these effects turn out to be even stronger than the complementary effects of print reading proficiency, accounting for navigation. In the case of the other two indices that focus more on how often students opened and re-opened pages, there were still effects on digital reading performance independent of print reading proficiency, but these were smaller; and in these analyses, print reading proficiency proved to be the comparatively stronger predictor. this means that the more relevant pages students visit, the better they are likely to perform. this effect cannot be explained solely by the fact that students who display better navigational behaviour are also those with better print reading proficiency. on the contrary, although students with better print reading skills display better navigational behaviour in terms of the number of relevant pages they visit (number of relevant pages visited), and the number of times they access relevant content (number of visits to relevant pages), navigation is associated with digital reading performance in ways that are independent of print reading proficiency. this supports the notion that proficiency in digital reading cannot fully be mapped according to traditional print reading proficiency. Non-linear effects of navigation on digital reading performance Indices capturing the extent of actions students take when performing digital reading tasks, such as the number of visits to relevant pages, or the number of page visits, have overall positive linear associations with performance (tables VI.3.2, VI.3.5, and VI.3.6). However, a linear model might not be the best way to describe these aspects of the relationship between navigation and performance. Consider, for example, the number of visits to relevant pages. Clearly, a student who rarely visits relevant pages will most likely fail in a given task and achieve a low score on the entire test. In contrast, a student who has a moderate number of visits to relevant pages will probably fare better. However, visiting relevant pages more often than is needed, meaning that these pages are revisited frequently, might have an additional beneficial effect on comprehension if done thoughtfully, as a result of proper monitoring and regulation of the comprehension process (see also the case study of Item 2 in the unit JOb SEArCH below). In many cases, clicking back and forth between pages is a sign of disorientation, rather than of proper monitoring and regulation, as is indicated by negative associations of high numbers of backtrack- sequences of the type PageA – PageB – PageA with learning outcomes reported in the literature (Richter, et al., 2005; Savayene, et al., 1996). to test for non-linear effects of navigation on digital reading performance, the previous section’s regression models, which predicted digital reading performance by print reading and navigation, were extended. In addition to the linear effect of navigation on digital reading performance, a non-linear (quadratic) effect of navigation on digital reading performance was estimated. Inspection of the regression coefficients revealed that non-linear effects were present for both the number of visits to relevant pages and the number of page visits consistently across countries (tables VI.3.7 and VI.3.8). For the number of visits to relevant pages and the number of page visits, the regression coefficient for the non-linear term was negative in each case. this indicates that, in each country, visiting yet another (relevant) page becomes less predictive of digital reading performance, the more visits to relevant pages students had already made. Averaged across all participating oeCd countries, the predicted digital reading score for a student with 20 fewer visits to relevant pages than the average is 64.6 score points below the score predicted for a student with an average number of visits to relevant pages. In contrast, for a student with 20 more visits to relevant pages than the average, the predicted increase in digital reading score is only 30.5 score points (Figure VI.3.8). overall, in conventional terms for effect size classification, the non-linear trends for both the number of visits to relevant pages and the number of page visits correspond to a medium-sized effect.100 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT • Figure VI.3.8 • Relationship between the number of visits to relevant pages (centred) and digital reading performance, OECD average Predicted digital reading score 550 Increase of 30.5 score points in digital reading 500 Increase of 64.6 score points in digital reading 450 400 -20 -10 0 10 20 Number of visits to relevant pages (centred)Source: OECD, PISA 2009 Database, Table VI.3.8.1 2 http://dx.doi.org/10.1787/888932435397In contrast to the number of visits to relevant pages and the number of page visits, for the number of relevant pagesvisited no consistent non-linear trend can be observed (table VI.3.9).thus, for the indicators that load heavily on how often students visit any page, there is a point beyond which visitingmore is not helpful. In contrast, for the number of relevant pages visited the relation with performance is linear.taken together, these results suggest that once students have adequately covered all the relevant material, eithervisiting relevant pages more often or visiting more pages in general (relevant as well as irrelevant), tends not toprovide any additional benefit.Navigation and genderAnalyses presented in this chapter thus far provide evidence that navigation is related to digital readingperformance, before and after accounting for print reading proficiency. At the same time, correlations existbetween navigation and print reading albeit smaller. Chapter 2 shows that the gender gap found in print readingis also found in digital reading, however, the difference is smaller here, and after accounting for their print readingskill, boys tend to have a slight advantage over girls in digital reading. A similar pattern holds for navigation: Ingeneral, girls navigate better than boys. overall, they visit more relevant pages (number of relevant pages visited),and tend to visit relevant pages more frequently (number of visits to relevant pages). For the number of relevantpages visited, girls’ advantages are significant for 14 oeCd countries (table VI.3.1). Insignificant differences arefound in Chile and Japan, and all three partner countries and economies. For the number of visits to relevantpages, significant advantages for girls are found in 10 oeCd countries. Averaged across all participating oeCdcountries, girls visit more relevant pages (number of relevant pages visited ), and more frequently (number of visitsto relevant pages). these differences are not too surprising, given that the number of relevant pages visited, but alsothe number of visits to relevant pages are strongly correlated with digital reading performance, and girls do better PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 101
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT digital reading than boys. However, when print reading proficiency is accounted for, these advantages for girls are diminished, or even reversed. After accounting for print reading skill, significant differences in favour of boys are found in Chile, Spain and Poland, as well as in the partner countries and economies macao-China and Colombia. this means that in these countries and economies, of boys and girls with similar print reading proficiency, boys will visit more relevant pages. A similar result emerges for the number of visits to relevant pages. on this index of navigation, after accounting for print reading proficiency, significant differences in favour of boys are found in Spain, France, Korea and Poland, and all four participating partner countries and economies. Also, averaged across all participating oeCd countries, after controlling for print reading proficiency, boys display a significantly higher number of visits to relevant pages. taken together, these results are consistent with the assumption that the comparatively smaller advantage of girls over boys in digital reading as compared to print reading might be due to the fact that boys, who are on par with girls in print reading, tend to navigate slightly better. one should, however, bear in mind that unconditionally, girls are not only better readers than boys, but also navigate more proficiently in the electronic environment. the analyses provided so far in this chapter underscore the importance of navigation for the comprehension of digital text. In particular, strong correlations between digital reading performance and the number of relevant pages visited were found, indicating that careful and comprehensive selection of task-relevant materials within a hypertext is one variable closely tied to digital reading proficiency in general. these associations are largely independent of students’ print reading proficiency. Although the data provided here cannot ascribe causality, the statistical dependency of digital reading performance on navigation appears not to be a mere by-product of students’ print reading proficiency. Rather, for two students with the same print reading proficiency, different digital reading scores are predicted, depending on how much of the material considered relevant to a given task they access, and depending on how often they access relevant pages. Some conclusions might also be drawn concerning different aspects of navigation, and their respective associations with digital text comprehension as assessed by the PISA digital reading assessment. Generally, it appears crucial that students systematically assess what they need to see in a hypertext and then access these materials. doing more than that – visiting a lot more pages than required – apparently has no additional positive association with digital reading proficiency. Case sTudies: navigaTion behaviour oF sTudenTs in seleCTed digiTal reading TasKs the remainder of the chapter presents case studies of the navigation behaviour observed among students for six individual tasks from three units used in the PISA 2009 digital reading assessment: IWANTTOHELP, SmELL and JOb SEArCH. the case studies illustrate how some of the findings in this chapter operate at the task level. the units used in the PISA 2009 digital reading assessment were designed to vary considerably in the complexity of text processing and navigation demands. the six tasks analysed in these case studies were chosen to illustrate this variety. the analysis describes a range of strategies used by students in response to these different task demands. It identifies behaviours that are associated with students who show higher digital reading proficiency, and other behaviours that are associated with students who show lower proficiency. this analysis offers a sense of the range of strategies used by good readers and by less effective readers. these strategies vary from task to task, as do the specific questions investigated. to date, empirical studies of readers’ navigation behaviour in individual reading tasks have mostly been conducted on a small scale (Barab, et al., 1996; madrid and Cañas, 2008; mceneaney, et al., 2009; Puerta melguizo, et al., 2008; Rouet, 2003). the PISA 2009 digital reading assessment allows for a large-scale examination of students’ navigation behaviour in response to a variety of individual reading tasks by analysing the log files that capture every navigation step made by students as they respond to each task, as well as the time they spend on each page. data of this kind allow for analysis of the different kinds of behaviour students exhibit when confronted with different tasks. It is possible to observe how much exploration stronger and weaker readers typically engage in when confronted with new reading stimuli, as well as the extent to which this level of exploration varies according to the demands of individual tasks. It is also possible to observe under what circumstances readers avoid visiting pages not obviously relevant to the task, and when, by contrast, they are more likely to explore the available material. the analysis allows for a consideration of the value of categorising students according to the behaviours referred to earlier in this chapter as “knowledge seekers”, “feature explorers” and “apathetic users” (lawless and Kulikowich, 1996).102 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenTthe case studies provide evidence of specific navigation sequences, including when better readers decide to visitspecific pages multiple times, and when they deem a single visit sufficient. they describe navigation behaviourstypically employed by weaker readers and contrast these with behaviours of better readers. Information is providedabout the activity of students who fail to gain credit or to provide responses to particular questions, for example,how much navigation (if any) they engage in, and whether or not they locate all the relevant pages. the case studiesalso show the amount of time students spend on tasks with differing demands, and on pages containing informationnecessary to answering the question. the behaviours of students who answer questions successfully and unsuccessfullyare compared. differences in patterns of navigation behaviour between girls and boys are described. For example, theanalysis provides evidence of how far it is true to say that boys are likely to engage in more navigation (that is, to clickon more links) than girls. the analysis focuses mainly on digital reading performance, but when relevant, comparisonsbetween performance by different sub-groups in digital and print reading are also examined.the main aim of these case studies is to investigate patterns of behaviour observed when students perform individualreading tasks. the aim is not to report on navigation indices, as the first section of this chapter does, nor to relatethese patterns to performance on the digital reading assessment as a whole. Rather, the case studies show thedemands made by individual tasks, and the patterns of navigation behaviour used on these tasks by stronger andweaker readers. therefore, the tables presented in the remainder of this chapter draw on somewhat different dataand use different analyses from the statistics used in the other chapters and PISA 2009 volumes.In this section, all figures relating to the numbers of students refer to those for whom log-file data are available,from all countries and economies that participated in the digital reading assessment. they may differ slightly fromabsolute numbers of students attempting each task. Group sizes are often too small for meaningful analysis at thecountry level; as a result, the analysis in this section is at the level of the whole sample of students to whom eachtask was administered. Scale scores are given to indicate the difficulty of each task; in addition, percentages ofstudents in different score categories (full credit, partial credit, no credit, no response given) are provided to facilitatecomparisons between different types of behaviours and the various sub-groups within each score category.Although the tables in this section refer to similar measures referred to in the first section of this chapter, the numberof relevant pages visited, number of visits to relevant pages and number of page visits are reported in absolutenumbers in this section. For some tasks, additional counts are also presented: number of pages visited, number ofirrelevant pages visited and number of visits to irrelevant pages. these are not analysed as indices, generalisableacross the entire digital reading assessment, but are related to individual tasks. they are presented in absolute terms,not centred or standardised. Because the behaviours are identified according to issues relevant to individual readingtasks, rather than associated with framework variables or patterns of performance by country, the analyses presentunweighted numbers (to illustrate the absolute frequency of particular behaviours), percentages of students, andunweighted mean scores.Tasks analysed in the case studiesFigure VI.3.9 lists the six tasks analysed in Chapter 3. As described at the beginning of this chapter, the pages thatstudents can view in the course of each task can be categorised as necessary (that is, the pages students need to visitto locate the information required to answer the question), relevant (pages that may or may not be essential, butcontain useful information that may assist students), or irrelevant (pages that contain no information that will assiststudents in completing the task successfully). the sum of all pages that students can view, by using all links and tabs,represents the number of available pages. Figure VI.3.9 summarises features of the task related to navigation and textprocessing: the number of pages of each type, and an indication of the quantity and complexity of the text studentsneed to process. It also shows the percentage of students who obtained credit, the mean time spent by all studentson each task, and the average number of pages visited by students during each task.Figure VI.3.9 shows that, for example, in IWANTTOHELP Question 1, students can locate the necessary informationon a single page (that is, the starting page for the task) containing only a small amount of simple text. this is the onlyrelevant page for this task, although there are 31 pages available to students during this task if they decide to exploreall the possibilities. the task is relatively easy (digital reading scale score 362). the mean time spent on the task is66 seconds, and the mean number of pages visited by each student is 1.6. other tasks require students to visit two,three or more pages, each containing text of varying lengths and complexity.the section at the end of Chapter 2, comprising examples of the PISA 2009 digital reading units, provides a detaileddescription of all the tasks in each of these units. they can be viewed on line at www.pisa.oecd.org. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 103
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT • Figure VI.3.9 • summary of characteristics of digital reading tasks analysed in this section task features student response navigation text processing Performance Behaviour digital time on task number of necessary relevant Available Quantity complexity reading (seconds) pages visited task pages1 pages pages of text of text scale score Mean S.D. Mean S.D. IWANTTOHELP 1 1 31 Short: low level: 362 66 41 1.6 2.1 Question 1 one short text simple. informal E005Q01 (less than language 200 words) IWANTTOHELP 2 2 31 Short: low level: 417 39 29 2.4 1.8 Question 2 two short simple, informal E005Q02 texts (essential language information is in 50-word text ) IWANTTOHELP 5 13 31 long: High level: some Full credit: 183 123 11.2 8.8 Question 4 or more multiple texts, formal text, 567 E005Q08 each with some technical Partial credit: multiple sections language, 525 relatively unfamiliar situation SMELL 2 2 13 medium: medium-high 572 88 49 2.4 2.4 Question 1 set of six search level: some E006Q02 results, plus dense text, relatively long popular scientific text (230+ words) language, familiar topic SMELL 3 3 13 long: multiple medium-high 485 85 51 4.1 3.9 Question 3 texts of varying level: some E006Q06 lengths (longest is dense text, 400+ words) popular scientific language, familiar topic JOB SEARCH 3 4 8 medium: low: mainly Full credit: 153 81 5.5 4.4 Question 2 multiple short informal 624 E012Q03 texts language, Partial credit: personal, familiar 462 topic 1. Including the page where the task starts. Source: oeCd. PISA 2009 Database. 12 http://dx.doi.org/10.1787/888932435397 the number of available pages in each task is relatively modest. this restriction was a deliberate decision made by those who developed the test: it was seen as critical that students were presented with a set of tasks they could reasonably be expected to complete in the available time, in order to allow an adequate assessment of their ability to respond to these tasks. Another consideration arose as the test was developed: students need guidance in locating the information necessary to answer the questions. there is no value in including tasks where large numbers of students become disoriented, confused and frustrated because they cannot locate the necessary pages. the result is that some of the tasks provide explicit directions about links to click on and pages to visit. others are somewhat less explicit, since it was considered important to assess the extent to which students were able to locate necessary information by themselves. As indicated by the substantial amount of variation in the navigation indicators number of relevant pages visited and number of visits to relevant pages (table VI.3.1), students did differ in the degree to which they visited pages containing necessary information. these issues, concerning available material and explicitness of guidance, play an important role in students’ ability to navigate in the digital medium. each of the case studies that follow starts with the task that students see, followed by a set of questions to be explored, a description of essential features of the task, and a list of the necessary pages (pages that students need to visit in order to locate the information required to respond successfully to the task). Since each task raises different issues, the discussions that follow vary.104 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT IWANTTOHELPiWaNTToHelP – QuESTION 1read Maika’s blog entry for January 1. What does the entry say about Maika’s experience of volunteering?Questions for this taskthis is the first task in this unit, and therefore lends itself to consideration of how much students explore whenpresented with new stimuli. Although the task requires no navigation and there is little text that students needto process, they have the opportunity to investigate numerous links, both within the website of the starting page(maika’s Blog) and within another website accessible from a hyperlink on the starting page.• What proportion of students visited pages not necessary for answering the question?• Is increased navigation associated with higher digital reading performance?• Are there differences in the patterns of navigation used by boys and girls?• Are any observed gender differences associated with differences in reading performance?Essential features of the taskthe context for this unit is a blog for maika, who is interested in doing volunteer work. this is a relatively simpletask, requiring students to read one short, simple text. the instructions direct them to look only at the text at the topof the open page, making a literal match between the date in the question and in the heading for maika’s blog entry,“tuesday, January 1”. no navigation is needed, as the task directs students to read only this page and the informationrequired to answer the question can be seen on this page without scrolling.When students start this unit, two tabs are open: the active (visible) tab, maika’s Blog (P24), and a second one,IWANTTOHELP (P01). Students may click on the other available tab, “iwanttohelp.org” (P01), or on the link inmaika’s Blog that goes to the same page. there are several other links available on maika’s Blog, leading to additionalpages, but none of them is relevant to this question.this was a relatively easy question (digital reading scale score 362), with over 84% of students receiving credit(table VI.3.10). the mean digital reading score for students who answered unsuccessfully was low (385 for girls, 317for boys), and even lower for the small proportion who did not attempt the question (306 for girls, 287 for boys).Necessary page• P244: maika’s Blog Home pageExploringoverall, few students engaged in much exploration in this initial task in the unit: 83.5% of students did not gobeyond the page that is open at the start of the task, the only page relevant to this task.Boys (19.3%) were more likely than girls (13.7%) to visit one or more pages other than the starting page (table VI.3.11).there was no difference in the mean proficiency level of boys viewing only the starting page compared to thosevisiting multiple pages. However, girls who did no navigation beyond the starting page had a higher mean score(508) than those who visited two or more pages (493).exploration of the available links and pages for this task, where there is only one relevant page, is not generallyindicative of the behaviour of good readers, consistent with what has been described above. the great majority ofstudents who obtained credit successfully found the answer by reading the starting page, with no further navigation(70.8% of all students). the pattern that emerges beyond this is that as the number of pages visited increases, themean ability of the students diminishes (table VI.3.12).In terms of number of page visits, when all students obtaining full credit are examined as a group, there is verylittle difference in digital reading proficiency among those who visited only the starting page where the necessaryinformation can be found (519), or made three page visits (520) or five page visits (523) (table VI.3.13). When girls PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 105
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT and boys are considered separately, however, a slightly different picture emerges. the girls with the highest reading proficiency for both digital (526) and print (530) were those who visited no pages other than the starting page. those making three or five page visits had slightly lower scores, though the difference is not significant. the boys with the highest mean reading score in both digital (524) and print reading (508) are the small group who made five page visits (2.1% of all boys). their score for digital reading is similar to those with three page visits (521), but considerably higher than for those boys who visited no pages other than the starting page (511). It seems, then, that while most students, both boys and girls, do not engage in unnecessary navigation, small groups of good readers do choose to explore the available navigation space. this provides a qualification to the suggestion by lawless and Kulikowich (1996) that so-called “feature explorers” tend to demonstrate lower performance in reading tasks. the small proportion of students who visited two pages (that is, only one page other than the necessary page) had a much lower reading ability in both digital and print reading, similar to those who visited 10 or more pages. In most cases the second page visited was the other available tab (“iwanttohelp.org”), which contains no information relevant to the task. It seems that these students were pursuing no clearly directed activity and did not actively explore the available content, since they went no further than the single extra page. minimal undirected exploration seems to be a behaviour characteristic of less proficient readers. It may be that a single cursory click on an additional page is a mark of confusion or uncertainty, whereas students who explore further are taking the trouble to satisfy themselves that they have found all the relevant information, or at least determined that they do not need to continue with lots of additional page visits. there is a suggestion from the navigation patterns for this task that, in general, the more proficient readers assess the task requirements and adapt their navigation behaviour accordingly. Where no navigation is required, the better readers tend not to engage in navigation that appears irrelevant. there are, however, small groups of good readers who do actively explore a number of pages; boys with good reading ability are slightly more likely than girls to do this. this exploration may result from the fact that this is the first time students have encountered this set of material, and their exploration is intended to give them a sense of the overall context and scope of the kind of material that is available. SuMMARy • Students most commonly acted strategically for this task, using the task directions and remaining on the starting page, where the target information is available. • Few students engaged in lots of exploration (“feature explorers”), but those who did explore tended to perform better if they engaged in a relatively thorough fashion. • there is some difference in the navigation behaviour of girls and boys. the highest-performing (and largest) group of girls did not go beyond the starting page, while for the highest-performing group of boys (a very small group), the optimum number of page visits was five. this suggests that for those boys (and to a slightly lesser extent, girls) who deem it important to explore the site, this is a useful strategy. this exploration may be more relevant in the first question in the unit (first encounter with the stimulus) than in later questions. • A single click on an irrelevant page, with no follow-up, is characteristic of lower-proficiency students, and does not seem as effective as either remaining on the single relevant page or more thorough exploration. iWaNTToHelP – QuESTION 2 Go to Maika’s “ about” page. What kind of work does Maika want to do when she leaves school? Questions for this task the main issue for this question relates to the behaviour of students who did and did not visit the target page where the information can be found. • What proportion of students visited the target page, maika’s “About” page, P25? • What proportion of students gained credit for the task without visiting the target page? What evidence is there that these students guessed?106 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT• What behaviour is most commonly associated with students who received no credit?• Is there evidence of students seeking but failing to find the target page?• Are any gender differences associated with any of these patterns?Essential features of the taskthis task requires students to follow a link on the starting page to a second page (P25). Identifying the link relies onmaking a literal match with the task wording. once students find the target page, the text is very short. the task isrelatively easy (digital reading scale score 417), with just over 76% of students answering it correctly (table VI.3.14).Necessary pages• P24: maika’s Blog Home page• P25: maika’s “About” pageGuessingthis task requires students to click on a link from the starting page (the Home page for maika’s blog, P24) to maika’s“About” page, P25, in order to find the answer. the data show that the great majority of students (almost 80%)visited this page (table VI.3.14). those who did not visit the page would have had to guess an answer, unless theyhad already visited the page while answering Question 1, and remembered the answer. using their memory in thisway is likely to be a mark of a good and careful reader, and the data do not support the notion that they were goodreaders relying on their memories: the small percentage (3.9%) who received full credit but did not visit P25 had amuch lower reading ability than those who followed the link to P25. this suggests that they did in fact guess. Boys(4.9%) were slightly more likely than girls (3.0%) to guess.of the 79.8% of all students who did visit P25, about 90% received credit (credit: 72.6% of all students; no creditor no response: 7.2% of all students). About two-thirds of those who received no credit (including students givingno response: 16.3% of all students) also failed to visit P25. Boys were slightly more likely than girls not to visit P25,and this is reflected in their performance on the task as well as in their overall reading score.Efficiency of navigationFor students receiving credit for this task, those who followed the most efficient navigation path, clicking directly andonly on P25, maika’s “About” page, had a substantially higher mean digital reading score (532; see table VI.3.14)than those who visited additional (irrelevant) pages (mean digital reading score = 512). the strategy of “knowledgeseeking” appears most suitable here. this finding is in line with the negative quadratic trend found for the numberof page visits in relating to digital reading proficiency, as described in the section “non-linear effects of navigationon digital reading performance”.of those requiring multiple clicks to locate P25, 358 students (1.6%) required five or more clicks to reach the page,and a further 189 students required four or more clicks to locate it, suggesting they had some level of difficulty inthis access aspect of the task. A small number of students (86) seem to have become lost, visiting five or more pages,but not finding P25. these students had a low mean reading ability (448), similar to those who did find P25, butanswered incorrectly.SuMMARy• the overall picture that emerges here is that most of the difficulty in this task consisted in following the task instructions and finding the correct page, using a literal match; the text processing task, once they had found the page, was relatively simple.• A small but significant minority of these students also visited one or more irrelevant pages. this irrelevant navigation was associated with students of lower proficiency, suggesting that it was counter-productive. In contrast to the first question in this unit, exploration seemed to be no longer of value.• About 20% of students did not visit the critical page, and there is evidence that they guessed. A very small proportion of students engaged in a lot of navigation, but did not find the critical page. It seems that careful attention to the demands of the task might assist here. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 107
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT iWaNTToHelP – QuESTION 4 read Maika’s blog for January 1. Go to the iwanttohelp site and find an opportunity for Maika. Use the e-mail button on the “Opportunity Details” page to tell Maika about this opportunity. Explain in the e-mail why you think the opportunity is suitable for her. Then send your e-mail by clicking on the “Send” button. Questions for this task this task allows for an investigation of how students deal with the demands of a complex task requiring a combination of multiple navigation steps and integration of information from multiple texts. there are numerous pages available, necessary, relevant and irrelevant, as well as directions in the task that assist students in navigating efficiently. this task offers the best illustration, among the six tasks analysed, of the variations in navigation behaviours that students exhibit. • How much time did students typically spend on this complex task, and how many pages did they typically visit? How wide was the variation in these behaviours? • What evidence is there that exploration of the available space is typical of higher proficiency students in this kind of task? • What proportion of students followed the most efficient pathways? How did use of these pathways relate to overall proficiency? What evidence is there of inefficient navigation? • What navigation behaviours were used by students who received no credit or gave no response for this task? did they locate the critical pages? did they engage in much irrelevant navigation? Essential features of the task this is the final question in this unit. It is a complex task that requires students to follow a series of links to locate one or more volunteering opportunities. they need to use information given on the page where the task starts, maika’s Blog, in selecting a suitable opportunity from the four possibilities. they then need to write a short explanation for their selection and send it as a message. there are two suitable opportunities, and students gain credit for selecting and justifying the choice of either one. there are 31 pages available for them to navigate to in total, of which 13 are relevant; they need to visit a minimum of five pages to provide a valid response to the question. Slightly over 42% of students (46.7% of girls; 37.9% of boys) obtained full credit (digital reading scale score 567) for this question (table VI.3.15). Some 14% obtained partial credit (digital reading scale score 525), while fewer than 5% answered the question but obtained no credit. the number of students giving no response was especially high for this item (around 40%). the high non-response rate may be attributable in part to the multiple demands, including navigation, of this complex task. Necessary pages this task offers a range of necessary and relevant pages, depending on evaluations students make. there are two equally short possible navigation paths that students can follow in order to obtain credit, described below as Pathway A and Pathway B. each of these pathways involve visits to five pages. Pathway A 1. P24: maika’s Blog Home page 2. P01: iwanttohelp Home page 3. P02: latest opportunities 4. P04: Graphic Artist opportunity details 5. P08: e-mail this opportunity to a Friend (Graphic Artist) Pathway B 1. P24: maika’s Blog Home page 2. P01: iwanttohelp Home page 3. P02: latest opportunities 4. P07: upway Primary School – Work with kids opportunity details 5. P11: e-mail this opportunity to a Friend (upway Primary School – Work with kids)108 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenTStudents may complete the task successfully, and obtain credit, by using either Pathway A or Pathway B, but theymore often visit at least seven pages, including the two expected additional pages described below:Expected additional pages6. P35: edit or Send your message (Graphic Artist)7. P12: e-mail confirmation: “Your message has been sent successfully.”oR6. P36: edit or Send your message (upway Primary School – Work with kids)7. P12: e-mail confirmation: “Your message has been sent successfully.”Students may obtain credit for having completed the reading task if they omit these two final steps; that is, theyreceive credit for finding a suitable opportunity and giving an explanation relating to its suitability even if they donot send the information in an e-mail message as directed by the task.the full list of 13 relevant pages is shown in Figure VI.3.10. • Figure VI.3.10 • relevant pages for IWANTTOHELP – Question 4Page id Page contentP01 IWANTTOHELP Home pageP02 latest opportunitiesP03 FAQP04 Graphic Artist opportunity details pageP07 upway Primary School – Work with kids opportunity detailsP08 e-mail this opportunity to a Friend (Graphic Artist) pageP11 e-mail this opportunity to a Friend (upway Primary School – Work with kids)P12 e-mail confirmation: “Your message has been sent successfully.”P24 maika’s Home pageP25 maika’s About pageP26 maika’s Contact detailsP35 edit or Send your message (Graphic Artist)P36 edit or Send your message (upway Primary School – Work with kids)Source: oeCd, PISA 2009 Database.12 http://dx.doi.org/10.1787/888932435397Time spent on this taskthis complex task required a lot of time (table VI.3.16). the mean time spent on this task, for all students, wasslightly over three minutes, the longest of any of the tasks presented in this chapter, although some other tasks in thePISA 2009 digital reading assessment required a longer average time. Students gaining full credit spent on averagecloser to four minutes; even those giving no response to the question spent on average around two minutes on thetask. there is a correlation of 0.33 between time on task and score (table VI.3.17).Number and relevance of page visitsthe mean number of pages visited by students obtaining full credit was 8.2 although students who gave a responsemade, on average, slightly over 13 visits to pages in total (table VI.3.17). Some students made many more pagevisits than this, however: the maximum was 125 (Figure VI.3.11). there is a correlation of 0.32 between number ofvisits to pages and score (table VI.3.17). the relatively high correlation of ability with pages visited (0.52) and withnumber of relevant pages visited (0.63) is consistent with what has been described in the first part of this chapter:students who visit only relevant pages tend to be better readers than those who explore all available material,including multiple irrelevant pages.table VI.3.17 shows that students receiving full credit, although they visited a similar number of pages (both number ofpages visited and number of page visits) to those receiving partial credit and no credit, tended to visit fewer irrelevantpages than either of those groups: an average of 0.8 irrelevant pages visited and 1.2 visits to irrelevant pages. Asstudents performed better on this task, they tended to make more relevant page visits, and fewer irrelevant page visits. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 109
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT In contrast, students who obtained no credit visited, on average, 3.7 irrelevant pages and made 5.1 visits to irrelevant pages. this means that these students were wasting a substantial proportion of their extensive navigation activity on irrelevant pages that would not provide information useful for completing the task. Students who gave no response to the task still engaged in a significant amount of navigation, visiting, on average, 6.6 pages in total, most of which were relevant. Variation by country Since this task is the most complex, in terms of navigation, of those analysed in this chapter, it is worth considering variations among countries in the time spent as well as the number and relevance of pages visited (table VI.3.18). Countries’ mean scores on this task were generally closely aligned with their overall means on the digital reading assessment, with only denmark (performing considerably more weakly on this task) and France (performing considerably more strongly) showing much variation between their mean score on this task and their mean digital reading score overall. Students in northeast Asian countries spent the most time on this task: Japan (mean of 254 seconds, a little over 4 minutes), followed by macao-China, Hong Kong-China and Korea (241, 238 and 223 seconds, respectively). In contrast, several european countries spent the least time: Austria, Hungary and Iceland (139, 151 and 155 seconds, respectively). For most categories of potentially useful navigation (number of pages visited, number of relevant pages visited, number of visits to relevant pages and number of page visits), east Asian countries tend to have the highest means, consistent with time spent on task, although their rank order varied somewhat. Students from the partner economy Hong Kong-China tended to visit the most pages in total (17.6), followed by the partner economy macao-China (16.8), Korea (16.2) and Japan (15.0). For total relevant page visits, the rank order was Hong Kong-China (14.4), Korea (13.5), macao-China (13.0) and Japan (12.5). the number of relevant page visits showed some variation from this pattern, however, with Korea, the highest-performing country, having the highest mean (7.1), followed by new Zealand and Japan (6.3), then Hong Kong-China (6.2) and Australia (6.1); these were the five countries with the highest overall means for digital reading, as well as the highest average numbers of relevant pages visited overall (table VI.3.1). In contrast, students in Colombia, Chile and Austria visited far fewer pages: 4.1 pages in Colombia and 5.1 pages in Austria and Chile. Similarly, pages visited and relevant page visits were also significantly lower in these countries: Colombia (3.3 relevant pages visited, 5.4 relevant page visits), Chile (4.2 and 6.9, respectively) and Austria (4.3 and 6.9, respectively). Students in the partner economies macao-China and Hong Kong-China had the highest number of visits to irrelevant pages (irrelevant pages visited: 2.0 and 1.8, respectively; total irrelevant page visits: 3.7 and 3.2, respectively), followed by Korea and Japan (2.7 and 2.5, respectively). the country with the fewest irrelevant pages visited was Australia (0.6), followed by norway, Iceland, new Zealand and Ireland (all 0.7). Students in two of these countries also visited, on average, less than one irrelevant page in total: norway 0.8, and Australia 0.9. means of students in Iceland and Ireland (1.0) and new Zealand (1.1) were only marginally higher. Initial navigation sequences Students had four options to choose from. Pathways A and B, described above, led directly to the two opportunities suitable for maika. Parallel pathways for may be described as Pathways C (for “Vegfest”) and d (for “Help fix up twin Falls track!”). these seem efficient, but both could be eliminated on the basis of information provided in maika’s Blog, which states that she is looking for a longer-term opportunity. Substantial numbers of students who were awarded credit followed Pathway A or B as their initial navigation sequence (table VI.3.19). Student gaining full credit had somewhat different overall proficiency scores according to which of these pathways they chose. those who began with Pathway A (13.9%) had slightly higher reading proficiency (577) than the mean of all students at each score level; those who began with Pathway B (only 1.3%) had significantly lower mean scores (535). For students awarded partial or no credit, the mean score of those starting with Pathway A was significantly higher than for Pathway B (and for Pathway C or d).there are several possible reasons for choosing Pathway A: maika’s Blog notes that she wants a longer-term position, and the “Graphic Artist” opportunity is “ongoing”; maika’s “About” page refers to her interest in web design, which allows the inference that a “Graphic Artist” opportunity is likely to be relevant to her; and this is the first opportunity in the list.110 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenTStudents who received no credit rarely started with either of the most efficient navigation sequences. only ninestudents began with Pathway A, and a single student began with Pathway B. this contrasts with 3 333 students(15.2%) who were awarded full credit, and a further 902 (4.4%) who were awarded partial credit, who began withPathway A or B.Few students who were awarded credit (15 students in total: 8 with full credit, 7 with partial credit) began withPathway C or d. In contrast, only 18 students who were awarded no credit began with Pathway A or B, whereas260 (1.2%) of those students began with Pathway C or d (mean scale score = 462). In all, 4 263 students (19.3%)embarked on and followed Pathway A or B within four steps of starting their navigations for the task, and the greatmajority were awarded credit. In contrast, 275 chose Pathway C or d, almost all of whom were awarded no credit.this suggests either that better readers begin with more efficient navigation pathways, or that students benefit fromstarting their navigation pathways in the right direction. the data do not allow any clear view on which of these ismore likely; but a close reading of the information presented on the first pages students are likely to see, the tablesummarising opportunities, and the information given in maika’s Blog identifying information that is relevant to thereading task, would seem to improve the likelihood that students set off on a suitable path, and may reduce thelikelihood that they become confused or frustrated as a result of lengthy and unhelpful navigation.Inefficient navigationAlthough many students began their navigation with the most efficient pathways (A or B), the majority did not,whether or not they were awarded credit (table VI.3.19). this is perhaps surprising, since the task directions, whichstate the purpose of the task, would tend to direct students to one of these efficient pathways. nevertheless, thereis no significant difference between all students who obtain full or partial credit and those with the same creditlevel who began with Pathway A. It seems that students will choose a variety of pathways, not necessarily the mostefficient, to successfully reach the same end.the concern, however, is less with those who did obtain credit than those who performed poorly. many students whogave no response failed to locate necessary pages (table VI.3.20). Some 4 475 students (about 20%) who gave noresponse visited four or fewer pages, whereas the minimum sequence needed to obtain credit is five pages. the tableshows a clear link between the number of pages visited and mean ability, in both digital and print reading. those whodid not move beyond the starting page had a (low) mean score of 350 for digital reading and 396 for print reading.this may be a sign of disengagement in the task. At the other end of the spectrum, those who visited 11 or more pages(2.3% of students with no credit; 8.1% of students who gave no response), had much higher mean scores for digitalreading, even though they received no credit. their scores were similar to those among students who received no credit(467) and among students who gave no response (463). It seems that many students navigate a great deal to no effect.Variations in individual student behaviourFigure VI.3.11 gives a sense of the range of time taken and pages visited by individual students. the time spent for ananswer receiving full credit varied from as little as 46 seconds to 1 511 seconds (over 25 minutes), with visits to pagesvarying between 5 and 125. the persistence of this student paid off, as he received full credit and also managed tocomplete all items in the assessment.5 one girl who obtained full credit spent 1 000 seconds (nearly 17 minutes) onthe task, visiting 24 pages in the process. this was clearly an ineffective strategy, as she failed to complete 6 of the 19items in the test, a factor that would have contributed to her relatively low digital reading score (360) compared toprint (407). Some students who received no credit, or gave no response, spent similar or even longer times on the task.A few students (four girls and five boys) who were awarded full credit visited only the minimum number of necessarypages (five). others, regardless of their score on this task, visited many more than this. In contrast, another student,despite visiting 85 pages, ultimately gave no response to the question. His digital reading score (220) was muchlower than his print score (429). In this case, facility in clicking on links was not associated with reading effectivenessin this medium.this wide variation offers a good illustration of the highly disparate ways in which students construct their own textsas part of the process of responding to the task (see the discussion at the beginning of this chapter). Figure VI.3.11provides a powerful indication of the extent to which students also vary in their ability to know what to do in the digitalmedium. this task offers a maximum of 31 available pages. every page received at least 100 visits from students, whilethe average number of visits to each irrelevant page was 1 962 (data from 22 036 students were collected for this task). PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 111
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT • Figure VI.3.11 • extremes of student behaviour for IWANTTOHELP – Question 4 number number number number number of of visits of of visits number of items time digital Print number relevant to irrelevant to number of items reached, on task reading reading of pages pages relevant pages irrelevant of page not with no Score (seconds) Gender country score score visited visited pages visited pages visits reached response comment 1 511 boy New Zealand 469 458 24 11 91 13 34 125 0 0 most page visits (boy) 959 girl Norway 427 374 16 10 62 6 18 80 0 6 most page visits (girl) 697 girl Hong Kong-China 582 587 9 9 58 0 0 58 0 2 most pages visited, all relevant (girl) 565 boy Ireland 484 437 9 9 51 0 0 51 0 2 most pages visited, all relevant (boy) 548 girl Colombia 502 506 15 12 43 3 5 48 3 0 equal highest number of unique relevant pages visited (all) full credit 1 000 girl Colombia 360 407 8 7 18 1 1 19 6 2 longest time for full credit (girl) 46 girl Korea 473 505 7 7 7 0 0 7 0 0 Shortest time for full credit (girl) 47 boy New Zealand 305 403 7 5 5 2 2 7 5 8 Shortest time for full credit (boy) 121 girl Iceland 688 694 5 5 5 0 0 5 0 0 equal fewest page visits for full credit (girl) 160 girl Poland 620 589 5 5 5 0 0 5 0 0 equal fewest page visits for full credit (girl) 254 boy Belgium 601 547 5 5 5 0 0 5 0 0 equal fewest page visits for full credit (boy) 222 boy Japan 517 494 5 5 5 0 0 5 0 0 equal fewest page visits for full credit (boy) 722 boy Macao-China 498 513 18 10 45 8 55 100 0 0 most pages, most irrelevant pages, partial credit (boy) 939 boy Macao-China 394 270 21 12 49 9 44 93 0 1 longest time for partial credit (boy) 638 girl Austria 502 568 10 9 64 1 3 67 0 0 Highest total relevant page visits, partial credit (all) Partial credit 573 boy Hong Kong-China 422 536 26 12 41 14 23 64 0 1 most unique pages visited (all) 973 girl Macao-China 394 446 17 9 30 7 22 52 0 3 longest time for partial credit (girl) 29 boy Ireland 455 424 7 7 7 0 0 7 0 0 Shortest time for partial credit (boy) 38 girl Australia 532 512 7 7 7 0 0 7 0 0 Shortest time for partial credit (girl) 313 girl France 548 502 5 5 5 0 0 5 0 1 equal fewest pages visited for partial credit (all) 639 boy Korea 394 324 20 7 40 13 45 85 0 2 most page visits for no credit (all) no credit 868 girl Austria 383 385 11 4 10 7 26 36 0 4 longest time for no credit (girl) 1 192 boy Hungary 302 509 11 7 16 4 5 21 0 5 longest time for no credit (boy) 1 058 boy Sweden 220 429 20 8 43 12 42 85 0 4 most page visits, missing no response (boy) 840 girl Macao-China 334 366 12 5 21 7 39 60 0 2 most page visits, no response (girl) Source: oeCd, PISA 2009 Database. 12 http://dx.doi.org/10.1787/888932435397 the 31 pages in this task represent a miniscule proportion of what is available in the real digital world. In that sense, the navigational demands of this task are far less than what readers may face as digital readers in their daily lives. Some students are capable of operating with great speed and effectiveness when presented with this kind of material, suggesting that they would easily cope with far greater demands. However, many other students appear to become disoriented, and to spend a great deal of time to little or no effect when presented with a reading task requiring them to synthesise information on one website in order to locate and evaluate information on a second website. this emphasises the need for clear guidance by teachers in how to approach reading tasks when students are required to use the Internet for seeking information, and when they are required to evaluate the available information. Simply sending students to the Internet without clear guidance is likely to be a waste of time and lead to frustration and poor learning.112 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenTSuMMARy• this kind of task does not lend itself to a superficial approach. Good readers tend to visit as many pages as they deem necessary, with repeated visits, until they are satisfied with their answer.• Patience with the complexity of the task is important. It is not generally possible to complete this kind of task adequately without devoting sufficient time to it.• many students appear to abandon early on any attempt to complete the task – among those who receive no credit, the fewer pages they visit, the lower their proficiency tends to be. this may be a sign of disengagement or frustration with the task, or of confusion about how to proceed.• Careful reading of the information presented on the first pages is more strongly associated with students who receive credit. Simply continuing to navigate, without direction, does not appear likely to get students back on track.• many students do not navigate efficiently. the number of visits to irrelevant pages is high.• Students need guidance in clarifying the task they face, in selecting relevant links and pages, and in avoiding irrelevant ones. this will improve the efficiency of their navigation efforts, reducing both the time and the effort they spend unproductively. SmELLsMell – QuESTION 1Go to the “Smell: a Guide” web page. Which of these statements best expresses the main idea on this page?Questions for this taskthis task allows for an investigation of how students react when presented with that commonest of digital texts, aset of search results. While the task directions are explicit, the possibility remains that students will explore, visitingpages that are irrelevant to completing the task. the text-processing demands of the task are considerably higherthan the navigation demands.• What proportion of students follows the most efficient (minimal) navigation pathway required for answering this question?• to what extent do students explore the available pages?• What differences are there between students who visit the target page, where the information necessary to answer the question can be found, and those who do not?• What proportion of students guesses the answer to this question?• How is time spent on the target page related to performance?Essential features of the taskthis question is the first in the unit. It explicitly directs students to navigate to the page, “Smell: A Guide” (P02), andidentify the main idea of the text on this page. the question requires limited navigation. the starting page presentsa list of six search results for the term “smell”. Students need to select one link from the list (the first in the list) bymaking a literal match between the question wording and the search result. they then need to read the text on thepage that opens in a new tab, scrolling down to read the entire text. links from the search-results page to other pagesallow a maximum of four tabs to open in this task: the “Global Search” (P01) page, plus the pages “Smell: A Guide”,“Food in the news” and “Psychology now”. the links to the remaining three results lead to a page that states, “thispage has no content available”, and has a link back to the search-results page.the text containing the necessary information is not short (over 230 words), relatively dense, and contains some termscommonly found in texts dealing with popular science. Students will typically need to spend a significant amount oftime on this page; those who spend very little time on it are less likely to answer correctly. the task is relatively hard(digital reading scale score 572), with only 42.4% of students awarded credit (table VI.3.21). the difficulty most likelystems from the need to read the text carefully, distinguishing between pieces of strongly distracting information (seeChapter 2, examples of the PISA 2009 digital reading units), rather than from navigation demands. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 113
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT Necessary pages • P01: Global Search results • P02: Smell: A Guide Navigation to the target page Students who visited P02 had much higher overall reading proficiency than those who did not, whatever their score for this question (table VI.3.22). Almost all students who visited this page also responded to the question: only 0.8% gave no response. Guessing those who did not visit P02 would have had to guess. In all, 18.6% of students guessed their response (table VI.3.22). this type of information is not available in print-administered assessments. If a large proportion of students guessed correctly, this would undermine confidence in the assessment, but few of these students (fewer than 5% overall) were awarded credit. Among students receiving credit, there is a large difference in digital reading ability between those who did visit P02 (552) and those who did not (456). In all score categories, students who visited the necessary page show higher proficiency than those who did not. For those receiving credit, girls (39.8%) were more likely than boys (34.3%) to visit the page rather than guess, and a similar pattern was observed among those who attempted the question but received no credit. this again underscores the importance of learning how to search for relevant information. Time spent on the relevant page Students who answered the question successfully spent noticeably more time on P02 (12 or 13 seconds longer, on average) than those who answered incorrectly (table VI.3.22). Girls spent slightly longer on the page than boys. the very small proportion (fewer than 1%) who visited P02 but gave no answer to the question spent much less time on the page. Good readers tend to spend sufficient time on the relevant page to read and locate essential information. Exploring this task does not invite exploring as some other tasks might. It starts with a list of search results, but the question explicitly directs students to visit a single page. table VI.3.23 shows figures for girls and boys who received credit for this question, according to the number of pages they visited, their digital reading score, and the time they spent on P02. the students with the highest mean ability are those who visited only two pages: the starting page and (in the great majority of cases) the target page, P02. Consistent with the demands of this task, the students who were awarded credit who visited only the necessary page were better readers than students who explored. this group accounted for one-third of all students, with girls (35.7%) more likely to follow this straightforward path than boys (29.1%). the lowest-performing group is composed of students who did not visit P02, but guessed correctly. Students who visited between four and seven pages showed a higher level of proficiency than those who visited only three pages, or who visited eight or more pages. this suggests that many good readers make a deliberate decision to do a certain amount of exploring of the available material, but not too much. there is more evidence here of strategic behaviour by the better readers: a single click on one additional page will be insufficient for the good readers, among those who decide to explore the available pages, to be sure they have a good idea of the information that can be viewed; but they tend to be careful to limit their exploration and not waste time looking at a large number of pages. this finding corroborates the general trend showing that large numbers of page visits are not helpful, as indicated by the non-linear trends for the test as a whole. In addition, students need to spend adequate time on the page where the target information is found, where there is a relatively long, fairly complex text to read, rather than click on other pages to see if they might provide useful information. more able readers act strategically, ensuring that they spend sufficient time on the target page, P02: around 80 seconds or slightly more, for most groups of both girls and boys. In contrast, those who visited three pages also spent the shortest amount of time on the target page of any sub-group (68 seconds on average, both girls and boys), and this is reflected in their (low) mean reading score.114 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenTEfficient readingmost students visited P02 only once, suggesting that they did not feel the need or desire to explore the informationavailable on additional pages that would most likely be irrelevant to the task. Behaviour seems to be influenced bythe nature of the task, which is highly constrained, with explicit guidance on navigation.Among students who were awarded credit for this question, any additional visits to P02 were associated withsubstantially lower proficiency for girls (table VI.3.24). Among boys, there was not a large difference between thosewho visited P02 once and those who visited the page twice; lower proficiency was more marked for those whovisited the page more than twice. the numbers making multiple visits to P02 were small, so caution is needed indrawing conclusions.Among students who gained no credit for this question, relatively few visited the target page, P02, more than once.Here, boys who made two visits had higher reading scores than those who visited the page either only once or morethan twice. this suggests that these students are engaged in the task, since they take multiple navigation steps, butthey are unable to complete the reading task successfully. For girls, it seems that an increase in the number of visitsto the page is equated with a reduction in reading proficiency. It is possible that students who make multiple visitsto the page find the text-processing demands too great, and eventually decide to guess.SuMMARy• navigating to the correct page more strongly suggests a good reader than guessing without reference to the critical material. there are no surprises here, but this analysis allows us to demonstrate that this is true.• those who find and view the material on offer, even if they don’t read it carefully, tend to be better readers than those who do not visit the necessary page.• the more proficient readers spend a substantial amount of time processing the necessary page, and do not waste too much time investigating irrelevant links or revisiting the necessary page. there is a suggestion that girls may be somewhat more likely to be “knowledge seekers” (lawless and Kulikowich, 1996) than boys.• Where the task is constrained, a focus on locating the relevant page and spending adequate time on careful reading, rather than exploring the available material, is typical of better readers. the most able readers are most likely to make a single, careful visit to the target page, rather than repeated visits interspersed with other exploration.sMell – QuESTION 3There is information about the smell of lemon on the pages “Food in the news” and “psychology now”.Which statement summarises the conclusions of the two studies about the smell of lemon?Questions for this taskthis task allows for an investigation of how students’ ability to locate the necessary pages relates to their proficiency.• What proportion of students visited the two necessary pages, P03 and P07?• How do students visiting only the relevant pages compare with other students?• What evidence is there that visiting additional pages is a sign of high or low ability?• Is there evidence that some students engage in navigation but do not find the necessary pages?• Is there evidence that very good readers might remember essential information from earlier visits to one of the necessary pages, thus obviating the need for them to visit that page again?Essential features of the taskStudents need to compare information on two pages, P03 (Food in the news) and P07 (Psychology now), in order toidentify a conclusion common to the information presented on both pages. the Food in the news and Psychology nowpages represent the kind of texts found in popular scientific online publications, with a strong commercial element.Students are likely to have already viewed and read P03, in the process of responding to the previous question(Question 2). nevertheless, since the reference to the smell of lemon is not in a prominent place on P03, it seemsunlikely that students would have remembered this detail sufficiently closely to answer this task with confidence. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 115
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT As a result, they are likely to need to engage in scanning of both texts for information relating specifically to the smell of lemon. this task (digital reading scale score 485) was easier than Question 1 in this unit (table VI.3.25), likely because of this need to focus only on specific information, and possibly because some of the material may by now have been more familiar to students. Necessary pages • P01: Global Search results • P03: Food in the news • P07: Psychology now Visits to necessary pages the data show that when answering this question, the great majority of students (70.3%) visit P07, to which they had not previously been directed, and which was irrelevant to the previous tasks in this unit (table VI.3.26). A smaller majority (56.9%) visited P03, to which they were directed in the previous question, while a substantial minority (28.1%) visited P02, the page required for the first question in the unit, but irrelevant to this question. It may be assumed that students who did not visit P07 would have guessed, since it is unlikely that they had both visited P07 on one of the two previous questions in this unit and recalled accurately the information necessary to answer a question they had not seen. of the students who were granted credit for this question, those who followed the pathway as directed in the question, visiting only the two relevant pages, P03 and P07, had a digital reading ability (563) substantially higher than the mean for all navigation pathways (534) (table VI.3.27). their digital reading ability was also much higher than that of students who visited only P07 (526) or P03 (495). not surprisingly, those who were granted credit who visited neither P03 nor P07, and who therefore would have had to guess their answer, had much lower mean digital reading ability (439). mean ability was similar in print and digital reading for groups with these navigation patterns. these results provide no evidence that good readers rely on their memory for information viewed during previous tasks in this unit: the highest reading ability among those awarded credit is shown by those who visit both P03 and P07. of students who answered the question unsuccessfully, the largest group (10.5%) either guessed (most likely) or relied on their memory of visits during previous questions, although this is unlikely since they would only have viewed P07 as part of an exploration irrelevant to those questions. that is, they clicked on no links, and did not visit either of the two necessary pages while completing this question. these observations suggest that these students made no real effort to answer the question; they were “apathetic users”, in the terms of lawless and Kulikowich (1996). A slightly smaller proportion (8%) visited P07 but not P03. It is clear from table VI.3.27 that there is a relationship between the proficiency of students and the amount of relevant navigation they engage in, regardless of the level of credit given. those with higher proficiency tended to visit both the relevant necessary pages; the next proficient are those who visited only P07, the page not needed in previous questions in this unit. Below them are those who visited P03, but not P07; and the weakest are those who did not navigate beyond the search results page displayed at the start of the question. SuMMARy • the majority of students visited the necessary pages, but a significant number did not, which required them to guess. those who guessed were unlikely to receive credit. • Students who restricted themselves to visiting only the two pages containing the necessary information tended to have higher reading proficiency. • A significant minority of students visited a page relevant to an earlier question in the unit, but irrelevant to this question. • It is clear that significant numbers of students are not able to navigate efficiently in a task of this kind, with specific and restricted navigation demands.116 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT JOb SEArcHJoB searCH – QuESTION 2You have decided to apply for the Juice Bar job. Click on the link and read the requirements for this job.Click on “ apply now” at the bottom of the Juice Bar job details to open your résumé page. Complete the“relevant Skills and Experience” section of the “My résumé” page by choosing four experiences from thedrop down lists that match the requirements of the Juice Bar job.Questions for this taskthis task allows for the examination of how the number of visits to relevant pages relates to proficiency.• Is a single visit to the page containing the necessary information (the job advertisement, P03) indicative of a good reader, or are multiple visits more likely to be a sign of good readers?• Is there a single efficient pathway commonly used by better readers?• do students become distracted by irrelevant pages? What does this tell us about their reading ability?• What behaviours are demonstrated in this task by weaker readers?Essential features of the taskthis question is an example of a task that requires several navigation steps, which are explicitly described in the taskinstructions. Students need to locate and use information from one web page to make four decisions on anotherpage, by selecting from drop-down menus. It is therefore to be expected that many students will need to switchbetween these two pages, but there are numerous possibilities for variation in navigation pathways chosen.the task instructions are explicit in directing students to the pages to be visited, and are intended to prevent studentsfrom getting lost. there are two necessary pages for this task: P03 (Juice Bar job advertisement) and P13 (RelevantSkills and experience drop-down menus).Students are directed to refer first to P03 for the job specifications, to inform their choices when completing thedrop-down lists.For JOb SEArCH Question 2, approximately 30% of all students received full credit (digital reading scale score 624);40% partial credit (digital reading scale score 462); and 30% no credit, with approximately equal proportionsproducing a no-credit answer and giving no response (table VI.3.28).Necessary pages• P02: Job Search: Current Job• P03: Juice Bar advertisement• P13: my RésuméIn addition to the necessary pages P02, P03, and P13, there is one additional page that is highly relevant but not,strictly speaking, necessary, as students may be already familiar with the term and concept of a résumé.• P04: What is a Résumé?Digital versus print readingthose who were awarded full credit on this item have a higher mean score (by about 17 points) for digital reading(570) than for print reading (553) (table VI.3.29). there is no substantial difference in the mean digital reading (506)and print reading (508) score of those with partial credit. those who received no credit for this task tended to scoreabout 20 scale points better for print than digital reading. Students who made no attempt to answer the questionhad an even larger difference (over 40 scale points) between mean digital (363) and print (409) reading scores. thepatterns are similar for boys and girls. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 117
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT this task requires students to locate two different pages and compare information on these pages. Since it was probably necessary to switch between the pages more than once, the navigation demand may be considered to be fairly high. It may also be considered to be representative of many real-life digital reading tasks, where multiple comparisons of information on multiple pages is required. the results here suggest that these kinds of navigation requirements allow good readers to perform better (that is, where the navigation demand is relatively high, the students who complete them successfully will tend to demonstrate higher reading proficiency in this medium), while adding to the difficulty of the item for weaker digital readers, that is, the reading ability that they demonstrate in print may not help them to achieve similar proficiency in digital reading. Efficient reading Students may read P03 once, or may switch between P03 and P13 a number of times. many students (42.7%) followed the straightforward path, as directed in the task instructions (table VI.3.30). of these students, 13% received full credit, almost 20% received partial credit, and almost 10% were awarded no credit or gave no response. Girls were more likely (44.8%) than boys (40.7%) to follow exactly this sequence. However, the 13% who received full credit using this navigation path are not the most proficient readers. Students following this navigation sequence and visiting no other pages have a mean overall test score no better, and in fact slightly lower (564), than the average (570), although this difference is only 6 points. A similar difference between overall and average digital reading score was observed with students who were awarded partial credit. Students who visited only the necessary pages (the home page, P02; the job ad page, P03; and the résumé page where the drop-down menus are completed, P13), but made more than one visit to the page with the job advertisement (P03), showed higher overall proficiency, as measured by total test score, regardless of their success on this item, than other students (table VI.3.30). there is no evidence to support the idea that students who can remember what they have read on a single reading of a text are better readers than those who refer to the relevant pages enough times to make the numerous comparisons necessary. It seems that better readers tend to use more than a single visit, and do not rely on memory following a single reading. the navigation data show that of the students who scored full credit on this task, the higher their reading proficiency, the more they were likely to switch between the job advertisement page and the page where they completed the task of selecting relevant résumé experiences. As table VI.3.31 shows, the girls with the highest mean proficiency were those who visited the page four times (2.5% of girls; mean of 598). For boys, the highest- performing were those who visited P03 four times or more (6.8% of boys; mean scores ranging from 580 to 588). this number of visits makes sense, given that there are four drop-down menus to complete. the résumé-completion task requires explicit comparisons of requirements in one text with a list of qualifications and experience in another. this sort of task lends itself to careful checking, so it is not surprising that repeated visits to the necessary pages were typical of the more proficient readers. this is in keeping with the notion that in some tasks, the deliberate re-visiting of pages can be a good navigation strategy, as already outlined in the section “Indicators used to describe navigation”. Here, revisits can be assumed to be helpful because not all the information required from a page can be memorised at once. thus, while revisiting pages is often regarded a sign of disorientation and has negative associations with comprehension, there are examples where revisits are fruitful. this also means that task demands must be taken into account when analysing revisits as an indicator of navigation across different tasks. Minimal reading It is possible to get to P13 without consulting the Juice Bar job advertisement (P03), by ignoring the task instructions and the prominent hyperlink on the open Job Search page (“View details of job: Juice Bar team members”), and clicking instead on the link, “my Résumé”. Some students by-pass instructions they may regard as intermediate, and navigate directly and swiftly to the final page, where the task is completed. they may not refer to critical relevant pages, but complete the task anyway. these students may miss crucial information, and therefore not gain maximum credit. Alternatively, they may be more interested in simply finishing the task without checking whether they have found and used (all) the available information. Full credit can be received without referring to the job ad (P03), by inferring and guessing. only 11.2% of students failed to visit P03 (table VI.3.32). Boys (12.1%) were slightly more likely than girls (10.3%) not to visit this page.118 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenTthe 150 students (0.7%) awarded full credit who did not visit P03 had a substantially lower level of readingproficiency (532) than those who did consult the advertisement (571), which suggests that guessing played a part intheir responses. Similar differences in the overall level of reading proficiency between those who did and did notvisit P03 were observed among students who received partial credit (509 v. 465) or no credit (434 v. 393).Ineffective navigationA number of students visited multiple pages, but did not find the critical page or pages required to complete thetask. those who engage in apparently undirected exploration are likely to be poor readers, and this notion receivessome support in the data (table VI.3.32). A small number of students (1.5%) who gave no response to the questionvisited at least three different pages, but failed to find P03.Among students who answered the question, whatever level of credit they received, those who visited irrelevantpages showed lower reading proficiency than those who did not (table VI.3.33), consistent with what was describedearlier in this chapter. there is little difference in proficiency between those who visited only one irrelevant pageand those who visited multiple irrelevant pages. the issue appears to be whether or not students visit any irrelevantpages: more able readers tend not to visit irrelevant pages.A small proportion of students (2.1%) followed the minimum described sequence, visiting no other pages, butdid not answer the question (table VI.3.30); their mean digital reading score was 380, substantially higher thanthe mean of all students who gave no response (mean score of 363). they appear able to manage the navigationcomponent of the reading task (locating the target pages), but unable to synthesise information from the two pages.SuMMARythe behaviour of students overall suggests various strategies used for this task:• Students visiting the Juice Bar advertisement page multiple times tended to demonstrate the highest overall proficiency in the assessment.• Students who did not visit the Juice Bar advertisement page tended to have the lowest overall proficiency. Better readers locate and use the information provided on this page.• Students who visited no irrelevant pages tend to demonstrate higher reading proficiency than those who visited irrelevant pages.the implication is that good readers are selective in the links they choose and do not waste time on irrelevant pages.this approach minimises the number of pages and amount of text they expose themselves to. they also take as muchcare as is needed in visiting and revisiting the pages with the information critical to the task, to verify that they haveused it correctly. this task, which requires students to select only the most relevant information from a fairly longlist of similar possibilities, demands careful integration of information across two texts. It is not surprising, then, thatbetter readers tend to recognise the need to check that they have interpreted correctly all the demands of the task,and make the most suitable selection of résumé features. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 119
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT ConClusions this chapter has shown that successful reading in the digital medium requires effective navigation, and that it cannot be assumed that students can simply transfer reading skills learned in print reading to this medium. effective navigation requires students to construct pathways to pages with information relevant to the task. the overall picture that emerges from the case studies is that stronger readers tend to choose strategies suited to the demands of the individual tasks. Where no navigation is required (see IWANTTOHELP task 1), better readers tend not to become distracted by the availability of irrelevant pages. Where the task requires them to compare information on different pages (see SmELL task 3), the better students will locate these pages and navigate between them as many times as they feel necessary. When the navigation demands are complex (see IWANTTOHELP task 4), better readers will spend more time on the task and visit more of the relevant pages than they do for simpler tasks. Better readers tend to minimise their visits to irrelevant pages and locate necessary pages efficiently. they also monitor their time, so that they are able to complete all the reading tasks in the time allocated. there is evidence that when a set of stimuli is first presented to students (for the first question in a unit), a small percentage of stronger readers (boys slightly more often than girls) will explore the available navigation space (see IWANTTOHELP task 1). this is not common behaviour; but for those who select and engage in it as a deliberate strategy, exploring to discover the range of available information, there is no indication that it impedes their likelihood of performing well. Good readers are expected to use a variety of strategies. In contrast to careful, deliberate exploration, there is evidence that minimal exploration, such as clicking on a single additional page, but without follow-up, is an ineffective digital reading strategy (see IWANTTOHELP task 1, JOb SEArCH task 2). navigation needs to be carefully directed. Students who make many visits to irrelevant pages tend to be poorer readers, as do students who fail to locate necessary pages. there is some evidence to suggest that good readers are those who start the reading task with an efficient navigation path (see IWANTTOHELP task 4). the digital reading assessment necessarily presents extremely constrained options for navigation – far less than the almost infinite range of navigation possibilities readers face when they use the Internet, whether for personal, educational or occupational purposes (see discussion in IWANTTOHELP task 4). nevertheless, what does emerge from this analysis is that the tasks included in the assessment offer enough navigation and text-processing challenges to measure and describe the digital reading proficiency of 15-year-olds from the 19 participating countries. Indeed, the tasks, as a whole, allow analysts to discriminate successfully among students at all proficiency levels. Although the navigation demands of the digital reading assessment are modest, many students find it hard to cope with them. even when the guidance is quite explicit, significant numbers of students still fail to locate crucial pages. thus teachers and policy makers should not assume that students can navigate successfully or methodically in the vast realm of possibilities that the Internet offers them. the digital reading assessment offers powerful evidence that today’s 15-year-olds, the “digital natives”, do not automatically know how to operate effectively in the digital environment, as has sometimes been claimed. Simply turning students loose in the digital medium, without clear direction, is likely to increase the risk that they will waste time, become frustrated, and fail to engage productively as readers. Students should be encouraged to define their reading task before they start to navigate. they need clear purposes for reading, encouragement to clarify these purposes before embarking on navigating, and practice in evaluating and selecting both the links they choose to follow and the material they will then be able to read. they should learn to recognise and use whatever guidance is available to help them to locate relevant or critical pages. Before embarking on a navigation path, students should determine why they are reading and what information they are looking for, to reduce the likelihood that they will become disoriented or waste time by visiting irrelevant pages. to use navigational tools and features effectively readers need to exercise discrimination and critical reasoning. once they have navigated to necessary pages, they should ensure that they spend sufficient time on these pages to process the critical information. When information is to be compared across pages, students should be encouraged to understand that more than a single visit to each page is necessary. Students should be encouraged to avoid undirected navigation – clicking on numerous pages in the hope that one of them might yield useful information. Given that digital texts are not limited in size and scope the way print texts are, students need guidance in judging how much time is enough to spend on a task and how much navigation is necessary. the Internet is an almost infinite space, and if students are to use it productively, they need strategies to direct their navigation choices.120 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 3 navigaTion in The pisa 2009 digiTal reading assessmenT Notes1. As a result of a technical problem, data for page visits could not be collected with complete accuracy in all cases. this meansthat there are some minor inaccuracies in some of the figures provided for the numbers of page visits, or number of visits torelevant pages, which do not influence the overall picture of the results presented here. For the same reason, the figures are notalways exactly aligned between the aggregated data and the case-study data presented in this chapter.2. In case of heavily right-skewed frequency distribution, sometimes a logarithmic transformation is applied to the data tonormalise the distribution before using the data in further statistical analyses, such as regression. As the skew was only moderatein the present case, this was not done. In the regression models reported in the next section, however, residuals were distributednormally (see e.g. Cohen, et al., 2003).3. the fact that the effect size f 2 for Korea is to some degree an outlier is partly due to the comparatively low overall proportion ofvariance explained by print reading and navigation together in this country, as f 2 for a predictor is given as: 2 DrAf2= 2 1 – rTot , where Dr2A is the variance uniquely explained by predictor A, and r2tot is the total variance explained in the model.thus, f 2 will not only increase as Dr2A increases, but also as r2tot increases.4. each page within a unit is identified using the convention P plus a two-digit number (so, P01, etc.).5. each page within a unit is identified using the convention P plus a two-digit number (so, P01, etc.). PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 121
    • 4 relationships between Digital reading performance and Student Background,Engagement and reading Strategies This chapter examines the extent to which proficiency in both print and digital reading is associated with certain variables, including students’ socio-economic background, immigrant status, the degree of students’ engagement in reading, and students’ awareness of effective learning strategies. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 123
    • 4 relaTionships beTween digiTal reading perFormanCe and sTudenT baCKground, engagemenT and reading sTraTegies this chapter examines how a number of variables relate to print and digital reading proficiency. the first part of the chapter investigates student background variables, such as economic, social and cultural status; and immigrant background. the second part examines engagement in reading activities and awareness of effective reading strategies. the chapter focuses on how these aspects are related to print and digital reading proficiency. An explanatory model, based on students’ background characteristics, engagement and reading strategies, is presented at the end of the chapter. this model shows the strength of the relationship between each of the variables and digital reading performance. unless otherwise noted, the countries described in this chapter are the 19 countries that conducted the digital reading assessment. oeCd averages mentioned here are for the 16 oeCd countries that participated in both the print and digital reading assessments. Family baCKground the aim of education systems around the globe is to encourage students to achieve at the highest possible levels and to provide equitable opportunities for all students. As discussed in Volume II of this report, inequities may arise as a result of gender, socio-economic status, ethnicity or even geographic location. A weak relationship between a student’s family background and his or her performance at school is an indication of an equitable distribution of educational opportunities. the variables discussed in this section are described in greater detail in Annex A1a. socio-economic background most schools are populated by students from a range of socio-economic backgrounds; and teachers and parents appreciate that the interaction of family background and the educational setting can enhance learning. As was true in the case of print reading proficiency, PISA results show that there is a positive association between socio-economic background and digital reading proficiency. In PISA, a student’s socio-economic background is indicated by the PISA index of economic, social and cultural status (eSCS). this index captures several aspects of a student’s family background, including information on parents’ education and occupations, and home possessions.1 the index is standardised to have an average value of 0 across all the participating oeCd countries in the print reading assessment and a standard deviation of 1. An examination of the average value of the index for each of four student performance categories gives an indication of the impact of socio-economic background (table VI.4.1). In digital reading, students who are top performers (i.e. those who perform at PISA proficiency level 5 or above) have an average socio-economic index score of 0.65 – well above the overall average of 0.06 (table VI.4.2) – while students who are the lowest performers (i.e. those who perform at PISA proficiency level 1 or below) have an average socio-economic index score of -0.45 – well below the average. the average difference in the socio-economic index scores between the top performers and the lowest performers across oeCd countries was 1.10 index points. For print reading, the results are similar, with the top performers having an average socio-economic index score of 0.66 and the lowest performers -0.43: a difference of 1.09. the largest difference observed in both digital and print reading is in Chile, which has a difference of 1.84 index points between the top performers and the lowest performers in digital reading, and a larger difference between these two groups of 1.96 index points in print reading. the smallest variation between top performers and lowest performers is found in the partner economy macao-China, with a 0.61 index point difference in digital reading and 0.56 index point in print reading. Both across and within countries, then, differences between the top and the lowest performers tend to be similar in both the digital and print reading assessments. Another way of looking at the association between socio-economic background and student performance is to see if there are measurable differences in performance scores between students from socio-economically advantaged and disadvantaged backgrounds (the top and bottom quarters of the PISA index of economic, social and cultural status). In the digital reading assessment, the difference, on average across the relevant oeCd countries, is 85 score points, compared to a difference of 89 score points for print reading (table VI.4.2). In both cases, this would be regarded as equivalent to over two years of schooling (one school year is estimated to be equivalent to 39 score points in PISA; see table A1.2 in PISA 2009 Results: What Students Know and Can Do: Student Performance in Reading, Mathematics and Science for an explanation of this calculation). the smallest difference in performance between socio-econonimically advantaged and disadvantaged students is seen in the partner economy macao- China, with a 23 score point difference in digital reading and a 25 score point difference in print reading.124 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 4 relaTionships beTween digiTal reading perFormanCe and sTudenT baCKground, engagemenT and reading sTraTegiesthe largest performance difference between socio-economically advantaged and disadvantaged students occurs inHungary, with 135 and 118 score points difference, respectively, in digital and print reading. While 12 of the 19countries have smaller differences between advantaged and disadvantaged students in print reading, in Poland andChile the differences are larger by 19 and 18 score points, respectively, suggesting that in these two countries theimpact of socio-economic background is greater on digital reading than on print reading.the method for comparing the scores of students from different socio-economic backgrounds used above canbe extended to look at a range of student backgrounds. the change in student performance associated with eachsingle unit change of the PISA index of economic, social and cultural status is known as the socio-economicgradient (a unit is defined as one standard deviation). the slope of the socio-economic gradient line is anindication of the extent of inequity. Steeper gradients indicate a greater impact of socio-economic background onstudent performance; gentler gradients indicate less of an impact.on average across the 16 oeCd countries that participated in the digital reading assessment, the slope ofthe gradient line is 38 score points, which is similar to what is observed for print reading (table VI.4.3). theoeCd countries with the steepest slopes for digital reading are Hungary, Austria, new Zealand, Poland, Belgiumand Australia. In these countries, a one-unit change of the index is associated with a performance difference ofbetween 54 (Hungary) and 43 score points (Australia) on the digital reading scale. Countries and economies withslopes of less than 30 score points are Japan, Korea, norway, Iceland, and the partner economies macao-Chinaand Hong Kong-China.For print reading in PISA 2009, the average slope across the 16 oeCd countries that participated in the digitalreading assessment is 40 score points. the countries with steep slopes in digital reading also tend to have thesteepest slopes in print reading. For example, Hungary has a slope of 54 score points for digital reading and 48 scorepoints for print reading, and Austria has slopes of 49 and 48 score points, respectively – all significantly above theoeCd averages. At the same time, the countries with gentle slopes in digital reading also tend to have the gentlestslopes in print reading. For example, the partner economy macao-China has a slope of 11 score points for digitalreading and 12 score points for print reading, and the partner economy Hong Kong-China has slopes of 19 and17 score points, respectively – all significantly below the oeCd averages. the largest discrepancy between thegradients for digital and print reading occurs in Japan, with a 14 score point difference: the slope of 26 score pointsfor digital reading is much less than the 40 score points for print reading. thus, in Japan, there appears to be greaterequity in the digital reading results than in the print reading results.While the steepness of the gradient is an indicator of how many score points are associated with a one-unitchange in the PISA index of economic, social and cultural status, it does not necessarily show the strengthof the relationship. As explained in Volume II, this is better revealed by examining the amount of variance instudent performance that is explained by a variable. If this number is low, relatively little of the variance instudent performance is explained by students’ socio-economic background; if it is high, a large part of theperformance variation is explained by socio-economic background. on average across oeCd countries, 14.1%of the variation in student performance in digital reading within each country is associated with the PISA indexof economic, social and cultural status (table VI.4.3). For print reading, across the 16 oeCd countries thatparticipated in the digital reading assessment, the average variance explained by socio-economic backgroundwas 14.4%. In Poland, both the slope and the variance explained were noticeably greater for digital readingthan for print reading, indicating that socio-economic background in that country has a greater impact on digitalreading proficiency than it does on print reading.Countries with a lower-than-average impact of socio-economic background are regarded as high-equitycountries. using the information in table VI.4.3 countries are categorised into four groups: i) high performance/low socio-economic impact; ii) high performance/high socio-economic impact; iii) low performance/high socio-economic impact; and iv) low performance/low socio-economic impact (see Figure VI.4.1). Among the countriesand economies that participated in the digital reading assessment, Japan, Iceland and the partner economyHong Kong-China constitute the group of high performance/low socio-economic impact countries; Belgium isthe high performance/high socio-economic impact country; and Hungary, Poland, Chile and the partner countryColombia are the low performance/high socio-economic impact countries. other countries and economies showaround average performance and/or around average impact of socio-economic background. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 125
    • 4 relaTionships beTween digiTal reading perFormanCe and sTudenT baCKground, engagemenT and reading sTraTegies • Figure VI.4.1 • Strength of socio-economic gradient and reading performance Both, the digital reading performance and the strength of the relationship between performance and socio-economic background are significantly different from the OECD average. The digital reading performance and/or the strength of the relationship between performance and socio-economic background are not significantly different from the OECD average. Digital reading score Digital reading 600 Above OECD average Above OECD average digital reading performance digital reading performance Korea Above OECD average impact Below OECD average impact 550 of socio-economic background of socio-economic background New Zealand Australia Japan Ireland Hong Kong-China OECD average-16 Belgium Sweden Iceland Norway 500 France Denmark Macao-China Hungary Spain Poland Austria 450 Chile 400 Below OECD average OECD average-16 Below OECD average digital reading performance digital reading performance Above OECD average impact Colombia Below OECD average impact of socio-economic background of socio-economic background 350 30 25 20 15 10 5 0 Percentage of variance in digital reading performance explained by the PISA index of economic, social and cultural status (r-squared x 100) Print reading score Print reading 600 Above OECD average Above OECD average print reading performance print reading performance Above OECD average impact Below OECD average impact 550 of socio-economic background of socio-economic background Korea New Zealand Sweden Japan Australia Hong Kong-China OECD average-16 Belgium Poland Norway Iceland 500 France Ireland Hungary Denmark Austria Spain Macao-China 450 Chile OECD average-16 Colombia 400 Below OECD average Below OECD average print reading performance print reading performance Above OECD average impact Below OECD average impact of socio-economic background of socio-economic background 350 30 25 20 15 10 5 0 Percentage of variance in print reading performance explained by the PISA index of economic, social and cultural status (r-squared x 100) Source: OECD, PISA 2009 Database, Table VI.4.3. 1 2 http://dx.doi.org/10.1787/888932435416 A comparison of the two graphs shows that there is a greater diversity in the equity of results for digital reading than for print reading. the average socio-economic background of the countries considered varies widely. table VI.4.3 shows the mean score obtained by each country in the digital reading assessment and also a score that is adjusted for each country’s average socio-economic background. In this hypothetical analysis, the South American countries, Chile and Colombia, have adjustments of 22 and 37 score points, taking their scores from 435 to 456 and from 368 to 405 score points, respectively. Countries with higher socio-economic status, such as Iceland and norway, have their scores adjusted downwards from 512 to 493 and from 500 to 487 score points, respectively. these differences are similar to those observed in print reading, where Chile and Colombia have adjustments of 19 and 32 score points upwards, respectively, while Iceland and norway have adjustments of 18 and 16 score points downwards, respectively.126 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 4 relaTionships beTween digiTal reading perFormanCe and sTudenT baCKground, engagemenT and reading sTraTegiesimmigrant statusAs a result of increased global migration and population mobility, governments are often called upon to provideintegration programmes at schools and in the community at large. PISA uses three categories to define theimmigrant status of students: i) native students, ii) second-generation students, and iii) first-generation students(see Annex A1a for a detailed description). Generally, students with an immigrant background are defined as first-or second-generation immigrants.2Across oeCd countries, the pattern of results indicates that native students perform at a higher level than theirimmigrant counterparts. table VI.4.4 shows that, on average, native students score 504 points, compared to 475 forsecond-generation students and 450 for first-generation students. In print reading, the averages for the same groupsare 504, 474 and 449, respectively.As can be seen in Figure VI.4.2, this pattern is not repeated in all countries. In Australia, for example, second-generation students score at the highest level in digital reading, with 554 score points, followed by native students(539 score points) and then first-generation students (525 score points). • Figure VI.4.2 • Student performance in digital reading and immigrant status Native Second-generation First-generation 600Digital reading score 550 500 450 400 350 300 250 New Zealand Australia Hong Kong-China Belgium Sweden Iceland Ireland OECD average-16 Norway France Denmark Macao-China Spain Austria Hungary Poland Chile ColombiaCountries are ranked in descending order of the mean score of native students.Source: OECD, PISA 2009 Database, Table VI.4.4.1 2 http://dx.doi.org/10.1787/888932435416languages spoken at homeIn print reading, students who speak a language at home that is different from the language of the assessmentgenerally perform at a lower level than those whose language is the same. In PISA 2009, the average score inprint reading among students whose language is different from the assessment language was 455 points comparedto 506 points for those students whose language is the same as the assessment language (see table VI.4.5). Indigital reading, the pattern is similar: the average score for students whose language at home is different fromthe assessment language was 452 points compared to 504 points for students whose language is the same as theassessment language.the two largest gaps between print and digital reading are in norway, where the difference between the languagegroups is 63 score points for print reading and 40 score points for digital reading, and in the partner economyHong Kong-China, where these differences are 70 and 35 score points, respectively. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 127
    • 4 relaTionships beTween digiTal reading perFormanCe and sTudenT baCKground, engagemenT and reading sTraTegies Performance differences within and between schools Figure VI.4.3 shows the proportion of the between- and within-school variance in performance in digital and print reading that can be attributed to differences in socio-economic background within and between schools. digital reading is shown on the left, while print reading is shown on the right. the grey part of the bar represents the between-school variation that is explained by schools’ socio-economic background; the blue part of the bar represents the within-school variation that is explained by the socio-economic background of students within schools (see table VI.4.6). on average, between schools, the percentage of the variance in student performance explained by a school’s socio-economic background is smaller in digital reading (48.4%) than in print reading (56.8%). In contrast, within schools, the percentage of the variance in student performance explained by students’ socio-economic background is larger in digital reading (7.4%) than in print reading (5.5%). • Figure VI.4.3 • Variation in performance in digital and print reading explained by students’ and schools’ socio-economic backgrounds Within-school variance in performance explained by students’ socio-economic background Between-school variance in performance explained by schools’ mean socio-economic background Digital reading Print reading Norway Hong Kong-China Iceland Iceland Spain Norway Hong Kong-China Macao-China Korea Spain France Austria Denmark Korea Ireland France OECD average-16 OECD average-16 Austria Ireland Sweden Hungary Macao-China Poland Australia Belgium Belgium Australia Poland Sweden New Zealand Chile Hungary Denmark Chile New Zealand Colombia Colombia 0 10 20 30 40 50 60 70 80 % 0 10 20 30 40 50 60 70 80 % Countries are ranked in ascending order of the between-school variance in performance explained by schools’ socio-economic background. Source: OECD, PISA 2009 Database, Table VI.4.6. 1 2 http://dx.doi.org/10.1787/888932435416 sTudenT engagemenT and aTTiTudes do engagement in reading and awareness of reading strategies have the same kind of relationship with digital reading proficiency as they do with print reading proficiency? As shown in Chapters 2 and 3, the skills required to succeed in the digital reading tasks are both general, that is, applicable to print reading as well, and specific, usually associated with navigating through online texts. As it could be expected that engagement in online reading is likely to have a closer link with proficiency in digital reading than with print reading, online reading practices are closely examined below.128 © OECD 2011 PISA 2009 ReSultS: StudentS on lIne – Volume VI
    • 4 relaTionships beTween digiTal reading perFormanCe and sTudenT baCKground, engagemenT and reading sTraTegies Box VI.4.1 a cycle of engagement in reading activities, reading strategies and reading performance Students who are highly engaged in diverse reading activities and who are aware of what strategies work best for reading and understanding texts perform better in the PISA reading assessment. However, this finding cannot be interpreted as direct evidence of a causal relationship between being engaged in reading, adopting effective reading strategies and achieving high levels of reading proficiency. evidence presented in PISA 2009 Results: Learning to Learn (Volume III) for print reading, and in this chapter for digital reading, reflects the cumulative observed association between how engaged students are, the reading strategies they adopt and how well they do. What does cumulative association mean? Studies in education and applied psychology suggest that reading proficiency is the result of multiple developmental cumulative cycles (see Aunola, et al., 2002 for a review). Attitudes towards reading and learning, motivation, engagement in reading activities and reading proficiency are mutually reinforcing. Positive reinforcement operates at two levels. the first reflects the fact that the future depends on the past. Past engagement matters for current and future engagement, and past reading performance is also a very good predictor of future reading performance (Fredericks, Blumenfeld and Paris, 2004; Stanovich, 2004). this suggests that a student’s past reading activities will influence his or her future reading activities. Similarly, how effectively the student applied learning strategies in the past is one of the aspects that determine how well he or she will apply reading strategies in the future. the second level indicates that associations among engagement, reading strategies and performance are circular. engaging in reading activities, adopting effective reading strategies and being a proficient reader are mutually dependent: as students read more they become better readers; and when they read well and expect good performance in reading, they tend to read more and enjoy reading (nurmi, et al., 2003). the graph below illustrates how results of associations between how engaged in reading activities students are, the reading strategies they adopt, and how well they read should be interpreted in the context of the two levels of reinforcement. Engagement time Engagement Engagement Performance Engagement Performance Performance Performance the evidence that emerges from PISA on the positive interplay between engagement in reading activities, the adoption of particular reading strategies and reading performance suggests that preparing students to read well and promoting a passion for reading and effective reading is very important. Students who are highly engaged and are effective learners are most likely to be proficient readers; proficient readers are also the students most engaged and interested in reading.engagement in reading and digital reading proficiencythis section focuses on three different aspects of how students engage in reading activities:• how much students enjoy reading (positive or negative attitudes towards reading);• which kinds of print material they read and how often; and• which kinds of online reading activities they engage in and how often. PISA 2009 ReSultS: StudentS on lIne – Volume VI © OECD 2011 129
    • 4 relaTionships beTween digiTal reading perFormanCe and sTudenT baCKground, engagemenT and reading sTraTegies Box VI.4.2 The association between reading engagement, awareness of reading strategies and reading performance Results presented in this chapter can be used to answer two main policy questions: 1. how strong is the association between digital reading performance, reading engagement and reading strategies? • one indicator used to answer this question is the inter-quartile range, which represents the difference between the top and bottom quarters of different indicators, such as reading enjoyment, diversity of print reading material, online reading practices, and awareness of reading strategies. this indicator can reveal the extent of the differences in reading performance between, for example, enthusiastic and unenthusiastic readers. 2. Are reading engagement and reading strategies good predictors of performance? • the proportion of the variation in digital reading performance that is accounted for by engaging in reading and reading strategies, or explained variance, helps to answer this question by identifying the proportion of the observed variation in student performance that can be attributed to reading engagement and reading strategies. • If this number is low, knowing the students’ reading engagement and level of awareness of reading strategies says very little about their digital reading performance. If this number is high, one can associate students’ performance in digital reading reasonably well with their engagement in reading and awareness of reading strategies. Box VI.4.3 interpreting pisa indices • Indices allow for comparisons of countries that are above or below the oeCda average in certain variables: indices used to characterise students’ engagement in reading activities (either print or online) and awareness of reading strategies were constructed so that the average oeCd student would have an index value of zero and about two-thirds of the oeCd student population would be between the values of -1 and 1 (i.e. the index has a standard deviation of 1). negative values on the index do not imply that students responded negatively to the underlying question. Rather, students with negative scores are those who responded less positively than the average response across oeCd countries. likewise, students with positive scores are those who responded more positively than the average student in the oeCd area (see Annex A1a for a detailed description of how indices were constructed). • most of the indicators of engagement-in-reading activities are based on students’ self-reports. they can thus suffer from a degree of measurement error because students are asked to assess their level of engagement in reading activities retrospectively. Apart from potential measurement error, cultural differences in attitudes towards self-enhancement can influence country-level results in engagement-in-reading activities and the use of learning strategies (Bempechat, et al., 2002). the literature consistently shows that response biases, such as social desirability, acquiescence and extreme response choice, are more common in countries with low GdP than in more affluent countries, as they are, within countries, among individuals from more disadvantaged socio-economic backgrounds and with less education. • As in the first PISA cycle and as for print reading performance (Volume III, Learning to Learn), many of the self-reported indicators of engagement in reading are strongly and positively associated with digital reading performance within countries, but show a weak or negative association with performance at the country level. this may be due to different response biases across countries or the fact that country-level differences in reading performance are due to many factors that go beyond levels of engagement in reading activities and that are negatively associated with reading performance and positively ass