Barry McGaw has worked half-time at the University of Melbourne since January 2006. In 2006-2008 he was Director of the Melbourne Education Research Institute. In 2009, he has been Executive Director of the international Assessment and Teaching of 21st Century Skills project established and funded by Cisco, Intel and Microsoft. Among his other activities, he is Chair of the Australian Curriculum, Assessment and Reporting Authority.He returned to Australia at the end of 2005 from Paris where he had been Director for Education at the Organisation for Economic Co-operation and Development (OECD). He had previously been Executive Director of the Australian Council for Educational Research (ACER) from 1985 to 1998.Dr McGaw is currently President of the Academy of the Social Sciences in Australia.
Australia’s rank dropped in PISA 2006 because the Australian mean performance declined to 513 in 2006 from 528 in 2000 and 525 in 2003. This decline was statistically significant. Possible reasons for the decline are discussed in the next slide.Korea, on the other had, significantly improved its mean performance and did so by raising its performances at the highest levels. The sources of this improvement appear to be a new curriculum with more emphasis on essay tests and expanded use of essays in assessments for university entrance.Hong Kong raised its mean performance by raising the performance of its poorer performing students and attributes this primarily to teacher development.There were no significant changes for Finland, Canada and New Zealand.Poland has been added to the figure to show the extent to which this initially poorly performing country has improved its performance. It abandoned streaming students around 12 years of age into schools of different types and, setting higher expectations for lower performers, initially raised its mean score but pulling up the tail and then by achieving improvements throughout the distribution,
The decline in Australia’s mean performance is shown again the graph above together with the trends in performance levels of Australia’s 15-year-olds at the 95th, 90th, 75th, 25th, 10th and 5th percentiles. (The 5th percentile is the score below which the performances levels of 5 per cent of the Australian students lay, and so on for the other percentiles.)Performance levels at the lower percentiles did not drop, while those at the higher percentiles did. This shows that the significant drop in Australia’s mean performance was due to a decline among high performers.The reasons for this are not immediately evident from the data but it is at least clear that it is due to schools focusing more on basic achievement levels and not so much on the development of sophisticated reading of complex text.
The figure above divides the variation in student performance in science in PISA 2006 for each country into a component due to differences among students within schools, shown above the zero line, and a component due to differences between schools shown below that line.In the Scandinavian countries there is very little variation in scores between schools. Choice of school is not very important because there is so little difference among schools.Among the countries in which there is a large component of variation between schools, there are some in which this occurs by design. In Belgium, Germany and Hungary, for example, students are sorted into schools of different types according to their school performance as early as age 12. The intention is to group similar students within schools differentiated by the extent of academic or vocational emphasis in their curriculum. This is intended to minimise variation within schools in order then to provide the curricula considered most appropriate for the differentiated student groups. It has the consequence of maximising the variation between schools.In PISA 2000, Poland was to the left of this graph. In PISA 2006, as a consequence of its abandonment of early streaming of students into schools of different types, it is now at the right and, as seen earlier, has improved its performance as well.A further way in which to examine equity is to determine the extent to which the variation between schools can be accounted for in terms of differences in students’ social backgrounds. In Australia, 68% of the variation among schools’ mean performances can be explained in this way. More of the performance of Australian schools has, therefore, to do with whom the schools enrol than with what the schools do.
This figure is from the front page of the My School report for a school enrolling students from a relatively disadvantages community. The extent of its disadvantage can be seen in its Index of Socio-educational advantage of 907, which is almost a full standard deviation below the national average of 1000. It can also be seen in the distribution of its students on that index, with 85% of them being in the lowest quarter across the Australian population.This is, however, a high-performing school. Compared with 60 other schools enrolling students with similar socio-educational disadvantage, it performs very well. A dark green colour indicates a mean for the school that is more than half a standard deviation above the average performance in these ‘statistically similar’ schools.Even more impressively, this school actually performs above the national average on some of the measures. In numeracy at Year 5, for example, its mean of 535 is not only more than half a standard deviation above the mean of 454 in statistically similar schools (hence the dark green) it is also more than half a standard deviation above the national mean of 487 (hence the dark green for that comparison also).This is clearly a school from which others could learn.
When the NAPLAN results are published each year, much of the public discussion is about the percentage of students who perform below the level set as a minimally acceptable benchmark. That draws attention to a low point in the distribution and can divert attention from what could be important changes elsewhere in the distribution. (Remember the decline at the top end of Australia’s distribution of performances on the OECD PISA reading scale shown in Slide 6. Paying attention only to the percentage of Australian students failing to have performed above a low threshold would have resulted in this important result being missed.The national NAPLAN report does provide distributions but they are not sufficiently attended to in the public discussion. The My School website deliberately features them with a comparison, as shown above, of the distributions across the NAPLAN bands of the scores in the individual school, the schools in its statistically similar group and in all schools across the nation.
The school in question is Broadmeadows Primary School in the western suburbs of Melbourne. Comparisons with the 60 individual schools in its statistically similar group show that it performs substantially above most of them, a dark tan colour indicating a performance more than half a standard deviation above the comparison school.Purple indicates a comparison school that outperforms Broadmeadows – dark purple indicating that the difference is more than half a standard deviation and light purple indicating that the difference is between 0.2 and 0.5 of a standard deviation.There is very little purple in the report for Broadmeadows but one is shown in the part of the report reproduced in this slide. Cudgen Primary School in NSW outperformed Broadmeadows in reading in Year 5, though not Year 3. The Principal of Broadmeadows has said publicly that he will contact and even visit the few schools that outperform Broadmeadows while working with similar students to see what he and his school might learn from them.While Broadmeadows is looking elsewhere for clues for its own improvement, many other schools should be looking to Broadmeadows as a source of inspiration and advice. Broadmeadows shows that social background does not determine destiny. If others follow its lead, Australia could hope to reduce the impact of differences in social background on educational achievement to the low level already achieved in Canada.
Barry Mcgaw, Australia
The key drivers of high performance systems –Australia <br />Barry McGawChair, Australian Curriculum, Assessment and Reporting Authority <br />International perspectives on U.S. education policy and practice: what can we learn from high performing nations?Asia Society/Council of Chief State School Officers Symposium<br />Washington, DC, 27-28 April 2010<br />
Mean reading results (PISA 2000)<br />Australia tied for 2nd with 8 othersamong 42 countries.<br /><ul><li>OECD (2003), Literacy skills for the world of tomorrow: Further results from PISA 2000, Fig. 2.5, p.76.</li></li></ul><li>Trends in PISA reading performance<br />Korea<br />Finland<br />Hong KongChina<br />Canada<br />New Zealand<br />Australia<br />Poland<br />
Countries ahead of Australia in PISA 2006<br />Mathematics<br />Reading<br />Science<br />CanadaFinlandHong Kong<br />CanadaFinlandHong KongKoreaNew Zealand <br />CanadaFinlandHong KongKoreaMacaoNetherlandsSwitzerlandTaiwan<br />
Story line<br />Quality<br />Australia ranks high among OECD and other countries<br />The competition is not standing still <br />
Social background & reading literacy<br />High<br />Two indices of relationship<br />Social gradient<br />Correlation or variance accounted for<br />Social gradient: Magnitude of increment in achievement associated with an increment in social background (on average) <br />Reading literacy<br />Correlation: How well the regression line summarises the relationship<br />SocialAdvantage<br />Low<br />PISA Index of social background<br />Source: OECD (2001) Knowledge and skills for life, Appendix B1, Table 8.1, p.308<br />
Variation in science performance (PISA 2006)<br />Variation of performance within schools<br />Australia<br />68%<br />32%<br />Variation of performance between schools<br />Explained by SES<br />Not explained by SES<br />
Story line<br />Quality<br />Australia ranks high among OECD and other countries<br />The competition is not standing still<br />Equity<br />Social background & educational differences related more strongly than in some other comparable countries<br />
The past and the future<br />Reasons for past success<br />Diffused among 6 states, 2 territories<br />Diffused among govt (67%) and non-govt (33%) schools<br />Systematic initiatives in govt schools systems<br />Current reforms to build upon best practices<br />Development of a national curriculum<br />Monitoring of system and schools<br />Reporting on school performance<br />Improving resource levels<br />
Development of a national curriculum<br />History<br />Federation with State but not local curricula<br />Range from syllabuses to frameworks<br />Moves towards national approach since 1989<br />Rationale<br />Common needs of young Australians in C21, including those who cross state borders<br />We could do better working together to improve quality and equity<br />Globalisation and international competition<br />
Curriculum development stages<br />Shape of the Australian Curriculum (15pp)<br />Shape of the Australian Curriculum:<br />English (16pp), Math (14pp), Science (13pp), History (16pp) <br />Australian Curriculum: English, Math, Science, History<br />Shape Papers<br />Draft<br />Final<br />Implementation<br />Other subjectsfollowing<br />Design Paper<br />www.acara.edu.au<br />
Structure of curriculum<br />Subjects/learning areas<br />General capabilities<br />literacy, numeracy, information and communication technology literacy, thinking skills, creativity, self management, teamwork, intercultural understanding, ethical behaviour and social competence (likely to be restructured as developmental continua developed)<br />Cross-curriculum dimensions<br />Indigenous history and culture, Sustainability, Asia and Australia’s engagement with Asia<br />www.australiancurriculum.edu.au<br />
OtherDimensions<br />GeneralCapabilities<br />Facility to restrict grade range<br />Content descriptions<br />(Section deleted here.)<br />Achievement standards with links to annotated samples of students’ work further down page<br />Content elaborations<br />
A concept, not yet a reality<br />Teacher forum<br />Teacher’s selected curriculum<br />Potential resource material identified via meta-data in curriculum website.<br />
Story line<br />Quality<br />Australia ranks high among OECD and other countries<br />The competition is not standing still<br />Equity<br />Social background & educational differences related more strongly than in some other comparable countries<br />Curriculum<br />Clear, brief, explicit, setting high expectations for all<br />
Primary school with disadvantaged students<br />Dark green indicating school mean more than 0.5 std dev above national mean (ALL)<br />Dark green indicating school mean more than 0.5 std dev above mean in 60 schools with students from similar social backgrounds (SIM)<br />Index of socio-educational advantage (ICSEA) almost 1 std dev below mean<br />www.myschool.edu.au<br />
Focusing on distribution not minimum acceptable level<br />Graphs show percentages of students in bands on test (in this case reading) for school (dark brown) its 60 similar schools (mid-brown) all schools (light brown)<br />
Disadvantaged school outperforming similar schools<br />School better than comparison school:<br />>.5 SD – dark tan<br />>.2 and <.5 SD – light tanSchool worse than comparison school:<br />>.5 SD – dark purple<br />>.2 and <.5 SD – light purple<br />
Improving resource levels<br />A pre-election Dec ‘07 commitment to computers in schools<br />Post-global financial crisis stimulus spending on school buildings<br />Increased recurrent funding<br />Federal government near doubling of school funding<br />Funding targeted to disadvantaged schools, identified empirically<br />New Australian Institute for Teaching and School Leadership<br />