We hear a lot about the “shock”, but that’s only one kind of response.Some countries respond more calmly or ignore their PISA (or TIMSS) results altogether.I’m interested primarily in 2 response phenomena: (1) nationally-centralized vs. focused or targeted reforms, and (2) globally-convergent or divergent policy responses.The reason why policy responses are important is because the whole point of PISA is to create an internationally-comparative evidence base for educational policy development and implementation.
So, it’s not shocking that developing educational (or political or social or economic) policy is an outcome of PISA (or TIMSS or any other large-scale international educational assessment). This is one of the key reasons that the OECD developed cross-nationally comparative education indicators, and why the OECD’s Education at a Glance is a widely-recognized and important source for educational indicators among researchers, policymakers, the media and the general public.So, one of the most visible responses to PISA (ie, PISA shock) has become institutionalized as the flagship or stereotypical response to PISA and other large-scale international educational assessments) – but it really is only ONE response of many. Still…since PISA shock is so public and popular in the discussions about policy responses, I’m going to start with it here, too.
Shock happens when there is a deviation from the norm (usually involving mediocre or low performance; ie, below expectations).The media can play a large role in both creating the normative structures (for what is expected) as well as highlighting the gaps when expectations aren’t met.Germany is best known for its PISA “Schock” following its first participation. The flurry of media discussion and publicly-displayed hand-wringing about Germany’s performance was shocking itself.Japan had its own version of PISA shock when it dropped in reading literacy between PISA 2000 and 2003 and then again in math literacy between PISA 2003 and 2006.The drops were comparative small given the situation of other participating countries, but the Japanese response was large.The US has never had PISA shock because they spent so much time on A Nation at Risk that the “shock” level never has gone down. Why is the “shock” related to PISA and other large-scale international assessments so public and pronounced?Initial reports describing international tests make mainstream media outlets such as newspapers, television news and talk shows, and internet news and blogs (Koretz, 2009), but are often not nuanced in terms of comparability, mediating factors, and specific organizational cultures the way that secondary analyses are. There is rarely fanfare or widespread coverage of secondary analyses using international assessment data, especially in countries where students’ average performance is below expectations.
In the research community, we often overlook the importance of media coverage when it comes to thinking about policy responses to PISA.But there are a few in our community who have noticed the impact that the media has – certainly there is an acute awareness among those who organize, plan, and disseminate PISA or TIMSS or any other large-scale assessment. This is evidenced by the media kits, press releases, initial results announcement events, etc. that have become a staple of the PISA dissemination cycle.In particular, Michelle Stack at the University of British Columbia and Kirsten Martens at the University of Bremen. I’m sure there are others as well, but these two have been on my radar lately.For example, Martens and a colleague developed the table above looking at the frequency of newspaper articles related to PISA using one national high-quality daily newspaper with high circulation from each of the countries listed during the period Dec 2001 – Nov 2008.The results are interesting. Germany, as you can see, led the pack with about 250 PISA-related articles, while the USA had virtually none that directly referenced PISA during that same time frame.The immediate tendency is to jump to the conclusion that the more media coverage the more “shock”, but it’s not quite that clear cut.
For example, Martens and Niemann also looked at the association between the number of PISA-related newspaper articles and a country’s PISA ranking. It is easy to see that there is not a consistent trend here.Finland ranks very well and has about 110 newspaper articles in the 7 year period covered.The USA ranks near the bottom, but has no PISA-related articles in that same period.But, Germany, which ranks above the USA, but is firmly in the middle of the pack has an explosion of media-coverage related to PISA.What we can notice as an overall trend, perhaps, is that the worse a countries ranking, the more differentiated the reactions are.We can see this in the way that the countries fan out across the article frequency spectrum as the rank drops (moving to the right of the figure).But we can’t necessarily tie a country’s policy response to PISA to the impact of the media directly or exclusively.So, then how can we explain the differences in policy response to PISA in different countries? And, what are the trends in policy response?To examine this, let’s start with understanding why the “shock” occurs and how that influences policy responses.
In an online article a few years ago that went viral (at least as it relates to Germany and PISA) which comes from – again – Kirsten Martens and a colleague from the University of Bremen – They point out that expectations have a lot to do with why Germany responded the way it did.They suggest that due to normative structures about education and the status quo in Germany and among the German public, “Germans had always tacitly assumed that they led the world in education.”So, when there is evidence that a nation’s educational system and performance is NOT (or specifically BELOW) what has been assumed, expected or taken-for-granted, there is some “shock” that occurs.And, when this evidence is displayed publicly, internationally, and in comparison to other nation’s educational systems and performance, the shock is exacerbated.But what specifically is so shocking? What was assumed that the PISA results exposed as not true or remarkably different from expectations?
The expectations that national educational systems worldwide deal with largely revolve around the legitimating myths of EXCELLENCE and EQUITY.Excellence generally means that students are expected to receive the highest quality education and perform at the top of whatever scale their performance is measured on.Equity generally means that the educational systems are expected to provide equal or at least equitable learning opportunities for all students regardless of their race, ethnicity, socioeconomic class, or gender.Now, the reason why I call these legitimating myths is not because they aren’t true or because national education systems don’t strive to attain them; I call them legitimating myths because it is certain that no educational system has or can completely attain complete EXCELLENCE or EQUITY, yet these constructs guide, inform, and are used to assess educational systems, educators and students in every educational system worldwide. So, not surprisingly, the shock factors result from deviations from those globally-normed, legitimating expectations.Excellence is found to be lacking in many PISA countries because average literacy levels (in reading, math or science) are often low or mediocre compared to other countries and a high percentage of students are shown to lack basic competency in core subjects.Equity is found to be lacking in many PISA countries as well because of the demonstrated large differences between low/high performers and between racial/ethnic and socioeconomic groups.In other words, nations tend to react strongly when PISA results show that they are not achieving excellence or equity in their educational systems, especially when they expect to be or when the results are compared to other higher achieving or more equitable countries’ results.
In June of this year, the 14th OECD Japan Seminar titled, "Strong Performers, Successful Reformers in Education” was held in Tokyo, Japan.At this seminar, several high-ranking ministry of ed or ed department officials presented on why their countries were either strong performers or successful reformers; and the seminar organizers were kind enough to post their slides online. I’m going to use examples from some of these presentations as we move forward.So, for example, Mr. Ken Suzuki, who is the Senior Vice Minister of MEXT in Japan, talked about how Japan constructed an evidence-based improvement cycle using PISA results as their platform. In so doing, he pointed out the reasons why there was a PISA “crisis” in Japan using this visual aid to show how Japan’s performance levels dropped in reading literacy between PISA 2000 and 2003 and then again in math literacy between PISA 2003 and 2006.While Japan’s performance on PISA is enviable from the positions of many other participating PISA countries, the drop from the top group to the group above the mean or more chillingly for the Japanese the international average group, was NOT what they expected in terms of EXCELLENCE.
In contrast to the excellence woes of the Japanese are the improvements by the Polish.The Polish example shows that often the key to solving the excellence problem is to solve the equity problem first. By developing the competencies of the lowest-performing students (ie, narrowing the gap between the high and low performers) and by developing the competencies of girls (a group that is frequently marginalized in educational systems worldwide), Poland was able to be one of the few PISA countries that actually gained in average reading performance since PISA 2000.
If we take a closer look at gender differences in international student assessment scores on PISA, TIMSS and PIRLS (as this table from Hamano shows), we can see that gender equity is a long way away. In a new study that I’m working on with David Greger, a colleague from Charles University in the Czech Republic, we have been able to show that in countries where national average performance on international assessments rises, it is significantly often a result of significant improvement on the part of the marginalized or minority groups (specifically girls/women), and stable or some improvement on the part of majority or dominant groups (specifically boys/men).In other words, excellence significantly improves when equity rises (equity is defined as raising the learning/performance of marginalized or minority groups rather than raising the performance of the dominant/majority).So, countries that focus on policy reforms geared towards equity tend to improve overall.
Yet, PISA and other large-scale assessments are more often tools of “soft power” than they are evidence for newly-recognized needs or agendas.For example, the use of education as a policy platform by Reagan at the end of the Cold War and the resulting report, A Nation at Risk.Suggested the development of internationally-comparative education indicators as a way to both export the American debate about educational quality, but also to identify how and where US students need to improve in order to be “first in the world” – a mantra the US still trots out every election season.But, there is also the contrasting example of the French who in the mid-1980s and early 1990s, for example, supported the development of the OECD’s work to develop education indicators that would provide evidence that an inequitable French system “fostered poor performance”.Participation in PISA (and TIMSS/PIRLS) is also a version of soft power. By participating in the international assessments countries legitimize their educational systems to a degree by belonging to the group of nations that visibly places emphasis (ie, values) education, and creates a “culture” of education that sees it as a foundation for economic, political and social development. The policy response to PISA among those countries that participate is therefore also a form of soft power.
The policy responses to PISA come in many forms, but I’d like to suggest that they follow two patterns as a result of attraction, borrowing, mimicry, coercion or whatever process or terms you find it most convenient to use.I mentioned earlier in my talk that I was interested in national centralization versus “focused” reforms as a policy response, on the one hand, and globally convergent versus divergent policy responses, on the other.But, it is difficult to find truly divergent responses. Instead, it is much more likely to find some policy responses that converge most of them more closely resemble “isomorphism”. Policy convergence as a response to PISA is when different countries have the same policy response with the same or similar implementation of those reformed policies.Policy isomorphism, however, is when countries’ policy responses to PISA correspond or are similar, but don’t converge. And, in my opinion, the evidence suggests that policy isomorphism as a response to PISA is much more likely than divergence or convergence.
The process of policy isomorphism, while not identical from country-to-country, does follow some general paths, which include:Immediate, large-scale policy-driven reforms as a response to low performance on summary reports (Germany)Research-driven, national model development to understand the link between student learning and performance (Japan)Regional comparison approach looking at contextual factors that impact school effectiveness and academic achievement (USA)Each of these response patterns follow the logic of “evidence-based” policymaking, which looks at the crisis or situation, determines “what works” best in order to either fix poor performance, develop national models for improvement, or identify contextual factors that drive school effectiveness.In some instances, these policy responses are driven by the rankings – perhaps more often than not because these are the initial evidence that normed expectations are not being met among many countries.But, whichever approach policymakers take, the fact that PISA data is quantifiable and standardized across countries makes it a legitimized evidence base for educational policy reform or agenda-setting.
While there are certainly many different policy responses to PISA, the ones that seem to receive the most attention from policymakers, the public, and the media are related to:Improving teacher qualityDeveloping accountability systems around standards, andCreating opportunities for equitable educationAs mentioned already, these policy responses do not “converge” across countries as much as they isomorphically correspond among countries.Here are some examples again taken from the 14th OECD Japan Seminar on “Strong Performers and Successful Reformers) held last June.
Given that Finnish students performed very well on PISA, it is not surprising perhaps that there is no doubt or constructive criticism of the Finnish teachers. Just one more example of how the policy responses vary.But, as an example of strong performers both Finland and Singapore are poured over by policymakers interested in reforming teacher preparation or raising teacher quality in response to PISA results. As SotiriaGrek pointed out in an analysis of the impact of PISA from the Journal of Education Policy (2009), Finnish educators and policymakers were surprised by the ways that PISA launched them into the spotlight. For example, she quotes some Finnish educational researchers as saying, “At a single stroke, PISA has transformed our conceptions of the quality of the work done at our comprehensive school and of the foundations it has laid for Finland’s future civilization and development of knowledge.” (Valijarvi et al, 2002, p.3)Specifically, the importance of Finland’s teachers in their comprehensive school system has become a model for other countries in terms of the individualization teachers provide students, teachers’ autonomy in evaluating student performance and selecting textbooks, and the high status of teachers in Finnish society in general.
Likewise, Singapore’s teachers are promoted as a key model for countries whose educational expectations for excellence, in particular, were not met. While the training and preparation of teachers is a big factor, the professional respect and status of teachers necessarily follows and is a frequently discussed topic related to teacher education/teaching policy reform.For example, Andreas Schleicher has publicly pointed out in recent media outlets that the highest performing countries like Finland and Singapore “recruit only high-performing college graduates for teaching positions, support them with mentoring and other help in the classroom, and take steps to raise respect for the profession.” (NYT, March 2011)
As the Finnish director general for the Center for International Mobility, Dr. PasiSahlberg, pointed out at the recent OECD Japan Seminar on Strong Performers and Successful Reformers, PISA results show that the strongest performers have the least variation between schools, but relatively high levels of student variation within schools. This is due to the policy practices of inclusion and attempts to ensure systemwide equity in education.
The ways that countries respond to issues of educational equity as a response to PISA can vary, however, even though they correspond in their attempts to address the same equity issues.For example, Mr. BurhanuddinTola, from Indonesia’s Ministry of National Education, has provided evidence that Indonesia is approaching educational equity issues using vocational and polytechnic education as their policy response. This response from the Indonesian perspective is to create an educated and prepared future workforce to serve Indonesia’s many economic developments.This is an example of a research-driven, national model development to understand the link between student learning and performance.In contrast to the efforts of the Japanese educational system and educational policymakers…
…which used a survey in addition to the PISA results to identify problems with their educational sector that they thought they should address with new educational policies and reforms.For example, policy responses came in two areas: (1) improvement of learning content, and (2) improvement of the educational system.Improvement of learning content included:Revised courses of study through national standards, systematic content revision, and improved government screening of textbooks.Improvement of the educational system included:Reduced class sizes, increases in the number of new teachers, enhancement of continuing educ and PD for teachers, the revision of the school management model to create more involvement and empowerment of the community, and developing an accountability system from the national to local levels.
So, if creating education indicators and evidence to encourage and support the development of policy or at least policy lessons is a key purpose of PISA, I’d say it’s been a success.Policy responses to PISA may be incorporated into preexisting “soft power” agendas or a response to evidence-based policymaking.These responses are more likely to correspond broadly across nations than they are to converge, and within nations the emphasis is more on centralization of reform efforts even if the centralized policy is to push responsibility to the local schools and districts.And, there are trends in the ways that educational policy responses form using PISA as the base, which include consistently (although in various formats) the three areas of (1) teacher quality, (2) accountability and standards, and (3) educational equity.Countries whose normative expectations for education are most egregiously contradicted by the PISA results and publicly-documented in the media are more likely to respond with “shock” and immediate, large-scale policy-driven reforms (Germany).Countries whose normative expectations are contradicted, but whose policy response is more research or evidence-based tend to engage in research-driven, national model development to understand the link between student learning and performance (Japan)And, countries whose expectations are not necessarily challenged by PISA results (even when those results show high levels of poor performance) tend to engage in regional comparison by looking at contextual factors that impact school effectiveness and academic achievement (USA)It is my hope that these observations will help guide our discussion of the other papers being presented today as well as the overall approach we take to understanding and analyzing the impact that PISA has on educational policy and its implementation worldwide. Thank you very much.
Policy Responses to PISA in Comparative Perspective
Alexander W. WisemanLehigh Universityaww207@lehigh.eduPISA ConferenceSUNY-AlbanyDecember 2, 2011 1
“Key features driving the development of PISA have been: its policy orientation, with design and reporting methods determined by the need of governments to draw policy lessons…” OECD, 2004, p.12___Martens, K. (2007). How to Become an Influential Actor: The Comparative Turn in OECD Education Policy. In K. Martens, A. Rusconi& K. Leuze (Eds.), New Arenas of Education Governance: The Impact of International Organizations and Markets on Educational Policy Making (pp. 40-56). New York: Palgrave Macmillan.OECD. (2004). Problem Solving for Tomorrows World: First Measures of Cross-Curricular Competencies from PISA 2003. Paris: Author. 2
Level of shock defined by “the industrialized worlds educational status quo.” Role of the media & “normative structures” Germany Japan USA?___Koretz, D. (2009). How Do American Students Measure Up? Making Sense of International Comparisons. The Future of Children, 19(1), 37-51.Martens, K., &Leibfried, S. (2008). How Educational Policy Went International: A Lesson in Politics Beyond the Nation-State [Electronic Version]. The German Times Online, January. Retrieved 28 November 2011. 3Stack, M. (2007). Representing School Success and Failure: Media Coverage of International Tests. Policy Futures in Education, 5(1), 100-110.
___ 4Martens, K., &Niemann, D. (2010). Governance by Comparison: How Ratings & Rankings Impact National Policy-Making in Education (Transtate Working Papers, 139). Bremen: Sfb 597, StaatlichkeitimWandel.
___ 5Martens, K., &Niemann, D. (2010). Governance by Comparison: How Ratings & Rankings Impact National Policy-Making in Education (Transtate Working Papers, 139). Bremen: Sfb 597, StaatlichkeitimWandel.
“The country that prided itself on its education system, on its contributions to Western science and philosophy – that had produced Einstein, Goethe and Marx – ranked at the lower end of the comparative spectrum…Germans had always tacitly assumed that they led the world in education…” Martens &Leibfried, 2008, p.2___ 6Martens, K., &Leibfried, S. (2008). How Educational Policy Went International: A Lesson in Politics Beyond the Nation-State [Electronic Version]. The German Times Online, January. Retrieved 28 November 2011.
EXCELLENCE 1. Average literacy levels mediocre compared to other countries 2. High proportion of students lack basic competencies in core subjects EQUITY 3. Large differences between lowest and highest performers 4. Large differences between racial/ethnic and socioeconomic groups.___Ertl, H. (2006). Educational Standards and the Changing Discourse on Education: The Reception and Consequences of the PISA Study in Germany. Oxford Review of Education, 32(5), 619-634. 7Waldow, F. (2009). What PISA Did and Did Not Do: Germany after the PISA-Shock. European Educational Research Journal, 8(3), 476-483.
___Suzuki, K. (2011). PISA Survey and Education Reform in Japan: Construction of an Evidence-Based Improvement Cycle. Paper presented at the 14th OECD Japan Seminar "Strong Performers, Successful Reformers in Education". 8 Retrieved 30 November 2011, from http://www.oecd.org/dataoecd/45/41/48358140.ppt.
___Sielatycki, M. (2011). Poland: Successes and Challenged Educational Reforms. Paper presented at the 14th OECD Japan Seminar "Strong Performers, Successful Reformers in Education". Retrieved 30 November 2011, from 9 http://www.oecd.org/dataoecd/45/39/48357781.ppt.
___Hamano, T. (2011). The Globalization of Student Assessments and Its Impact on Educational Policy. Proceedings, 13, 1-11. 10
Attraction SOFT POWER • Values • Culture • Policies • Institutions Persuasion___ 11Bieber, T., & Martens, K. (2011). The OECD PISA Study as a Soft Power in Education? Lessons from Switzerland and the Us. European Journal of Education, 46(1), 101-116.
Convergence Isomorphism___DiMaggio, P., & Powell, W. W. (1983). The Iron Cage Revisited: Institutional Isomorphism and Collective Rationality in Organizational Fields. American Sociological Review, 48(April), 147-160.Holzinger, K., &Knill, C. (2005). Causes and Conditions of Cross-National Policy Convergence. Journal of European Public Policy, 12(October), 775-796.Wiseman, A. W., Pilton, J., & Lowe, J. C. (2010). International Educational Governance Models and National Policy Convergence. In S. K. Amos (Ed.), International Educational Governance (Vol. 12, pp. 3-19). Bingley, UK: Emerald 12 Publishing.
Legitimacy- What works seeking Standardized Quantitative___Anderson, J. O., Chiu, M.-H., & Yore, L. D. (2010). First Cycle of PISA (2000-2006)--International Perspectives on Successes and Challenges: Research and Policy Direction. International Journal of Science and Mathematics Education, 8, 373-388.LeTendre, G. K., Baker, D. P., Akiba, M., & Wiseman, A. W. (2001). The Policy Trap: National Educational Policy and the Third International Math and Science Study. International Journal of Educational Policy, Research and Practice, 2(1), 45-64.Wiseman, A. W. (2010). The Uses of Evidence for Educational Policymaking: Global Contexts and International Trends. In A. Luke, G. J. Kelly & J. Green (Eds.), Review of Research in Education (Vol. 34, pp. 1-24). Washington, DC: 13 American Educational Research Association.
___Sahlberg, P. (2011). The Finnish Advantage: Good Teachers. Paper presented at the 14th OECD Japan Seminar "Strong Performers, Successful Reformers in Education". Retrieved 30 November 2011, from 15 http://www.oecd.org/dataoecd/33/43/48358184.ppt.
___Keat, H. S. (2011). Singapore Education Reform Journey. Paper presented at the 14th OECD Japan Seminar "Strong Performers, Successful Reformers in Education". Retrieved 30 November 2011, from 16 http://www.oecd.org/dataoecd/45/37/48357652.ppt.
___Sahlberg, P. (2011). The Finnish Advantage: Good Teachers. Paper presented at the 14th OECD Japan Seminar "Strong Performers, Successful Reformers in Education". Retrieved 30 November 2011, from 17 http://www.oecd.org/dataoecd/33/43/48358184.ppt.
___Tola, B. (2011). Indonesia Reading Proficiency and Influencing Factors. Paper presented at the 14th OECD Japan Seminar "Strong Performers, Successful Reformers in Education". Retrieved 30 November 2011, from 18 http://www.oecd.org/dataoecd/33/49/48360434.ppt.
___Suzuki, K. (2011). PISA Survey and Education Reform in Japan: Construction of an Evidence-Based Improvement Cycle. Paper presented at the 14th OECD Japan Seminar "Strong Performers, Successful Reformers in Education". 19 Retrieved 30 November 2011, from http://www.oecd.org/dataoecd/45/41/48358140.ppt.
Soft power of international achievement studies Policy isomorphism more likely than convergence/divergence PISA as an “evidence-base” for national education policy agendas & development Policy response trends: teacher quality, accountability & standards, educational equity 20