Skip to main content

Yong Zhao: The PISA Illusion

PISA is a masterful magician. It has successfully created an illusion of education quality and marketed it to the world. In 2018, 79 countries took part in this magic show out of the belief that this triennial test accurately measures the quality of their education systems, the effectiveness of their teachers, the ability of their students, and the future prosperity of their society.

PISA’s magical power in the education universe stems from its bold claims and successful marketing. It starts by tapping into the universal anxiety about the future. Humans are naturally concerned about the future and have a strong desire to know if tomorrow is better than, or at least as good as, today. Parents want to know if their children will have a good life; politicians want to know if their nations have the people to build a more prosperous economy; the public wants to know if the young will become successful and contributing members of the society.

PISA brilliantly exploits the anxiety and desire of parents, politicians, and the public with three questions:

How well are young adults prepared to meet the challenges of the future? Are they able to analyse, reason and communicate their ideas effectively? Do they have the capacity to continue learning throughout life? (OECD, 1999, p. 7).

These words begin the document that introduced PISA to the world in 1999 and have been repeated in virtually all PISA reports ever since. The document then states the obvious: “Parents, students, the public and those who run education systems need to know” (OECD, 1999, p. 7). And as can be expected, PISA offers itself as the fortuneteller by claiming that:

PISA assesses the extent to which 15-year-old students, near the end of their compulsory education, have acquired key knowledge and skills that are essential for full participation in modern societies. … The assessment does not just ascertain whether students can reproduce knowledge; it also examines how well students can extrapolate from what they have learned and can apply that knowledge in unfamiliar settings, both in and outside of school. This approach reflects the fact that modern economies reward individuals not for what they know, but for what they can do with what they know. (OECD, 2016, p. 25).

This claim not only offers PISA as a tool to sooth anxiety but also, and perhaps more importantly, makes it the tool for such purpose because it helps to knock out its competitors. As an international education assessment, PISA came late. Prior to PISA, the International Association for the Evaluation of Educational Achievement (IEA) had already been operating international assessments since the 1960s, offering influential programs such as TIMSS and PIRLS. For a start-up to beat the establishment, it must offer something different and better. That’s exactly what PISA promised: a different and better assessment.

The IEA “surveys have concentrated on outcomes linked directly to the curriculum and then only to those parts of the curriculum that are essentially common across the participating countries” (OECD, 1999, p. 10) and that’s a problem according to PISA because:

School curricula are traditionally constructed largely in terms of bodies of information and techniques to be mastered. They traditionally focus less, within curriculum areas, on the skills to be developed in each domain for use generally in adult life. They focus even less on more general competencies, developed across the curriculum, to solve problems and apply one’s ideas and understanding to situations encountered in life. (OECD, 1999, p. 10).

PISA overcomes the limitations by assessing “what skills are deemed to be essential for future life,” which may or may not be covered by school curriculum. So it claims. In other words, PISA asserts that other international surveys measures how well students have mastered the intended school curriculum of education systems, but the school curriculum could be misaligned with what is needed for future life.

To make the offer even better, PISA makes another seductive claim to education policy makers: “By directly testing for knowledge and skills close to the end of basic schooling, OECD/PISA examines the degree of preparedness of young people for adult life and, to some extent, the effectiveness of education systems,” (OECD, 1999, p. 11). To paraphrase, PISA not only tells you if your children are prepared for future life, but also tells you that you have control over it through improving “the effectiveness of education.” Thus, “if schools and education systems are to be encouraged to focus on modern challenges,” PISA is needed.

However, the claim, the foundation upon which PISA has built its success, has been seriously challenged. First, there is no evidence to justify, let alone prove, the claim that PISA indeed measures skills that are essential for life in modern economies. Second, the claim is an imposition of a monolithic and West-centric view of societies on the rest of the world. Third, the claim distorts the purpose of education.

Made-up Claim

The claim that PISA measures knowledge and skills essential for the modern society or the future world is not based on any empirical evidence. Professor Stefan Hopmann of the University of Vienna writes:

There is no research available that proves this assertion beyond the point that knowing something is always good and knowing more is better. There is not even research showing that PISA covers enough to be representative of the school subjects involved or the general knowledge-base. PISA items are based on the practical reasoning of its researchers and on pre-tests of what works in most or all settings – and not on systematic research on current or future knowledge structures and needs. (Hopmann, 2008, p. 438).

In other words, the claim was just a fantasy, an illusion, entirely made up by the PISA team. But PISA keeps repeating its assertion that measures skills needed for the future. The strategy worked. PISA successfully convinced people through repetition.

Furthermore, there is empirical evidence that suggests what PISA measures is not significantly different from other international assessments or intelligence tests. For example, despite its claim to measure something different from studies such as TIMSS, performance on PISA is significantly correlated with TIMSS.

And ironically, the PISA project used results from other studies to support its case. PISA published an influential report aimed at demonstrating the importance of what it measures for economic development (Hanushek & Woessmann, 2010). The report made a number of stunning claims about the long term economic impact of improving PISA outcomes, including, for example, “having all OECD countries boost their average PISA scores by 25 points over the next 20 years … implies an aggregate gain of OECD GDP of USD 115 trillion over the lifetime of the generation born in 2010” (Hanushek & Woessmann, 2010, p. 6).

The report has been challenged by a number of scholars (Kamens, 2015; Klees, 2016; Komatsu & Rappleye, 2017; Stromquist, 2016). One of the most devastating problems with the conclusion of significant relationship between test scores and economic growth is the logic underlying the analysis utilized to reach the conclusion. The report compared test scores in a given period (1964-2003) with economic growth during roughly the same period (1960-2000), which is logically flawed because the students who took the test were not in the workforce at the time. It takes time for the students to enter the workforce and make up a significant portion of the workforce. Thus “test scores of students in any given period should be compared with economic growth in a subsequent period” (Komatsu & Rappleye, 2017, p. 170). Studies that compared test scores with economic growth in the subsequent periods using the same dataset and method found no “consistently strong nor strongly consistent” relationship between test scores and economic growth and “that the relationship between changes in test scores in one period and changes in economic growth for subsequent periods were unclear at best, doubtful at worst (Komatsu & Rappleye, 2017, p. 183), essentially invalidating the claims made in the report.

Even if the claims were valid, they primarily relied on results of international assessments besides PISA. While the report states that it “uses recent economic modeling to relate cognitive skills – as measured by PISA and other international instruments – to economic growth” (Hanushek & Woessmann, 2010, p. 6), the fact is that results from PISA constituted a very small portion of the data used in the modeling. Only three rounds of PISA had been offered by the time the report was released. Moreover, the economic data covered the period of 1960 to 2000, the year when PISA was first implemented. Only one round of PISA data was included but the report relied on “data from international tests given over the past 45 years in order to develop a single comparable measure of skills for each country that can be used to index skills of individuals in the labour force” (Hanushek & Woessmann, 2010, p. 14).

Hanushek and others (Hanushek, 2013; Hanushek & Woessmann, 2008; Hanushek & Woessmann, 2012) have repeated similar claims about the economic impact of improving PISA. Whether the conclusions are correct is a different matter. The point is that PISA’s claim to measure something different from other international assessments is a lie. It indeed measures the same construct as others. The claim to better measure what matters in the modern economy or the future world than other tests that had been in existence prior to the invention of PISA is but a made-up illusion.

A Monolithic View of Education

Underlying PISA’s claim is the assumption that there is a set of skills and knowledge that are universally valuable in all societies, regardless of their history and future. “A fundamental premise for the PISA project is that it is indeed possible to "measure the quality of a country‘s education by indicators that are common, i.e. universal, independent of school systems, social structure, traditions, culture, natural conditions, ways of living, modes of production etc.” (Sjøberg, 2015, p. 116). But this assumption is problematic.

The first problem is that there is more than one society in the world and societies are different from each other. For all sorts of reasons—cultural, political, religious, and economical—different societies operate differently and present different challenges. Meeting different challenges requires different knowledge and skills. As a result, “one can hardly assume that the 15-year olds in e.g. USA, Japan, Turkey, Mexico and Norway are preparing for the same challenges and that they need identical life skills and competencies” (Sjøberg, 2015, p. 116).

The second and a bigger problem with PISA’s assumption of a universal set of valuable skills and knowledge for all countries is its imposition of a monolithic, primarily Western view of societies. PISA was first and foremost developed to serve member states of OECD, most of which are the world’s most advanced economies with only a few exceptions such as Mexico, Chile and Turkey. The 35 OECD members in no way represent the full spectrum of diversity across the nearly 200 countries in the world today. The assumptions supporting PISA are primarily based on the economic and education reality of OECD members. Not surprisingly, “the PISA framework and its test are meant for the relatively rich and modernized OECD-countries. When this instrument is used as a ‘benchmark’ standard in the 30+ non-OECD countries that take part in PISA, the mismatch of the PISA test with the needs of the nation and its youth may become even more obvious” (Sjøberg, 2015, p. 116).

Distorted View of Education

Although PISA claims that it does not assess according to national curricula or school knowledge, its results have been interpreted as a valid measure of the quality of educational systems. But the view of education promoted by PISA is a distorted and extremely narrow one (Berliner, 2011; Sjøberg, 2015; Uljens, 2007). PISA treats economic growth and competitiveness as the sole purpose of education. Thus it only assesses subjects—reading, math, science, financial literacy, and problem solving—that are generally viewed as important for boosting competitiveness in the global economy driven by science and technology. PISA shows little interest in other subjects that have occupied the curricula of many countries such as the humanities, arts and music, physical education, social sciences, world languages, history, and geography (Sjøberg, 2015).

While preparing children for economic participation is certainly part of the responsibility of educational institutions, it cannot and should not be the only responsibility (Labaree, 1997; Sjøberg, 2015; Zhao, 2014, 2016). The purpose of education in many countries includes a lot more than preparing economic beings. Citizenship, solidarity, equity, curiosity and engagement, compassion, empathy, curiosity, cultural values, physical and mental health, and many others are some of the frequently mentioned purposes in national education goal states. But these aspects of purpose of education “are often forgotten or ignored when discussions about the quality of the school is based on PISA scores and rankings” (Sjøberg, 2015, p. 113).

The distorted and narrow definition of the purpose of education is one of the major reasons for some of the peculiar and seemingly surprising discoveries associated with PISA. There is the persistent pattern of negative correlation between PISA scores and students’ interest and attitude. Many researchers have found that higher PISA scoring countries seem to have students with lower interest in and less positive attitude toward the tested subject (Bybee & McCrae, 2011; Zhao, 2012, 2014, 2016). For example, PISA science score has a significant negative correlation with future science orientation and with future science jobs (Kjærnsli & Lie, 2011). High PISA scores have also been found to be associated with lower entrepreneurship confidence and capabilities (Campbell, 2013; Zhao, 2012). Moreover, high PISA scoring education systems seemed to have a more authoritarian orientation (Shirley, 2017; Zhao, 2014, 2016). Additionally, PISA scores have been found to have a negative correlation with student wellbeing (Shirley, 2017; Zhao, 2014, 2016), a finding that was finally openly acknowledged by PISA in a 2017 report (OECD, 2017). These findings basically suggest that PISA only measures a very narrow aspect of education and neglects to pay attention to the broader responsibilities of educational systems. Furthermore, pursuing the narrowly defined purpose of education may come at the cost of the broader purpose of education (Zhao, 2017, 2018). “There are very few things you can summarise with a number and yet Pisa claims to be able to capture a country’s entire education system in just three of them. It can’t be possible. It is madness” (Morrison, 2013).

In summary, PISA successfully marketed itself as a measure of educational quality with the claim to measure skills and knowledge that matters in modern economies and in the future world. Upon closer examination, the excellence defined by PISA is but an illusion, a manufactured claim without any empirical evidence. Furthermore, PISA implies a monolithic and espouses a distorted and narrow view of purpose for all education systems in the world. The consequence is a trend of global homogenization of education and celebration of authoritarian education systems for their high PISA scores, while ignoring the negative consequences on important human attributes and local cultures of such systems.

References

Berliner, D. C. (2011). The context for interpreting PISA results in the USA: Negativism, chauvinism, misunderstanding, and the potential to distort the educational systems of nations. In M. A. Pereyra, H.-G. Kotthoff, & R. Cowen (Eds.), Pisa Under Examination (pp. 77-96). New York: Springer.

Bybee, R., & McCrae, B. (2011). Scientific literacy and student attitudes: Perspectives from PISA 2006 science. International Journal of Science Education, 33(1), 7-26.

Campbell, M. (2013, Jan 5). West vs Asia education rankings are misleading: Western schoolchildren are routinely outperformed by their Asian peers, but worrying about it is pointless. New Scientist( 2898). Retrieved from https://www.newscientist.com/article/mg21728985-800-west-vs-asia-education-rankings-are-misleading/

Hanushek, E. A. (2013). Economic growth in developing countries: The role of human capital. Economics of Education Review, 37, 204-212.

Hanushek, E. A., & Woessmann, L. (2008). The role of cognitive skills in economic development. Journal of Economic Literature, 46(607–668).

Hanushek, E. A., & Woessmann, L. (2010). The High Cost of Low Educational Performance: The Long-run Economic Impact of Improving PISA Outcomes. Retrieved from Paris: http://books.google.com/books?id=k7AGPo0NvfYC&pg=PA33&lpg=PA33&dq=hanushek+pisa+gdp&source=bl&ots=2gCfzF-f1_&sig=wwe0XLL5EblVWK9e7RJfb5MyhIU&hl=en&sa=X&ei=MLPCUqaOD8-JogS6v4C4Bw&ved=0CGcQ6AEwBjgK – v=onepage&q=hanushek%20pisa%20gdp&f=false

Hanushek, E. A., & Woessmann, L. (2012). Do better schools lead to more growth? Cognitive skills, economic outcomes, and causation. Journal of economic growth, 17(4), 267-321.

Hopmann, S. T. (2008). No child, no school, no state left behind: Schooling in the age of accountability. Journal of Curriculum Studies, 40(4), 417-456.

Kamens, D. H. (2015). A maturing global testing regime meets the world economy: Test scores and economic growth, 1960–2012. Comparative Education Review, 59(3), 420-446.

Kjærnsli, M., & Lie, S. (2011). Students’ preference for science careers: International comparisons based on PISA 2006. International Journal of Science Education, 33(1), 121-144.

Klees, S. J. (2016). Human capital and rates of return: brilliant ideas or ideological dead ends? Comparative Education Review, 60(4), 644-672.

Komatsu, H., & Rappleye, J. (2017). A new global policy regime founded on invalid statistics? Hanushek, Woessmann, PISA, and economic growth. Comparative Education, 53(2), 166-191.

Labaree, D. (1997). Public Goods, Private Goods: The American Struggle over Educational Goals. American Educational Research Journal, 34(1), 39-81.

Morrison, H. (2013, December 1). Pisa 2012 major flaw exposed. Retrieved from https://paceni.wordpress.com/2013/12/01/pisa-2012-major-flaw-exposed/

OECD. (1999). Measuring Student Knowledge and Skills: A New Framework for Assessment. Retrieved from Paris: http://www.oecd.org/education/school/programmeforinternationalstudentassessmentpisa/33693997.pdf

OECD. (2016). PISA 2015 Results (Volume I): Excellence and Equity in Education. Retrieved from Paris: http://dx.doi.org/10.1787/9789264266490-en

OECD. (2017). PISA 2015 Results: Students’ Well-being. Retrieved from Paris: http://www.keepeek.com/Digital-Asset-Management/oecd/education/pisa-2015-results-volume-iii_9789264273856-en – .Wk1WGrQ-fOQ#page1

Shirley, D. (2017). The New Imperatives of Educational Change: Achievement with Integrity. New York: Routledge.

Sjøberg, S. (2015). PISA and Global Educational Governance-A Critique of the Project, its Uses and Implications. Eurasia Journal of Mathematics, Science & Technology Education, 11(1), 111-127.

Stromquist, N. P. (2016). Using regression analysis to predict countries’ economic growth: Illusion and fact in education policy. Real-World Economics Review, 76, 65-74.

Uljens, M. (2007). The Hidden Curriculum of PISA: The Promotion of Neo-Liberal Policy By Educational Assessment. In S. T. Hopmann, Gertrude Brinek, & M. Retzl (Eds.), PISA zufolge PISA – PISA According to PISA (pp. 295-303). Berlin: Lit Verlag.

Zhao, Y. (2012). World Class Learners: Educating Creative and Entrepreneurial Students. Thousand Oaks, CA: Corwin.

Zhao, Y. (2014). Who’s Afraid of the Big Bad Dragon: Why China has the Best (and Worst) Education System in the World. San Francisco: Jossey-Bass.

Zhao, Y. (2016). Who’s Afraid of PISA: The Fallacy of International Assessments of System Performance. In A. Harris & M. S. Jones (Eds.), Leading Futures (pp. 7-21). Thousand Oaks, CA: Sage.

Zhao, Y. (2017). What Works Can Hurt: Side Effects in Education. Journal of Educational Change, 18(1), 1-19.

Zhao, Y. (2018). What Works May Hurt: Side Effects in Education. New York: Teachers College Press.

 

[1] Adapted from part 1 of my article Two Decades of Havoc: A Synthesis of Criticism against PISA to appear in Journal of Educational Change https://link.springer.com/journal/10833

This blog post has been shared by permission from the author.
Readers wishing to comment on the content are encouraged to do so via the link to the original post.
Find the original post here:

The views expressed by the blogger are not necessarily those of NEPC.

Yong Zhao

Dr. Yong Zhao is an internationally known scholar, author, and speaker. His works focus on the implications of globalization and technology on education. He ...