The Programme for International Student Assessment (PISA) is a triennial world-wide test of 15-year-old schoolchildren's scholastic performance, the implementation of which is coordinated by the Organisation for Economic Co-operation and Development (OECD).
The aim of the PISA study is to test and compare schoolchildren's performance across the world, with a view to improving educational methods and outcomes.
In 2000, 265 000 students from 32 countries took part in PISA; 28 of them were OECD member countries. In 2002 the same tests were taken by 11 more "partner" countries (i.e. non-OECD members). The main focus of the 2000 tests was reading literacy, with two thirds of the questions being on that subject.
PISA’s debut round in 2000 was delivered on OECD’s behalf by an international consortium of research and educational institutions led by the Australian Council for Educational Research (ACER). It continued to lead the design and implementation of subsequent rounds of PISA for OECD.
Over 275 000 students took part in PISA 2003, which was conducted in 41 countries, including all 30 OECD countries. (Britain data collection however, failed to meet PISA’s quality standards and so the UK was not included in the international comparisons.) The focus was mathematics literacy, testing real-life situations in which mathematics is useful. Problem solving was also tested for the first time.
In 2006, 57 countries participated, and the main focus of PISA 2006 was science literacy. Results are due out in late 2007. Researchers have begun preparation for 2009, in which reading literacy will again be the main focus, giving the first opportunity to measure improvements in that domain. At last count (end-March 2007), about 63 countries were set to participate in PISA 2009. It is anticipated that more countries will join in before 2009.
Development of the methodology and procedures required to implement the PISA survey in all participating countries are led by ACER. It also leads in developing and implementing sampling procedures and assisting with monitoring sampling outcomes across these countries. The assessment instruments fundamental to PISA’s Reading, Mathematics, Science, Problem-solving, Computer-based testing, background and contextual questionnaires are similarly constructed and refined by ACER. ACER also develops purpose-built software to assist in sampling and data capture, and analyses all data.
The process of seeing through a single PISA cycle, start-to-finish, takes over 4 years.
In reading literacy, the equivalent to TIMSS is the Progress in International Reading Literacy Study or PIRLS. According to the OECD: "OECD/PISA does not measure the extent to which 15-year-old students are fluent readers or how competent they are at word recognition tasks or spelling". Instead, they should be able to "construct, extend and reflect on the meaning of what they have read across a wide range of continuous and non-continuous texts" (Chapter 2 of the publication "PISA 2003 Assessment Framework", pdf) PIRLS, on the other hand, describes reading literacy as "the ability to understand and use those written language forms required by society and/or valued by the individual." (Chapter 1 of PIRLS 2006 Assessment Framework, pdf)-- PIRLS includes using language forms in reading literacy. However, according to the IEA, in scoring the PIRLS tests, "the focus is solely on students’ understanding of the text, not on their ability to write well." (Chapter 4 of PIRLS 2006 Assessment Framework, pdf).
Each student takes a two-hour handwritten test. Part of the test is multiple-choice and part involves fuller answers. In total there are six and a half hours of assessment material, but each student is not tested on all the parts. Participating students also answer a questionnaire on their background including learning habits, motivation and family. School directors also fill in a questionnaire describing school demographics, funding etc.
Criticism has ensued in Luxembourg which scored quite low, over the method used in its PISA test. Although being a trilingual country, the test was not allowed to be done in Luxembourgish, the mother tongue of a majority of students.
Here is an overview of the top six scores in 2003:
|Mathematics||Reading literacy||Science||Problem solving|
Professor Jouni Välijärvi was in charge of the Finnish PISA study: he believed that the high Finnish score was due both to the excellent Finnish teachers and to Finland's 1990s LUMA programme which was developed to improve children's skills in mathematics and natural sciences. He also drew attention to the Finnish school system which teaches the same curriculum to all pupils. Indeed individual Finnish students' results did not vary a great deal and all schools had similar scores.
An evaluation of the 2003 results showed that the countries which spent more on education did not necessarily do better than those which spent less. Australia, Belgium, Canada, the Czech Republic, Finland, Japan, Korea and the Netherlands spent less but did relatively well, whereas the United States spent much more but was below the OECD average. The Czech Republic, in the top ten, spent only one third as much per student as the United States did, for example, but the USA came 24th out of 29 countries compared.
Compared with 2000, Poland, Belgium, the Czech Republic and Germany all improved their results. In fact, apparently due to the changes to the school system introduced in the educational reform of 1999, Polish students had above average reading skills in PISA 2003; in PISA 2000 they were near the bottom of the list.
Another point made in the evaluation was that students with higher-earning parents are better-educated and tend to achieve higher results. This was true in all the countries tested, although more obvious in certain countries, such as Germany.