Programme for International Student Assessment (PISA)View Best in Class
The Programme for International Student Assessment (PISA) is governed and administered by the Organisation for Economic Co-operation and Development (OECD), an entity comprised of thirty countries. The PISA exams cover scientific, reading, and mathematical literacy, and have been administered every three years since 2000. This review covers the framework for the math assessment administered in 2006,7 in which fifty-six OECD and non-OECD countries participated.
The PISA test is administered to fifteen year-olds since this is the age â€œin most OECD countries [where] students are approaching the end of compulsory schooling.â€ PISAâ€™s objective is to test â€œnot so much in terms of mastery of the school curriculum, but in terms of important knowledge and skills needed in adult life.â€ The PISA framework â€œdefines the contents [sic] that students need to acquire, the processes that need to be performed and the contexts in which knowledge and skills are applied.â€ Though not intended exclusively for school-based education, this framework thus has the same intention and performs essentially the same function as a set of academic standards, and can reasonably be evaluated by the same standards-based criteria. Further, a number of American educators, policymakers, and standard setters look to PISA as a benchmark for what should be required and/or expected of American schools. Since the PISA standards are used for these purposes, it is important to appraise their content.
The framework devotes over forty pages to summarizing what mathematical literacy should mean for fifteen year-olds. PISA breaks content into four categories: â€œspace and shape; change and relationships; quantity; and uncertainty.â€ (In addition to those four categories, PISA discusses four â€œsituation types,â€ eight â€œcompetencies,â€ and three â€œcompetency clusters,â€ but these are vague and do nothing to clarify the content that students should have mastered.) For each of the four content categories, there is a general discussion about the mathematics in the category, but again, these discussions do not describe content to be covered. Each category features a short list of â€œkey aspectsâ€ that are as close to standards as the framework gets, but these, too, are vague and non-specific. They total only twenty-three and do not supply clear guidance for readers or users (e.g., teachers, curriculum developers, test developers, mathematicians). Sample problems are also supplied in each category, and the real content to be covered must often be inferred from these examples (as well as from released items from actual PISA exams, discussed below). The sample problems illustrate the very low level of mathematics content that is required. The exam is primarily a problem solving exam, and seldom requires the highest level of mathematics that a fifteen year-old would be expected to know.
The second document reviewed here, the released items, is 106 pages long and contains fifty problems with descriptive names (such as Walking, The Best Car, and Postal Charges). Many of these problems consist of several distinct test items, all related to the same theme. All PISA problems are â€œin contextâ€ (i.e., they are intended as real world problems).
The PISA framework is evaluated on two dimensions, â€œcontent and rigorâ€ and â€œclarity and specificity,â€ and this review will address these dimensions under four sections. First, the content addressed by the PISA materials is compared with the content that should be covered by a mathematics test for fifteen year-olds (see â€œMath Content-Specific Criteria,â€ page 11), and the results are presented under â€œcontent coveredâ€ and â€œcontent missing.â€ Specific problems with the content are discussed in the third section, followed by a discussion of the released items in the fourth section. The remainder of the review sums up the content and rigor of the PISA framework and released items and considers the clarity and specificity of the former. The last section provides an overall summary and final grade.
Any mathematics assessment for fifteen year-olds should thoroughly cover the arithmetic of rational numbers, including decimals, and should also cover rates, ratios, proportions, and percentages. Fractions or rational numbers are not mentioned anywhere, but we do find that understanding the meaning of operations â€œincludes the ability to perform operations involving comparisons, ratios and percentages.â€ The standard covering â€œnumber senseâ€ does explicitly include proportional reasoning.
Under the content category â€œquantity,â€ we find some standards that are potentially relevant: â€œunderstanding the meaning of operationsâ€ and â€œelegant computations.â€ These are much too vague, however, to assure coverage of the arithmetic of rational numbers. Typically, standards are self-explanatory. However, PISA standards require one to read the lengthy prose surrounding them to gain some insight into their meaning (more on this below).
Volume, area, and perimeter are mentioned under the category â€œquantity,â€ and triangle appears in the preliminary discussion and in an example problem. â€œSimilarâ€ occurs in the discussion, but it is not the technical similarity of geometry. Coordinates are also mentioned in the discussion. The most guidance we get in the â€œspace and shapeâ€ content category is from the rather vague standard: â€œrecognizing shapes and patterns.â€ This, in principle, could cover much of the material of geometry, but the standard is far too general to be useful.
Students should have covered a full year of algebra by age fifteen. Under the content category â€œchange and relationships,â€ we have two potentially relevant standards: â€œrepresenting changes in a comprehensible formâ€ and â€œunderstanding the fundamental types of change.â€ These do not give clear guidance to readers, though. In the accompanying discussion, we find absolute value, linear equations, and inequalities and linear functions.
Note that the discussion material included in the PISA framework is not an obvious attempt to clarify individual standards. Rather, it is a very general discussion about the uses of mathematics in the real world. Here is an example:
Every natural phenomenon is a manifestation of change and the world around us displays a multitude of temporary and permanent relationships among phenomena. Examples are organisms changing as they grow, the cycle of seasons, the ebb and flow of tides, cycles of unemployment, weather changes and stock exchange indices. Some of these change processes involve and can be described or modeled by straightforward mathematical functions: linear, exponential, periodic or logistic, either discrete or continuous. But many relationships fall into different categories and data analysis is often essential to determine the kind of relationship that is present. Mathematical relationships often take the shape of equations or inequalities, but relations of a more general nature (e.g. equivalence, divisibility, inclusion, to mention but a few) may appear as well.
Though PISA is given credit here for covering linear equations and inequalities, this credit might be considered quite generous. For one does not find explicit guidance about content so much as suggestions that certain content should be coveredâ€”linear equations for example.
The core content of the arithmetic of rational numbers is missing, as is any coverage of rates. Most of the expected geometry is missing. There are no similar and congruent triangles; circles; angles associated with triangles and parallel lines; or computation of areas and perimeters of rectangles. The Pythagorean Theorem is missing.
Roots, reciprocals, and powers are absent, as is the arithmetic of polynomials and rational expressions. Factoring is also missing. Quadratics are not mentioned.
PISAâ€™s content category â€œuncertaintyâ€ covers â€œcollecting data, data analysis and display/visualisation, probability and inference.â€ It states:
Relatively recent recommendations concerning school curricula are unanimous in suggesting that statistics and probability should occupy a much more prominent place than has been the case in the past.
Although publications from 1982 to 2000 are cited to support this sentiment, â€œunanimousâ€ is a dramatic overstatement. In the sidebar on data analysis, statistics, and probability, â€œHow much DASP do students need?â€ on page 12 of this report, the case is made that these content areas should receive only limited attention. Simply put, there is an overemphasis on this content category in the PISA standards.
In addition to the sample problems included in the PISA Framework, PISA also makes available a set of released items (test problems used in previous years). Because of the imprecise nature of the standards and the discussion surrounding them, it is important to review the released items to see if they give further guidance about content.
The most striking feature of the released items is that thirty-five of the fifty involve the use of a picture, table, or graph. â€œData displayâ€ falls under the content category of â€œuncertainty,â€ and this means that the â€œuncertaintyâ€ content category is highly overrepresented, even if the problems go on to test content covered by other categories.
There are only a few real formulas and equations, and only about 10 percent of the released items use any algebra. There is only one question that expects students to produce a formula of their own from the information given in a problem. This question requires students simply to multiply a given formula by 0.8 in order to arrive at the new formula. The lack of algebra illustrates the low level of mathematical content knowledge expected by the PISA assessment. Nor are arithmetic skills in much demand in the problems. Even where such skills might be useful, calculators are allowed.
The level of geometry used in a few of the released items is significantly higher than that suggested by the standards and the discussion surrounding them (as is the case with similarity and congruence). However, as with algebra, only a few test items call upon this level of geometry.
Most of the released items are focused on problem solving and use fairly low-level mathematics content. However, the problems can be quite complex; about 20 percent of them are multi-step problems. So, even though the content is undemanding, the sample problems can be quite difficult due to the number of steps required to solve them.
A significant number of problems could be taken to task for errors, misleading statements, or imprecise questions. Enumerating all of these issues is beyond the scope of this review.
Content and Rigor Conclusion
Arithmetic, most of the geometry, and much of the algebra that is listed in the â€œMath Content-Specific Criteria,â€ (page 11) are missing from the PISA framework. More than half of the important content is never mentioned, which would normally result in a score of three for content and rigorâ€”that is, if the framework were all thatâ€™s being evaluated. However, the inclusion of the released items affects PISAâ€™s grade for content and rigor. Even though there are very few geometry problems in the released items, those that do appear require more geometry content than the standards suggest. This raises PISAâ€™s content and rigor score from three to four as it now appears that at least 50 percent of the content is covered, but certainly more than 35 percent is missing (see the â€œCommon Grading Metric,â€ page 16).
Clarity and Specificity
The actual standards included in the PISA framework are non-specific and not testable. They give almost no guidance to readers and usersâ€”i.e., teachers, students, parents, curriculum designers, test makers, textbook developers, standards writers, policymakers, or others. The discussions around the standards are rambling accounts of mathematics in the real world, and although they mention various bits of content (for which coverage has been generously credited in this review), these discussions, too, give scant guidance to readers at any level.
According to the â€œCommon Grading Metricâ€ (page 16), a score of zero for clarity and specificity means:
The standards are incoherent and/or disorganized. The standards are not helpful to users. The standards are completely lacking in detail.
It is as if the PISA framework were written to illustrate the kind of standards that should merit a score of zero. Accordingly, its score for clarity and specificity score is zero.
Summary and Grade
The PISA assessment tests mathematical literacy, not knowledge and understanding of grade-level mathematical content. The standards and their explanations do not cover the appropriate grade-level material and the released items indicate that the exam is quite weak in mathematical content. It is a problem-solving test, and although mathematics is used, it is somewhat incidental. Many problems have no apparent mathematical content at all and are, at best, small logic puzzles. Because of this low level of required content knowledge, the claim that PISA tests â€œpreparedness for further studyâ€ is in doubt.
The test itself is unbalanced, with glaring overemphasis on data display. Most of the content that is expected of a fifteen year-old in PISA is what younger students should have already mastered.
As a serious problem solving test using elementary mathematics, the PISA assessment might function nicely, although its unbalanced nature would limit its usefulness here. Certainly schools should teach problem solving in their mathematics curricula, and it should be a major embarrassment for a country to perform poorly on this test. Still, results from PISA ought not to be used to interpret how successful a school system is at imparting grade-level mathematical knowledge and understanding, nor are the PISA framework and released items a suitable model for U.S. standards setters at any level.
Our Review OfProgramme for International Student Assessment (PISA)
Read moreABOUT THIS REPORT
Our review ofState Standards