Programme for International Student Assessment (PISA)View Best in Class
PISA, the Program for International Student Assessment, was created in 1997. It is operated on behalf of the member governments of the OECD (Organisation for Economic Co-operation and Development). Using a common international framework, PISA administers assessments every three years for fifteen-year olds, measuring the â€œliteracyâ€ of those young people in reading, mathematics, and science.
For each administration, PISA designates one of the three subjects as the â€œmajorâ€ domain. In 2006, for the first time, the major domain was science. It was therefore a â€œminorâ€ domain for the 2009 administration, reviewed here. According to PISA, this means that, while the â€œdefinition of the domain [was] unchanged since PISA 2006â€¦the student questionnaire [did] not include items asking about studentsâ€™ general attitudes towards scienceâ€¦â€
Because the assessment is administered to fifteen-year-olds, we judge its underlying science standards here against our own content criteria for fifteen-year olds. (See Appendix A: Methods, Criteria, and Grading Metric; State of State Science Standards 2012.) The review necessarily includes considerable discussion of the PISA Frameworkâ€™s organization and of the academic standards it implies. An evaluation of the content and rigor of these standards is offered, and of the clarity and specificity of the documentation.
Organization of the Framework
According to its assessment framework, PISA is designed to evaluate the literacy of test-takers in each domain. Literacy, it is argued, differs from ordinary knowledge. In the case of science, assessing literacy can provide â€œan early indication of how [the students] are likely to respond later in life to the diverse array of situations that involve science and technologyâ€ (emphasis added). Thus PISA attempts, at considerable length, to make a strong distinction between scientific literacy and knowledge of science (that is, knowledge of science subject matter). Literacy, in short, is taken to be notably more than ordinary â€œcontentâ€ knowledge. The not entirely convincing implication of this is that content knowledge, however well it might be established at age fifteen, cannot by itself provide the desired indication of future science competence.
PISA asserts that its science-literacy assessments are a product of the question: â€œWhat is it important for citizens to know, value, and be able to do in situations involving science and technology?â€ Thus, while PISA does not undertake to provide traditional science-content standards, the answer provided for that question in PISA documentation is, in a sense, an articulation of (implicit) standardsâ€”those upon which the assessment designers base their work.
There are three components of these PISA standards. The first is a set of purely cognitive competencies, or skills, listed in Figure 1 below.
Figure 1: PISA scientific competencies
|Identifying scientific issues|
|Explaining phenomena scientifically|
|Using scientific evidence|
Of course, these skills can be demonstrated only in the context of a body of science contentâ€”facts, principles, laws, and their effective manipulationâ€”which PISA attempts to acknowledge with its statements about the science content that students must master in order to succeed in the assessment. For PISA, science content has, in turn, two parts: â€œknowledge of scienceâ€ and â€œknowledge about science.â€
Unfortunately, neither of those two, insofar as they are covered explicitly in the document, sets forth adequately the depth of content knowledge that students need in order to prove possession of the cognitive accomplishments (Figure 1)â€”the â€œliteracyâ€â€”to be tested. Instead, PISA simply issues a table-of-contents-style list of science subjects that students should have studied or dealt with in advance of the test-taking. The lists appear below in Figures 2 and 3.
Content and Rigor
The implication that scientific enquiry and scientific explanations can be taught and learned separately, at least to some degree, from science content is certainly a popular notion among K-12 science educators. Unfortunately there is not much hard evidence in its favor. If serious teaching of science fails also to teach about science, it is simply bad teaching. And any learning of science that really excluded learning about hypothesis-making, evidence, consistency, creativity, and the likeâ€”science learning that excluded emphasis upon sound epistemologyâ€”would not be acceptable â€œlearning.â€ It would fail as science learning even in very traditional classrooms. Nor is it possible to learn enough about science without significant immersion in the factsâ€”the observations, arguments, independent testsâ€”of science.
Figure 2: PISA Categories: Knowledge of Science
|Earth and space systems|
Figure 3: PISA Categories of Knowledge about Science
While the painstaking subdivision of all science competences in the PISA Framework is surely helpful to the design and understanding of the assessment instrument, not least because it guides test-writers and sets the range of possible questions, the list of topicsâ€”the science contentâ€” is far too general to provide the kind of guidance that curriculum designers or classroom teachers need to help them decide what students should learn and be able to do.
For example: The sub-head â€œmotions and forces (e.g. velocity, frictionâ€¦)â€ under â€œPhysical Systemsâ€ implies an enormous range and depth of contentâ€”far more than is indicated in the physical science expectations listed in our minimum content list for fifteen-year-olds. (See Appendix A: Methods, Criteria, and Grading Metric; State of State Science Standards 2012.) But such a breadth of statement is not a virtue: It fails because of what is left out about motions and forces. Similarly, an entry such as â€œStructures of Earth Systemsâ€ implies almost the whole of geology. And â€œHumansâ€ and its sub-topics imply an entire course in human anatomy and physiology. In short, no indication is given in the Framework, despite exhaustive examination of the meaning of â€œscientific literacy,â€ of what level of substantive knowledge in the listed scientific disciplines will be required if a student is to perform creditably on the PISA test.
For a clearer indication of what students should actually know and be able to do, one must go to the released assessment items (test questions). But while those are uniformly interesting, competently organized, and thoughtfully written, they reveal that some of the actual content knowledge students are expected to have is rather basic (as in the set, for example, on tobacco smoking). Further, the questions are very often topical (i.e., actively in the news). Thus among these questions are sets having to do with: biodiversity and its challenges, cloning, climate change, the ozone layer, water quality issues, and the like. There is nothing wrong with topicality, but it can, if overdone, crowd out fundamental science content in key subjects like physics, astronomy, cosmology, genetics, and geology. The latter are necessarily covered in a good, comprehensive science curriculum, but they are far less likely to be newsworthy than science issues with political implications, such as climate change and environmental quality.
Thus, while the test questions are good, as is the content knowledge needed to do well on the test (and much of it would normally be included in a sound science curriculum), it doesnâ€™t go nearly far enough and surely is sufficient material to comprise standards for said sound science curriculum.
Content and Rigor Conclusion
As written, the Framework provides no straightforward way to relate the PISA topics to a set (such as our own) of minimum content expectations for fifteen-year-olds. There would need to be many more short statements indicating what, exactly, is to be learned within each of the science discipline items in Figure 2â€”and much more guidance would need to be given about when (in which grades) these important topics should be taught. Consider, for example, â€œevolutionâ€ in that table. Yes, this is an essential component of any rigorous, college-preparatory science curriculum.
But when, between Kindergarten and grade ten, is it to be introduced? In what context? (Narrative? Descriptive? Mechanistic, including genetics? Chemical and Biochemical? Geologic-historical?) And at what levels of sophistication, if introduced more than onceâ€”as the main ideas of science should be? Unfortunately, no such guidance for this or any of the content categories is provided.
Therefore the rigor of PISAâ€™s learning expectations is manifest only indirectly, in the released test questions. There, to be sure, rigorâ€”meaning in this context reasonably demanding, but also logical and consistent within the chosen level of detailâ€”is usually evident. And it is adequate to the stated purposes of assessing the literacy, as here defined, of fifteen-year-olds. As to subject matter depth, however, the scientific â€œliteracyâ€ expected for success on these tests is a hoped-for common literacy. There is no reason why, for serious standards, the bar couldnâ€™t be higher, allowing the assessments to identify truly outstanding achievement as well as its alternatives. If the PISA Framework were a set of science standards for K-12, which it is not, an appropriate score for content and rigor would be three out of seven. (See Appendix A: Methods, Criteria, and Grading Metric; State of State Science Standards 2012.)
Clarity and Specificity
The prose of PISAâ€™s Framework document can be dense and is in places repetitive. Much of it is devoted to the argument for science literacy as opposed to scientific knowledge. Despite PISAâ€™s effort to clarify that argument, some vagueness remains as to the expected contribution of personal values to â€œscientific literacy.â€ Those values (including styles and attitudes toward science issues) were important when science literacy was the major domain of PISA (in 2006). In 2009, with science a â€œminorâ€ domain, personal characteristics were less important, but clearly not unimportant. This issue, discussed in the Framework, could be confusing to some members of the intended audience (teachers, for example).
Clarity and specificity are therefore on the whole adequate, given PISAâ€™s expressed undertakings and purposes, and so far as organization and writing of the main document are concerned. But it fails to provide truly clear and applicable guidance for curriculum planners and teachersâ€”guidance of the kind that good standards must supply. That deficiency is reflected, however, in our score for content and rigor. For clarity and specificity, if â€œspecificityâ€ refers only to the announced purposes of the Framework, a score of two out of three is warranted. (See Appendix A: Methods, Criteria, and Grading Metric; State of State Science Standards 2012.)
Our Review OfProgramme for International Student Assessment (PISA)
Read moreABOUT THIS REPORT
Our review ofState Standards