Progress in International Reading Literacy Survey (PIRLS)View Best in Class
English and Language Arts
The Progress in International Reading Literacy Survey (PIRLS) is a large-scale, matrix assessment of reading literacy administered by the International Association for the Evaluation of Educational Achievement (IEA) every five years to students in their fourth year of school. In 2011, PIRLSâ€™s tenth anniversary, students in approximately fifty-five countries will take the assessment.
Designed to measure trends in literacy achievement within and across countries, the most recent PIRLS assessment framework builds on the previous frameworks. The IEA embeds reading passages and items from previous assessments to allow for the tracking of achievement trends since 2001. Because PIRLS only addresses reading literacy at the fourth-grade level, we evaluate the standards against only our fourth-grade reading criteria.
The assessment framework is divided into four chapters: an overview, a summary of reading purposes and processes, a chapter on contexts for learning to read, and a chapter on the design of the assessment. In none of these sections are any standards specifically articulated. Instead, the framework merely describes, in very general prose, the content and skills to be assessed.
The framework focuses on the following three â€œaspectsâ€ of reading literacy:
- Purposes for reading;
- Processes of comprehension; and
- Reading behaviors and attitudes.
Only the first two aspects are actually assessed on the PIRLS. The third aspect is examined through questionnaires sent to students, teachers, parents, and other school officials and is not factored into student scores.
The actual content of the framework is minimal, as the three aspects above are described only in very general prose, making its rigor nearly impossible to judge. What little content does appear is sporadic and lacks any meaningful specifics. For example, in â€œFocus on and Retrieve Explicitly Stated Information,â€ the framework explains that this process may entail:
- Identifying information that is relevant to the specific goal of reading;
- Looking for specific ideas;
- Searching for definitions of words or phrases;
- Identifying the setting of a story (e.g., time, place); and
- Finding the topic sentence or main idea (when explicitly stated).
Similarly, in â€œInterpret and Integrate Ideas and Information,â€ the framework merely suggests that this process might include:
- Discerning the overall message or theme of a text;
- Considering an alternative to actions of characters;
- Comparing and contrasting text information;
- Inferring a storyâ€™s mood or tone; and
- Interpreting a real-world application of text information.
While the framework does provide some sample assessment items, none of these samples are the kinds of content-based tasks that could more thoroughly address the concepts mentioned here.
Glimpses of the assessmentâ€™s rigor can perhaps be extrapolated from the sample passages and items offered in an appendix, but only two are included: one literary and one expository passage. Worse, these passages are insipid and are noteworthy neither for their literary significance nor enduring appeal.
Additionally, while the framework does indicate that informational and literary texts will be equally weighted in the assessment, it falls woefully short in describing content. In two pages of prose, the framework describes only very general qualities of each of the text types, but details no specific objectives for students to address while reading them. For example:
In literary reading, the reader engages with the text to become involved in imagined events, setting, actions, consequences, characters, atmosphere, feelings, ideas, and to enjoy language itself.
Through informational text, one can understand how the world is and has been, and why things work as they do. Readers can go beyond the acquisition of information and use it in reasoning and action.
These are fine, if simplistic, descriptions of what students should be able to do, but the framework never goes beyond these general descriptions. It fails to delineate specific expectations for students to recognize and interpret genres, structures, literary elements, or stylistic devices.
The framework spends more time explaining the reading skills it assesses than it does clarifying the level of rigor of the texts used throughout the assessment. To that end, the framework identifies these four â€œprocessesâ€ as priorities:
- Focus on and retrieve explicitly stated information;
- Make straightforward inferences;
- Interpret and integrate ideas and information; and
- Examine and evaluate content, language, and textual elements.
The first two of these â€œprocessesâ€ are oftenâ€”and rightfullyâ€”treated as one skill in other reading standards, but are arbitrarily split here. The remaining two are commonly identified reading skills for assessment at this and other grade levels.
Furthermore, according to the assessment blueprint itself, PIRLS spends too little time (a mere 20 percent) focused on asking questions that require students to â€œexamine and evaluate content, language and textual elementsâ€ and far too much time (50 percent) on the lower-level questions focused on basic recall and making straightforward inferences.
Content and Rigor Conclusion
In short, the PIRLS framework specifically addresses virtually none of our content-specific criteria for reading in Kindergarten through fourth grade. Without describing more explicit aspects of vocabulary development, outlining more specific expectations for reading and analyzing literary and non-literary texts (e.g., recognizing and interpreting genres, structures, literary elements, and stylistic devices), and detailing choices about the quality and complexity of reading passages, this framework cannot earn higher than a one out of seven for Content and Rigor.
Clarity and Specificity Conclusion
While the PIRLS assessment framework is clearly written and organized, its failure to delineate any of the specific content to be assessed makes the clarity of the document superficial at best. Since the framework provides insufficient detail, and the passages and sample questions, while helpful, do not adequately clarify student expectations, PIRLS can earn no higher than a one out of three for Clarity and Specificity.
Our Review OfProgress in International Reading Literacy Survey (PIRLS)
Read moreABOUT THIS REPORT
Our review ofState Standards