Test Development
Test Development
The development of the PISA 2006 assessment instruments was an interactive process among the PISA Consortium, various expert committees, and OECD members. The assessment was developed by international experts and PISA Consortium test developers, and items were reviewed by representatives of each jurisdiction for possible bias and relevance to PISA’s goals. The intention was to refl ect the national, cultural, and linguistic variety among OECD jurisdictions. The assessment included items submitted by participating jurisdictions as well as items that were developed by the Consortium’s test developers.
The fi nal assessment consisted of 140 science items,
48 mathematics items, and 28 reading items allocated to 13 test booklets. Each booklet was made up of 4 test clusters. Altogether there were 7 science clusters (S1–S7), 4 mathematics clusters (M1–M4), and 2 reading clusters (R1–R2). The clusters were allocated in a rotated design to the 13 booklets. The average number of items per cluster was 20 items for science,
12 items for mathematics, and 14 items for reading. Each cluster was designed to average 30 minutes of test material. Each student took one booklet, with about 2 hours worth of testing material. Approximately one-third of the science literacy items were multiple choice, one-third were closed or short response types (for which students wrote an answer that was simply either correct or incorrect), and about one-third were open constructed responses (for which students wrote answers that were graded by trained scorers using an international scoring guide). In PISA 2006, every student answered science items. Mathematics and reading items were spread throughout other booklets. The United States did not use the optional 1-hour test booklet that included lower diffi culty items designed for use in special education classrooms. This booklet
Czech Republic, Germany, the Netherlands, Slovakia, reading section were incorrect. In some passages, and Slovenia. For more information on assessment
students were incorrectly instructed to refer to the design, see the OECD’s PISA 2006 Technical Report
passage on the “opposite page” when the passage now (Adams in press).
appeared on the previous page. Because of the small number of items in the reading section, it was not
In addition to the cognitive assessment, students also possible to recalibrate the score to exclude the affected
received a 30-minute questionnaire designed to provide items. No incorrect page references appeared in the
information about their backgrounds, attitudes, and mathematics or science sections of the assessments.
experiences in school. Principals in schools where However, in some instances math and science items
PISA was administered also received a 20- to 30- could be more diffi cult because the question required
minute questionnaire about their schools. Results information provided previously that now required the
from the school survey are not discussed in this report student to turn back a page. In a few instances, items
but are available in PISA 2006: Science Competencies could be somewhat easier because of the pagination.
for Tomorrow’s World (Vols. 1 and 2) (OECD, 2007a, ACER examined the potential impact of this on the
2007b). math and science scales and estimated the scores
would change by one point if the items that may have