International Tests Present Misleading Picture of U.S. Student Performance
By Tim Walker
When the Organization for Economic Cooperation and Development released the results for the Program for International Student Assessment (PISA) for 2009, Secretary of Education Arne Duncan called them a “wake up call.” U.S. students placed in the middle of the rankings of science and math, way behind high-performing countries such as Finland, Canada, Singapore, China, and New Zealand. Other lawmakers and experts joined in the chorus of knee-jerk reactions – U.S. students are “mediocre,” our schools were “failing,” the U.S. is “falling behind.” Calls to accelerate the implementation of any and all education “reforms” became deafening.
Why? Because PISA surveys, we are told, provide solid, irrefutable data, and numbers never, ever lie. But too often influential people don’t (or won’t) look beneath the headlines and dig a little deeper into the data. Without really understanding what the rankings mean and how they are tabulated, it’s too easy for lawmakers and many journalists to look at the table and jump to the same conclusion: schools are failing our students.
Richard Rothstein, research associate of the Economic Policy Institute (EPI), and Martin Carnoy of the Stanford Graduate School of Education, recently analyzed the 2009 PISA database and concluded that the performance of U.S. students isn’t as dire as initially reported and identified the disproportionate share of students from economically disadvantaged backgrounds as the main culprit. Ignoring this factor made it easier to misread the findings and declare the sky is falling.
“Such conclusions are oversimplified, frequently exaggerated and misleading,” said Rothstein and Carnoy. “They ignore the complexity of test results and may lead policymakers to pursue inappropriate and even harmful reforms.” Last week, Rothstein and Carnoy released their findings in a report titled, “What Do International Tests Really Show About U.S. Student Performance?”
Carnoy and Rothstein compared U.S. results by social class to the three top performers on PISA—Canada, Finland and South Korea—and England, France and Germany, countries that share similar socioeconomic traits to the United States. According to the two scholars, the relatively low ranking of U.S. students can be attributed in no small part to a disproportionate number of students from high-poverty schools among the test-takers. After adjusting the U.S. score to take into account social class composition and possible sampling flaws, they estimate that the United States placed fourth in reading and 10th in math – up from 14th and 25th in the PISA ranking, respectively.
Carnoy and Rothstein also found that, although every PISA country has an achievement gap, the gap is actually smaller in the United States than in England, France and Germany and not significantly larger than in the highest-ranked nations. In addition, achievement of disadvantaged students in the United States has been improving – in stark contrast to Canada, Finland and Korea, where achievement among this group has been failing.
“Our main message is a cautionary tale,” Carnoy explained. “If you don’t make some attempt to look at everything by social-class groups, you are headed for lots of mistakes in your policy conclusions.”