The release of national test scores in reading and math on September 25 was an embarrassment for the New York State Education Department. While scores nationally and in many individual states showed modest gains between 2005 and 2007, New York State did not. Only in fourth-grade math was there significant improvement.

The federally-sponsored National Assessment of Educational Progress (NAEP), which is known in the education world as the gold standard of testing, has been collecting test samples of students in the states since 1992. In the No Child Left Behind (NCLB) Act of 2002, Congress authorized NAEP testing in every state to serve as an external monitor of the states’ own claims about their progress. The states develop and administer their own tests, and Congress rightly worried about the possibility that states would dumb-down their tests to inflate the results.

The latest NAEP had very little good news for New York State. Only a few months ago, the state’s Education Department celebrated large test-score gains for eighth-grade students in both reading and math. In May and June, The New York Times ran front-page stories heralding major improvements in the state test scores for eighth-graders: “Eighth Graders Show Big Gain in Reading Test” and “City Students Lead Big Rise on Math Tests.” The Education Department reported that in grade 8, the proportion of students meeting state reading standards jumped from 49.3 percent to 57 percent, a remarkable increase in a single year, especially in a grade where academic performance had stagnated for several years. Similarly, the state reported that the proportion of eighth-graders meeting the state’s math standards jumped from 53.9 percent to 58.8 percent. These are very impressive gains.

Unfortunately, all of the state’s gains in eighth grade disappeared in the NAEP results—a fact the New York Times mentioned not on its front page but at the end of a story on page A20. The NAEP tests performance in two ways: by scale scores (on a scale from 0-500), and by achievement levels (“below basic,” “basic,” “proficient,” and “advanced”), which are supposed to show what students ought to know and be able to do in their grade. The results:

Only in fourth-grade mathematics did New York students post a solid gain, from a scale score of 238 in 2005 to 243 in 2007. In eighth-grade mathematics, where the state claimed big increases on its own tests, the NAEP scale score was 280 in 2005 and 280 in 2007.

In fourth-grade reading, New York’s scale score went from 223 in 2005 to 224 in 2007, not a significant change. In eighth-grade reading, New York’s scale score went from 265 in 2005 to 264 in 2007, again not a significant change.

The achievement levels demonstrate the severity of the “achievement gap” among different groups in New York. In fourth-grade reading in 2005 and 2007, 19 to 20 percent of white and Asian students were “below basic,” as compared with 48 to 50 percent of black and Hispanic students. These numbers did not change significantly from 2005 to 2007. The size of the gap is similar in eighth-grade reading and eighth-grade math, where about half of black and Hispanic students score “below basic.” Only in fourth-grade math did a majority of black (69 percent) and Hispanic (74 percent) students score at or above the basic level of performance, while 94 percent of white and Asian students ranked at basic or above in 2007.

The New York State Education Department has already suffered a series of embarrassments. A federal study released in June held that New York’s tests were not as rigorous as those administered in many other states. The New York Daily News reported early in September that the state’s math tests in 2005 were easier than those given in 2002. A few days later, The New York Sun reported on a study which found that the reading tests of 2005 were also easier than in previous years.

The disparity between state and national test scores points to one conclusion: New York State—with its multi-billion-dollar annual investment in public education—needs an independent, nonpartisan, professional audit agency to administer tests and report results to the public. Such an agency should be staffed by testing professionals who have no vested interest in whether the scores go up or down. At present, when scores go down, the public is told that the test was harder this year; but when scores go up, state officials never speculate that the test might have been easier. Instead, they high-five one another and congratulate the Regents for their wise policies and programs.

What the public needs are the facts. No spin, no creative explanations, no cherry-picking of data for nuggets of good news. Just the facts.

Donate

City Journal is a publication of the Manhattan Institute for Policy Research (MI), a leading free-market think tank. Are you interested in supporting the magazine? As a 501(c)(3) nonprofit, donations in support of MI and City Journal are fully tax-deductible as provided by law (EIN #13-2912529).

Further Reading

Up Next