Results from National Assessment of Educational Progress exams, given in 2018 to eighth-graders in different parts of the country, revealed scores not unlike those we have seen for several decades on these same exams:
- Of the 16,400 eighth-graders who took the U.S. history test in 2018 — and who are said to be randomly selected and representative of all American eighth-graders — the average score was 263 on a 500-point scale, four points lower compared with 2014, but four points higher than in 1994.
- The NAEP geography average scores dropped, too. Of the 12,900 eighth-graders who took it in 2018, the average score was 258 on a 500-point scale, three points lower compared with 2014, but still close to the first year of assessment in 1994.
- And in all-important civics, the average score of the 13,400 students who took the test in 2018 was 153 on a 300-point scale, only a point down from the 2014 average score, but three points higher than in 1998, the first year of the civics test.
As always happens when scores from NAEP exams in various subjects are released, many of the reactions are hyperbolic.
NAEP has been referred to as “the nation’s report card” or the “gold standard” in student assessment because it is seen as the most consistent, nationally representative measure of U.S. student achievement since the 1990s — at least for those who believe that a single standardized test score can broadly assess what students know and can do in a particular subject.
It is administered to groups of U.S. students in the fourth and eighth grades, and less frequently to high school students. It’s given every two years in math and reading, and less frequently in science, writing, the arts, civics, economics, geography, technology and engineering literacy, and U.S. history.
Critics have long questioned the value of the NAEP results, noting that proficient markers are fuzzy and that at least some of the exams don’t measure what they claim to assess. Nevertheless, the results get a lot of attention, and every time the reaction is similar: The sky is falling, or about to, because American public schools aren’t getting the job done in whatever subject happened to be measured by NAEP.
DeVos used the results as proof that “we need to fundamentally rethink education in America.” The problem is the “antiquated approach to education,” she said.
“The results,” she said in a statement, “are stark and inexcusable. A quarter or more of America’s 8th graders are what NAEP defines as ‘below basic’ in U.S. history, civics and geography. In the real world, this means students don’t know what the Lincoln-Douglas debates were about, nor can they discuss the significance of the Bill of Rights, or point out basic locations on a map. And only 15% of them have a reasonable knowledge of U.S. history. All Americans should take a moment to think about the concerning implications for the future of our country.”
It’s worth noting: In the 1980s, an NAEP test found that only about half of the students who took it could place the Civil War in the correct half-century.
It’s also worth noting the ignorance on key U.S. history moments shown by some in the Trump administration. That includes President Trump, who spoke in 2017 about black abolitionist and statesman Frederick Douglass — who died in 1895 — as if he were still alive. And when DeVos herself said that historically black colleges and universities were pioneers of school choice, rather than created because white schools wouldn’t accept black students.
There were other reactions, too, to the NAEP scores. In an email with the subject line “Alarming NAEP Results,” the nonprofit Center for Education Reform said the results “should startle America,” as if America has the capacity to be startled by anything new these days.
Sen. Lamar Alexander (R-Tenn.), chairman of the Senate Education Committee, said in a statement, “These results from the ‘Nation’s Report Card’ are sobering. They remind us that the worst scores for American high school students often are not in science and math, but in United States history. And, if our children do not learn United States history, they will not grow up learning what it means to be an American.”
The results are often misinterpreted, with NAEP’s proficiency levels considered interchangeable with grade-level assessment. They aren’t. NAEP’s three markers of progress are NAEP Basic, NAEP Proficient and NAEP Advanced — and the news release announcing the results made that clear: “The NAEP Proficient achievement level does not represent grade-level proficiency as determined by other assessment standards.” If you look for especially detailed explanations of the three levels on the NAEP website, you won’t find them.
There’s more: The NAEP website makes a point of saying that NAEP achievement levels should be seen as still being part of a “trial” and says the most recent review of them happened in 2016.
“The evaluation concluded that further evidence should be gathered to determine whether the NAEP achievement levels are reasonable, valid, and informative,” the website says. “Accordingly, the NCES commissioner determined that the trial status of the NAEP achievement levels should be maintained at this time.”
The Education Department did not respond to a question about whether there has been any movement on the trial status of the NAEP achievement scores. Neither did James Lynn Woodworth, the commissioner of the National Center for Education Statistics — which as part of the Education Department administers the NAEP — or Peggy Carr, associate commissioner in the National Center for Education Statistics’s Assessments Division.
The news release from the National Center for Education Statistics announcing the new NAEP results says this: “The NAEP achievement levels are set by the National Assessment Governing Board, which sets policy for the NAEP program. The NAEP achievement levels are used on a trial basis and, therefore, should be interpreted with care to ensure a proper understanding of performance.”
In 2017, three education experts from Stanford University did an experiment to try to test whether NAEP actually does what it says it can do, unlike other standardized tests: measure problem solving, critical thinking and other high-level skills. The results were devastating:
For example, in history, NAEP claims to test not only names and dates, but critical thinking — what it calls “Historical Analysis and Interpretation.” Such questions require students to “explain points of view,” “weigh and judge different views of the past,” and “develop sound generalizations and defend these generalizations with persuasive arguments.” In college, students demonstrate these skills by writing analytical essays in which they have to put facts into context. NAEP, however, claims it can measure such skills using traditional multiple-choice questions.
We wanted to test this claim. We administered a set of Historical Analysis and Interpretation questions from NAEP’s 2010 12th-grade exam to high school students who had passed the Advanced Placement (AP) exam in U.S. History (with a score of 3 or above). We tracked students’ thinking by having them verbalize their thoughts as they solved the questions.
What we learned shocked us.
In a study they published in the American Educational Research Journal, Sam Wineburg, Mark Smith and Joel Breakstone showed “that in 108 cases (27 students answering four different items), there was not a single instance in which students’ thinking resembled anything close to ‘Historical Analysis and Interpretation.’ Instead, drawing on canny test-taking strategies, students typically did an end run around historical content to arrive at their answers.”
The researchers did other tests as well, which you can read about here.