One of the things that has influenced me most strongly to call for radical school reform has been the results of the National Assessment of Educational Progress (NAEP) examinations. These exams have been testing the achievement of our 9, 13 and 17-year-olds in a number of basic areas over the past 20 years, and the results have been almost uniformly dismal.
According to NAEP results, no 17-year-olds who are still in school are illiterate and innumerate -- that is, all of them can read the words you would find on a cereal box or a billboard, and they can do simple arithmetic. But very few achieve what a reasonable person would call competence in reading, writing or computing.
For example, NAEP's 20-year overview, Crossroads in American Education, indicated that only 2.6 percent of 17-year-olds taking the test could write a good letter to a high school principal about why a rule should be changed. And when I say good, I'm talking about a straightforward presentation of a couple of simple points. Only 5 percent could grasp a paragraph as complicated as the kind you would find in a first-year college textbook. And only 6 percent could solve a multi-step math problem like this one: "Christine borrowed $850 for one year from Friendly Finance Company. If she paid 12 percent simple interest on the loan, what was the total amount she repaid?"
These results are troubling not only because they suggest that few of our students reach high levels of competency by the time they graduate from high school but also because they compare very poorly with the test results students from other industrialized countries must attain even to get into college.
But as I've gone all over the country talking about how NAEP shows we are in the midst of an education crisis involving the great majority of our students, I've found that most people don't believe me. They are willing enough to believe that disadvantaged kids living in the inner cities or in poor, rural areas are achieving poorly, but they consider these kids unfortunate exceptions. The students in their own schools, they tell me, are doing fine. And they see no reason for making big changes.
Recently some New York friends questioned the validity of NAEP results by bringing up the New York State Regents examinations. They pointed out that New York has offered Regents examinations in most academic high school subjects for many years. Copies of the exams are available, and it seems apparent that anyone who can pass, for example, the Mathematics III Regents or the English III or IV would probably be in the top NAEP rankings. The English examinations, in addition to multiple-choice and matching questions, require some well-written essays. Math exams present students with challenging problems that could take 10 or 15 minutes to solve. And, my friends tell me, out of the 37 percent of 11th graders who took the Math III Regents exam last year, 80 percent passed. This number suggests achievement that is far above the NAEP figures.
Obviously, the students who are taking Regents exams have spent a year mastering the material. And they know what to expect. They've practiced using questions from old tests, and they've done entire tests in time-trials and graded themselves so they can see their strengths and weaknesses. NAEP exams, on the other hand, have no direct connection with any course work, and students who take them are not forewarned or given a chance to practice. Nevertheless, the discrepancy between Regents and NAEP is troubling.
One of the most frequently offered theories about the low NAEP scores is that kids know the tests don't count. In the jargon of the testing business, they are low-stakes tests -- quite different from the Regents exams which can affect a student's high school average and acceptance into college.
Of course high-stakes testing can have some very bad effects. When everyone knows that a test is important, students may be tempted to cheat and teachers to teach just what's going to be on the test. But it's possible that low-stakes testing also has some serious disadvantages. If students know that what they do on a test doesn't matter, they may decide it's not worth their while to put forth any effort. And it could be that this explains the low level of achievement we have seen in NAEP examinations.
NAEP is an important source of information about what U.S. students know and can do, so we ought to clear up this question about its validity. Why not conduct an experiment over a period of several years? Students taking NAEP exams would be divided into three groups. For one group, NAEP would go on in the same low-stakes way as now. A second group would be asked to put their names on their exam booklets and told that test results would be taken into account in their grades. And a third group would be offered some kind of prize or honor if they did well. Then, if all three groups performed about equally, we could continue to have great confidence in NAEP results. And we could also convince many people who are still in doubt that the situation in our schools is indeed disastrous and that we need radical change.