In most fields, statistics are put together in the same way over time. So people know, for example, what unemployment or trade deficit figures mean and are able to rely on them. Not so in education.  We spend $250 billion a year on education in the U.S. and it's important that we know what is going on. But people often lack confidence in the figures they get because there has been so much manipulation of statistics to make things look better than they are.

 Some school districts with high dropout rates have been known to change the way they counted dropouts so that, even if the actual figures did not change, they could claim a big improvement. Other districts with poor test scores have altered their reporting of these scores to make them look better. That's one reason why Congress created the National Assessment of Educational Progress (NAEP)--to give Americans information they could rely on about the performance of our students.

Since the late 1960s, NAEP has been giving examinations in subjects like mathematics, reading and science to samples of U.S. students and reporting the results of each exam in one or two thick volumes. A NAEP report offers much more than raw scores. In writing, for example, it classifies student writing as minimal, adequate or elaborated, and it describes what each of these levels involves. It also gives examples of the kinds of questions youngsters at various levels can answer--like whether a student taking the math exam can figure what he'll owe in principal and simple interest on a loan or a student taking the reading exam can comprehend an editorial from a good newspaper.

The results of the 1992 NAEP math exam were due to be released next summer. Unfortunately, last week outgoing Secretary of Education Lamar Alexander jumped the gun. He rushed some bare bones results into print, with none of the examples or accompanying information, and announced them at a well-publicized press conference. It's not as though the results had to be released right away. However, there was a little bit of good news. Test scores rose slightly. For example, although the number of twelfth graders at the advanced level remained at only 2 percent, 5 percent more of them reached the basic and proficient levels than had in 1990. Probably Secretary Alexander wanted to tell the world, "This happened on my watch."

What's wrong with that? Several things. One is that releasing raw scores without the accompanying material runs the risk of turning these NAEP results into the meaningless kind of number juggling we usually associate with standardized test scores.

And while there's nothing wrong with expressing pleasure at even a slight improvement in test scores, why call for a celebration, as Secretary Alexander did, when the overall results are still a disaster? Thirty-six percent of those about to graduate from high school are below what the test calls basic competency--one in every three. Does that mean they can't add, subtract and divide? Perhaps, but we can't be sure because Secretary Alexander's little book doesn't explain. And when you look at the achievement of minorities, 66 percent of African-American 12th graders, 55 percent of Hispanics and 54 percent of Native Americans failed to achieve basic competency. Celebrating results like these is like fiddling while Rome burns because firemen have one or two buildings under control.

There's another problem. In addition to knowing how well our kids perform in relation to our own levels of achievement, we must know how they compare with students in other industrialized countries. We adopted, as one of our education goals, being first in the world in math and science by the year 2000. Everybody knows that will not happen, but there is no way we will achieve this goal by 2005 or 2010--or even know how close we are--unless our standards bear some relationship to the standards in other countries.

Nobody likes to be the bearer of bad news. But if we are to believe what we hear about education from our leaders, they need to tell us the bad news as well as the good. Perhaps they could learn a lesson from the BBC.

During the early days of World War II, the BBC nightly news was full of details about how many miles the Allies had retreated that day, how many Allied soldiers had been killed, etc. Some listeners thought the BBC was crazy for reporting all that bad news, but when the tide eventually turned, people had confidence in BBC reports. If the BBC told the truth when times were bad, it could be trusted now. We need leaders in American education who can establish that kind of trust.