email icon Email this citation

CIAO DATE: 12/03

The Poverty Rate: America's Worst Statistical Indicator

Nicholas Eberstadt

On The Issues

March 2002

American Enterprise Institute for Public Policy Research

Due to fundamental defects in the method by which the U.S. poverty rate is calculated, it is incapable of offering an accurate assessment of living standards at any point in time, much less a dependable view of longer-term trends.

To the quantitatively inclined, the United States's federal statistical service stands as one of the wonders of the modern world. Yet despite all its extraordinarily useful and reliable information, that same service regularly showcases a dreadful numerical embarrassment: the now-famous U.S. "poverty rate."

In theory, the poverty rate is meant to measure material deprivation within the American population. In practice, due to the fundamental defects of the method by which it is calculated, it is incapable of offering an accurate assessment of living standards at any point in time, much less a dependable view of longer-term trends.

In 1965, the poverty rate was devised to help fight the recently declared "War on Poverty" and was perforce fashioned out of readily available data: the Census Bureau's annual household income figures, some 1961 food budgets prepared by the U.S. Department of Agriculture, and an old USDA food consumption survey.

Relying on a 1955 finding that American families of three or more devoted about a third of their after-tax income to food (and assuming the poor paid virtually no taxes), an initial "poverty threshold" was created simply by tripling the cost of the USDA's "economy food plan" (its cheapest nutritionally adequate menu). Then, with additional "thresholds" to cover other household sizes, a national "poverty line" was drawn. Anyone with an annual reported income below that line was officially "in poverty." The rate was updated annually by adjusting poverty thresholds against changes in the consumer price level. This seemingly ingenious technique promised to identify a specified and absolute condition of want in America—and to measure this fixed condition over time. Unfortunately, the flawed index soon began spinning out numbers depicting a society with no recognizable correspondence to real-world America.

Figures for the years 1973 and 2000 make the point. According to the Census Bureau, per capita income jumped by almost 60 percent over that period. The Labor Department says unemployment was lower in 2000 than in 1973 (4.0 percent versus 4.9 percent). And the 2000 edition of the congressional "Green Book" reports that income-tested social spending nearly tripled between 1973 and 1998—leaping from $136 billion to $392 billion (in constant 1998 dollars).

Despite all that, the official poverty rate actually maintains that a higher proportion of Americans lived in poverty in 2000 than in 1973 (11.3 percent versus 11.1 percent). Indeed, to judge solely by the poverty rate, the poverty situation in the United States has never been so good as it was back in 1973.

Does anyone remember 1973—the Nixon-era recession year when U.S. per capita income was 35 percent lower than today? Surely only a fatally flawed poverty measure could yield such a result.

 

Measuring the Wrong Thing

Why does the poverty rate generate absurd findings? Over the years, economists and statisticians have raised many technical issues about the index: the inflation measure it uses, the undervaluation of homeownership, and the exclusion or underweighting of government welfare benefits. Important as they are, all those objections miss a larger point: The poverty rate measures the wrong thing.

"Material want" means inadequate consumption; ultimately, after all, consumption patterns define living standards. The U.S. poverty rate, however, is based exclusively on income data, which can only hint at actual consumption patterns. For lower-income people especially, income tends to be an unreliable predictor of true living standards.

The jarring mismatch between income and consumption is highlighted annually, in the Bureau of Labor Statistics's Consumer Expenditure Survey. For the bottom fifth of households sampled, expenditures typically exceed income by more than 100 percent. In the latest survey, for every dollar of reported pretax income, the poorest fifth of American households reported spending $2.31!

How can this be? Quite simply, because America is a land of tremendous income variability—and economic mobility. According to research by the current chairman of the Council of Economic Advisers, the churning is so constant that only a seventh of American households in the bottom annual income quintile will still be there ten years later. At any given time, most people on the lowest rungs of the American income ladder are having a bad year—and expect better times ahead.

Accordingly, instead of brutally rationing their intake of food, goods, and services against results from one bad year, they plan against the long run. They draw down assets or take out loans. If necessary, they borrow from family and friends—or get help from the government.

No wonder, then, that spending for the "poorest" always tops snapshot measures of reported income—or that annual income numbers badly misstate actual material circumstances in our land. If the Consumer Expenditure Surveys are to be trusted, the relationship between annual income and annual spending in America is considerably weaker today than it was back in the 1960s. Income, in other words, is an even poorer predictor of living standards now than in the past.

 

Taking Living Standards into Account

That might help explain another striking finding in our national statistics: Contrary to the assumptions of our poverty rate, the objective living standards of the population below the poverty line have been increasing steadily.

According to the Census Bureau, in 1970 almost 30 percent of African American households below the poverty level lacked some plumbing facilities (hot water, flush toilet, and/or shower-bathtub). By 1999 the corresponding figure was less than 3 percent. By the same token, between 1970 and 1999, the proportion of poverty-level black households with telephones and clothes dryers jumped by more than 30 percentage points. Between 1985 and 1999 alone, median per capita floor space for poor black households rose by about 25 percent. In 1999, nearly 36 percent of all "poverty level" African-American households had central air conditioning—well over twice the figure for America's white nonpoverty population in 1970.

It is reassuring to know that consumption levels and material living standards for America's most disadvantaged citizens have increased substantially over the past three decades. The same cannot be said of our reliance on an official "poverty index" that fails to recognize any such progress, much less track it with precision.

The original poverty rate calculations were an inventive effort to fashion an index of material want under real data constraints. Today—almost four decades later—no similar excuse for that index exists. The poverty rate misleads the public and our representatives, and it thereby degrades the quality of our social policies. It should be discarded for the broken tool that it is—and a poverty rate worthy of the name should be crafted anew in its place.

 

Nicholas Eberstadt is the Henry Wendt Scholar in Political Economy at AEI.