Measuring wage inequality within and across U.S. metropolitan areas, 2003–13

This article shows that location, size, and occupational composition play important roles in determining the level of wage inequality within and across U.S. metropolitan areas. Larger areas, especially in the Northeast and on the West Coast, typically have greater wage inequality, while smaller areas, many of which are in the South and Midwest, have less inequality. Metropolitan areas with high concentrations of employment in higher paying occupations also tend to have greater inequality.

Rising wage inequality in recent years has brought increased focus on the disparity between the highest wage earners and the lowest wage earners. Less attention, however, has been paid to how wage inequality varies by location or area. By one measure—the ratio of the 90th wage percentile to the 10th wage percentile, sometimes called the “90–10” ratio, inequality increased by 7 percent in the United States between 2003 and 2013. But this increase varied widely by area. The 90–10 ratio increased by over 20 percent in Oakland, CA, and Corvallis, OR, for example, while it declined in several other metropolitan areas in the United States, including three areas in Florida. This article examines how wage inequality varies by metropolitan area and how average wages, occupational composition, geographic location, and the size of the area contribute to the variation in this inequality measure.

The data used in this article are from the Occupational Employment Statistics (OES) program. The OES program produces employment and wage estimates annually for more than 800 occupations. These data are available for the nation as a whole, for individual states, and for metropolitan and nonmetropolitan areas; national occupational employment and wages for specific industries are also available. The most recent data show that the 90th-percentile annual wage in the United States for all occupations combined was $88,330 in 2013, and the 10th-percentile wage was $18,190. In other words, the highest paid 10 percent of wage earners in the United States earned at least $88,330 per year, while the lowest paid 10 percent earned less than $18,190 per year. Therefore, by this measure, the “90–10” ratio in the United States was 4.86 in 2013, compared with 4.54 in 2003, an increase of about 7 percent over that 10-year period.

Differences in the 90–10 ratio over time or geography can be due to several factors. They may result from differences in wages for the highest paid workers, the lowest paid workers, or both groups. Differences in the 90th and 10th wage percentiles may also be the result of differences in wage levels within the same occupations, differences in the occupational makeup of the area or areas, or some combination of both. Finally, changes in wage inequality may be because the wages of the highest paid 10 percent of workers (those with wages at or above the 90th percentile) grew faster than those of the lowest paid 10 percent (those whose wages fall below the 10th percentile). In fact, between 2003 and 2013, after adjusting for inflation, the 90th-percentile wage increased 4.6 percent while the 10th-percentile wage decreased 2.2 percent. Figure 1 shows the change in real annual wages at the national level for the 10th, 25th, 50th (median), 75th, and 90th percentiles over the 2003–13 period. As can be seen in the figure, in general, real annual wages increased for the highest paid workers and decreased for the lowest paid workers, a pattern that holds for most metropolitan areas. Nominal wages for both measures increased, but the 90th percentile grew faster than the 10th percentile. This pattern holds for most, but not all metropolitan areas. The remainder of this article examines nominal wages by metropolitan area.