The urban-rural death divide is getting alarmingly wider for working-age Americans
In the 1960s and 1970s, people who lived in rural America fared a little better than their urban counterparts. The rate of deaths from all causes was a tad lower outside of metropolitan areas. In the 1980s, though, things evened out, and in the early 1990s, a gap emerged, with rural areas seeing higher death rates—and the gap has been growing ever since. By 1999, the gap was 6 percent. In 2019, just before the pandemic struck, the gap was over 20 percent.
While this news might not be surprising to anyone following mortality trends, a recent analysis by the Department of Agriculture’s Economic Research Service drilled down further, finding a yet more alarming chasm in the urban-rural divide. The report focused in on a key indicator of population health: mortality among prime working-age adults (people ages 25 to 54) and only their natural-cause mortality (NCM) rates—deaths among 100,000 residents from chronic and acute diseases—clearing away external causes of death, including suicides, drug overdoses, violence, and accidents. On this metric, rural areas saw dramatically worsening trends compared with urban populations.
The federal researchers compared NCM rates of prime working-age adults in two three-year periods: 1999 to 2001, and 2017 to 2019. In 1999, the NCM rate in 25- to 54-year-olds in rural areas was 6 percent higher than the NCM rate of this age group in urban areas. In 2019, the gap had grown to a whopping 43 percent. In fact, prime working-age adults in rural areas was the only age group in the US that saw an increased NCM rate in this time period. In urban areas, working-age adults’ NCM rate declined.