Researchers J. Elliott Campbell and Andrew Zumkehr looked at every acre of active farmland in the U.S., regardless of what it’s used for, and imagined that instead of growing soybeans or corn for animal feed or syrup, it was used to grow vegetables. (Currently, only about 2 percent of American farmland is used to grow fruits or vegetables.) And not just any vegetables: They used the USDA’s recommendations to imagine that all of those acres of land were designed to feed people within 100 miles a balanced diet, supplying enough from each food group. Converting the real yields (say, an acre of hay or corn) to imaginary yields (tomatoes, legumes, greens) is tricky, but using existing yield data from farms, along with a helpful model created by a team at Cornell University, gave them a pretty realistic figure.

The researchers assigned each active acre of farmland in the country to a nearby city. On average, they found about 90 percent of the country could rely on a locally grown diet.

[I]t’s a national average: In some parts of the country, people could have all of their needs met, but in, say, New York City, only about 30 percent of the people could have their food needs met by local food (assuming that we tear up all current crops and plant more smartly). Oddly enough, not all major cities have this problem. Chicago, for example, is a wonderland in terms of local food potential. “Chicago stands out. All the high-population cities seem to have lower potential, but Chicago has a lot of cropland around it,” said Campbell. Chicago’s advantage is partly because, unlike in the Northeast, Southern California, or even South Florida, it doesn’t have any major satellite cities nearby. But it’s also because there are a ton of farms within even 50 miles of Chicago, much more than in the Northeast, for instance.

The study isn’t perfect, but it is a good start. Nosowitz explains:

Right now it doesn’t include any climate data, for example: An acre of land in Michigan does not have the same growing season as an acre of land in California’s Central Valley. (Currently, the model takes an average of the annual production of each acre, but it doesn’t include any tips for how to conserve the harvest so that it feeds people above the Mason-Dixon Line during the winter.) Another issue: Our food preferences now are significantly global, and there are lots of important and popular foods that can’t be grown in the U.S. at all (think coffee or chocolate).

Still, imagine the potential! It’s a bit naive to imagine an entirely localized American food system — we’d have to completely upend U.S. ag policy, for one thing. In fact, the global economic effects would be huge if we converted our country’s farm acreage to support a locally grown food system: In 2010, the U.S. produced 32 percent of the world’s corn supply across 84 million acres of farmland. But at least this study shows that theoretically it could be possible for America to eat local — and how just how much of a runaway train our subsidized, Big Ag mess of a food system has become.