By Dan Nosowitz
Eating a local diet—restricting your sources of food to those within, say, 100 miles—seems enviable but near impossible to many, thanks to lack of availability, lack of farmland, and sometimes short growing seasons. Now, a study from the University of California, Merced, indicates that it might not be as far-fetched as it sounds. “Although we find that local food potential has declined over time, our results also demonstrate an unexpectedly large current potential for meeting as much as 90 percent of the national food demand,” write the study’s authors. Ninety percent! What?
Researchers J. Elliott Campbell and Andrew Zumkehr looked at every acre of active farmland in the U.S., regardless of what it’s used for, and imagined that instead of growing soybeans or corn for animal feed or syrup, it was used to grow vegetables. (Currently, only about 2 percent of American farmland is used to grow fruits or vegetables.) And not just any vegetables: They used the USDA’s recommendations to imagine that all of those acres of land were designed to feed people within 100 miles a balanced diet, supplying enough from each food group. Converting the real yields (say, an acre of hay or corn) to imaginary yields (tomatoes, legumes, greens) is tricky, but using existing yield data from farms, along with a helpful model created by a team at Cornell University, gave them a pretty realistic figure.
Still, the study involves quite a few major leaps of faith because it seeks not to demonstrate what is possible for a given American right now but to lay out a basic overview of the ability of local food to feed all Americans. It’s not just projecting yields for vegetables grown on land that is today dominated by corn and soy. The biggest leap of faith is perhaps an unexpected one and is surprisingly underreported: Why do we even want to adjust our food supply to be local in the first place?
“Local food is kind of largely rejected by a lot of scientists from earth and environmental fields because the greenhouse gas emissions from the transportation of food from the farm to the retailer is actually really small compared to all the other emissions,” said Campbell, an associate professor at UC Merced. (Zumkehr is one of his students; the two fused their research to attempt to answer this question.) We take it for granted that eating locally must provide a huge boost to our environmental bona fides, but if the only consideration is emissions from the trucks, trains, and planes that bring us food from elsewhere, we’re mistaken. Looking at our diet as a whole, the total amount of emissions that come from transportation is somewhere around 10 percent—hardly the biggest factor. The bulk of emissions emerge from the farm itself, from the actual growing and production of the food.
So, Why Should You Care? Campbell thinks there’s a distinct connection between eating locally and tackling those farm-based emissions. The elephant in the room, he said, is the move from an animal-based diet to a plant-based one. Environmental and food scientists trying to reduce emissions are focused much more intently on that switch than on local food, but Campbell sees the two as related, largely because those who eat locally also tend to eat a much higher concentration of plants. “You walk into a farmers market and into a grocery store, and it’s like two different worlds, you know?” he said. “A grocery store has some vegetables hidden off to the side, and at a farmers market it’s all about the vegetables. That’s not a trivial issue.”
To tie all of those new acres of vegetables imagined in the study to local consumers, each acre was assigned to a nearby city, with no overlaps. This is tricky, especially in dense megalopolises like the Northeast Corridor and Southern California; land in, say, northeastern Pennsylvania lies within 100 miles of both New York City and Philadelphia. “We added this optimization model that decided which units of land to allocate to which particular cities to maximize the total number of people in the U.S. who could be fed locally,” said Campbell.
So that 90 percent number doesn’t mean that any given American can have 90 percent of his or her food needs met by local food, nor does it mean that 90 percent of all Americans will have all of their needs met by local food. Instead it’s a national average: In some parts of the country, people could have all of their needs met, but in, say, New York City, only about 30 percent of the people could have their food needs met by local food (assuming that we tear up all current crops and plant more smartly). Oddly enough, not all major cities have this problem. Chicago, for example, is a wonderland in terms of local food potential. “Chicago stands out. All the high-population cities seem to have lower potential, but Chicago has a lot of cropland around it,” said Campbell. Chicago’s advantage is partly because, unlike in the Northeast, Southern California, or even South Florida, it doesn’t have any major satellite cities nearby. But it’s also because there are a ton of farms within even 50 miles of Chicago, much more than in the Northeast, for instance.
Dense cities aren’t just difficult to feed because they’re dense; the Northeast also suffered a huge collapse in nearby farmland as farming moved to the Midwest in the 20th century. But that farmland, or a lot of it, anyway, could still be resuscitated and used to feed the cities. Campbell sees that as a possibility with a huge amount of potential. “If you put the farms close to the cities, it opens up new opportunities to basically recycle water and nutrients between the cities and farms instead of relying on things that might require fossil fuels,” he said. A robust urban composting program, for example, could supply nearby farms easily, reducing the reliance on fertilizers that maybe aren’t so good for the environment. (Cheap synthetic nitrogen fertilizers put a massive strain on the environment in about a dozen ways; using less of them can only help.)
“This is kind of the first attempt to quantify what the potential is, so we decided with the first number to just see what the upper limit is, the greatest possibility,” Campbell said. This isn’t a change that we could just put into effect with a few clever laws or behavioral changes; it would require an overhaul of the entire economic system and would probably cause the collapse of the world economy as we know it.
But that isn’t the point. The point is to have a baseline, an upper theoretical potential, of whether feeding the country locally is even possible. It certainly seems that it is. The next step, both for Campbell and Zumkehr and for the others that will inevitably riff on their work, is to refine this data. Right now it doesn’t include any climate data, for example: An acre of land in Michigan does not have the same growing season as an acre of land in California’s Central Valley. (Currently, the model takes an average of the annual production of each acre, but it doesn’t include any tips for how to conserve the harvest so that it feeds people above the Mason-Dixon Line during the winter.) Another issue: Our food preferences now are significantly global, and there are lots of important and popular foods that can’t be grown in the U.S. at all (think coffee or chocolate).
It’s important to understand the limits of this study, but it would be equally foolish to disregard it. This is research that thoughtfully begins the conversation about legitimately feeding the country locally. It’s a conversation that’s going to get louder and more important in the years to come.
Author Dan Nosowitz is a freelance writer based in Brooklyn. He has written for Popular Science, The Awl, BuzzFeeᴅ, Modern Farmer, Gawker, Fast Company, and elsewhere.