Wednesday, November 18, 2015

diet-driven microbial extinction event - the western diet

Nautilus | Burgers and fries have nearly killed our ancestral microbiome
For the microbiologist Justin Sonnenburg, that career-defining moment—the discovery that changed the trajectory of his research, inspiring him to study how diet and native microbes shape our risk for disease—came from a village in the African hinterlands.
A group of Italian microbiologists had compared the intestinal microbes of young villagers in Burkina Faso with those of children in Florence, Italy. The villagers, who subsisted on a diet of mostly millet and sorghum, harbored far more microbial diversity than the Florentines, who ate a variant of the refined, Western diet. Where the Florentine microbial community was adapted to protein, fats, and simple sugars, the Burkina Faso microbiome was oriented toward degrading the complex plant carbohydrates we call fiber.
Scientists suspect our intestinal community of microbes, the human microbiota, calibrates our immune and metabolic function, and that its corruption or depletion can increase the risk of chronic diseases, ranging from asthma to obesity. One might think that if we coevolved with our microbes, they’d be more or less the same in healthy humans everywhere. But that’s not what the scientists observed.
“It was the most different human microbiota composition we’d ever seen,” Sonnenburg told me. To his mind it carried a profound message: The Western microbiome, the community of microbes scientists thought of as “normal” and “healthy,” the one they used as a baseline against which to compare “diseased” microbiomes, might be considerably different than the community that prevailed during most of human evolution.
And so Sonnenburg wondered: If the Burkina Faso microbiome represented a kind of ancestral state for humans—the Neolithic in particular, or subsistence farming—and if the transition between that state and modern Florence represented a voyage from an agriculturalist’s existence to 21st-century urban living, then where along the way had the Florentines lost all those microbes?
Earlier this year I visited Sonnenburg at Stanford University, where he has a lab. By then he thought he had part of the answer. He showed me, on his computer, the results of a multigenerational experiment dreamed up by his wife, Erica, also a microbiologist.
When the Burkina Faso study was published, in 2010, the question of what specific microbes improved human health remained maddeningly elusive, but evidence was beginning to suggest that diversity itself was important. So despite their relative material poverty, these villagers seemed wealthy in a way that science was just beginning to appreciate.
Where did that diversity come from? Humans can’t digest soluble fiber, so we enlist microbes to dismantle it for us, sopping up their metabolites. The Burkina Faso microbiota produced about twice as much of these fermentation by-products, called short-chain fatty acids, as the Florentine. That gave a strong indication that fiber, the raw material solely fermented by microbes, was somehow boosting microbial diversity in the Africans.
Indeed, when Sonnenburg fed mice plenty of fiber, microbes that specialized in breaking it down bloomed, and the ecosystem became more diverse overall. When he fed mice a fiber-poor, sugary, Western-like diet, diversity plummeted. (Fiber-starved mice were also meaner and more difficult to handle.) But the losses weren’t permanent. Even after weeks on this junk food-like diet, an animal’s microbial diversity would mostly recover if it began consuming fiber again.
This was good news for Americans—our microbial communities might re-diversify if we just ate more whole grains and veggies. But it didn’t support the Sonnenburgs’ suspicion that the Western diet had triggered microbial extinctions. Yet then they saw what happened when pregnant mice went on the no-fiber diet: temporary depletions became permanent losses.
When we pass through the birth canal, we are slathered in our mother’s microbes, a kind of starter culture for our own community. In this case, though, pups born to mice on American-type diets—no fiber, lots of sugar—failed to acquire the full endowment of their mothers’ microbes. Entire groups of bacteria were lost during transmission. When Sonnenburg put these second-generation mice on a fiber-rich diet, their microbes failed to recover. The mice couldn’t regrow what they’d never inherited. And when these second-generation animals went on a fiberless diet in turn, their offspring inherited even fewer microbes. The microbial die-outs compounded across generations.
Many who study the microbiome suspect that we are experiencing an extinction spasm within that parallels the extinction crisis gripping the planet. Numerous factors are implicated in these disappearances. Antibiotics, available after World War II, can work like napalm, indiscriminately flattening our internal ecosystems. Modern sanitary amenities, which began in the late 19th century, may limit sharing of disease- and health-promoting microbes alike. Today’s houses in today’s cities seal us away from many of the soil, plant, and animal microbes that rained down on us during our evolution, possibly limiting an important source of novelty.
But what the Sonnenburgs’ experiment suggests is that by failing to adequately nourish key microbes, the Western diet may also be starving them out of existence. They call this idea “starving the microbial self.” They suspect that these diet-driven extinctions may have fueled, at least in part, the recent rise of non-communicable diseases. The question they and many others are now asking is this: How did the microbiome of our ancestors look before it was altered by sanitation, antibiotics, and junk food? How did that primeval collection of human microbes work? And was it somehow healthier than the one we harbor today?

No comments:

Post a Comment