If you put a lab mouse on a diet, cutting the animal’s caloric intake by 30 to 40 percent, it will live, on average, about 30 percent longer. The calorie restriction, as the intervention is technically called, can’t be so extreme that the animal is malnourished, but it should be aggressive enough to trigger some key biological changes.
Scientists first discovered this phenomenon in the 1930s, and over the past 90 years it has been replicated in species ranging from worms to monkeys. The subsequent studies also found that many of the calorie-restricted animals were less likely to develop cancer and other chronic diseases related to aging.
But despite all the research on animals, there remain a lot of unknowns. Experts are still debating how it works, and whether it’s the number of calories consumed or the window of time in which they are eaten (also known as intermittent fasting) that matters more.
And it’s still frustratingly uncertain whether eating less can help people live longer, as well. Aging experts are notorious for experimenting on themselves with different diet regimens, but actual longevity studies are scant and difficult to pull off because they take, well, a long time.
Here’s a look at what scientists have learned so far, mostly through seminal animal studies, and what they think it might mean for humans.
Why would cutting calories increase longevity?
Scientists don’t exactly know why eating less would cause an animal or person to live longer, but many hypotheses have an evolutionary bent. In the wild, animals experience periods of feast and famine, as did our human ancestors. Therefore, their (and conceivably our) biology evolved to survive and thrive not only during seasons of abundance, but also seasons of deprivation.
Thank you for your patience while we verify access. If you are in Reader mode please exit and log into your Times account, or subscribe for all of The Times.
Thank you for your patience while we verify access.
Already a subscriber? Log in.
Want all of The Times? Subscribe.