Why do people get old? History offers up some very weird theories.

 

The idea of “older people” as a stand-alone population, complete with its own distinct set of stereotypes in terms of behavior, appearance and mind-set, seems like something that has been around forever. But in reality, our current narrative of aging — the story we tell ourselves about the normal progression of life, including its second half — is a relatively recent invention.

Growing old was once something experienced on an individual basis: not at a set age for everyone and not according to a single set of rules. However, in the second half of the 19th century, a more monolithic idea of “the aged” began to take form, shaped in great part by a medical theory that has long since been debunked.

During this period, doctors believed that old age occurred when the body ran out of “vital energy” — which was no mere metaphor. The stuff was thought to be tangible, literally present in the body and its fluids. Everyone had a finite reservoir of vital energy that gradually became depleted over a lifetime. When you began to run low on vitality, you were old; death followed when the tank was empty.

For the era’s doctors, the concept conveniently solved the mystery of why illness seemed far more curable in the young than the old. Physicians supposed that the loss of vitality created a “predisposing debility,” as one historian has put it, making the older body vulnerable to a host of secondary maladies. The theory also fit with American religious thought as influenced by the Second Great Awakening, which peaked in the 1830s. The amount of vitality you were endowed with at birth was simply your lot. Whether you used it well or squandered it, however, was your personal responsibility.

 

And which activities spent vitality most profligately, leading to premature disability and death? All the fun ones, of course. The specifics varied depending on which expert you consulted. “Some endorsed the use of wines; others demanded abstinence; still others debated the merits of vegetables or red meat,” writes historian Carole Haber. Regardless of whom you asked, moderation was always the key: “If death resulted from an exhausted supply of energy, then the goal was to retain it at all cost . . . by eating the correct foods, wearing the proper clothes, and performing (or refraining from) certain activities.”

Sex — specifically, sex of non-procreative or self-pleasing varieties — was to be avoided at all costs. For men in particular, it was evident when one’s vitality was on the wane: Things stopped working as they once had in the marital bed. Doctors inevitably told these poor fellows that it was all their fault. Personal indiscretions — whether of recent vintage or way back in semi-forgotten youth — had added up.

In continental Europe in the 1850s and 1860s, vitality theory began to wane as French and German pathologists realized that the lesions, fibrous tissue and calcium deposits they discovered in older people’s cadavers could provide an explanation for some of the complaints of old age. But in the United States and Britain, many of those aware of these continental findings simply doubled down on their existing beliefs: Any wasting observed in cadavers was simply due to the loss of vital energy.

Perhaps the best evidence for that point of view was the moment in a patient’s life when vitality began to appreciably decline, which English-speaking physicians named the “climacteric period,” or “climacteric disease.” In women, the climacteric period was believed to begin between ages 45 and 55 and was associated with menopause; in men, it took place between 50 and 75 and was indicated by such signs as wrinkles, white hair and complaints of feebleness.

In some cases, this “extraordinary decline of corporeal powers,” as one physician termed it, according to the book “Beyond Sixty-Five: The Dilemma of Old Age in America’s Past,” seemed to progress rapidly, even violently — surely the result of vitality levels reaching dangerous lows. If you were in the right age bracket and betrayed any signs of the climacteric, the implication was clear: You needed to immediately drop whatever you were doing to conserve what was left of your energy — avoiding “excesses and undue exertions,” as one doctor wrote in 1853. And unfortunately, if you were a living, breathing human, you probably exhibited several of the warning signs. “Headache, vertigo, faintness, ‘heat flushes,’ emotional waves, phases of moral perversity, irritability, querulous impatience, even intellectual disturbance (especially of memory and of attention) prevail,” a doctor catalogued in 1899.

Insanity was the most troubling possible climacteric diagnosis, in part because it was loosely defined. Earlier, at the start of the 19th century, physicians had attempted to differentiate abnormal dementia from the more standard, benign cognitive effects of aging, but by century’s end, most agreed (incorrectly) that almost all older adults experienced the same sort of insanity, differing only in degree.

As oldness became synonymous with the loss of mental flexibility, joie de vivre and self-control, the centuries-old vision of the aged as fonts of wisdom went out the window. One writer blamed the Crédit Mobilier scandal — a massive case of railroad graft that rattled American politics in 1872 — on the advanced age of the participants.

Even Sigmund Freud, who was otherwise busily upending all conventional wisdom about the human mind, insisted in 1904 that “old people are no longer educable” and that those “near or over the age of fifty lack . . . the plasticity of the psychic processes upon which the therapy depends.” Tellingly, he was 48 years old at the time.

By the dawn of the 20th century, once you’d become visibly old, no matter your apparent health, no matter how sharp your mind seemed, all you could hope to do was withdraw and rest, saving your vitality for that final sprint. Crucially, you could no longer work; old age now changed you from an economic producer to a consumer, from healthy-as-a-default to a patient-despite-health.

Eventually, insights from the medical field of pathology discredited vital-energy theory, but only after it molded the development of a long-lasting set of social, cultural and economic institutions. The first dedicated old-age homes, the rise of public and private pensions, the normalization of retirement both as something bad your boss could do to you and also a new stage of life — these all marinated in vital-energy theory for decades before emerging fully baked into the 20th century, complete with implications for what it meant to be an “older person.”

Of the resulting notion of oldness we still hold onto, perhaps the most inaccurate aspect is its specificity. “The old” make up a population so diverse that it almost defies characterization. Depending when you decide old age begins, the group can be said to account for people found anywhere along a 50-plus-year span of life, with every imaginable level of physiological health, cognitive ability and wealth represented, along with every type of personality; ideologies of every stripe; and every race, nationality, creed, gender and sexual identity to be found on this blue Earth. The notion that there exists one single state of older being defies all logic.

 

Today, populations around the world are aging rapidly, and the need to jettison misconceptions from history is becoming urgent. It’s long past time to build a better old age: one that is less arbitrary and more intentional than our past.

health-science@washpost.com

 

This is an excerpt from Coughlin’s book “The Longevity Economy: Unlocking the World’s Fastest-Growing, Most Misunderstood Market” published by PublicAffairs Books. The excerpt was adapted for The Washington Post by Yoquinto.