On July 30, 1965, when President Lyndon B. Johnson signed the bill that created Medicare, he outlined an ethical vision for the nation’s obligations to its older citizens. “No longer will older Americans be denied the healing miracle of modern medicine,” he said. “No longer will illness crush and destroy the savings that they have carefully put away over a lifetime so that they might enjoy dignity in their later years. No longer will young families see their own incomes, and their own hopes, eaten away simply because they are carrying out their deep moral obligations to their parents, and their uncles, and their aunts.”

That was then. Remember that, in 1965, 19 million Americans 65 years and older were eligible for Medicare. Life expectancy for men was only 66.8 years and, for women, 73.7 years.

This is now. Almost 44 million people are 65 and older. Women who reach the age of 65 now have a life expectancy of 20.4 more years; for men, it’s 17.8 more. That’s a boon to the healthy and well-off, but a challenge for those who are sick and poor. Continue reading…

The last time I was directly responsible for treating diabetes was fifty years ago, when I was an intern in medicine at UCLA. In my subsequent career as a psychiatrist I was not directly responsible for diabetes care, and as an individual, I don’t have the condition. As a result, I haven’t kept up on diabetes treatment, so a June 11 article on “Diabetes Overtreatment in Elderly Individuals: Risky Business in Need of Better Management” was news to me.

The opening two sentences of the American Diabetes Association’s article on “Tight Diabetes Control” make it sound as if “tight control” should be the goal of treatment:

“Keeping your blood glucose levels as close to normal as possible can be a lifesaver. Tight control can prevent or slow the progress of many complications of diabetes, giving you extra years of healthy, active life.”

In my uninformed state, that’s how I understood how diabetes should be managed, even for over 65ers. But I was wrong. Continue reading…

The classic cartoon picture of someone with hearing loss is a bent, old man with a giant ear horn held to his ear: not a pretty picture. Things have come a long way. Modern hearing aides are highly sophisticated electronic devices, some so small that they can fit into the ear canal and be virtually invisible. Some have directional microphones that can help you sort out voices that you want to hear from background noise.

Yet, most people are reluctant to wear a hearing aid. Some even buy them and don’t use them. Why is that? Continue reading…

This winter, three good friends and four admired colleagues died. As my generation winks out, there is plenty of time at funerals to think about grief and comforting the brokenhearted. It isn’t easy.

When death cuts down a life intertwined with mine, I’m depleted. Waves of pain and powerlessness wash over me and weigh me down. Premature, violent deaths are the hardest to bear. After 9/11, I went to the funeral of a young man last seen helping his coworkers down a fiery staircase. Patrick was mourned by his pregnant wife, two toddlers, parents, four siblings, and a sorrowful church full of friends and neighbors.

Dr. Arnold (“Bud”) Relman died yesterday at 91. He was the most esteemed leader among those who have been dismayed by the commercialization, fragmentation, excessive cost, and relatively poor quality of the U.S. health “system.” In 1980, as editor of the New England Journal of Medicine, Bud sounded the alarm about a danger he fought against almost to the day of his death: “We should not allow the medical-industrial complex to distort our health care system to its own entrepreneurial ends…[Medicine must] serve patients first and stockholders second.”

Bud was a champion – perhaps the champion – on behalf of patient care values and ethical medical practice. But in this post I want to write about him as a model of aging for those of us who are over 65.

My contact with Bud was solely at meetings we attended together over the years. He was 16 years older than I, and as I joined him as a member of the over 65 set I increasingly admired him for his passionate commitment to his core values and the generative way in which he tried to support those who held the same commitments. I especially admired the way in which he could disagree with others in a respectful, friendly, humorous manner. To my eye he was a master of constructive debate and collaborative conflict.

I didn’t know Bud well enough to ask him the questions I posed to my beloved late father-in-law, who died at 91 sixteen years ago. I marveled at my father-in-law’s zest for learning about the fields his grandchildren were working in as journalist, psychologist, teacher, and environmental activist. When I asked him about the basis for his enthusiasm for new learning he seemed puzzled – “What else is there to be interested in but the future?”

I had a similar exchange with the grandmother of one of my daughters-in-law when she was 99. We discovered that we both wished we could return to earth in 500 years – not out of a wish for reincarnation but out of curiosity about how our species and the planet would evolve.

Some years ago when I was talking with a friend about my belief that a substantial number of over 65ers are worried about the impact of runaway Medicare costs on future generations, he responded with an aphorism I’ve treasured ever since: “The true meaning of life is to plant trees under whose shade you do not expect to sit.”

Bud Relman embodied this attitude for me. He didn’t expect to sit in the shade of a clinician-led, patient-oriented single payer system, but he worked tirelessly on behalf of that vision. I understood Bud to be following a “moral faith” that had the force of “religious faith.” Bud was a teacher to the end of his life, via both the content of his ideas and the example of his person.

[The aphorism comes from the title of a book that Wes Henderson (1928-2003), a third-generation Canadian, wrote about his father, Nelson. It’s the advice Nelson gave Wes when Wes graduated from high school. For an extensive obituary for Dr. Relman, see here.]

Jim Sabin, M.D., 75, is an organizer of Over 65, a professor of population medicine and psychiatry at Harvard Medical School, and a Fellow of the Hastings Center.

Disclaimer: This blog is not an endorsement for Hillary Clinton. Far from it. We don’t even know if she is running for President yet. But we do know that she is projected to be a grandmother. Who knows? Even if she does run, she may run against someone who is also a grandmother or grandfather. Does that matter?

The last President of the United States to be a grandparent in office was George H.W. Bush. Probably there were many more in history, though of course they were all grandfathers. One point of view is that we don’t know more about that because it is irrelevant, as Rebecca Traister wrote for the New Republic in “How to Be Less Stupid About Hillary Clinton’s Future Grandchild”. She emphatically concludes:

“ . . . no one in the history of presidents has ever cared about whether or not they have grandchildren or ever will have grandchildren because if is truly one of the dumbest things to care about in the universe.”

As someone who has spent many years writing about end-of-life care, there is one question that has long intrigued me, but it is rarely posed in that context. When is a good, or tolerable, time to die? I do not mean when one is in pain or suffering, which is the way that question usually comes up. Like most others, I don’t want useless and painful care or needless suffering. My question is more speculative: even if one is in good health, medically and physically, when might one consider that one’s life has been sufficiently long? By “sufficiently” here I mean when death would not be judged an evil in my eyes or that of others. Continue reading…

A recent survey of 1200 adults found that Alzheimer ’s disease (AD) is the most feared disease among older Americans, with 44% identifying it as their biggest worry. Cancer was second, with 33% fearing it most. Over 65ers will not be surprised by these findings.

Given the high level of fear, it’s natural and inevitable that many people will want to know their risk for developing AD. The best known risk factor for development of AD is the apolipoprotein E (APOE) gene, which found on chromosome 19. APOE contains the instructions for making a protein that helps carry cholesterol and other types of fat in the bloodstream. One form (allele) of APOE – APOE4 – is associated with increased risk of AD in the over 65 phase of life. Depending in part on whether it is inherited from one or both parents, APOE4 increases the risk for developing AD four to twelve fold. APOE4 is a statistical predictor, not a fate. Having APOE4 does not mean that a person will develop AD, and not having it does not mean they won’t. (A National Institute of Aging fact sheet on the genetics of AD can be found here.)

A recent research report in the American Journal of Psychiatry found that older adults who knew they were positive for the APOE4 gene had more subjective distress about their memory and performed worse on objective tests of memory than a comparable group of adults who had the APOE4 gene but did not know it. The researchers concluded that knowledge of APOE4 status “could have a serious clinical impact by increasing the likelihood of false positive diagnosis of dementia or mild cognitive impairment.” In other words, the test is potentially harmful to mental function. Knowing one’s risk for developing AD made one appear – in self-perception and in external function – more demented! Continue reading…

Time is measured in many ways. For Jane, the big turkey and overflowing trays of Thanksgiving fixings served up on her long dining room table each year were the indication of how much her family had grown. But this year, her daughter and daughter-in-law handled the side dishes and the turkey had to be delivered, because Jane no longer drives. Still Jane – a real story, but not her real name –celebrated her holidays at home, and that says everything about how eldercare has changed.

According to research by AARP nearly 90% of seniors want to stay in their own homes as they age. That’s a lot of us who want to “age in place” –1 in 5 Americans (more than 72 million people) will be over 65 by 2030. But, as Jane’s story illustrates, this preference is not necessarily the course of action first pursued by families and doctors. Continue reading…

When I graduated from Yale in 1952, I began getting the alumni magazine. Our class was the latest in a long list of class notes, the new kids on the block. Most impressive were those at the beginning of the chronological listing. I was amazed by that long list of predecessors, graduates in the 1890s and early 1900s, and born in the 1870s and 1880s. Many had lived in the legendary Dink Stover era, when the males (that’s all there were then) smoked pipes, dressed as collegiate dandies, and wandered about the campus arm-in-arm singing “to the tables down at Mory’s…” and the civil war song “Aura Lee.”

At the front of the chronological list now are “the greatest generation,” rapidly dying. Invariably, mention is made of their service in World War II. Hardly any did not serve. I first met that generation, by then veterans, as a freshman in 1948. A few were in my classes, but most were juniors and seniors, 4-5 years older than I was—and a generation light years ahead in terms of life and experience. One of them was William F. Buckley, the big man on campus those days. Another, an heir to the New York Herald Tribune family, Ogden Reid, came to class in an immaculately tailored blue suit with a subdued rep tie. I was awed.