Contents
Cover
About the Book
About the Author
Title Page
Dedication
Introduction
1. Facing, or Fearing, Aging
2. “I have, so to say, lost myself”
3. Has Alzheimer’s Always Been with Us?
4. The Case of Jonathan Swift
5. The Biology of Aging
6. A Natural Life
7. The Aging Brain
8. Plaques and Tangles
9. “I only retire at night”
10. A Deadly Progression
11. The Brain Fights Back
12. Is the Epidemic Slowing?
13. Am I going to get it? And if so, when?
14. Treatment: Candidates But No Champions
15. Men, Women and Alzheimer’s
16. Was It Really the Aluminum?
17. The Many Faces of Dementia
18. Where You Live, What You Eat
19. What’s Next?
Notes
Index
Acknowledgements
Copyright
This ebook is copyright material and must not be copied, reproduced, transferred, distributed, leased, licensed or publicly performed or used in any way except as specifically permitted in writing by the publishers, as allowed under the terms and conditions under which it was purchased or as strictly permitted by applicable copyright law. Any unauthorized distribution or use of this text may be a direct infringement of the author’s and publisher’s rights and those responsible may be liable in law accordingly.
Epub ISBN: 9781473529656
Version 1.0
1 3 5 7 9 10 8 6 4 2
Rider, an imprint of Ebury Publishing
20 Vauxhall Bridge Road,
London SW1V 2SA
Rider is part of the Penguin Random House group of companies whose addresses can be found at global.penguinrandomhouse.com
Copyright © Jay Ingram 2014
Jay Ingram has asserted his right to be identified as the author of this Work in accordance with the Copyright, Designs and Patents Act 1988
First published in Canada by HarperCollins Publishers Ltd in 2014
First published in the UK by Rider in 2016
www.penguin.co.uk
A CIP catalogue record for this book is available from the British Library
ISBN 9781846045066
To my father, Ralph Ingram,
for demonstrating dignity, patience and love
in caring for my mother through her years of dementia
I FORGET EXACTLY what I was looking for when I came across an editorial in the journal Neurology titled, “Mom and Me.” It referenced a really cool piece of research showing that people whose mothers had had Alzheimer’s disease could exhibit the same disruptions of brain metabolism as patients with Alzheimer’s and yet be cognitively intact. That is, in the absence of symptoms of any kind, their brains nonetheless seemed to be on the way to Alzheimer’s. At the same time, the brains of children of fathers with Alzheimer’s were ticking along just fine. This convinced the researchers that because the mitochondria (the so-called power plants of the cell) are inherited through the maternal line, Alzheimer’s is a disease of the mitochondria. It was good science, but it was more than that.
Those data hit home for me because at the end of her life, my mother was bedridden and unaware. She was said to have Alzheimer’s, but since no autopsy was done, that was pretty much a guess. A guess with the odds on its side, true, but not a diagnosis. Not that it would really have mattered. With the exception of some drugs that slow the process for a year, more or less, there is no way of delaying the cognitive decline that is dementia. And now it seemed as if I was running the risk of getting the same thing.
I’ve had what I think is a pretty typical exposure to Alzheimer’s. I learned most from helping take care of my favourite aunt as she dwindled away. My aunt, my mother’s sister, visited all the familiar checkpoints: forgetting to eat, not being able to use the pill minder because she didn’t know today was Tuesday, wandering when first moved into a home because we had waited too long to move her. But she would have protested loudly—refused actually—if we had tried to move her any earlier. Eventually, even her good humour deserted her. My father-in-law declined a little faster but in roughly the same way.
But I didn’t write this book because I’ve had family members die of dementia (likely Alzheimer’s). Most people have had some sort of experience with the disease—and many have been eloquent about the experience, some of them giving first-person accounts of what it’s actually like, others telling their stories from a caregiver’s or family member’s point of view. As I began to think more about Alzheimer’s, I wanted an anatomy of the disease, a natural history. Not a guide to caregiving or diet recommendations or a description of an individual’s experience. But a scientific account: Where does it come from? What causes it? Is it a natural part of aging? How are we trying to combat it?
The science of Alzheimer’s disease is complex and extremely challenging. As fascinating as any medical mystery, it is unique among them. The emotionally draining personal experience of the disease and the very real threat to health care systems as the numbers of Alzheimer’s patients worldwide accelerates have combined to place enormous pressure on Alzheimer’s science to come up with a treatment. Alzheimer’s is, after all, “the plague of the twenty-first century.”
But, of course, it isn’t just of the twenty-first century. Long before Alois Alzheimer’s name was attached to the disease, the medical world was aware of dementia and described it in terms that are immediately familiar to us. It was first called “Alzheimer’s disease” about a hundred years ago, and while that coinage created a brief flurry of interest at the time, it wasn’t until the mid-1970s that it became recognized as a disease, rather than a common companion of old age. Since that time, we’ve been living in a nearly unprecedented era of concentration on a single illness. Or at least it felt that way to me until I saw the stats. In the U.S., the National Institutes of Health (NIH) spends over $6 billion a year on cancer research, $4 billion on heart disease and $3 billion on HIV/AIDS. Alzheimer’s? Just $480 million—nothing compared to the cost of care, which is rising inexorably.1
So where do we stand? What exactly is the nature of the beast? That’s what this book is about.
The first chapter takes a step back to give a sense of how people have thought about aging and death in the past. Sinning was a recurrent theme; dementia was the punishment. On the other hand, your honest striving for salvation could earn you a long and vibrant life and a virtual lock on a place in heaven. Today, we dread Alzheimer’s, but in the past, it was usually considered to be a normal part of aging for some. Chapters 2 and 3 tell the story of the beginnings of Alzheimer’s disease, from Auguste Deter, the first patient, through some quiet decades until the 1970s, when Alzheimer’s was finally recognized as a worldwide threat.
Chapter 4 takes an extremely rare look at dementia from both the outside and the inside; in both cases, the “patient” is Jonathan Swift. In Chapter 5, you’ll meet, likely for the first time, Abraham Trembley, a genius scientist who grabbed everyone’s attention with his “immortal” hydra and helped kick-start studies of the biology of aging. Then, in Chapter 6, a closer look at one amazing phenomenon, the inexorable increase in life expectancy over the last 175 years, and James Fries’ theory of the “compression of morbidity.”
Chapters 7 and 8 are paired. The first is a look at what happens as the brain ages naturally—not with dementia, just healthy aging (assuming these processes are actually different). In Chapter 8, we return to Alzheimer’s lab and look over his shoulder to see what he saw in Auguste Deter’s brain—the crucial differences that set it apart from healthy aging brains, the differences that are still the basis of an Alzheimer’s diagnosis (called “plaques” and “tangles”). They dominated Alzheimer’s description of what he saw on his slides, and they dominate thinking about the disease today.
I introduce the Nun Study in Chapter 9, partly because it is one of my favourites (a brilliant idea for a long-term study) and partly because its results underline the fact that Alzheimer’s is a complex disease. What might look simple at first glance (if there are plaques in the brain, there is Alzheimer’s) turns out not to be (many completely healthy, whip-smart elders have brains absolutely ridden with plaques). The most astonishing result from the Nun Study is that essays written by young novitiates in their early twenties predicted with surprising accuracy who among them would get Alzheimer’s sixty years later. The Nun Study has added enormously to our understanding of the disease.
The significance of the Nun Study is made crystal clear by contrasting the apparently inexorable spread of Alzheimer’s in the brain, as I do in Chapter 10, with the resistance to damage that has been labelled “brain reserve” (described in Chapter 11). Much is known about where neurons begin to break down in the brain and how plaques and tangles apparently conspire to spread in all directions from those initial sites. But brain autopsies conducted as part of the Nun Study revealed that many cognitively intact nuns had plaques and tangles in their brains when they died. This observation, combined with those from other studies, led to the brain reserve concept—a mysterious something that protects some individuals from dementia. It turns out there’s a long list of factors that might be part of brain reserve, a list that will likely grow over time. Education is one of the most important, and education is thought to be an important player in the small number of studies now emerging which suggest that in some places, especially Europe, the incidence of dementia might be decreasing. This is the subject of Chapter 12; it is a surprising and important, though early, trend that bears watching.
With the shadow of Alzheimer’s over all of us, we want to know our chances of getting it and what sorts of treatments will be available if we do. That is the subject of Chapter 13. In a way, the study of the genetics of Alzheimer’s is in its infancy, but it’s already clear that some genes are important for both early- and late-onset Alzheimer’s, and more are being announced all the time. In the long run, the hope is that some of these will lead to preventive treatments. Sadly, at the moment, none of those exist. However, as I describe in Chapter 14, there is a rush of clinical trials underway at the moment, most of which—so far anyway—have come up empty. But to an optimist, every failed trial at least yields information you didn’t have before. That’s the way it works.
Chapters 15 through 18 depart somewhat from the mainstream by taking four individual features of the disease and exploring each one. There are two female Alzheimer’s patients for every male, and while superior female longevity accounts for much of this, other, not-well-characterized influences are in the picture, connected, perhaps, to differences in the male and female brains. One of them might be the female hormone estrogen, but again, its role isn’t perfectly clear. Once thought to be the key to maintaining cognitive health through menopause and beyond, experts now generally believe that its beneficial actions are much more limited.
Many will remember the aluminum scare, the idea popular in the 1980s and 1990s that aluminum was the cause of Alzheimer’s and all aluminum kitchenware should be thrown out immediately. A substantial body of science lay behind the initial worries about aluminum, but inconsistent research results eventually turned most scientists away. All the same, that trend is a lovely example of how a scientific idea can rise—and then fall.
Alzheimer’s is by no means the only form of dementia, although it probably represents 75 per cent of the total. The majority are subtle variations on a theme, with rogue proteins (eerily similar to the infectious prions of mad cow disease) accumulating in different parts of the brain and playing a central role in many of those parts. But the most puzzling—and therefore the most scientifically attractive—is the mysterious dementia found on the island of Guam. Is it caused by dietary toxins? Are those toxins delivered in extreme doses by the consumption of bats? This too is ongoing research.
Chapter 18 focuses on where you live and what you eat. It is simply impossible, and would be very tedious, to review the support for every single claim on behalf of a “dementia-protective” food. There are just too many, and the evidence is scattered and scant. But two are worth examining: one is the assertion that turmeric is responsible for the extremely low rates of Alzheimer’s found in India; the other, the incontrovertible evidence that large amounts of sugar in the diet are not good. Today, this evidence is considered so central to the problem of Alzheimer’s that some are calling the disease type 3 diabetes.
I’ve tried to describe the state of our knowledge of Alzheimer’s as broadly and evenly as I could. I’m sure there will be researchers who won’t like my emphasis on this or that or the fact that I didn’t write about their approach to the disease. I’m also confident that those who advocate a particular set of vitamin supplements or dietary guidelines will find my omission of them scandalous. My hope is that when you read this book, you’ll gain a much broader and deeper understanding of this disease, which preoccupies us so much. Knowing more might even help in that most difficult of tasks: caring for those who are struggling with the affliction.
And by the way, shortly after I read “Mom and Me,” I was fortunate enough to come across another article, this one titled “Exceptional Parental Longevity Associated with Lower Risk of Alzheimer’s Disease and Memory Decline.”2 Well, that was a pick-me-up! My mother, demented as she was, lived to ninety-four and my dad, a month shy of ninety-eight. In this study, anything over the age of eighty-five was considered exceptional longevity, putting me in a somewhat safer zone. But it would be so simplistic to think that my risk of becoming demented is slightly higher because of my mother and lower because of my mother and my father because those are only two risk factors out of dozens if not hundreds. My experience illustrates a point: in the twenty-first century, we face aging in a way that has never happened before—one eye on the clock and the other on Alzheimer’s.
YOU CANNOT THINK about aging today without the shadow of Alzheimer’s disease intruding. Many of us—most of us—are afraid of it: James Watson, co-discoverer of the structure of DNA, had a set of Alzheimer’s gene locations redacted from his genome because he didn’t want to know whether he was prone to the disease. He was seventy-nine at the time.
The amounts of money being spent on combating the disease or caring for patients suffering from it are already astronomical, but overwhelming increases threaten our future. Even with extensive global research going on, Alzheimer’s is still mysterious and complex. So dependable treatment, let alone a cure, might be distant.
These are things we already know, and this is what aging in the twenty-first century is all about, particularly in the Western world. But in the “time before Alzheimer’s”—before dementia became an issue for all of us—what people thought about aging was very different. Beginning hundreds of years ago, thoughts and proclamations about sin, vitality, God’s will and the stages of life all fought for the public’s attention. A complicated mix, but the one main difference between then and now is this: when people in centuries past grappled with the inevitability of aging and death, religion was the place to turn. Religion has a weaker hold on us now, but we still place faith, or at least hope, in medical science. We want research to allow us to enjoy a happy and extended life.
But it hasn’t been like that for very long. Once you get a glimpse of how different the experience of aging was centuries ago, it’s easier to stand back and assess how overwhelming Alzheimer’s is. The disease demands an outsider’s vantage point, and oddly enough, it is our ancestors who can furnish this persepctive.
In the fourteenth and fifteenth centuries, most people didn’t even know exactly how old they were but measured life, if they did at all, in terms of ages or stages. These might be four (childhood, youth, maturity and old age, sometimes linked to the four seasons) or seven, made famous by the lines in Shakespeare’s As You Like It:
All the world’s a stage,
And all the men and women merely players;
They have their exits and their entrances,
And one man in his time plays many parts,
His acts being seven ages.
The seven stages originated centuries before with the astronomer Ptolemy, in his astrological work, Tetrabiblos, based on the influences of the sun, the moon and the five known planets.1 The Ptolemaic influences were precise: the moon was responsible for guiding the first four years of life, Mercury the next ten, Venus the following eight, the Sun the nineteen years of “young manhood” and Mars, Jupiter and finally Saturn the later years. A planet’s qualities were evident in their influence: the moon, so changeable as seen from earth, governs the first four years of life, when brain and body develop dramatically. But near the end of a person’s years, the slow-moving Saturn presides over the deceleration of life: “the movements both of body and soul are cooled and impeded in their impulses, enjoyments, desires, and speed; for the natural decline supervenes upon life, which has become worn down with age….”2
The idea of stages of life dominated thinking about aging for centuries, though the numbers of stages expanded beyond the original four and/or seven. One of the most persistent elaborations, making appearances in one form or another from the fifteen hundreds to the eighteen hundreds, arrayed people on a pyramidal set of stairs, infants on the first step on the left, fifty-year-olds at the top and greater and greater ages descending the stairs on the right. In some versions, centenarians weren’t even accorded their own step but lay horizontal beside the last, the ninety-year-old, step on the right. The American printing company Currier and Ives distributed scores of such images as late as the mid-eighteen hundreds.
Variations on the central theme were abundant: for the longest time, the pyramid featured only men; women first appeared as faithful wives and only as themselves in the nineteenth century. Each stage or step on the pyramid was accompanied by the appropriate symbols: the grim reaper holding an hourglass, saplings on the left mirrored by dead trees on the right, an aged cat dozing beside the fire. These were the nineteenth-century equivalent of the popular graphic of human evolution, with our hominid ancestors on the left transitioning to upright-walking Homo sapiens on the right.
There were even board games that played on the idea of life as a series of stages. In 1860 American entrepreneur Milton Bradley (whose eponymous company was eventually acquired by Hasbro in 1984) launched The Checkered Game of Life, which ran the course from Infancy to Happy Old Age. Only the odd lucky roll allowed players to avoid Ruin or Poverty, but there was no square labelled Death—though notably, there was a risk of Suicide. Bradley sold tens of thousands of copies of his game.
I dug around in my cupboards and found the 2002 version of Bradley’s invention. It bears scant resemblance to the original and has none of the darkness of the one produced in 1860: no squares labelled Crime, Idleness, Disgrace or Poverty. Instead, we have Join Health Club, Buy Sport Utility Vehicle and Have Cosmetic Surgery. That’s The Game of Life today.
Much art has been devoted to the subject of life’s stages: among the most important is a set of four huge canvases called The Voyage of Life by American painter Thomas Cole.
I first saw these works in the National Gallery in Washington, DC, years ago, long before I had any interest in the subject, but for a few minutes, I was entranced. The four paintings show a trip down a river, beginning with an infant in a boat emerging from a cave and ending with an old man, still in the boat, setting out on the open ocean. It is all religion: a guardian angel accompanies the man throughout his life (though for the most part, without his knowledge); a shiny white castle hovers in the sky; companion angels flit here and there. Exactly what you might expect from a religious, mid-nineteenth-century artist’s rendering of “life as a voyage.”3
It wasn’t just Cole, of course: for centuries, religion had been the only significant influence on thinking about the passage of life. Yes, people thought of aging as a series of steps or stages, but that was just the calculation and anticipation part. Religion provided motivation: for instance, to counter the view that aging simply draws one further and further from usefulness and closer and closer to death, the Puritans argued that old age actually had an important purpose. It brought one nearer to salvation, something that no forty-year-old could experience. Therefore, there was an incentive, and a powerful one, to live every last day of one’s life in a moral way.
Actually, it seemed to be a good idea to get an early start on that moral modus vivendi. One widely held belief was that a full, healthy and enjoyable life to the end was possible only through consistent purity of mind and faithfulness to God; those who either died earlier than they should have or, worse, suffered through their last years were seen as the deserving victims of their own immoral lives. They had only themselves to blame, not Providence. So if you were a sinner, old age would inevitably be miserable. Unfortunately, even if you weren’t, there were no guarantees.
So the prominence given to angels and heaven in Cole’s paintings was no surprise. But there is much more to these canvases: the boat passes through absolutely fantastical landscapes. The carefree youth gliding on calm waters gives way to a troubled middle-aged man deep in prayer as he is tossed about by the waves. There is no doubt that in the end, nature subdues man, but still, the skies toward which the old voyager drifts are heavenly lit.
An abrupt change occurs between the first two and the last two canvases. For the most part, the first and second paintings represent the dreams and hopes of the young (although even in the second canvas, where the youth sails confidently on smooth waters toward a shining castle in the skies ahead, a glance at the extreme right of the painting reveals an upcoming curve in the river, where the waters are choppy, promising a much rougher voyage). The last two canvases are completely grim and dark, the autumn and winter of life, a period that Cole himself described as characterized by trouble.
The Voyage of Life does not even allow for the possibility of choice; the river ensured that there was only one path, a helpless drift toward the sea, albeit watched over by celestial beings. Whether ensured by them or not, the voyage seems to turn out well, with heaven beckoning in the distance.
I have read other accounts by people who were captivated at first sight by The Voyage of Life, but it’s still not clear to me why this happens. Maybe the paintings force us to address the “threat” of getting old instead of pretending the phenomenon doesn’t exist. Or it might be something as pedestrian as the sheer size of the canvases: each is about one and a half metres by two metres (five feet by six feet). Regardless, ever since the four canvases of The Voyage of Life were first exhibited publicly in 1840, they have attracted crowds. A decade later, engraved reproductions were hung in homes just as the steps of life had been decades before. Even today, thousands view The Voyage of Life, even though the twenty-first-century attitude toward aging has been thoroughly secularized since the paintings were executed about 175 years ago.
Note that these varied visual treatments of life’s stages, despite putting wildly different interpretations on what happened to people as the years passed (and how much responsibility they bore for their fate), all bumped up against the same ceiling: no one could escape the inevitability of it all. You climbed the pyramid, then descended step by step. The passenger on Cole’s Voyage through a landscape apparently unaffected by the presence of humans was propelled by currents and buffeted by the weather; he had no role other than to hang on and pray for his salvation. Even as you moved through The Checkered Game of Life, you had to be passive: you might escape the worst or you might not, but you had no control.
The best expression of this helplessness in the face of inevitable old age and death (and the pre-eminent role of religion) was provided by the theologian and preacher Nathanael Emmons. He sermonized in New England for more than sixty years, dying in 1840 at the impressive age of ninety-five. According to his theology, people could influence to some degree whether or not they might ascend to heaven, but above all, they were dependent on God, and God had absolute authority over who died and when. In the timing of death, human behaviour would not influence Him.
Emmons even argued that God could deliberately end a person’s life to underline the fact that He was in total control—the vengeful God. This belief led Emmons to argue, surprisingly, that because God therefore ruled over even the laws of nature, it was impossible to know the “natural” life span of a human. That idea allowed for the possibility that human lives, if God weren’t tempted to tamper with them, might be much longer than anyone had ever known:
As we are not perfectly acquainted with the laws of nature, we cannot absolutely determine that any of those who are dead did actually reach the natural bounds of life. We may, however, form some conjecture upon this subject, by the very few instances of those who have lived an [sic] hundred and twenty, or thirty, or forty, or fifty years…. Hence we have great reason to conclude that God has most commonly deprived mankind of the residue of their years. And never allowed one in a thousand or one in a million of the human race to reach the bounds of life which nature has set.4
Despite Emmons’s assertion that many more years of human life might be possible, the fact that God held them in his hand didn’t offer much hope of ever obtaining them. And anyway, there had never been much talk of 150-year-olds; immortality had been forfeited in the Garden of Eden. Centenarians, if they were portrayed at all, were virtually comatose. The limits on human life were all too obvious.
By the mid-eighteen hundreds, though, religion’s grip on thoughts of aging and death was, at least in some quarters, beginning to loosen. Practitioners of something called “health reform” argued that while God was definitely still in the picture, it made perfect sense for people to live in responsible and healthy ways in order to maximize the years available to them. In fact, living to extreme old age would merely be a return to those fantastic life spans uncritically recorded in the Bible (Methuselah, for example), so this approach was anything but a rebuff to God. People were advised to avoid tobacco, alcohol, coffee and tea (or at least to moderate their indulgence in these things), and they were told not to engage in excessive sex. On the other hand, they were encouraged to bathe and change their clothes more often and to increase their consumption of vegetables.
Many of the health reform movement’s recommendations don’t sound out of place today, but some of its most enthusiastic champions didn’t know when to stop: they foresaw lives of two or three hundred years or more, based, again, on those biblical claims. Indeed, religion was slow to loosen its grip entirely. In his 1857 book, Laws of Health, William Alcott wrote: “It is assumed finally that old age must necessarily be wretched. But old age, whenever it is wretched, is made so by sin. Suffering has no necessary connection with old age, any more than with youth or manhood.”5
In the mishmash of thinking, emoting, rationalizing and sermonizing about death that characterized the nineteenth century, there was one unique, bizarre, not-sure-whether-to-laugh-or-cry strand of thought that emerged in the late eighteen hundreds. Enunciated by none other than the great Canadian-born physician William Osler, the idea had its roots in Anthony Trollope’s 1882 novel, The Fixed Period. Everywhere described as “dystopian,” The Fixed Period describes the country Brittanula, which has established a fixed period for a life: sixty-seven years. At that point, people are sent to a place called “the college” in the town of Necropolis, where, within a year, they are euthanized and then cremated. Trollope is said to have derived inspiration for the book from a seventeenth-century play, The Old Law, which he had apparently read shortly before writing his novel. But in doing so, he overlooked a much more recent and forcefully argued version of the same idea in George Miller Beard’s 1874 publication, Legal Responsibility in Old Age.6
Trollope was being satirical; Beard, not really. A doctor, Beard based his book on an address he gave to the Medico-Legal Society of the City of New York in March 1873. At the beginning of this talk, he announced he would speak about the impact of aging on the “mental faculties” and indicate whether that impact impaired the elderly to the point where the legal system would be forced to take notice: “The method by which I sought to learn the law of the relation of age to work was to study in detail the biographies of distinguished men and women of every age.”7
“All the greatest names of history” were included in Beard’s analysis. He collated the ages at which each performed his or her most important work: statesmen their legislation, architects their monuments, philosophers their systems. Here are Beard’s key conclusions:
• Eighty per cent of the “work of the world” is accomplished before the age of fifty.
• The most productive fifteen-year period is between thirty and forty-five.
• A large number of Beard’s subjects lived to be over seventy, but on average their final twenty years, across the board, were unproductive.
Beard, wanting to ensure that his message hit home, attached precious metals and other materials to the decades, beginning with twenty to thirty as brazen, thirty to forty as gold, then silver, iron and tin. The final decade (between seventy and eighty) was “wooden.”fn1 To give some heft to his research, Beard pointed out that this principle was universal: horses’ best years, according to him, were between eight and fourteen, and hunting dogs were most effective from two to six. Hens hit their egg-laying peak at the age of three, though they might produce eggs for several more years.
Beard also answered some criticisms. Earlier, for instance, it had been thought that the mind was most active between forty and eighty. When asked why that notion had prevailed until he set things right, he claimed not only that people had been excessively reverent toward the aged but also that it takes time for fame to take hold, with the result that we venerate people whose best work was completed decades before. He also held artists responsible: they immortalized, through paintings and busts, men who had been famous but were now way past their prime.
Finally, he came through on his promise to address the issue of whether the legal system should take into account the impact of aging on the mind. His conclusion: the courts should have experts on “cerebro-pathology” on hand to consider the probability that testimony was being adversely affected by the accumulated damage of excessive age.
George Miller Beard was thirty-four at the time he wrote Legal Responsibility in Old Age, so he could be forgiven for a certain short-sightedness. After all, he wasn’t proposing anything terribly radical. But he cast doubt on the veracity of his results by never defining the size of his sample, changing numbers each time he quoted them and not publishing the data or calculations leading to his conclusions. Nonetheless, he received a lot of attention; some have argued that his work paved the way for mandatory retirement.
Trollope then wrote the sci-fi version. There these ideas might have languished had not the pre-eminent physician Dr. William Osler chosen “The Fixed Period” as the title of his last address to the Johns Hopkins University Medical School in 1905.8 Osler was leaving Johns Hopkins for Oxford, and he was making the point that young faculty were good for a medical school, that in fact most people don’t contribute much after forty and that maybe they should be forced to vacate the university at the age of sixty. “As it can be maintained that all the great advances have come from men under 40,” he claimed, “so the history of the world shows that a very large proportion of evils may be traced to the sexagenarians—nearly all the great mistakes politically and socially, all of the worst poems, most of the bad pictures, a majority of the bad novels, and not a few of the bad sermons and speeches.”9
Apparently, he was just getting warmed up because a few moments later, he said: “The teacher’s life should have three periods—study until 25, investigation until 40, profession until 60, at which age I would have him retired on a double allowance. Whether Anthony Trollope’s suggestions of a college and chloroform should be carried out or not I am a little dubious, as my own time is getting so short.”10
Today, social media would have been all over that comment. We are used to nearly daily examples of people (who should know better) making untimely or thoughtless remarks, but the public response to Osler’s joke about chloroform more than a century ago was actually on the same scale: over-the-top newspaper headlines and public pronouncements, especially from those over sixty who felt that their usefulness, at least as they estimated it, was being denigrated. There were even three deaths that seemed suspiciously linked to Osler’s comments, including one of a man who apparently chloroformed himself out of this life days after discussing Osler’s speech with others and making it clear that as far as he was concerned, the theories should be put into practice.
The funny thing was that apparently no one who was present when Osler made his remarks took him seriously; he was savaged only in the following day’s newspaper reports. Yet the damage was done: a new verb, “to oslerize,” briefly came into vogue.
The furor Osler created would have been as incomprehensible to Americans a hundred years earlier as it is unremarkable a hundred years later. Where is God? Are not all human lives held in God’s hands? Nathanael Emmons would have been red-faced at the impudence of it all. But as sloppy as George Miller Beard’s science appears to have been, it was symbolic of the change felt throughout the nineteenth century. Age was now the province of science.
And what could science potentially do? Manipulate nature. Today, more than a century and a half after the health reform movement bragged about extending human life, there is renewed talk of people reaching the age of 150 years. Not because God is out of the picture or back in the picture but because science, or what Emmons called “the laws of nature,” is now much better understood. Cole’s natural landscapes have been changed forever. In the eyes of the scientific optimists, the passenger in The Voyage of Life will no longer be helplessly tossed around by the currents. And while scientific knowledge is far from perfect, it has advanced enough to encourage some experts on aging to argue that we will soon be able to tinker with the human life span.
We’ll go into more detail about the biology of aging in Chapter 5, but for now, there are so many different approaches to the science of aging that a clear path to life extension does not, and may never, exist. However, a growing number of credible experts believe that we can expose and clarify the factors that initiate aging in the human body and even extend life as a result. We really are in the age of science.
But even as we dream about living longer, we worry. Alzheimer’s now dominates the discussion about aging as no other disease has done. A short while ago, this wasn’t the case: heart disease, stroke and cancer were considered to be the principal hazards of, or threats to, a long life. Now, when nearly one of every two people over eighty-five is demented, Alzheimer’s casts a cold light on the prospects of living long. It’s as if Cole’s river of life has suddenly opened a new and hazardous side channel, and a growing number of individuals are drifting down it.
As a result, those brave enough to claim that we will one day live much longer are always careful to make the point that at the same time, we will be obliged to ensure that we’ll be healthy and mentally intact at those great ages. Predictions that the debilitating, chronic disease of Alzheimer’s would suddenly come to our attention when the killing diseases, like pneumonia, were eliminated or controlled have come true.11 After all, pneumonia was called “the old man’s friend.”
But this is all twenty-first century: had you lived in the nineteenth century, you wouldn’t have been very concerned about extending life or about the accompanying risk of a much diminished quality of that extended life.
Alzheimer’s disease today affects up to 10 per cent of those over sixty-five and nearly 50 per cent of those over eighty-five.12 In Canada and the United States, with a total population of nearly 350 million, there are nearly 6 million cases of Alzheimer’s disease. In 1800 the entire population of the United States (of European descent) was less than that. At the same time, Canada probably had no more than half a million people. And there was no baby boom cohort, no bulge of older people moving through the population. So two hundred years ago, with life expectancy dramatically lower than it is today, with a much smaller population of people over sixty-five and with God, sin and salvation uppermost in people’s minds, thoughts of old age didn’t allow much room for dementia. But a little over a hundred years ago, things changed.
fn1 In fact, graphing his data produces a shape not unlike the steps-of-life pyramid except that the peak is reached sooner and the decline, as a result, is longer.
ON NOVEMBER 26, 1901, in Frankfurt, Germany, a young clinician met with a woman who had been admitted to the municipal mental asylum the day before. She was fifty-one. In the months before her arrival, her behaviour had become increasingly bizarre and disordered. She had become paranoid and irrationally jealous of her husband of twenty-eight years, and her memory was declining precipitously. The doctor made careful notes of his interview with the woman, named Auguste Deter:1
“What is your name?” “Auguste.”
“Last name?” “Auguste.”
“What is your husband’s name?” “Auguste, I think.”
“Your husband?” “Ah, my husband.”
She was inconsistent—able to name a pencil, pen, diary and cigar correctly—but further questioning revealed the depth of her confusion.
“What did I show you?” “I don’t know, I don’t know.”
“It’s difficult isn’t it?” “So anxious, so anxious.”
Some of the objects first named correctly were quickly forgotten. When eating cauliflower and pork, she identified them as spinach. The doctor questioned her further:
“What month is it now?” “The 11th.”
“What is the name of the 11th month?” “The last one, if not the last one.”
“Which one?” “I don’t know.”
The doctor noted that her difficulties extended beyond being unable to name things correctly. When reading, she repeated the same line three times. Even though she could identify the individual letters, she seemed not to understand what she was reading and even stressed the words in an unusual way. Then, out of the blue, she said:
“Just now a child called, is he there?”
and here and there short phrases that provide a brief window into an agonizing decline:
“I do not cut myself.”
“I have, so to say, lost myself.”
Auguste’s decline continued unabated. Eventually, her speech became incomprehensible, and the only sounds she made were shouting or humming. For the last year of her life, she remained virtually mute, apathetic and immobile. She died in April 1906, shortly before her fifty-sixth birthday.
Her case would likely have remained unremarkable, and even unnoticed, had it not been for the tenacity of her doctor. By the time of Auguste’s death, he had moved on from Frankfurt to Munich, where he worked at the Royal Psychiatric Clinic at the university. But he had never forgotten her, and when informed of her death, he asked that her brain be sent to him for study. What he found made Auguste D. well known in neurological circles at the time, and the doctor himself famous. He was Alois Alzheimer.
For a man whose name is now attached to one of the best-known and most-feared diseases of the twenty-first century, Alzheimer was anything but a larger-than-life character. Compassionate, yes, and a good mentor, wandering from one co-worker to the next in his lab, offering advice on what they were seeing through the microscope between puffs on his omnipresent cigar—the cigar that was often left to burn out on the lab bench as he lost himself in concentration. Various biographers have tried to make something more colourful of this diffident, hard-working, quiet man, but the best anyone has been able to do is to point out that when he was a student in medical school in Würzburg, Germany, he was a member of a fencing club that augmented its thrusts and parries with a full social life, including good German beer (although that is not properly documented). And in 1887, when he was still a student, he was fined for activity that was vaguely defined as “improperly aroused disturbance of the peace in front of the police station.”2
But Alzheimer was definitely not one of the high-profile, empire-building psychiatrists and neuroscientists of the day like Sigmund Freud, Carl Jung, Emil Kraepelin or Kraepelin’s archrival in Prague, Arnold Pick. As one writer put it, Alzheimer was one of those people who truly had fame thrust upon them, although to be fair, he ran a powerful anatomy lab, and several scientists whose names we associate today with important diseases worked there, including both Hans Gerhard Creutzfeldt and Alfons Maria Jakob of Creutzfeldt-Jakob disease (CJD), and Frederic Lewy. (His “Lewy bodies” are aggregates of protein that collect inside brain cells, in both Parkinson’s disease and dementia with Lewy bodies.)
But then Auguste Deter entered hospital, Alzheimer became her clinician and his name, if not his science, became widely known. To be accurate, however, not even in Alzheimer’s lifetime did his examination of Auguste Deter appear to be the defining moment of his career. She was simply a case in which he was interested, and once he had received her brain, he proceeded to slice it thinly and apply a variety of revolutionary chemical stains that made microscopic structures pop out from the background. These stains had been invented recently, some by his colleague and close friend Franz Nissl, who had also been best man at his wedding.
Alzheimer’s examination of Auguste Deter’s brain revealed several abnormalities. First, the number of neurons, the brain cells, was dramatically reduced. Among those cells, but outside them, Alzheimer also discovered dark deposits of an unusual composition, and third, using stains containing silver, he was able to discern dark fibrils sitting in the middle of what appeared to be otherwise normal brain cells. Three features, a triad that even today characterizes the disease.
Alzheimer wasn’t the first to identify the deposits (which are now called “plaques”) or the fibrils (“tangles”). Both had been spotted in brain tissue by other researchers before them. But it was Alzheimer who made the point that they, together with a significantly reduced population of neurons, coexisted in the brain of someone whose premature symptoms of mental decline he had witnessed himself. This observation was enough to move him to report his findings to a wider audience, which, as it turned out, was largely unappreciative.
It was at the November 1906 meeting of the Southwest German Psychiatrists in Tübingen. Some ninety of Alzheimer’s colleagues attended, and he reported what he’d seen in Auguste Deter’s brain, apparently failing to create even a ripple of interest. (I have seen a claim that the audience was more interested in the report that followed, on compulsive masturbation, but have been unable to verify this.) No questions were asked, and the chairman of the meeting said, “So then, respected colleague Alzheimer, I thank you for your remarks, clearly there is no desire for discussion.”3
Alzheimer might understandably have felt let down, although it’s been pointed out that even he wasn’t sure that he’d discovered anything worthy of bearing his name. At most, he thought Auguste Deter had suffered from a strange, early-onset form of senile dementia. Nevertheless, the history making was out of his hands. This single case, along with a handful of related instances in the next few years, prompted Alzheimer’s boss, the powerful Emil Kraepelin, to include the cases in the eighth edition of his immensely influential Handbook of Psychiatry (a series of texts on which he built a huge reputation) and to call them “Alzheimer’s disease.” But even Kraepelin himself admitted that the disease, its interpretation and indeed even its importance were “unclear.”
Alzheimer’s discovery came at a dynamic time in psychiatry and neuroscience, especially in Europe. Sigmund Freud was aggressively—and persuasively—pursuing psychotherapy and the idea that mental illness could be addressed by talk therapy; scientists like Alzheimer were tackling similar issues but much more biologically, trying to connect the dots between what they saw under the microscope with the symptoms a patient had when alive, an approach made possible only by the recent invention of staining techniques that highlighted different features of the brain landscape.
There were bigger ideas in play too, like competing theories as to the nature of the brain itself: Was it one huge interconnected web or were neurons actually individual units, communicating with each other but remaining distinct? It was an exuberant time for brain science, although you’d never know from the generally dour expressions on mustachioed faces in group photos of the time.