“I don’t think we will find one chemical in something like a blueberry that is the active ingredient,” Bickford says. “I think we are going to find it is a lot of chemicals working together, that it is more synergistic.”
And just as we search for what we should eat, it is just as important to know what we shouldn’t. Again, there are very few studies that show that specific foods, like steak, are bad for the brain, but we do know that certain patterns of eating can lead to conditions, such as obesity or type 2 diabetes, that can be harmful. Most of this new research is based on linkage studies and does not necessarily prove cause and effect, but does indicate what we should pay attention to.
There are recent suggestions that type 2 diabetes, the most common kind, which is often related to obesity, may increase the risk for dementia, for instance. It’s still unclear if it’s the diabetes or the obesity or both that may increase the risk, because not everyone with diabetes gets Alzheimer’s and not all those who get Alzheimer’s are diabetic.
But in recent years, a number of large studies have found that those with type 2 diabetes are twice as likely to develop Alzheimer’s. It may be that the cardiovascular problems caused by diabetes block blood flow to the brain or cause strokes, contributing to dementia. The same kind of plaque that builds up in the brain with Alzheimer’s also accumulates in the pancreas with type 2 diabetes. It’s also possible that abnormalities of glucose metabolism and insulin levels in the brain may be harmful. Those with type 2 diabetes often have insulin resistance—when their cells cannot use insulin well—so the pancreas makes extra insulin, which builds up in the blood and can lead to inflammation and possible harm to the brain.
One of the new studies, by researchers from the Karolinska Institute in Sweden, found that even people who had borderline diabetes were 70 percent more likely than those with normal blood sugar to develop Alzheimer’s. Another study, in Finland, published in the
Archives of Neurology
in 2005, found that being overweight in midlife—even without diabetes—increased the risk of dementia. The researchers looked at the records of 1,449 randomly selected men and women when they were fifty-one and then again when they were seventy-two and found that midlife obesity, like high blood pressure and high cholesterol, doubled the risk for dementia—and that those who had all three risk factors were six times as likely to become demented.
Again, the reasons are unknown, but with a number of studies showing the same findings, it begins to appear that being obese in middle age is not the best thing you can do for your brain. One of the most recent studies, by Scott Small at Columbia, used brain scanners and found a tie between glucose levels and—again—that tiny area of the hippocampus, the dentate gyrus, that is so crucial to memory. Small found that unregulated spikes in glucose were linked to lower blood volume in the brain’s dentate gyrus. The effect came with levels of glucose that aren’t necessarily seen with diabetes, but with the normal aging process as we reach middle age. And it is known that physical activity and a proper diet—one that leans much more on fruits and vegetables than on highly sugared sodas and snacks—can help regulate blood sugar.
About 20 million people in the United States have type 2 diabetes. The number has doubled in the past two decades and is expected to keep increasing because rates of obesity are rising. Worldwide, diabetes is also increasing, up to 230 million cases from 30 million in the past twenty years.
Obesity rates, too, remain high. According to the Centers for Disease Control and Prevention, 34 percent of U.S. adults aged twenty and over were obese in 2008. Alzheimer’s now affects one in ten people over age sixty-five and nearly half the people over eighty-five. About 4.5 million Americans have it, and taking care of them costs $100 billion a year. The number of patients is expected to grow, possibly reaching 11.3 million to 16 million by 2050.
But those projections about dementia do not include a possible increase from obesity-related diabetes. In fact, when scientists talk about the potential for those in middle age now to avoid the destructive deterioration of dementia and retain their high levels of cognitive function into old age, there’s often a caveat. That is, if current trends toward increasing obesity continue, then, as one nutritional researcher put it, “all bets are off.”
11 The Brain Gym
Toning Up Your Circuits
At first, the task seems as simple as a preschool lesson.
You’re sitting in front of a computer and the word
apple
flashes on the screen, then disappears. A minute later, the word
apple
appears again.
The question: Did you
see
that word before?
The next test is just as easy. This time, the computer
says
“apple.” Then you hear a string of random words: “lamp,” “pen,” “dog.” Then you hear “apple” again.
The question: Have you
heard
that word before?
No problem.
Now it gets dicier. The word
apple
either flashes on the screen or is read out loud. Then you read or hear a longer list of unrelated words. Then you hear “apple” again.
The question: Did you
hear
“apple” before, as you’re hearing it now? Or did you
see
the word? Your instructions are to push a button only if the word
apple
appears in the same form it had earlier.
Oh, dear. You know you’ve encountered
apple
before. But how? Did you hear it? Did you see it?
You have no idea.
And therein lies one of the most intriguing aspects of our brains. As we get older, most of us, even those in the early stages of dementia, still recognize the familiar. If we look at a word and then see that same word again, we say, “Aha!” Show us
apple,
then show us
apple
again, and, sure enough, we know our apples.
But take it a step further and, starting sometime in middle age, a subset of our memory can grow murkier. We’re certain we’ve come across something—or someone—before, but our recollection of
how
or
when
is lost.
This is that feeling you get at a party—the deep and usually accurate conviction—that you
know
someone, but you don’t know where you know him from. Is that guy your daughter’s soccer coach? The guy from church? The guy you met with the dog in the park? Did you hear “apple” or see
apple
? This is a task of memory but, again, memory in context that relies on a whole host of brain functions. We have to fire up the proper parts of our brain, recruit more brainpower if we need it, resist the urge to let our minds wander this way and that—and recall what—and how—we have seen or heard something.
Over the past few years—as it has become clear that memory itself is not one thing but many and that some parts age better than others—the question is, can we fix the parts that need a little help? Exercise and food may help. But can we also zero in on our brain’s weakest areas and, through training, buck them up?
It is, when you think of it, quite odd. If we have normal healthy brains in middle age, we don’t forget that we have a brother in Phoenix or that we once lived in California—basic autobiographical details stay with us. We build richer vocabularies well into old age, proving that even newly acquired knowledge can stick.
We also excel at basic recognition—“Yes, I’ve seen this word before”—and familiarity, a kind of recognition memory that’s so deep-seated we often mistake it for an emotion—“I
feel
like I know that guy.”
But other types of memory don’t weather as well. Memories for events—how or when something happened—grow hazier. Did I go to cousin Harry’s for Thanksgiving last year? Did I buy bread? Did I turn off the stove? Did I
see
or
hear
apple?
Often called episodic memory, this kind of recall is not as automatic as other types of memory. It takes more effort. We have to connect dots, put something in context, like remembering the sequence of a story. It takes a more sophisticated, wide-ranging neural machinery, and, for a variety of reasons, brains, as they age, start to balk.
At middle age, our brains are negotiating the world with finesse, but a few neurons here and there are feeling their years. So can we push those wayward brain cells back in line?
Using More of the Brain
Tricks work. There are a number of strategies aimed at improving overall basic memory functions. Lists are good and may be all some of us need. Studies show that adding contextual detail can help. If you’re trying to remember a word, for instance, you can ask yourself if the word is abstract or concrete. If you’re trying to register a face, you can focus on a visual detail—a large forehead or a crooked nose. Added information prompts more brain activity in more areas, more connections, and better recall later on.
Indeed, even our imaginations can be helpful. In one fascinating study, neuroscientist Denise Park and a colleague, Linda Liu, found that older people trying to remember to check their blood glucose levels at a certain time did considerably better by first
visualizing
themselves doing that chore. Those who spent three minutes every morning imagining themselves testing their blood sugar were 50 percent more likely to actually do the test later on in the day than those who used other strategies, such as actually practicing the action. They had, through visualization, created what Park calls a stronger “neural footprint” of what they wanted to remember.
Park suspects that using our imaginations may be effective as we age because, again, it relies on a part of our brain’s machinery—automatic memory, a more primitive part of memory—that does not decline quickly.
Given that, if we really want to keep our brains in better condition for the long haul, we may need to fundamentally change the way they operate. A small percentage of people may have no problems well into their seventies and eighties (and if that runs in families it could be genetic), but for most of the rest of us who do, we may need to coax our brains—shove them, even—back into younger or more efficient patterns.
At a lab at the University of Toronto, neuroscientist Nicole Anderson is trying to do just that. Using the latest knowledge about how brains work best as they age, she is teaching people to boost their performance by, again, using two sides of their brains instead of one—training them to be bilateral.
The idea here is that those who adapt and learn to use both sides of their brains, or call on their powerful frontal lobes more efficiently when necessary, will be able to stay in better cognitive shape. The questions are: Can those techniques be taught? And can they be taught to those in middle age and beyond in a way that will last?
Anderson is betting that the answer to both questions is yes, and so she is giving it a try. Day after day, men and women file into her Toronto lab and, sitting in front of computers, try to recall whether they’ve
heard
or
seen
various words, such as
apple, bucket, lamp
. As Anderson explained when I spoke with her before one recent lab session, this exercise specifically targets episodic memory, the ability to remember something in context. How did I encounter that word
apple
? Did I see it? Did I hear it?
To be successful at this, we need to recruit our most elite brain region, our frontal cortex. When we’re younger, we use only one side of our brain’s frontal areas to deal with such complex contextual information, such as how we encountered the word
apple
or how we know that guy at the party. But as we age, the best and brightest among us start to call on both sides to do this. These brains power up. They become bilateral, using more power to get the job done. Smart brains unconsciously adapt when necessary. They call in the reinforcements.
When I spoke with Anderson, she was in the middle of trying to train a group of adults to make such adjustments. As part of her study, which is ongoing, participants’ brains are scanned before the training and afterward. Anderson wants to see if those who initially do not use both sides of their brains learn, as they improve in the task, to do so. Can a brain be taught to use more of itself when necessary?
Obviously, this goes way beyond crossword puzzles. Rather, as Anderson explains, such wholesale brain rehab aims at the underlying processes and trains them.
“We think as people improve they’ll start to use more of their brains, that we’ll be able to induce a pattern of bilateralization,” Anderson said. “We are tapping into something different here. We’re trying to train the older brains to do this when they need to, to use their most appropriate and powerful mechanisms.”
Video Game Training
If the participants in Anderson’s study are explorers on the outer edge of brain enhancement, increasingly they have company. At a lab at Columbia University one recent morning, another brain explorer, Martin Goldblum, settled into a chair to play a video game called Space Fortress, which Columbia neuroscientist Yaakov Stern hopes will train older brains to retain, or return to, more efficient youthful patterns.
An artist, the sixty-six-year-old Goldblum usually doesn’t spend his days playing video games. But nevertheless, he was the lab’s superstar. As I watched the game, I was surprised that anyone could learn it well, let alone those in middle age or older. But Goldblum had mastered it.
“It’s challenging,” said Goldblum, a trim, youthful man dressed in jeans and a black T-shirt who was happy to chat as he mouse-clicked along in the game. “You have to multitask; you have to do a lot of different things at the same time—and there is an insistence to it all.”
To play, Goldblum grabbed a black joystick with his right hand and a computer mouse with his left. A large green hexagon flashed on the screen and there was a gun that would try to shoot down Goldblum’s spaceship, which he had to keep inside the hexagon as he shot down small asteroids that flew by. The game had dozens of rules pertaining to when you were allowed to shoot, how often, and where.