Can Personality Change Or Does It Stay The Same For Life? A New Study Suggests It's A Little Of Both


Does personality stay the same from birth for the rest of your life, or can it be changed? A new study tapping into 50 years of data suggests that it's quite possibly a blend of both.

For decades personality was considered as unmalleable as concrete – who you were at 15 is who you’d be at 75. But within the last 20 or so years, as cognitive and behavioral science have revealed dynamic insights about the human brain and corresponding behaviors, we’ve come to see personality as at least marginally changeable, and possibly much more so.

The latest study tracked personality changes over five decades, and the results suggest that while certain personality elements remain stable over time, others change in distinct ways.  In other words, personality is both relatively stable and changeable, and the degree of change is specific to each person.

The good news from the research is that for those of us who experience significant personality change, the shift is mostly in a positive direction.

“On average, everyone becomes more conscientious, more emotionally stable, and more agreeable,” said lead study author Rodica Damian, assistant professor of psychology at the University of Houston.

At the same time, people who were especially considerate, agreeable and emotionally stable at a young age were also more likely to be so later on.

"People who are more conscientious than others their age at 16 are likely to be more conscientious than others at 66,” said Damian.

The study mined data from the Project Talent Personality Inventory, a repository of personality data on more than 400,000 people (in total) gathered over a 50-year period.  The value of the findings comes from the expansive timespan, which allows researchers to measure changes in personality traits like conscientiousness, extroversion and neuroticism over time.

As to what influences personality stability or malleability, both genetics and environmental factors play lead roles, with previous research suggesting that each contributes equally to the outcome. The relatively new wrinkle in this understanding is epigenetic influence, in which genes for certain factors may be “switched on” by environmental influences.

The study also found that while some personality elements seem more gender-specific, women and men change at pretty much the same rates over their lifespans. Neither has an edge on “personality maturity” over time.

A big takeaway from the findings, the researchers emphasized, is that when it comes to personality change, we shouldn’t compare ourselves to others.  Your especially likable and gregarious friend in middle school is still probably going to be more likable and gregarious than most people you know in mid-life, so don't let the social mirror draw you into a comparison. What matters is how much you’ve changed – and that, according to this study, is very much a person-specific evaluation.

The study was published in the Journal of Personality and Social Psychology.

The newly revised and updated 2018 edition of What Makes Your Brain Happy and Why You Should Do the Opposite is now available. 

Posted on November 16, 2018 .

Stress Doesn't Only Affect the Minds Of the Stressed


Science just bolstered the wisdom that it’s never a good idea to bring your stress home with you. A new study suggests that stress changes certain brain structures, and those changes are mirrored in the brains of others. If that's true, the effect would help explain why we seem susceptible to “catching” another’s stress.

This was a mouse study, so the customary cautions about drawing too many conclusions for humans are warranted. But what makes these results interesting for us humans is that the same neuronal structures that were affected in the mice brains are also present in our brains, and a similar emotional contagion effect could be at play.

We know from past research, for example, that humans “spread” a grab-bag of emotions, including anger, fear and happiness (to greater or lesser extents, depending on the research you’re referencing), and there’s at least a strong theoretical framework involving specific brain areas to explain why this happens.

In the latest study, the researchers paired sets of mice together and then removed one mouse from each pair and subjected them to a mild amount of stress. They then returned the stressed mouse to the pair and observed the brains of both mice. The results showed that the stressed mouse experienced changes in a group of neurons located in the hippocampus, a brain area that plays a central role in memory and emotional response. The brain of the other mouse that hadn’t been stressed, but was now in the presence of its stressed partner, rapidly showed the same neuronal changes in its hippocampus. In effect, the brains of the unstressed mice mirrored the brains of the stressed mice.

"The neurons that control the brain's response to stress showed changes in unstressed partners that were identical to those we measured in the stressed mice,” said Toni-Lee Sterley of the Hotchkiss Brain Institute and the study’s first author.

The researchers think the mechanism behind this effect in mice is the release of a "putative alarm pheromone" from the stressed mouse that signals a response in other mice. That's not so surprising, since we know that animals give off a variety of cues, chemical and otherwise, that signal a response in others to avoid danger (think of a bird signaling to the flock to abruptly change direction). What's newer here, and the part of this study that may be revealing for humans, is an observation of changes in brain structure in response to stress that are then mirrored in other brains. How the signals are transmitted between human brains is still an open question.

"Stress circuits in mice and humans are very similar. In particular, the cells we investigated in mice play the exact same role in humans - they control the hormonal response to stress," senior study author Jaideep Bains, PhD told me in an email.

Bains says it's possible that chemical communication is also happening between humans. "Although pheromones or chemical signals are not widely studied in humans, there are recent observations suggesting that they transmit emotional information in subtle, perhaps even subconscious ways." 

The good news is that the effect seems reversible—at least for female mice.

The researchers noticed that when the female partners of the stressed mice were placed among other mice, the changes in the hippocampus were reversed.  Social interaction erased the brain-altering effects of stress, but not for the male mice. They instead held onto the stress and the brain changes accompanying it regardless of how many other mice they visited.

Again, with due caution about applying these results to humans, the researchers think there are some clues here that could help us develop more effective ways of treating stress.

"This (reversal effect) suggests that there are sex-specific differences that could be useful when thinking about approaches for treating stress disorders," added Bains. "What we can begin to think about is whether other people's experiences or stresses may be changing us in a way that we don't fully understand."

The study was published in the March edition of the journal Nature Neuroscience.

The newly revised and updated 2018 edition of What Makes Your Brain Happy and Why You Should Do the Opposite is now available. 

Copyright 2018

Posted on April 25, 2018 .

What Connects Altruists and Psychopaths: An Interview with Dr. Abigail Marsh, Author of "The Fear Factor"


The must-read brain books of 2017 included an exploration of an emotion rooted so deeply in all living species, it's arguably the most basic of emotions. The Fear Factor traces the origins of fear and its connectivity across species, most especially how it connects all of us—from the far poles of psychopathy to extreme altruism, and all points in between. I spoke with the author of The Fear Factor, Dr. Abigail Marsh, about the center of fear in the brain, whether humans are wired for altruism, psychopathic kids, and several other topics.

Your new book, The Fear Factor, traces a connection between two extremes that don’t initially seem to have much in common: altruism and psychopathy. Give us the 30-second version of what this connection is about.

My interest in studying psychopathy is in part due to my interest in studying the origins of care and compassion. Psychopathy makes a nice clinical case of what a brain that is lacking care and compassion looks like. This approach (studying people who lack some phenomena to understand that phenomena) is common—like studying people with amnesia to understand memory, or studying people with face-blindness to understand face recognition. In any case, we found that people who are psychopathic are reliably characterized by three traits: amygdalas that are smaller than average and less reactive to the sight of others' fear, and a reduced ability to recognize others' fear.

These issues may explain why people who are psychopathic don't respond compassionately to others' distress—the brain structure most important for recognizing and generating an emotional response to others' fear isn't working right. Conversely, we found that highly altruistic people are in some ways "anti-psychopaths"—their amygdalas are larger than average, more responsive to the sight of others' fear, and they are relatively better at recognizing others' fear. This may help explain their heightened compassion—their increased sensitivity to others' distress.

You mentioned a part of the brain that we hear much about in relation to fear, anxiety and stress – the amygdala. What makes this little brain area so central to so much of our emotional lives?

The amygdala is an ancient brain structure under the cortex that is involved in many processes but essential for only a few. One of them is fear responding. People whose amygdalas are damaged or absent are notoriously fearless. In addition, they have difficulty recognizing others' fear, or responding to it appropriately. They literally have difficulty empathizing with an emotion in others that they don't feel themselves. There is good evidence from animal studies that within the amygdala lie neural pathways that help us recognize when others are in danger and need help, and may also help generate the motivation to help them. This may be why people who are unusually callous in response to others' distress have amygdalas that are dysfunctional, whereas people who are unusually compassionate have amygdalas that are, in a sense, hyperfunctional.

There’s ongoing debate about whether humans are “wired” to be altruistic (and by extension whether there’s an evolutionary advantage to altruism). Some argue that what we think of as altruism is really just a form of selfishness, because being altruistic makes us feel better about ourselves or provides other self-serving advantages. What does your research lead you to conclude about altruism as a hardwired human trait, and whether it’s really just selfishness in disguise?

There is no doubt that humans are, as a species, wired with the capacity to care for others—to be altruistic. We need that capacity to motivate us to care for our children, as do all mammals, which is why all mammals seem to share common neurohormonal functions that motivate them to care for their babies for months or years at a time. But across species, or within a species, we see wide variation in caring behavior, probably due to variation in how this circuitry is functioning—both as a result of genetic variation and different life experiences (both are important). We also see wide variation in who gets cared for. Some mammals care only for their own babies; others, like most humans, provide care for anything that even faintly resembles a baby or communicates distress.

It's true that caring for those who are in need is reinforcing—just as successfully performing any goal-oriented behavior is reinforcing. That's a feature, not a bug. The philosopher Matthieu Ricard has argued, and I agree with him, that the fact that we find caring for others rewarding is proof that we are altruistically motivated—because if we were not, why would we find caring for others rewarding? People who are truly selfish and care only about feeling good tend to be psychopathic and antisocial. They are the opposite of altruists.

It seems like we’re endlessly fascinated with psychopathy–with what makes someone a psychopath and how we can recognize psychopathic traits in others (or ourselves). Your book places psychopathy in at least partly a biological light, as an almost predictable outcome of certain chemical processes playing out. What surprised you the most about what your research with psychopathic kids revealed about psychopathy? 

I was surprised both by how incredibly normal—sometimes supernormal—the psychopathic children I've worked with often seem initially. If I lined up all the children I've worked with who had psychopathic traits and mixed them in with a bunch of healthy children, you'd never know who was who. It's a good reminder that what's on the outside doesn't always line up with what's on the inside. I was also surprised by how paradoxically optimistic working with them made me feel about human nature more broadly. Working with children who are clinically lacking in care and compassion highlights how not-normal that is—highlights how truly capable of care and compassion the average person really is, even if we don't always express it as well as we could.

And I agree with the 'almost predictable' part. Nearly every human trait there is varies continuously among people. Some people are more extraverted and some more introverted. Some have more self-control, some have less. So if compassion is a typical human trait, there will inevitably be some people who have very little of it and some who have a lot. I should emphasize that that's not to say that any of these traits are fixed! Our brains and minds are capable of growth and change throughout life.

You discuss several examples from nature that defy our expectations about predatory behavior (like a lioness caring for a baby baboon as if it was her own), but make sense in light of your book’s thesis. What do these examples tell us about our neural similarities with other species?

These studies reinforce the fact that there is nothing inherently contradictory about a single species, or even a single individual, having the capacity for both care and aggression. Lions are wonderful mothers who will, as you allude to, care tenderly not only for each others' babies but occasionally babies of others species like baboons and antelopes who trigger their maternal response. But of course they are also fearsome hunters. So are they really savage and fierce or really caring and tender?  They are both. There is no need to choose. Humans are no different. And much of the same neural hardware supports tender maternal care in every mammalian species, humans and lions included, which shows how deeply rooted our capacity for care is.

What does your research lead you to believe about the human capacity to change–to become more caring, more giving and less self-centered?  Is there a workable path forward for us to become a better species, or are we hemmed in by our biology?

We clearly have the capacity to become more caring, because we are becoming more caring. Research I did for the book found fascinatingly similar positive trendlines for various kinds of caring behavior—particularly altruism toward strangers—all around the world. Donating blood, donating money, volunteering, helping strangers in everyday ways—all of these behaviors are becoming more common, not less (despite what we see in the news). Preliminary evidence suggests that what supports these changes are widespread changes in flourishing and well-being that promote increasing valuation of strangers' welfare. There are also a number of lab studies showing that compassion can be increased at the individual level via a variety of methods, including compassion meditation and even, maybe, reading literature.

What’s the most important takeaway you’d like people to receive from reading The Fear Factor

That the capacity for both compassion and cruelty are deep seated parts of human nature. Both are equally real and important, and both have clear biological origins. Studying the brain basis of these phenomena has helped illuminate for me what it means to be human, and also how we may be able to make humans better.

The newly revised and updated 2018 edition of What Makes Your Brain Happy and Why You Should Do the Opposite is now available. 

Copyright 2018

Posted on April 18, 2018 .

The Good and Bad News About Coffee


Health research on coffee swings back and forth between good and bad news more frequently than almost any other topic. When you hear about one study claiming health benefits while another harps on a list of negatives, it’s easy to get jaded and stop paying attention. To bring some focus, here’s a summary of both good and bad findings from a selection of coffee studies, with some perspective about why these findings are worth the time.

The good news: Coffee seems to help prevent an early demise.

Let’s start with the big one: the latest round of research suggests that drinking a few cups of coffee a day is associated with a decreased chance of early death from several causes. Important to note, none of these studies prove that coffee extends life. These are observational studies that found correlations between drinking between two-to-four cups a day and lower mortality. The reasons why are debatable. It could be coffee’s high concentration of antioxidants providing cells protection from oxidative stress and inflammation, or it could be reasons that haven’t been uncovered yet. Whatever the reasons, enough of these studies have found similar enough results that they’re worth the attention.

The bad news: Coffee can cause insomnia.

Another health topic that gets a fair amount of press is sleep, and when it comes to sleeping well, caffeinated coffee isn’t our friend. At least not if you’re drinking it later in the day. The rule of thumb is to avoid anything with caffeine after about 2 pm, because it’s a deceptively enduring chemical. The half life of caffeine is about 6 hours, which means it takes 6 hours to eliminate about half of the chemical from your system. Hence, drinking coffee later in the day is strongly linked to insomnia, which is in turn linked to a list of health negatives. Keep the coffee for the morning unless you’re drinking decaf, but even then make sure your decaf is truly decaffeinated (because often it isn’t).

The good news: Coffee may preserve your liver.

A recent study found a correlation between drinking both coffee and tea and a healthier liver. Again, this was an observation across a span of data and not cause-and-effect proof, but it’s a decently strong correlation. The reason why isn’t well understood, but both coffee and tea contain a wealth of compounds with tissue-protecting effects, and the liver—the body’s central filtration system—may benefit from these compounds charging through our bloodstream.

The bad news: Coffee can trigger anxiety.

It’s a little unfair to blame coffee for this, since anything with enough caffeine could do it, but coffee is the way most of us get our daily jolt, and caffeine is propane gas for those suffering from anxiety. The effect is twofold: there’s the immediate trigger, and then there’s the longer-term, slower-burn trigger. The second effect is of particular concern because it hangs around for as long as caffeine remains in your system (which is more or less all the time if you’re a daily coffee drinker). Caffeine seems to decrease levels of GABA, the neurotransmitter that helps regulate anxiety, and it amplifies the effects of our two main stress hormones, cortisol and epinephrine. Bottom line: anyone with an anxiety condition should sip judiciously and infrequently.

The good news: Coffee may provide protection against diabetes.

Another finding in the “coffee protects…” category shows that daily consumption could decrease the risk of developing type 2 diabetes. It’s hard to identify exactly why, but the compounds in coffee seem to protect cells from the accumulation of toxic proteins that play a role in the onset of diabetes.

The bad news: Coffee may induce your craving for sweets.

A new study revealed that caffeine changes our taste perception, making sweet things seem less sweet. The twist is that this subtle change may result in craving more sweets. This finding explains why coffee goes so well with donoughts and pastries, and serves as a caution light when you’re sipping hot java with a plate of baked goods in front of you.

The good news: Coffee may prevent dementia.

A few studies suggest that three or more cups a day may ward off cognitive decline leading to dementia. The reason in this case could be caffeine itself, but more likely it’s caffeine working with coffee’s host of other compounds (like polyphenols), because comparable results weren’t found with the same amount of tea. A notable study in this category found that coffee may even delay the onset of Alzheimer’s; the reason may be similar to why coffee protects against diabetes, by preventing the accumulation of toxic proteins.

The bad news: Coffee can make GERD worse.

If you suffer from gastroesophageal reflux disease (GERD), it's a good idea to avoid coffee or at least limit your intake. Coffee stimulates the secretion of gastric acid, which can bring on the heart burn even if you don't have a bad case of GERD, and if you do, it's going to be much worse. 

The good news: Coffee may improve memory.

A number of studies have linked coffee with improved memory, mainly because caffeine is a mental acuity enhancer. And the really good news is that this effect may not only be short-term; some research suggests that it lasts much longer.

And some bonus good news: Coffee can make you happy.

Let’s end with one that’s both scientifically valid and anecdotally true – a major reason why we like to drink coffee is that it’s a potent psychoactive brain stimulant. Drinking it elevates mood and bolsters energy, at least for a little while, and it doesn't hurt that it's also delicious. In other words, drinking it makes us feel happy. Do we really need a better reason than that?

© David DiSalvo

The newly revised and updated 2018 edition of What Makes Your Brain Happy and Why You Should Do the Opposite is now available. 

Posted on January 1, 2018 .

Why Walking is Better Brain Medicine than Anything You'll Find on the Supplement Shelf


Sometimes science jibes with ancient wisdom on simple but deceptively powerful things. Case in point: walking. A wealth of research bolsters the Zen of putting one foot in front of the other, with stronger science than any supplement marketer or brain trainer could hope for. Walking is potent mood medicine that enhances thinking, sharpens memory and safeguards brain health. Here are six reasons why you should make it a regular part of your day if you're able.

1. Walking boosts your mood even when you’re not expecting it. In a recent study, researchers conducted three experiments on hundreds of college students to find out if they’d experience a positive mood boost while walking, without knowing that walking could be the reason. The researchers disguised each experiment as an alleged test of something else, all the while tracking mood changes linked to the simple act of taking a stroll. They found that just 12 minutes of walking resulted in an increase in joviality, vigor, attentiveness and self-confidence versus the same time spent sitting.

2. Walking enhances creativity, especially when you’re seeking a solution. A Stanford study found that walking increased creative inspiration by an average of 60% versus sitting. The effect was evident while and shortly after walking anywhere between five and 16 minutes. The enhancement was specific to a flavor of creativity called “divergent thinking,” defined as a thought process used to generate creative ideas by exploring many possible solutions.

3. Walking sparks connections between brain cells. Never underestimate the power of walking when it comes to sparking communication between neurons and improving brain health. Such were the results of a study on older adults that included walking along with other forms of exercise, finding that:  "One year of walking increased functional connectivity between aspects of the frontal, posterior, and temporal cortices within the Default Mode Network and a Frontal Executive Network, two brain networks central to brain dysfunction in aging."

4. Walking improves working memory. If you want to sharpen your recall, lace up and hit the sidewalk. A German study found that performance on challenging working memory tasks improved for participants allowed to walk at their own steady pace, as opposed to a slower pace set by the researchers. The results were more pronounced for the youngest of the study participants, but everyone’s working memory improved enough to give walking yet another smiley face.

5. Walking yields the right rhythm for thinking. One of the more intriguing areas of walking research delves into effects of its steady rhythm on how we think. Studies have examined everything from the brain-spinal-cord connection with respect to this rhythm to the interplay of neurological function, biomechanics and the forces of gravity. The bottom line here is still equal parts science and intuition, but all signs point to walking inducing the right rhythm for getting thinking done.

6. Walking is a powerful way to mainline nature. Research supports what we already intuitively know: regularly spending time outside is a brain elixir like few others. And when you pair walking with your nature boost, it’s a win-win, not to mention a pleasant way to break up the tyranny of sitting. If you don’t have scenic horizons waiting outside your door, no worries–any outside time featuring a steady pace and a little green will work.

You can find David DiSalvo on TwitterFacebookGoogle Plus, and at his website

Posted on January 2, 2017 .

Why Breaking Habits is Even Harder Than We Think

We tend to think that habits like eating junk food are a matter of willpower–too little willpower, too much sugar and fat in our diets. But the more we learn about the malleability of the brain, the more we know that “willpower” is a far too easy explanation for what’s really going on. The truth is that habits change how our brains work. What begins as a behavior morphs into changes in brain circuitry, and with repetition and time those changes strengthen and endure.

A new study by Duke University researchers helps clarify the matter by showing how a sugar habit changes specific brain circuits, and how those changes produce cravings that reinforce the habit.

The research team began by getting a group of healthy mice hooked on sugar. Similar to classic studies on drug addiction, the mice in this study were trained to press a tiny lever to receive doses of sweets. Once the mice were hooked, they continued pressing the lever even when the sweets were removed. So that was step one, establishing a behavioral pattern to get the goods.

Then the researchers compared the brains of the sugar-dependent mice to another group of healthy mice not hooked on sugar, focusing particularly on the network of brain areas known as the basal ganglia. In studies of drug addiction, electrical activity in this brain network is traceable along two main pathways: one that carries a “go” signal, which triggers action to pursue the object of the addiction, and one that carries a “stop” signal, which puts the brakes on the pursuit. It's the give and take between these pathways that regulates how we pursue just about anything (food, sex, goals, name it).

In the brains of the mice hooked on sugar, the researchers found stronger, more active go and stop signals than in the mice not hooked on sugar. And they found that in the sugar-dependent group, the go signal consistently appeared before the stop signal. In the non-dependent group, the stop signal appeared before the go signal. In other words, the brains of the sugar-hooked mice lost the braking capacity to regulate their behavior.

These brain circuitry changes were so strong, the researchers could predict which mice became hooked on sugar just by examining samples of their brains without knowing which mouse group the sample came from.

What this tells us is that once a behavioral pattern was established and reinforced (pressing the lever to get more sugar), the brains of the mice changed in a significant and enduring way. Both “stop” and “go” signals in their basal ganglia networks became hyper-charged, but they also flipped, with the go signal bumping out the stop signal.

The result of those changes is that the mice continued pursuing sugar even when it was gone. The circuitry changes in their brains produced a strong signal to get more sweets. We humans call that signal a craving. Filling the craving reinforces the very system that’s propelling the brain to want more.

“One day, we may be able to target these circuits in people to help promote habits that we want and kick out those that we don’t want,” said the study’s senior investigator Nicole Calakos, M.D., Ph.D., an associate professor of neurology and neurobiology at the Duke University Medical Center.

The study was published in the journal Neuron.

You can find David DiSalvo on TwitterFacebookGoogle Plus, and at his website

Posted on March 13, 2016 .

Why Your Brain Doesn't Care If You Have Thousands of Facebook Friends


Your Facebook page may boast a small army of friends, but to your brain it's the same as it ever was.

Back in the early 1990s, when the web and social media were still in their infancies, a psychologist named Robin Dunbar uncovered something intriguing about brain size and structure. He found that the size of the most recently evolved part of the brain, the neocortex, correlates closely with the relative size of social groups. First he identified the correlation in primates, then reasoned that the same should apply to humans. He was right. In humans throughout history, he observed, the sweet spot for social group size has consistently been between 100-200 relationships.

Thus, what we know as “Dunbar's Number” was born: the human brain seems calibrated to effectively handle around 150 relationships of varying depths (some shallow, some strong).

But our social media experience seems to contradict Dunbar’s findings. As tools for extending our relationship reach, Facebook and other sites theoretically allow us to leverage our limited resources to maintain more relationships in less time. Perhaps technology has enabled us to turn a corner, cognitively speaking, and go beyond the constraints that have kept human social groups relatively contained for centuries.

Well, who better to test that theory than Robin Dunbar himself?  To do so he relied on a survey from an unlikely source, the Thomas J. Fudge biscuit company in Britain. The survey was well-structured, randomly sampling 2,000 regular social network users across the country and another 1,375 who weren’t necessarily regular social media users (the second group was more representative of the population overall). Survey questions delved into how many friends respondents had on Facebook and the relative closeness of those friendships.

Post analysis, the results showed that despite the numbers on our Facebook pages, we modern humans are much like humans of centuries past. The true numbers of friendships across the studied groups was still between about 150 and 180 people. Respondents in the most active social media group said that on average only about 28% of their Facebook friends could be considered “genuine." And drilling down further, respondents confirmed that really only about 15 of those people could be called “close friends."

The bottom line: no matter what social media numbers publicly portray, our brains still hue to a relatively constrained number of relationships, much as they always have.

Dunbar commented in the study: “The fact that people do not seem to use social media to increase the size of their social circles suggests that social media may function mainly to prevent friendships decaying over time in the absence of opportunities for face-to-face contact.”

The study was published in the journal Royal Society Open Science.

You can find David DiSalvo on Twitter @neuronarrative and at his website

Posted on February 14, 2016 .

Your Brain in the Zone: An Interview with Dr. Barry Sears


Twenty years ago this year, former Massachusetts Institute of Technology researcher Dr. Barry Sears published Enter the Zone: A Dietary Road Map, a book that helped change the discussion about diet and health. Against the established nutritional powers--still clutching the "high carb, low-fat" mantra--Sears championed what he called an "anti-inflammatory" dietary approach to undermining obesity and a host of chronic diseases, including diabetes, cardiovascular disease and brain-based diseases. At the time, little was being said about the connection between diet and diseases like Alzheimer's and Parkinson's, but Sears pointed to trends in the modern American diet that were fueling the expansion of these and other conditions.

Considering that cases of diabetes have risen nearly 180% in the last 30 years, and both the incidence and severity of new Alzheimer's cases have skyrocketed in the same period, Sears's points are well worth revisiting. I corresponded with him recently about his latest book, The Mediterranean Zone, and his evidence-based position on what our diet is doing to our bodies and brains.

DiSalvo: In your past work and your latest book you say that there is a direct link between a series of dire health conditions, two of which are already epidemics—obesity and diabetes--and a third that is an epidemic in the making: Alzheimer's. What’s the line connecting the three? 

Sears: The linkage between all three chronic conditions is increased inflammation in the adipose tissue, pancreas or the brain.  What you see is the movement of cellular inflammation initially from the adipose tissue to the pancreas and then to the brain, respectively.  This is similar to the metastatic spread of a cancer.  Virtually all chronic disease involves increased inflammation.

Most of us are familiar with "inflammation" as it relates to injuries, but here you’re talking about a particular sort of inflammation that you argue plays a major role (albeit unseen) in the health of our bodies and brains.    

There are two types of inflammation.  The first type is classical inflammation, the kind that hurts. That’s typically why you go to a doctor.  The other type is below the perception of pain. This is cellular inflammation.  Initially it causes disruption of hormonal signaling in the cells.  However, since there is no indication of its presence, it will continue causing increased cellular damage until there is enough accumulated damage that you can call it chronic disease.  It could be obesity, diabetes, heart disease, cancer, or Alzheimer’s, but they are all ultimately caused by cellular inflammation.

Is this type of inflammation reversible?

The best approach to increased inflammation is to follow an anti-inflammatory diet.  This is one that maintains a balance of low-fat protein, low-glycemic carbohydrates (i.e. fruits and vegetables), and moderate amounts of fat that are low in both omega-6 and saturated fats (both of which can increase cellular inflammation) at every meal.  This is because the hormonal responses of any meal will last only five hours.  Such an anti-inflammatory diet can be supplemented with anti-inflammatory supplements rich in either omega-3 fatty acids and polyphenols (the chemicals that give fruits and vegetables their color) and ideally both.  Clinical data suggests that cellular inflammation can be rapidly reduced with such a dietary approach. The secret is to follow it for a lifetime.

In your latest book you talk quite a bit about the distinction between omega-6 and omega-3 fatty acids, and the problems associated with an imbalance between the two. What is the main issue here?  

Omega-6 fatty acids are the building blocks to make pro-inflammatory hormones, whereas omega-3 fatty acids are the building blocks to make anti-inflammatory hormones.  You need a balance of both to maintain a healthy immune response.  However, an excess of omega-6 fatty acids or a deficiency of omega-3 fatty acids causes an increase in cellular inflammation.  This increase in cellular inflammation is accelerated in the presence of elevated insulin levels which comes from a high-carbohydrate diet.

Give us a sense of the imbalance in terms of how much omega-6 we’re consuming.

The average American now consumes 7-8% of their total calories as omega-6 fatty acids.  This is nearly a 400% increase in the past century.  A hundred years ago the average omega-6 to omega-3 fatty acid content of the American diet was about 2:1.  Today, it is closer to 20:1.

And you say in your recent book that fast food is a major source of the imbalance.

The typical fast food meal has an even higher level of omega-6 fatty acids than grocery-store bought foods. That’s because they are the cheapest source of calories known and make any food taste better (especially fried chicken or French fries).  Industrialized beef, chicken, and pork products are also rich in omega-6 fatty acids as they also drive the fattening process.

I've read research suggesting that a super saturation of omega-6 fatty acids in our diets may be linked to psychological-emotional issues, even higher rates of violence. Do you think there's credible evidence for this?  

The answer is yes from both epidemiological studies and intervention studies using high-dose omega-3 fatty acids (especially in the treatment of depression, ADHD, and anxiety).  I believe the increase in the cellular inflammation in the brain is causing disruption of neurotransmitter signaling patterns, thus preventing the transmission of the appropriate signals to the interior of the nerve cell.

Many people think that eating more fish, like salmon, will boost omega-3 levels, but you’ve said that there are problems with this approach.    

First, fish are being hunted to extinction.  Second, all fish are contaminated by pollutants we have put into the environment over the past two generations.  These include mercury from burning coal to industrial toxins such as PCBs, dioxins, and flame retardants.  All of these toxins are known neurotoxins and carcinogens.  Third, most people like lean fish, which means they are low in omega-3 fatty acids.  Salmon is the only popular fish that is rich in omega-3 fatty acids.  Finally, only the Japanese eat enough fish to maintain an appropriate balance of omega-6 to omega-3 fatty acids in their blood.

The solution is to use highly purified, omega-3 concentrates derived from anchovies and sardines to reduce cellular inflammation.  The first reported use of fish oil to reduce inflammation was reported in 1989 by Harvard Medical School investigators in The New England Journal of Medicine.  Fortunately, omega-3 fatty acid concentrates today have a much higher purity and potency than those used in 1989.

What about farmed fish, particularly the restaurant standbys – farmed salmon and tilapia?

Farmed tilapia has very high levels of omega-6 fatty acids and very low levels of omega-3 fatty acids.  This is because the tilapia can grow on cheap vegetable oils instead of more expensive crude omega-3 fatty acids needed for farmed salmon growth.  This was discussed in a 2008 article in the Journal of the American Dietetic Association.  In fact, the levels of omega-6 fatty acids in farmed tilapia were similar to eating the same amount as found in a hamburger that you might find in a fast food restaurant.

Farmed salmon have higher levels of omega-3 fatty acids because they require omega-3 fatty acids for growth.  However, the crude fish oil used in the farming is rich in PCB so that the PCB levels of farmed salmon are about five times that of wild salmon. However, with the increasing price of crude fish oil, the levels of omega-3 fatty acids in farmed-raised salmon are dropping as cheaper alternatives are being used.  These include vegetable oil (2/3 of oil used in Norwegian farmed-raised salmon is now vegetable oil rich in omega-6 fatty acids) as well as algae, insects, barley protein and trimmings from seafood processing plants.

If someone wants to tackle inflammation and turn things around, where do they begin?  

Try to remember what you grandmother told your parents. Eat small, but balanced meals (at least in terms of calories) throughout the day, never consume any more low-fat protein at a meal than you can fit in the palm of your hand, and never leave the table without eating all your vegetables. Her final advice was to take a tablespoon of cod liver oil (rich in omega-3 fatty acids) before you left the house.  Who knew she was at the cutting-edge of 21st century anti-inflammation nutrition?  If a person wants more detail on modern anti-inflammatory diets, then I would recommend going to

You can find David DiSalvo on Twitter @neuronarrative and at his website

Posted on June 28, 2015 .

Stress Makes You Want It, But Will You Enjoy It?


Life is lived in loops. Here’s one you may know: we experience stress; to relieve the stress we do something pleasurable; when that pleasure exhausts itself, we experience more stress. Sound familiar? Psychologists tell us that when we run in this loop long enough, we may encounter something called “anhedonia” – the inability to experience pleasure from things we’d normally enjoy. Does binging on chocolate do it for you? Binge long enough and it probably won’t.

To this stinging realization we can add another, and—apologies ahead of time—it’s also a bit of a pill. Researchers have shown that not only does stress predispose us to wanting pleasure, it makes our desire for it drastically out of proportion to our enjoyment. The reward never reaches the level of our want.

To demonstrate this, researchers recruited two groups of study participants—all of whom were chocolate lovers—for some fun with water and sweets. (The chocolate-lover part will make sense in a moment.)

Members of the first group were made to experience stress by holding their hands in ice water (a well-tested means of inducing stress in psych research) while they were observed by the researchers. Those in the other group placed their hands in lukewarm water. After a little while, both groups were told to squeeze a handgrip that, they were instructed, would give them a nice stout whiff of chocolate.

As you might predict, the stressed group squeezed considerably harder for their chocolate – three times harder. Having received their dose of reward, the groups were then asked to rate their satisfaction. You might think that the group desiring the chocolate with three times the intensity would rate it proportionally higher, but in the end the groups’ ratings were really no different. Using stress to spike desire did nothing to increase enjoyment.

Quoting Dr. Tobias Brosch, professor of psychology at the University of Geneva and one of the study’s authors: “Stress seems to flip a switch in our functioning: If a stressed person encounters an image or a sound associated with a pleasant object, this may drive them to invest an inordinate amount of effort to obtain it.”

Which explains why stress is a consistent trigger in everything from failing to stay on diets to addiction relapses. Whatever the object of our desire, feeling stress makes us think we need it like we need air to breath.

The same dynamic has held true in animal studies with our friends the rats. It seems that if you get rats hooked on cocaine, and then get them clean from their addiction, you can induce a rapid relapse back into addiction by stressing them out with icy cold water. The same principle applies to us: stress is the trigger that makes us want “it” more.

One takeaway: keep a close eye on how you are reacting to stress -- and the earlier the better. Your best chance of affecting your "loop" is well before you've reached the point of smoldering hot desire.

The latest study was published in the Journal of Experimental Psychology: Animal Learning and Cognition.

The newly revised and updated 2018 edition of What Makes Your Brain Happy and Why You Should Do the Opposite is now available. 

You can find David DiSalvo on Twitter @neuronarrative.

Posted on March 17, 2015 .

Five Minutes with Author Dan Ariely on How to Manage Your Time

If you’ve spent any time in the psychology and self improvement sections of any bookstore, you know that Dan Ariely quite literally wrote the book on human irrationality. With bestsellers like Predictably Irrational and The Honest Truth About Dishonesty, Ariely is a go-to source for knowledge about why we do what we do, even when doing it just doesn’t make much sense.

Now Ariely, professor of psychology and behavioral economics at Duke University, has turned his attention to another topic that vexes the best of us: time management. He’s part of a team that created a smartphone app called Timeful, which I’d describe as an intuitive self-management tool that goes a few steps beyond typical schedulers and time planning apps. I recently spent some time chatting with Ariely about time management and related topics.

DiSalvo:  Time management -- we’re obsessed with it, yet we generally seem really bad at mastering it. Where do you think people often fail in getting it right?

Ariely: I think we are obsessed with time management precisely because we are so bad at it, but the reality is it’s no wonder we are bad at it because it’s a really, really hard thing to do.  Not only is it hard to manage multiple things, but you also have dynamic changes throughout the day in which you have some hours where you are more alert and have high cognitive capacity and some hours where you are more tired. And then things change dynamically like deadlines or additional requests come in, so the prioritization problem is incredibly hard. In fact, I think it’s not humanly possible. So what do we do? We do our best. We say “Ok, what’s the next best thing to do?”

And is technology helping or hurting us in this regard?

Technology is actually making things worse. Take a To Do list for example. I personally have over 729 things on my list in Evernote, and one of those To Dos is to go in and organize all of other things and figure out what I should do and what I should abandon. The problem is that it’s such a big task, it’s probably not going to happen.  The world is rich, there are lots of things we can do, and we have an easier and easier time of putting things on our plate, and searching through all of those things and figuring out what is the next best thing to do is incredibly tough. We do a terrible job at it and the consequences are that we are stressed, unhappy and not as productive as we could be.

Which explains why when I talk to people about how they manage their time, the overriding vibe I get is stress. Figuring out how to make best use of time is stressful, and dwelling on that, it seems to me, can become its own time-consuming monster.  

If you were a farmer and worked from sunrise to sunset, and farming includes very basic things to do with no real questions, life would be very simple. But we live in an incredibly wonderful age with lots of things vying for our time, more than we can handle, and on top of that we aren’t limited to sunrise to sunset. All of this richness, while wonderful, also creates very strong constraints.

This is why we started Timeful. Imagine that your life is like a factory with a productivity function – we need to figure out, from all of the jobs that can be done, which are the ones that are most productive to do right now.  This depends on deadlines, and reporting requirements, and whether you need maintenance or need to be reinvigorated, etc., and once we understand all of those factors we can do a much better job of managing them. The stress is inevitable because of the richness of our lives, and our challenge is to harness technology to help us figure it out.

I’ve heard you say, in so many words, that our sense of the world “not being on our side”—not acting in our best interests—is sort of correct. Our attention is flooded with distractions on all sides. How do we defend our attention, and our time, against these unrelenting forces?

It’s true that every organization wants our attention. Not only do they control our shopping environment, but they control our phones, (just think about every app that’s trying to get your attention), and they compete. Sometimes this competition yields improved results, but sometimes it creates negative outcomes. It’s really all about trying to understand the “attention economy.” And, of course, right now Android and Apple control the rules for the attention economy to a large degree. Sure we can turn things off ourselves, but we still have a lot of apps trying to get us to do different things all the time.

We don’t always understand how limited our attention is, or how when we switch attention things become even harder. For example, when you check your email, after you finish reading it and go back to your work it takes an extra 15 minutes before you can actually focus again. Shifting of attention to a different mindset is not quick. This is something we need to understand.

Your app, Timeful, is easy to use, straightforward, and integrates well with other programs. My question (to myself and now to you) is, what’s going to make me want to use it more than I did the other apps that I started and then eventually stopped using? 

The question is whether the app would give you sufficient benefits, which are basically to take scheduled things away from people’s lives. There are lots of things that don’t fit the characterization of the calendar. For example, if you have to do laundry, you might need to do it any day this week, sometime in the evening when you have time, but it doesn’t have to be on any particular day from 7-9. So it’s useful to be on your list of things to do and the app can suggest when you should be doing it. As we learn more, we’ll be able to be more helpful in suggesting and making peoples’ lives less stressful.

One of the features that we find most useful is called Good Habits. As an example, you can enter that you want to run three times a week and call your mother once a week and maybe do a five-minute meditation. After setting it once, we can recommend when you should do it depending on your schedule.

Since we’re not far into the New Year, give me your thoughts on New Year’s resolutions. Worth it?

I think New Year’s resolutions are great. You know, we tend to think of ourselves in binary terms. Either good or bad, and once we start being bad we figure what the hell, we might as well enjoy it.  If you think about dieting, for example, someone will be on a diet and then eat a muffin and say “Oh well, I’m not really a dieter, I might as well enjoy it.” New Year’s gives you a chance to start with a clean slate.

The real issue is how can we create rules that we won’t break too quickly? How can we make the rules specific enough that we’ll follow them? A rule like “going on a diet” isn’t helpful because it’s too general. We need something specific like “no dessert during the week” or something along those lines. And then, how do we make the rules not too restrictive so that when we do break one occasionally it doesn't collapse us into bad behavior? These are all things we need to keep in mind to make the most of our resolutions.

You can find David DiSalvo on Twitter @neuronarrativeForbesTIME and Psychology Today. His latest book is Brain Changer: How Harnessing Your Brain’s Power To Adapt Can Change Your Life.

Posted on January 17, 2015 .

Deconstructing the Power of Overconfidence

Overconfidence. The word itself is irritating. Knowing that it often works is infuriating.

We’d prefer to believe that, like pride, overconfidence will reliably trip and stumble its way to a predictable fall and clear the path for those with level-headed confidence, gilded with humility, to climb the ladder. But that isn’t how it even usually goes.

Psychology research has often asked why that is, and churned up a few possible answers. A 2012 study concluded that even when overconfidence produces subpar results, its charm still wins the day. We might expect someone with more confidence than ability to underperform when pressed. The study tested that expectation and found it more or less accurate—but also found that it really doesn’t matter. Overconfidence may not deliver when objectively tested, but it has a knack for seducing people to such a degree that they ignore the results in favor of keeping the golden child on a pedestal.

If you had to isolate why this happens, it appears to come down to a matter of status—a commodity that overconfidence is expert at creating and nurturing. When managed well, the social status conferred by overconfidence has an aura just shy of magical, capable of keeping our attention diverted from measurable results.

That’s a jarringly paradoxical conclusion when you consider the average person’s gut reaction to "that overconfident jerk." How can we be both repulsed and seduced by the same thing? The question gets stranger in light of another study showing how even rudeness gets a pass if a person’s overconfidence has already alchemized sufficient status.

In one of the study’s experiments, participants watched a video of a man at a sidewalk café put his feet on another chair, tap cigarette ashes on the ground and rudely order a meal. Participants rated the man as more likely to “get to make decisions” and able to “get people to listen to what he says” than participants who saw a different video of the same man behaving politely. Through a few other experiments in the study, the same results prevailed—people tended to rate the rule breakers as more in control and more powerful than people who toed the line.

And what’s the essential ingredient in believing oneself to be above the rules? Overconfidence, of course. (This may also help explain why rude sales associates outsell others at luxury stores.)

Those studies circle the question of why we’re prone to falling for the chutzpah of overconfidence, but say little about why the overconfident are so good at pulling it off. The most recent study on the subject has an answer that’s not likely to lessen our irritation, but, irritatingly, it makes some sense, and can be summarized like this: Belief sells, whether it’s true or not. In the case of overconfidence, the belief in one’s ability—however out of proportion to reality—generates its own infectious energy. Self-deception is a potent means of convincing the world to see things your way.

Participants in the study (a group of college students) were asked to rate their own and their peers' abilities at the beginning of a 6-week course. About half were under-confident and a little less than half were overconfident (with the small balance of students accurately judging their abilities). They were then asked to reassess at the end of the course, after everyone had a chance to perform and real results were there for all to see.

The study showed that at the start of the course, overconfident students received higher peer ratings—and at the end of the course, regardless of how well or poorly they did, the overconfident students were still rated higher than others. In effect, the students' actual performance hardly mattered compared to the seductive signals sent by those believing they were the best.

This study added the wrinkle of gauging participants’ level of self-deception about their abilities, and found a strong correlation between the social sway of overconfidence and depth of self-deception. As the researchers summed it up, “Our findings suggest that people may not always reward the more accomplished individual but rather the more self-deceived.”

We may not like that conclusion, but it’s difficult to argue that it isn’t in evidence around us every day. People who don’t believe in themselves—whether that belief is well-grounded or not—aren’t likely to convince others to buy in. That’s partly what the psychological dynamic of self-efficacy is about: If you expect others to think you’re capable, you’d better believe it yourself. That's as true for healthy confidence as it is for its inflated alter-ego.

What the latest study and elements of the others are telling us is that self-deception is an especially potent brand of status fertilizer. When packaged with personality, it makes others want to believe even when the results would counsel otherwise. And though that doesn’t make the topic any less infuriating, it does throw some light on what makes overconfidence effective, despite its reputation.

The latest study was published in the online journal PLOS One

You can find David DiSalvo on Twitter @neuronarrativeForbes, TIME and Psychology Today. His latest book is Brain Changer: How Harnessing Your Brain’s Power To Adapt Can Change Your Life.

Posted on November 15, 2014 .

Your Profile Photo is a Liar


Be honest. Do you draw conclusions about someone based on his or her profile photo?

Whether it’s on dating websites or Facebook or any other social media venue, the power of a single photo is immense. Looking at a person’s profile photo, we develop first impressions that frame the rest of what we see and read.

Psychology researchers want us to know something about our profile photo-centrism: it’s a lie, and it’s leading us to draw conclusions that likely have zero basis in reality.

"Our findings suggest that impressions from still photos of individuals could be deeply misleading," says psychological scientist and study author Alexander Todorov of Princeton University, an expert on the personality dynamics underlying first impressions.

Todorov and his research team conducted a series of studies to demonstrate how easily swayed we are by profile photos, and how even slight variations in photos can significantly change our opinions of a person’s personality.

Researchers asked participants in an online survey to view and rate headshots on personality characteristics, including attractiveness, competence, creativity, cunning, extraversion, meanness, trustworthiness, and intelligence.  The photos were all taken in similar lighting, but headshots of some of the people were varied to show slightly different facial expressions.

The results showed that participants’ personality ratings of these slightly changed facial expressions varied just as much as their ratings of different people.  In other words, virtually any change in photos of the same person altered personality impressions as much as viewing photos of different people.

In another study, the researchers asked participants to rate headshots shown in different contexts. The results in this case showed that participants’ ratings changed solely based on which context the photo appeared in.  According to the research team, “(participants) tended to prefer one shot of an individual when they were told the photo was for an online dating profile, but they preferred another shot when they were told the individual was auditioning to play a movie villain, and yet another shot when they were told he was running for political office.”

The studies also examined how long it took for someone to make a personality judgment based on a profile photo, and found that strong preferences for specific images developed even when the photos were shown for a fraction of a second – a result that underscores just how sure we are that a profile photo tells a true story.

The takeaway from these studies is that our impressions formed by looking at profile photos are extremely malleable, no matter how sure we are that the photos are telling us something accurate about someone’s personality. That's worth keeping in mind especially on dating sites, where we're tempted to draw sweeping personality conclusions based on a passing glance at a photo.

The study was published in the journal Psychological Science.

You can find David DiSalvo on Twitter@neuronarrative, at Forbes, and at Psychology Today. His latest book is Brain Changer: How Harnessing Your Brain’s Power To Adapt Can Change Your Life.

Posted on September 13, 2014 .

Study: Fish Oil Prevents Brain Shrinkage and Cognitive Decline in Older Adults


Three more cases of Alzheimer’s disease will have been diagnosed by the time you finish reading this article. More than 5 million people have Alzheimer’s in the United States alone (44 million worldwide), and the rate of new diagnosis is about one patient every minute, with no cure on the horizon. Now a new study adds evidence to the argument that fish oil supplementation could be one of the best preventives we have against the disease--at least for people not at genetic risk of developing it.

Researchers from Rhode Island Hospital studied three groups of older adults, ages 55-90, using neuropsychological tests and brain magnetic resonance imaging (MRI) every six months. The group included 229 adults with no signs of the disease; 397 who were diagnosed with mild cognitive impairment; and 193 with Alzheimer’s. All participants were part of the Alzheimer’s Disease Neuroimaging Initiative (ADNI), which began in 2003 and ended in 2010.

Results showed that adults taking fish oil, who had not yet developed Alzheimer’s, experienced significantly less cognitive decline and brain shrinkage than adults not taking fish oil. Cognitive decline was measured using the Alzheimer's Disease Assessment Scale (ADAS-cog) and the Mini Mental State Exam (MMSE). (Unfortunately, the study did not specify the amount of fish oil taken, nor the percentage of EPA and DHA in the supplements.)

These are promising results, but they have one notable caveat: benefits of taking fish oil only held true for people lacking the main genetic risk factor for developing Alzheimer’s, known as APOE ε4 . The researchers think that people with APOE ε4  are incapable of metabolizing DHA, the fatty acid in fish oil thought to promote cognitive benefits.

The researchers add, however, that it’s still possible that starting fish oil supplementation during or before middle age could protect against developing Alzheimer’s even for people with the genetic marker. If you think of the gene for Alzheimer’s as a light switch, taking fish oil earlier in life could prevent the switch from being flicked on.

At least that’s the hope, and given the fact that Alzheimer’s—the sixth leading cause of death in the U.S.—still evades a cure, fish oil will continue to be a hot target of cognitive research as a possible shield against developing the disease.

The study was published in the journal Alzheimer's & Dementia.

You can find David DiSalvo on Twitter@neuronarrative and at Forbes. His latest book is Brain Changer: How Harnessing Your Brain’s Power To Adapt Can Change Your Life.

Posted on August 30, 2014 .

How Feeling Grateful Improves Your Decision Making


Ancient wisdom traditions have long held that gratitude is a prerequisite for fulfillment. Focusing on what we have, instead of what we think we need, fortifies the mind against rampant desire that ultimately leaves us feeling empty.

The difficulty we face in living out that wisdom comes in the form of challenges to self-control – our perilous dance with instant gratification and temptation. Now new research suggests that gratitude can help us out here as well, by improving our decision-making chops by fortifying our patience.

Researchers tested this theory by putting study participants through a test of financial self-control after they were pre-conditioned to feel one of three emotional states: (1) Grateful, (2) Happy, or (3) Neutral. The pre-conditioning was achieved by having the participants write about a life experience that made them feel either grateful, happy, or left them feeling a lot of nothing.

The financial test was very basic: participants could either choose to receive $54 now or $80 in three days. Alternatively, they could negotiate to receive a lesser or greater amount now instead of more later (for example, a participant could choose to take a $60 payout now instead of an $85 payout later).

The typical reaction to these tests is that most people opt for less money upfront instead of waiting for more. Participants in this study who were pre-conditioned to feel happy or no particular emotion mostly reacted exactly as expected – they wanted the cash now.

But participants feeling grateful showed more restraint, with significantly more in this group opting to wait longer for the larger amount of money. And the more grateful participants reported feeling, the more patient they were.

“Showing that emotion can foster self-control and discovering a way to reduce impatience with a simple gratitude exercise opens up tremendous possibilities for reducing a wide range of societal ills from impulse buying and insufficient saving to obesity and smoking,” said study co-author Assistant Professor Ye Li from the University of California, RiversideSchool of Business Administration.

It’s not entirely clear why feeling more grateful increases patience, but it may simply be that gratitude is an emotional counterbalance to selfishness.

Juliann Wiese, an executive coach who consults with several top-tier companies, says the study buttresses something she’s found to be repeatedly true in her experience.  "In my coaching practice, I've observed that once people shift their focus to what's already worthwhile in their lives--instead of what they think they are missing--their decision-making skills rapidly improve."

Just as people in this study were pre-conditioned to feel grateful, there’s likely benefit in putting ourselves in a grateful mindset to alter our perspective. According to Wiese: "Filtering decisions through a gratitude-centered perspective isn't only important for making better decisions on the job, but across all aspects of our lives. People are simply more patient and less prone to jumping at the first offer or fleeting hint of temptation when they’re grounded in gratitude.”

The study was published in the journal Psychological Science.

You can find David DiSalvo on Twitter@neuronarrative and at Forbes. His latest book is Brain Changer: How Harnessing Your Brain’s Power To Adapt Can Change Your Life.

Posted on August 9, 2014 .

Why Our Brains Love The Curve


You’ve seen the advertisements all around the Web: the curve is coming to a TV near you.  It seems at first glance a simple innovation, in some ways even a predictable one. Watching commercials for Samsung’s new line of televisions, I find myself wondering why it’s taken this long for curved screens to arrive. And it’s altogether possible that I’m asking that question because to my brain—and quite likely to yours—the curve simply fits.

Behavioral researchers have known this for some time; people consistently show a preference for curves over hard lines and angles. Whether the object is a wristwatch, a sofa, or a well-designed building, curves curry favor. Neuroscience, following the lead of behavioral science, is on the hunt for a neurally-hardwired preference for curvy elegance.

Of course, televisions with curved screens offer technical advantages beyond aesthetics. Curvature reduces reflection, making the viewing experience easier on the eyes, and simultaneously creates the illusion that the viewer is surrounded by the screen. The so-called “sweet spot” at the center of the illusion is a comfortable magnet for our attention. (Most of this has been known since the 1950s with the introduction of the first curved movie theater screen, the Cinerama.)

But aside from those advantages, studies suggest that merely viewing the curves of an object triggers relief in our brains – an easing of the threat response that keeps us on guard so much of the time.  While hard lines and corners confer a sense of strength and solidity, they are also subtly imposing. Something about them keeps our danger-alert system revved.

Research shows that we subjectively interpret sharp, hard visual cues as red flags in our environment (with corresponding heightened activity in our brain’s threat tripwire, the amygdala).  Even holding a glass with pronounced hard lines and edges has been shown to elevate tension across the dinner table. Curves take the perceptual edge off.

The softness of contour may also play out in our brains not unlike an emotionally satisfying song or poem. A new discipline known as neuroaesthetics—an ambitious vector between neuroscience and the fine arts—is exploring this idea, shedding light on why the love of curves is seducing technology manufacturers. Recent research under this new banner suggests that curved features in furniture, including TVs, trigger activity in our brains’ pleasure center. We derive a buzz from curves much as we do when viewing a beautiful work of art.

The overlap of visual impact from things as commonplace as chairs and tables and TVs, and emotional impact at such a high level (a level we’d normally reserve for art and music) suggests that the ordinary elements in our environments aren’t so ordinary after all. Skilled industrial designers have known this for quite some time, but now science is adding an explanatory dimension that makes the point all the more compelling.

And if it’s true, as the research indicates, that the curve is an emotional elixir for anxiety-prone brains, the latest trend is likely to take hold and transform our interface with all things digital.  We may eventually look back on the non-curved days of technology the way we think of black and white television now. The term “flat-screen TV” will go the way of “rotary dial phone”.

Having said that, the true test of the curvy trend’s appeal won’t happen until the price point drops considerably. Curves may calm, but the prices on many curved screen TVs are anything but calming. We’ll have to wait a while for that to play out to know whether the brain's love of the curve translates into an enduring shift in technology.

You can find David DiSalvo on Twitter@neuronarrative and at Forbes. His latest book is Brain Changer: How Harnessing Your Brain’s Power To Adapt Can Change Your Life.

Posted on July 21, 2014 .

How Your Blood Sugar Could Be Wrecking Your Relationships


We’ve all known people who should have to wear a flashing red DANGER! sign if they miss lunch, though even without the warning we instinctively know to steer clear if someone is running on empty. A grumbling stomach means a drop in blood sugar, and through excruciating experience most of us realize that means trouble. But could the blood sugar-anger connection lurk behind more relationship conflicts than we realize?

new study probed that question with a research methodology as painfully funny as it was effective.  Researchers rounded up 107 married couples for a 21-day couples’ boot-camp to draw a direct line between blood glucose (aka circulating blood sugar) and aggression.

First they asked the couples to complete a relationship questionnaire that evaluated their level of satisfaction with their marriages, which allowed the research team to control for variables like how rocky the marriage was to begin with. They also measured all of the participants’ blood glucose levels to set a benchmark, and continued to measure the levels throughout the 21-day study.

The researchers predicted that drops in blood sugar would consistently correlate with heightened aggression between the spouses. Aggression was defined in two ways: aggressive impulse and aggressive behavior.  The distinction was meant to identify aggression in thought versus action, because aggression rarely happens in a vacuum—there’s usually a thought impulse that precedes it, even if that impulse doesn’t occur immediately before the action but compounds over time.

To test aggressive impulse, the researchers gave participants a voodoo doll and 51 pins, with instructions to place as many pins in the doll every night as needed to show how angry they were with their spouse. A light conflict day might get just a couple pokes, while a “cover the kids' eyes and ears” day might warrant the full 51 to the head.

To test aggressive behavior, the researchers had the spouses wear headphones while they competed against each other in 25-part tasks. After each task, the winner decided how loudly and for how long to blast the loser with a noise through the headphones.

At the end of the 21 days, with riddled voodoo dolls and ringing ears aplenty, the hypothesis was proven out.  The lower the level of blood glucose, the more pins the spouses poked, and the higher the intensity and longer the duration they blasted their partners through their headphones.

The study provides a couple of worthwhile takeaways. First, quoting Brad Bushman, professor of psychology and communication at Ohio State University and lead study author, “Before you have a difficult conversation with your spouse, make sure you're not hungry."  Simple to say, harder to do.

Second, and the reason why that’s such good advice, is our brains are energy hogs. "Even though the brain is only two percent of our body weight, it consumes about 20 percent of our calories. It is a very demanding organ when it comes to energy," added Bushman. When the brain is short on energy, it’s also short on self-control, and the door is opened for aggressive impulses and behavior to take center stage. And if the study results are a true indication, we’re red lining our self-control more often than we realize.

I’d love to see a follow-up study that attempts to track these results against the blood sugar rollercoaster associated with fast food-laden diets. I have a suspicion that glucose-related aggression isn’t solely about how much or little food we eat, but also the sorts of food we eat. Just a hunch, but it stands to reason that shoveling in foods that cause our blood sugar levels to spike and crash day after day may also trigger spousal (and other) explosions. A little food for thought while you're sitting in the drive-thru.

The study was published in the Proceedings of the National Academy of Sciences.

You can find David DiSalvo on Twitter@neuronarrative, at his website The Daily Brain, and on YouTube at Your Brain Channel. His latest book is Brain Changer: How Harnessing Your Brain’s Power To Adapt Can Change Your Life.

Posted on June 15, 2014 .

When It Comes To Choosing Mates, Women And Men Often Get Framed


If I tell you that seven of ten doctors believe a medication is helpful, the positive weight of the seven endorsements will trump potential negatives. But if I tell you that three of ten doctors believe that a medication should be avoided, the weight of those three negative critiques will overpower potential positives.  The information in either case is the same – the only difference is how it's framed.

Our susceptibility to the framing bias has been demonstrated in study after study (most notably by Nobel Prize-winning psychologist Daniel Kahneman), and now a new study by Concordia University researchers shows how framing influences our selection of love interests.

Hundreds of study participants were given positively and negatively framed descriptions of potential partners. For example:

"Seven out of 10 people who know this person think that this person is kind." [positive frame]


"Three out of 10 people who know this person think that this person is not kind." [negative frame]

The researchers tested the framing effect across six attributes that are known from previous research to rank high in importance to men and women; four that are important to either sex, and two that are important to both sexes:

  • Attractive body (usually more important to men)
  • Attractive face (usually more important to men)
  • Earning potential (usually more important to women)
  • Ambition (usually more important to women)
  • Kindness (equally important to both)
  • Intelligence (equally important to both)

Participants evaluated both “high-quality” (e.g. seven out of 10 people think this person is kind) and “low-quality” (e.g. three out of 10 people think this person is kind) prospective mates for each of these attributes, in the context of a short-term fling or a long-term relationship.

What the research team found is that more often than not, women were significantly less likely to show interest in men described with the low-quality frame, even though they were being presented with exactly the same information as they were in the high-quality frames.

"When it comes to mate selection, women are more attuned to negatively framed information due to an evolutionary phenomenon called 'parental investment theory,'" says study co-author and Concordia marketing professor Gad Saad, a noted researcher on the evolutionary and biological roots of consumer behavior.

"Choosing someone who might be a poor provider or an unloving father would have serious consequences for a woman and for her offspring. So we hypothesized that women would naturally be more leery of negatively framed information when evaluating a prospective mate.”

In particular, women were most susceptible to the framing bias when evaluating a man’s earning potential and ambition.

Men, on the other hand, fell prey to framing most often when evaluating a woman’s physical attractiveness.

While these results at first seem to reinforce stereotypes about how women and men seek mates, they make sense in light of what we know about evolutionary psychology. And they provide an important takeaway for both sexes: before you draw a final conclusion about a would-be mate, consider whether you’re being overly influenced by how their good or bad attributes have been framed, either by others or by the person her or himself. Better to check your bias early than suffer its consequences later.

The study was published in the journal Evolution and Human Behavior.

You can find David DiSalvo on Twitter@neuronarrative, at his website The Daily Brain, and on YouTube at Your Brain Channel. His latest book is Brain Changer: How Harnessing Your Brain’s Power To Adapt Can Change Your Life.

Posted on June 8, 2014 .

Could Cooperation and Corruption Originate with the Same Hormone?

We humans contend with quite a few wicked flip sides in our personal and interpersonal lives. Gratitude can transform into resentment. Concern can morph into apathy. Love can quickly become hate. New research digs deeper into a similar neurobiological duality that can, and frequently does, run rampant in groups: the Jekyll and Hyde of cooperation and corruption.

Researchers hypothesized that oxytocin—the same hormone that previous studies have linked to collaboration and altruism—can predispose us to acting dishonestly if we think doing so will benefit our group of choice. A “group” in this case means anyone to whom we feel some sense of obligation, be it family, coworkers, peers, political cronies or our Friday night craft beer buddies.

In our day-to-day lives, oxytocin is thought to play a big role in how closely bonded we feel to our group. It isn’t just the “cuddle hormone” (often discussed in studies about love and affection) but also the group-cohesion hormone.

To test the hypothesis, the research team gave one group of healthy male participants a dose of oxytocin via nasal spray and another group a placebo nasal spray (neither the participants nor the researchers knew which participants received which spray). The participants were then asked to toss a coin multiple times and make predictions on whether they’d flip heads or tails, and then self-report on the results. How well they did, they were told, would win or lose money for their fellow group members. How they reported—honestly or dishonestly—was kept anonymous, assuring the participants that how they chose to respond wouldn’t reflect back on them personally.

We might guess that participants would lie more often about the results only if they, individually, could benefit – but instead participants given oxytocin lied significantly more about the coin flip than the placebo group only if doing so gained money for their fellow group members. And they lied for the group even if they thought that the favor wouldn't be reciprocated.

To find out how participants would react if they thought they’d benefit individually, the researchers put another group through the same testing conditions but told participants that the results of their predictions would only win or lose them money, with no group benefit or loss attached. The results showed that oxytocin did not influence participants to lie any more than those in the placebo group.

In other words, oxytocin promoted lying for group but not individual benefit.

The study has a few limitations, the most obvious of which is that it used only male participants. Whether or not oxytocin would influence females toward group dishonesty is impossible to tell from these results.

But, at least for men, it seems that higher levels of oxytocin potently affect decisions to lie for the group’s benefit. This may help explain the “you go, I go, we all go” nature of fraternal groups. And the results highlight the role of group bonding in forging hard-to-crack corruption. Last year's hit movie, The Wolf of Wall Street, a true tale about a group of corrupt stock brokers making an obscene amount of ill-gotten money, and lying to ensure that no one got caught (at least for a while), comes to mind as a vivid illustration.

The study was published in The Proceedings of the National Academy of Sciences.

You can find David DiSalvo on Twitter @neuronarrative, at his website The Daily Brain, and on YouTube at Your Brain Channel. His latest book is Brain Changer: How Harnessing Your Brain’s Power To Adapt Can Change Your Life.

Posted on May 18, 2014 .

Your Brain Channel Has Launched!

Hi everyone! Just wanted to let you know that I've launched a new science video channel on YouTube called Your Brain Channel. We'll be featuring brief "News You Can Use" video segments on a range of science-related topics. Please check out the first couple of entries on the green tea-memory connection and testing the sleep debt theory. Plenty more videos to come, so check back often. Thanks! 


The Connection Between Playing Video Games and a Thicker Brain

For all the negative news about the alleged downsides of playing video games, it’s always surprising to come across research that shows a potentially huge upside. A new study fills the bill by showing that heavy video game play is associated with greater “cortical thickness” – a neuroscience term meaning greater density in specific brain areas.

Researchers studied the brains of 152 adolescents, both male and female, who averaged about 12.6 hours of video gaming a week. As one might guess, the males, on average, played more than the females, but all of the participants spent a significant amount of time with a gaming console. The research team wanted to know if more time spent gaming correlated with differences in participants’ brains.

What they found is that the brains of adolescents that spent the most time playing video games showed greater cortical thickness in two brain areas: the left dorsolateral prefrontal cortex (DLPFC) and the left frontal eye field (FEF).

The prefrontal cortex is often referred to as our brain’s command and control center. It’s where higher order thinking takes place, like decision-making and self-control.  Previous research has shown that the DLPFC plays a big part in how we process complex decisions, particularly those that involve weighing options that include achieving short-term objectives with long-term implications. It’s also where we make use of our brain’s working memory resources – the information we keep “top of mind” for quick access when making a decision.

The FEF is a brain area central to how we process visual-motor information and make judgments about how to handle external stimuli. It’s also important in decision-making because it allows us to efficiently figure out what sort of reaction best suits what’s happening around us. The term “hand-eye coordination” is part of this process.

Together, the DLPFC and FEF are crucial players in our brain’s executive decision-making system. Greater “thickness” in these brain areas (in other words, more connections between brain cells) indicates a greater ability to juggle multiple variables, whether those variables have immediate or long-term implications, or both.

While this study doesn’t quite show that playing hours of videos games each week causes these brain areas to grow thicker, the correlation is strong – strong enough to consider the possibility that gaming is sort of like weight lifting for the brain.

And that, even more than the video game connection, is what makes this study really interesting. It suggests that the popular terms “brain training” and "brain fitness" are more than marketing ploys to sell specialized software. If it’s true that playing video games is not unlike exercise that beefs up our brain’s decision-making brawn, then it logically follows that we can not only perceptually, but physically improve our brains with practices designed for the purpose. Future research will continue exploring precisely that possibility.

The study was published in the online journal PLoS ONE.

You can find David DiSalvo on Twitter @neuronarrative and at his website, The Daily Brain. His latest book is Brain Changer: How Harnessing Your Brain’s Power To Adapt Can Change Your Life.

Posted on May 1, 2014 .