The Daily Brain is a website for people who love asking questions and searching for answers – especially when the answers lead to new questions.

 

Syndicated by

Search
Get the Books

 

 

 

 

Featured Goodies
Sunday
Apr062014

How A Tiny Bit Of Procrastination Can Help You Make Better Decisions

Decision-making is what you might call “practical science.” Findings on how we make decisions have direct applicability to life outside the psychology lab, and in recent years there’s been quite a lot said about this most commonplace, yet complicated, feats of mind. A new study adds to the discussion by suggesting that a wee bit of procrastination can make us better decision-makers.

Researchers from the Columbia University Medical Center wanted to know if they could improve decision accuracy by inserting just a smidgen more time between peoples’ observation of a problem and their decision on how to respond.

The research team conducted two experiments to test this hypothesis. First, they asked study participants to make a judgment about the direction of a cluster of rapidly moving dark dots on a computer monitor. As the dots (called the “target dots” in the study) traveled across the screen, participants had to determine if the overall movement was right or left. At the same time, another set of brighter colored dots (the “distractor dots”) emerged on the screen to obscure the movement of the first set. Participants were asked to make their decisions as quickly as possible.

When the first and second set of dots moved in generally the same direction, participants completed the task with near-perfect accuracy. When the second set of dots moved in a different direction than the first, the error rate significantly increased. Simple enough.

The second experiment was identical to the first, except this time participants were told to make their decisions when they heard a clicking sound. The researchers varied the clicks to be heard between 17 and 500 milliseconds after the participants began watching the dots – a timespan chosen to mimic real-life situations, such as driving, where events happen so quickly that time seems almost imperceptible.

The research team found that when participants’ decisions were delayed by about 120 milliseconds, their accuracy significantly improved.

"Manipulating how long the subject viewed the stimulus before responding allowed us to determine how quickly the brain is able to block out the distractors and focus on the target dots," said Jack Grinband, PhD, one of the study authors. "In this situation, it takes about 120 milliseconds to shift attention from one stimulus, the bright distractors, to the darker targets."

The researchers were careful to distinguish “delaying” from “prolonging” the decision process. There seems to be a sweet spot that allows the brain just enough time to filter out distractions and focus on the target. If there’s too little time, the brain tries to make a decision while it’s still processing through the distractions. If there’s too much time, the process can be derailed by more distractions.

If you’re wondering how anyone can actually do this with so little time to make a decision, the answer—suggested by this study—is practice. Just as the participants were cued by the clicks to make a decision, it would seem that we have to train ourselves to delay just long enough to filter distractions.

Said another way, doing nothing--for just a tiny amount of time--gives the brain an opportunity to process and execute. (In my book, Brain Changer, I refer to this as the "awareness wedge" because it's a consciously inserted "wedge" between the immediacy of whatever situation we're facing and our next action.)

This research also underscores just how dangerous it can be to add distractions to the mix—like using a phone while driving—when our brains already need a cushion of time to filter the normal array of distractions we experience all the time.

The study appears in the online journal PLoS One.

You can find David DiSalvo on Twitter @neuronarrative and at his website, The Daily Brain. His latest book is Brain Changer: How Harnessing Your Brain’s Power To Adapt Can Change Your Life.

Saturday
Mar152014

The Good and Bad News About Your Sleep Debt

Sleep, science tells us, is a lot like a bank account with a minimum balance penalty. You can short the account a few days a month as long as you replenish it with fresh funds before the penalty kicks in. This understanding, known colloquially as “paying off your sleep debt,” has held sway over sleep research for the last few decades, and has served as a comfortable context for popular media to discuss sleep with weary eyed readers and listeners.

The question is – just how scientifically valid is the sleep debt theory?

Recent research targeted this question by testing the theory across a few things that sleep, or the lack of it, is known to influence: attention, stress, daytime sleepiness, and low-grade inflammation. The first three are widely known for their linkage to sleep, while the last—inflammation—isn’t, but should be. Low-grade tissue inflammation has been increasingly linked to a range of unhealthiness, with heart disease high on the list.

Study participants were first evaluated in a sleep lab for four nights of eight-hour sleep to establish a baseline. This provided the researchers with a measurement of normal attention, stress, sleepiness and inflammation levels to measure against.

The participants then endured six nights of six-hour sleep (a decent average for someone working a demanding job and managing an active family and social life). They were then allowed three nights of 10-hour catch-up sleep. Throughout the study, participants’ health and ability to perform a series of tasks were evaluated.

Sleep debt theory predicts that the negative effects from the first six nights of minimal sleep would be largely reversed by the last three nights of catch-up sleep – but that’s not exactly what happened.

The analysis showed that the six nights of sleep deprivation had a negative effect on attention, daytime sleepiness, and inflammation as measured by blood levels of interleukin-6 (IL-6), a biomarker for tissue inflammation throughout the body — all as predicted. It did not, however, have an effect on levels of the stress hormone cortisol—the biomarker used to measure stress in the study—which remained essentially the same as baseline levels.

After three-nights of catch-up sleep, daytime sleepiness returned to baseline levels – score one for sleep debt theory. Levels of IL-6 also returned to baseline after catch-up – another score in the theory’s corner. Cortisol levels remained unchanged, but that’s not necessarily a plus for the theory (more on that in a moment).

Attention levels, which dropped significantly during the sleep-deprivation period, didn't return to baseline after the catch-up period. That’s an especially big strike against the theory since attention, perhaps more than any other measurement, directly affects performance. Along with many other draws on attention—like using a smart phone while trying to drive—minimal sleep isn’t just a hindrance, it’s dangerous, and this study tells us that sleeping heavy on the weekends won’t renew it.

Coming back to the stress hormone cortisol, the researchers point out that its level remaining relatively unchanged probably indicates that the participants were already sleep deprived before they started the study. Previous research has shown a strong connection between cortisol and sleep; the less sleep we get, the higher the level of the stress hormone circulating in our bodies, and that carries its own set of health dangers. This study doesn’t contradict that evidence, but also doesn’t tell us one way or the other if catch-up sleep decreases cortisol levels.

The takeaway from the study is that catch-up sleep helps us pay off some, but by no means all of our sleep debt. And given the results on impaired attention, another takeaway is that it’s best to keep your sleep-deprived nights to a minimum. Just because you slept in Saturday and Sunday doesn’t mean you’ll be sharp Monday morning.

You can find David DiSalvo on Twitter @neuronarrative and at his website, The Daily Brain. His latest book is Brain Changer: How Harnessing Your Brain’s Power To Adapt Can Change Your Life.

Friday
Feb282014

Balancing the Self-Control Seesaw

Imagine a seesaw in your brain. On one side is your desire system, the network of brain areas related to seeking pleasure and reward. On the other side is your self-control system, the network of brain areas that throw up red flags before you engage in risky behavior. The tough questions facing scientific explorers of behavior are what makes the seesaw too heavy on either side, and why is it so difficult to achieve balance?

new study from University of Texas-Austin, Yale and UCLA researchers suggests that for many of us, the issue is not that we’re too heavy on desire, but rather that we’re too light on self-control.

Researchers asked study participants hooked up to a magnetic resonance imaging (MRI) scanner to play a video game designed to simulate risk-taking. The game is called Balloon Analogue Risk Task (BART), which past research has shown correlates well with self-reported risk-taking such as drug and alcohol use, smoking, gambling, driving without a seatbelt, stealing and engaging in unprotected sex.

The research team used specialized software to look for patterns of activity across the brain that preceded someone making a risky or safe decision while playing the game.

The software was then used to predict what other subjects would choose during the game based solely on their brain activity. The results: the software accurately predicted people's choices 71 percent of the time.

What this means is that there’s a predictable pattern of brain activity associated with choosing to take or not take risks.

"These patterns are reliable enough that not only can we predict what will happen in an additional test on the same person, but on people we haven't seen before," said Russ Poldrack, director of UT Austin's Imaging Research Center and professor of psychology and neuroscience.

The especially intriguing part of this study is that the researchers were able to “train” the software to identify specific brain regions associated with risk-taking. The results fell within what’s commonly known as the “executive control” regions of the brain that encompass things like mental focus, working memory and attention. The patterns identified by the software suggest a decrease in intensity across the executive control regions when someone opts for risk, or is simply thinking about doing something risky.

"We all have these desires, but whether we act on them is a function of control," says Sarah Helfinstein, a postdoctoral researcher at UT Austin and lead author of the study.

Coming back to the seesaw analogy, this research suggests that even if our desire system is level, our self-control system appears to slow down in the face of risk; less intensity on that side of the seesaw naturally elevates intensity on the other side.

And that’s under normal conditions. Add variables like peer pressure, sleep deprivation and drug and alcohol use to the equation--all of which further handicap self-control--and the imbalance can only become more pronounced.

That’s what the next phase of this research will focus on, says Helfinstein. "If we can figure out the factors in the world that influence the brain, we can draw conclusions about what actions are best at helping people resist risks.”

Ideally, we'd be able to balance the seesaw -- enabling consistently healthy discretion as to which risks are worth taking. While it's evident that too much exposure to risk is dangerous, it's equally true that too little exposure to risk leads to stagnation.

We are, after all, an adaptive species. If we're never challenged to adapt to new risks, we stop learning and developing, and eventually sink into boredom, which, ironically, sets us up to take even more radical risks. 

The study appears in the journal Proceedings of the National Academy of Sciences.

You can find David DiSalvo on Twitter @neuronarrative and at his website, The Daily Brain. His latest book is Brain Changer: How Harnessing Your Brain’s Power To Adapt Can Change Your Life.

Friday
Feb212014

How to Squeeze Snake Oil From Deer Antlers and Make Millions

If I offered to sell you a liquid extract made from the velvety coating of deer antlers, claiming that it will catalyze muscle growth, slow aging, improve athletic performance and supercharge your libido – I’d expect you'd be a little skeptical. But what if I added that a huge percentage of professional athletes are using the stuff and paying top dollar, $100 or more an ounce, and swear up and down that just a few mouth sprays a day provides all benefits as advertised? Would you be willing to give it a try?

Ever since former Baltimore Ravens star Ray Lewis admitted a few months ago that he used deer antler spray (though subsequently denied it), the market for the stuff has exploded. Some estimates say that close to half of all professional football and baseball players are using it and a hefty percentage of college players as well, to say nothing of the army of weightlifters and bodybuilders that have made the spray a daily part of their routines.

TV journalism bastion 60 Minutes recently ran a special sports segment about "Deer Antler Man" Mitch Ross, the product's highest profile salesman, and the tsunami of buyers for oral deer antler spray and its growing list of celebrity devotees. Without question, deer antler spray has captivated the attention of the sports world and is rapidly pushing into mainstream markets.

Let’s take a look at the science behind the claims and try to find out what’s really fueling the surge in sales for this peculiar product.

The velvety coating of deer antlers is a chemically interesting material. For centuries it’s been used in eastern traditions as a remedy for a range of maladies, and there’s an underlying rationale for why it theoretically could be useful for certain conditions. The velvet coating contains small amounts of insulin-like growth factor 1, or IGF-1, that has been studied for several decades as a clinically proven means to reverse growth disorders in humans. For example, in children born with Laron Syndrome—a disorder that causes insensitivity to growth hormone, resulting in dwarfism—treatment with IGF-I has been shown to dramatically increase growth rates.  IGF-1 appears to act as a chemical facilitator for the production of growth hormone from the pituitary gland, and in sufficient amounts even synthetically derived IGF-1 can help boost physical growth.

That’s the reason why IGF-1 has been banned by the Food and Drug Administration (FDA) and the World Anti-Doping Agency in certain forms as having similar outcomes to using human growth hormone and anabolic steroids.  The forms these agencies have banned, however, are high-dosage, ultra-purified liquids administered by injection.

Why can’t the FDA and anti-doping agencies ban IGF-1 outright? For the simple reason that the chemical, in trace amounts, is found in things we eat every day: red meat, eggs and dairy products. Every time you eat a juicy ribeye or have a few eggs over easy, you’re ingesting IGF-1.

In the tiny amounts of the substance found in these foods, we may experience a cumulative, positive effect on muscle repair over time, but you’ll never be able to drink enough whole milk in a sitting to experience the anabolic effects you’d get from a syringe full of concentrated and purified IGF-1.

As I mentioned, the velvety substance on growing deer antlers also contains trace amounts of IGF-1, and (along with oddities like powdered tiger bone) has been sold in China for centuries as a traditional cure for several ailments. In traditional Chinese medicine, the antler is divided into segments, each segment targeted to different problems.  The middle segment, for example, is sold as a cure for adult arthritis, while the upper section is sold as a solution for growth-related problems in children. The antler tip is considered the most valuable part and sells for top dollar.

The main source for the market explosion in deer antler spray is New Zealand, which produces 450 tons of deer velvet annually, compared to the relatively small amount produced by the US and Canada: about 20 tons annually. Deer can be killed outright for their antlers, but in New Zealand the more accepted procedure is to anesthetize the deer and remove the antlers at the base. The antlers are then shipped overseas to the growing market demanding them.

The reason why deer antler velvet is usually turned into an oral liquid spray instead of a pill (although it is also sold in pill form around the world) is that the trace proteins in the substance are rapidly broken down by the digestive system, so only a fraction of the already tiny amount actually makes it into the bloodstream. In spray form, IGF-1 can potentially penetrate mucosal membranes and enter the bloodstream intact more quickly. Purchasing the spray form can run from anywhere between about $20 for a tiny bottle to $200 for two ounces. Standard doses are several sprays per day, so the monthly costs of using the product are exorbitant.

The question is does using deer antler spray deliver the benefits its sellers claim? These alleged benefits include accelerated muscle growth and muscle repair, tendon repair, enhanced stamina, slowing of the aging process, and increased libido – a virtual biological panacea of outcomes.

The consensus opinion from leading endocrinologists studying the substance, including Dr. Roberto Salvatori at the Johns Hopkins School of Medicine and Dr. Alan Vogol at the University of Virginia,  is that the chances of it delivering on any of these benefits are slim to none.  The reason is simply that there's far too little of the substance in even the purest forms of the spray to make any difference.

Think of it this way: If a steak contains roughly the same trace amount of IGF-1 as deer antler velvet, is there any evidence to suggest that eating steak can provide the same array of benefits claimed for deer antler spray?  No, there’s not a shard of clinical evidence to support that claim.

And yet, thousands of people are paying close to $200 a bottle for the spray believing that it will deliver these benefits.  With such high-profile celebrity connections as Ray Lewis and golf superstar Vijay Singh, there’s little wonder why the craze has picked up momentum. But in light of scientific evidence, there’s no credible reason to pay $200 or any amount for a bottle of deer antler spray.

Aside from the lack of evidence supporting benefits, it’s unclear what the negative effects may be of using the product long-term.  WebMD reports that the compounds in the spray may mimic estrogen in the body, which could contribute to spawning a variety of cancers or worsening of conditions such as uterine fibroids in women. Elevated estrogen levels in men can throw off hormonal balance and lead to a thickening waistline and a host of related metabolic problems.

The takeaway is this: deer antler spray is the latest high-priced snake oil captivating the market. Not only will it cost you a lot of money and not deliver promised benefits, but it could lead to negative health outcomes. Let the deer keep their antler velvet and keep your cash in your wallet.

You can find David DiSalvo on Twitter @neuronarrative and at his website, The Daily Brain. His latest book is Brain Changer: How Harnessing Your Brain’s Power To Adapt Can Change Your Life.

Sunday
Feb092014

The Era Of Genetically-Altered Humans Could Begin This Year

By the middle of 2014, the prospect of altering DNA to produce a genetically-modified human could move from science fiction to science reality.  At some point between now and July, the UK parliament is likely to vote on whether a new form of in vitro fertilization (IVF)—involving DNA from three parents—becomes legally available to couples. If it passes, the law would be the first to allow pre-birth human-DNA modification, and another door to the future will open.

The procedure involves replacing mitochondrial DNA (mtDNA) to avoid destructive cell mutations. Mitochondria are the power plants of human cells that convert energy from food into what our cells need to function, and they carry their own DNA apart from the nuclear DNA in our chromosomes where most of our genetic information is stored. Only the mother passes on mtDNA to the child, and it occasionally contains mutations that can lead to serious problems.

According to the journal Nature, an estimated 1 in 5,000-10,000 people carry mtDNA with mutations leading to blindness, diabetes, dementia, epilepsy and several other impairments (the equivalent of 1,000 – 4,000 children born each year in the U.S.). Some of the mutations lead to fatal diseases, like Leigh Syndrome, a rare neurological disorder that emerges in infancy and progressively destroys the ability to think and move.

By combining normal mitochondrial DNA from a donor with the nucleus from a prospective mother’s egg, the newborn is theoretically free from mutations that would eventually lead to one or more of these disorders. While never tried in humans (human cell research on mtDNA has so far been confined to the lab), researchers have successfully tested the procedure in rhesus monkeys.

Last March, the UK Human Fertilization and Embryology Authority wrapped up a lengthy study of safety and ethical considerations and advised parliament to approve the procedure in humans. According to New Scientist magazine, parliament is likely to vote on the procedure by July of this year. If the procedure overcomes that hurdle, it will still take several months to pass into law, but the initial vote will allow researchers to begin recruiting couples for the first human mtDNA replacement trials.

The U.S. is not nearly as close to approving mtDNA replacement as the UK seems poised to do; the U.S. Food and Drug Administration will start reviewing the data in earnest in February.  Among the concerns on the table is whether the mtDNA donor mother could be considered a true “co-parent” of the child, and if so, can she claim parental rights?

Even though the donor would be contributing just 0.1 percent of the child’s total DNA (according to the New Scientist report), we don’t as yet have a DNA benchmark to judge the issue. Who is to say what percentage of a person’s DNA must come from another human to constitute biological parenthood?

Other scientists have raised concerns about the compatibility of donor mtDNA with the host nucleus and believe the push to legalize human trials is premature. By artificially separating mtDNA from the nucleus, these researchers argue, we may be short-circuiting levels of genetic communication that we're only beginning to fully understand.

These are but two of many issues that this procedure will surface in the coming months. One thing is certain: we’re rapidly moving into new and deeper waters, and chances are we're going to need a bigger boat.

You can find David DiSalvo on Twitter @neuronarrative and at his website, The Daily Brain. His latest book is Brain Changer: How Harnessing Your Brain’s Power To Adapt Can Change Your Life.