Does Yo-Yo Dieting Drive Compulsive Eating?

New research on rats seems to find a connection between yo-yo dieting and compulsive eating.

According to researchers at the Boston University School of Medicine (BUSM), a chronic cyclic pattern of overeating followed by undereating reduces the brain’s ability to feel reward and may drive compulsive eating.

The finding suggests that future research into the treatment of compulsive eating behavior should focus on rebalancing the mesolimbic dopamine system, the part of the brain responsible for feeling reward or pleasure, researchers say.

“We are just now beginning to understand the addictive-like properties of food and how repeated overconsumption of high sugar — similar to taking drugs — may affect our brains and cause compulsive behaviors,” said corresponding author Pietro Cottone, Ph.D., an associate professor of pharmacology and experimental therapeutics at BUSM and co-director of the Laboratory of Addictive Disorders.

To better understand compulsive and uncontrollable eating, Cottone and his research team performed a series of experiments on two groups of rats. One, the cycled group, received a high-sugar, chocolate-flavored diet for two days each week and a standard control diet the remaining days of the week, while the control group received the control diet all of the time.

The group that cycled between the palatable food and the less palatable food spontaneously developed compulsive, binge eating on the sweet food and refused to eat regular food, the researchers discovered.

Both groups were then injected with a psychostimulant amphetamine, a drug that releases dopamine and produces reward, and their behavior in a battery of behavioral tests was then observed.

While the control group predictably became very hyperactive after receiving amphetamine, the cycled group did not.

Furthermore, in a test of the conditioning properties of amphetamine, the control group was attracted to environments where they previously received amphetamine, whereas the cycled group were not.

Finally, when measuring the effects of amphetamine while directly stimulating the brain reward circuit, the control group was responsive to amphetamine, while the cycled group was not, according to the findings.

After investigating the biochemical and molecular properties of the mesolimbic dopamine system of both groups, the researchers determined that the cycled group had less dopamine overall, released less dopamine in response to amphetamine, and had dysfunctional dopamine transporters — proteins that carry dopamine back into brain cells — due to deficits in the mesolimbic dopamine system.

“We found that the cycled group display similar behavioral and neurobiological changes observed in drug addiction: specifically, a crash in the brain reward system,” Cottone said. “This study adds to our understanding of the neurobiology of compulsive eating behavior.

“Compulsive eating may derive from the reduced ability to feel reward. These findings also provide support to the theory that compulsive eating has similarities to drug addiction.”

“Our data suggest that a chronic cyclic pattern of overeating will reduce the brain’s ability to feel reward — feeling satiated. This results in a vicious circle, where diminished reward sensitivity may in turn be driving further compulsive eating,” said lead author Catherine (Cassie) Moore, Ph.D., a former graduate student in the Laboratory of Addictive Disorders at BUSM.

The study was published in the journal Neuropsychopharmacology.

Source: Boston University School of Medicine

Osteoarthritis May Play a Role in Social Isolation

When older adults become socially isolated, their health and well-being can suffer. Now a new study suggests a link between being socially isolated and osteoarthritis (arthritis), a condition that causes joint pain and can limit a person’s ability to get around.

The findings are published in the Journal of the American Geriatrics Society.

Arthritis patients often have other health issues which may increase their risk of becoming socially isolated. These include anxiety and depression, being afraid to move around (because arthritis makes moving painful), physical inactivity and being unable to take care of themselves.

About 30 percent of adults over 65 have arthritis to some degree, especially in their leg joints. Despite that, until now there has been little research on the link between arthritis and social isolation.

Researchers analyzed data from the European Project on OSteoArthritis (EPOSA) study. They wanted to examine any potential links between arthritis and social isolation, and to identify the disease’s contribution to social isolation.

EPOSA is a study of 2,942 adults between the ages of 65 to 85 years old who live in six European countries: Germany, Italy, the Netherlands, Spain, Sweden, and the UK. In all, 1,967 people, around the age of 73, participated in the study. Half of the participants were women, and almost 30 percent had arthritis.

The researchers looked at whether the participants were socially isolated at the beginning of the study as well as 12 to 18 months later. The participants completed questionnaires that kept track of how often they connected socially with friends and family members and how often they volunteered or participated in social activities.

At the start of the study, almost 20 percent were socially isolated. Those who weren’t socially isolated tended to be younger, had higher incomes and more education. They were also more likely to be physically active, had less physical pain, had faster walking times and were in better all-around health.

Of the 1,585 participants who weren’t considered socially isolated at the beginning of the study, 13 percent had become socially isolated 12 to 18 months later. They reported that their health and osteoarthritis had worsened, they were in more pain, had become less physically active, had slower walking times, and had depression and problems with thinking and making decisions.

The researchers say the findings suggest that osteoarthritis can increase the risk of social isolation. In particular, having problems with thinking and making decisions, as well as having slower walking times, is associated with an increased risk of becoming socially isolated.

Since social isolation can lead to poorer health, the researchers suggest that older adults with arthritis may benefit from engaging in physical activity and social activities. Specifically, they suggest that health care providers might refer people to senior centers where activities are specially designed for people with arthritis.

Source: American Geriatrics Society

 

Hope Can Aid in Recovery from Anxiety Disorders

New research suggests hope is a trait that can predict resilience and recovery from anxiety disorders.

In a new study, clinical psychologist Dr. Matthew Gallager and colleagues examined the role of hope in predicting recovery in a clinical trial of adults in cognitive behavior therapy (CBT) for common anxiety disorders.

Historically, the concept of hope has long stirred opinion. In the 16th century, German theologian Martin Luther celebrated its power, claiming “Everything that is done in this world is done by hope.” Two centuries later, Benjamin Franklin warned that “He that lives upon hope will die fasting.”

In the study, Gallagher — University of Houston associate professor of clinical psychology — assessed the role of hope in predicting recovery among a clinical trial of 223 adults. In the trial, adults were receiving cognitive-behavior therapy (CBT) for one of four common anxiety disorders: social anxiety disorder, panic disorder, generalized anxiety disorder and obsessive-compulsive disorder.

Gallagher discovered that psychotherapy can result in clear increases in hope and that changes in hope are associated with changes in anxiety symptoms. His findings appear in the journal Behavior Therapy.

“In reviewing recovery during CBT among the diverse clinical presentations, hope was a common element and a strong predictor of recovery,” said Gallagher. He also reports that moderate-to-large increases in hope and changes in hope were consistent across the five separate CBT treatment protocols.

In terms of psychotherapy, hope represents the capacity of patients to identify strategies or pathways to achieve goals and the motivation to effectively pursue those pathways.

Significantly, the results of this study indicate that hope gradually increases during the course of CBT, and increases in hope were greater for those in active treatment than for those in the waitlist comparison.

The magnitude of these changes in hope were consistent across different CBT protocols and across the four anxiety disorders examined, which underscores the broad relevance of instilling hope as an important factor in promoting recovery during psychotherapy.

“Our results can lead to a better understanding of how people are recovering and it’s something therapists can monitor. If a therapist is working with a client who isn’t making progress, or is stuck in some way, hope might be an important mechanism to guide the patient forward toward recovery,” said Gallagher.

Hope is closely related to other positive psychology constructs, such as self-efficacy and optimism, that have also been shown to have clear relevance to promoting resilience to and recovery from emotional disorders, said Gallagher.

Gallagher’s research is part of a larger project examining the efficacy of CBT for anxiety disorders led by Dr. David H. Barlow, founder and director emeritus of the Boston University Center for Anxiety and Related Disorders.

Source: University of Houston

Emotional Eating After Bad Breakup May Not Lead to Weight Gain

Going for that pint of ice cream after a bad breakup may not do as much damage as you think. A new study shows that despite the emotional turmoil, people on average do not report gaining weight after a breakup.

The study, which included researchers from Penn State, investigated the German concept of “kummerspeck” — excess weight gain due to emotional eating — which literally translates to “grief bacon.”

According to the researchers, although hoarding food after a breakup may have made sense for humans thousands of years ago, modern humans may have grown out of the habit.

“Food was much scarcer in the ancestral environment, so if your partner abandoned you, it could have made gathering food much harder,” said Dr. Marissa Harrison, associate professor of psychology at Penn State Harrisburg.

“It may have made sense if our ancestors hoarded food after a breakup. But our research showed that while it’s possible people may drown their sorrows in ice cream for a day or two, modern humans do not tend to gain weight after a breakup.”

The findings are published in the Journal of the Evolutionary Studies Consortium.

The researchers say it is well documented that people sometimes use food as a way to cope with negative feelings and that emotional eating can lead to unhealthy food choices. Because breakups can be stressful and emotional, it could potentially trigger emotional eating.

In addition, ancient relationship dynamics may have made packing on the pounds after a breakup evolutionarily advantageous.

“Modern women of course have jobs and access to resources now, but back then, it was likely that women were smaller and needed more protection and help with resources,” Harrison said.

“If their partner left or abandoned them, they would be in trouble. And the same could have gone for men. With food not as plentiful in the ancestral world, it may have made sense for people to gorge to pack on the pounds.”

Harrison also noted that the existence of the word “kummerspeck” itself suggested that the phenomenon existed.

The research team conducted two studies to test the theory that people may be more likely to gain weight after a relationship breakup. In the first experiment, they recruited 581 people to complete an online survey about whether they had recently gone through a breakup and whether they gained or lost weight within a year of that breakup.

Most of the participants — 62.7 percent — reported no weight change. The researchers were surprised by this result and decided to perform an additional study.

For the second experiment, the researchers recruited 261 new participants to take a different, more extensive survey than the one used in the first study. The new survey asked whether participants had ever experienced the dissolution of a long-term relationship, and whether they gained or lost weight as a result.

The survey also asked about participants’ attitudes toward their ex-partner, how committed the relationship was, who initiated the breakup, whether the participants tended to eat emotionally, and how much participants enjoy food in general.

While all participants reported experiencing a break up at some point in their lives, the majority of participants — 65.13 percent — reported no change in weight after relationship dissolution.

“We were surprised that in both studies, which included large community samples, we found no evidence of kummerspeck,” Harrison said. “The only thing we found was in the second study, women who already had a proclivity for emotional eating did gain weight after a relationship breakup. But it wasn’t common.”

Harrison added that the results may have clinical implications.

“It could be helpful information for clinicians or counselors with patients who tend to eat emotionally,” Harrison said. “If your client is going through a breakup and already engages in emotional eating, this may be a time where they need some extra support.”

Victoria Warner, a Penn State Harrisburg graduate student, was the lead author of this study. Samantha Horn from Penn State Harrisburg and Susan Hughes from Albright College also participated in this work.

Source: Penn State

Solitary Confinement Tied to Greater Risk of Death After Prison Release

Prisoners who are held in restrictive housing (i.e., solitary confinement) face an increased risk of death after their release, according to a new study led by researchers from the University of North Carolina (UNC) at Chapel Hill.

The findings show that incarcerated individuals who were placed in restrictive housing in North Carolina from 2000 to 2015 were 24% more likely to die in the first year after their release, compared to those who were not held in restrictive housing.

In addition, those held in restrictive housing were 78% more likely to die from suicide, 54% more likely to die from homicide, and 127% more likely to die from an opioid overdose in the first two weeks after their release.

Further, the number of restrictive housing placements and spending more than 14 consecutive days in restrictive housing were associated with an even greater increase in the risk of death and reincarceration.

“For the first time ever, using data shared with us from our partners at the North Carolina Department of Public Safety, we’ve been able to demonstrate a connection between restrictive housing during incarceration and increased risk of death when people return to the community,” said lead author Lauren Brinkley-Rubinstein, PhD, an assistant professor of social medicine in the UNC School of Medicine.

“In addition, our study found that the more time people spent in restrictive housing the higher the risk of mortality after release. This study provides empirical evidence to support ongoing nationwide reforms that limit the use of restrictive housing.”

“North Carolina is a leader in this thinking as the Department of Public Safety has preemptively implemented multiple reforms that have resulted in the limited use of restrictive housing.”

“We appreciate this research collaboration and recognize the importance of these results in shaping policy and practice,” said Gary Junker, PhD, Director of Behavioral Health for the N.C. Department of Public Safety Adult Correction and Juvenile Justice.

“Since 2015, the department has initiated several programs to divert people from restrictive housing, including Therapeutic Diversion Units for those with mental illness. While safety and security must remain our top priority, we recognize that reduced use of restrictive housing will likely improve post-release outcome.”

These findings, published in the journal JAMA Network Open, are from a retrospective cohort study conducted by Brinkley-Rubinstein and co-authors from UNC, Emory University, the N.C. Department of Public Safety and the N.C. Department of Public Health.

Incarceration data for people who were confined in North Carolina between 2000 and 2015 were matched with death records from 2000 to 2016.

“We also found that non-white individuals were disproportionately more likely to be assigned to restrictive housing than their white counterparts,” said co-author Shabbar Ranapurwala, PhD, MPH, an assistant professor of epidemiology in the UNC Gillings School of Global Public Health and a core faculty member of the UNC Injury Prevention Research Center.

“In fact, the mortality and reincarceration outcomes after release were also quite different between these racial groups. The post-release opioid overdose and suicide death outcomes among those receiving restrictive housing were more pronounced among white individuals compared to non-whites, while the all-cause and homicide death and reincarceration outcomes were higher among non-white Americans compared to whites.”

Given the observational nature of the study, establishing cause and effect may be difficult, yet, the strength and consistency of the findings points to the fact that restrictive housing is an important marker of increased mortality risk among formerly incarcerated individuals.

Source: University of North Carolina Health Care

Slower Walkers Have Older Brains and Bodies at 45

A new study shows that people with a lower walking speed at the age of 45 have accelerated aging of both their bodies and their brains.

Using a 19-measure scale, researchers at Duke University found that in slower walkers, their lungs, teeth and immune systems tended to be in worse shape than the people who walked faster. MRI exams showed several indications that their brains were also older.

“The thing that’s really striking is that this is in 45-year-old people, not the geriatric patients who are usually assessed with such measures,” said lead researcher Line J.H. Rasmussen, a post-doctoral researcher in the Duke University Department of Psychology and Neuroscience.

“Doctors know that slow walkers in their seventies and eighties tend to die sooner than fast walkers their same age,” said senior author Terrie E. Moffitt, the Nannerl O. Keohane University Professor of Psychology at Duke University, and Professor of Social Development at King’s College London. “But this study covered the period from the preschool years to midlife and found that a slow walk is a problem sign decades before old age.”

The data come from a long-term study of nearly 1,000 people who were born during a single year in Dunedin, New Zealand. The 904 research participants in the current study have been tested, quizzed, and measured their entire lives, mostly recently from April 2017 to April 2019 at age 45.

Researchers note that neurocognitive testing that these individuals took as children predicted who would become slower walkers. At age 3, their scores on IQ, understanding language, frustration tolerance, motor skills, and emotional control predicted their walking speed at age 45, according to the researchers.

MRI exams during their last assessment showed the slower walkers tended to have lower total brain volume, lower mean cortical thickness, less brain surface area and higher incidence of white matter “hyperintensities,” small lesions associated with small vessel disease of the brain. In short, their brains appeared somewhat older, they said.

Adding insult to injury, the slower walkers also looked older to a panel of eight screeners who assessed each participant’s “facial age” from a photograph, the researchers reported.

Walking speed has long been used as a measure of health and aging in geriatric patients, but what’s new in this study is the relative youth of these study subjects and the ability to see how walking speed matches up with health measures the study has collected during their lives, the researchers explained.

“It’s a shame we don’t have gait speed and brain imaging for them as children,” Rasmussen said. (The MRI was invented when they were five, but was not given to children for many years after.)

Some of the differences in health and cognition may be tied to lifestyle choices these individuals have made, the researchers noted.

But the study also suggests that there are already signs in early life of who would become the slowest walkers, Rasmussen said.

“We may have a chance here to see who’s going to do better health-wise in later life.”

The study was published in JAMA Network Open.

Source: Duke University

Photo: A long-term study has found that signs of aging may be detected by a simple walking test at age 45, and that the brains of slower walkers were different at age 3. Credit: Duke University Communications.