Culture

How information is like snacks, money, and drugs -- to your brain

Can't stop checking your phone, even when you're not expecting any important messages? Blame your brain.

A new study by researchers at UC Berkeley's Haas School of Business has found that information acts on the brain's dopamine-producing reward system in the same way as money or food.

"To the brain, information is its own reward, above and beyond whether it's useful," said Assoc. Prof. Ming Hsu, a neuroeconomist whose research employs functional magnetic imaging (fMRI), psychological theory, economic modeling, and machine learning. "And just as our brains like empty calories from junk food, they can overvalue information that makes us feel good but may not be useful--what some may call idle curiosity."

The paper, "Common neural code for reward and information value," was published this month by the Proceedings of the National Academy of Sciences. Authored by Hsu and graduate student Kenji Kobayashi, now a post-doctoral researcher at the University of Pennsylvania, it demonstrates that the brain converts information into same common scale as it does for money. It also lays the groundwork for unraveling the neuroscience behind how we consume information--and perhaps even digital addiction.

"We were able to demonstrate for the first time the existence of a common neural code for information and money, which opens the door to a number of exciting questions about how people consume, and sometimes over-consume, information," Hsu said.

The paper is rooted in the study of curiosity and what it looks like inside the brain. While economists have tended to view curiosity as a means to an end, valuable when it can help us get information to gain an edge in making decisions, psychologists have long seen curiosity as an innate motivation that can spur actions by itself. For example, sports fans might check the odds on a game even if they have no intention of ever betting.

Sometimes, we want to know something, just to know.

"Our study tried to answer two questions. First, can we reconcile the economic and psychological views of curiosity, or why do people seek information? Second, what does curiosity look like inside the brain?" Hsu said.

To understand more about the neuroscience of curiosity, the researchers scanned the brains of people while they played a gambling game. Each participant was presented with a series of lotteries and needed to decide how much they were willing to pay to find out more about the odds of winning. In some lotteries, the information was valuable--for example, when what seemed like a longshot was revealed to be a sure thing. In other cases, the information wasn't worth much, such as when little was at stake.

For the most part, the study subjects made rational choices based on the economic value of the information (i.e., how much money it could help them win). But that didn't explain all their choices: People tended to over-value information in general, and particularly in higher-valued lotteries. It appeared that the higher stakes increased people's curiosity in the information, even when the information had no effect on their decisions.

The researchers determined that this behavior could only be explained by a model that captured both economic and psychological motives for seeking information. People acquired information based not only on its actual benefit, but also on the anticipation of its benefit, whether or not it had use.

Hsu said that's akin to wanting to know whether we received a great job offer, even if we have no intention of taking it. "Anticipation serves to amplify how good or bad something seems, and the anticipation of a more pleasurable reward makes the information appear even more valuable," he said.

How does the brain respond to information? Analyzing the fMRI scans, the researchers found that the information about the games' odds activated the regions of the brain specifically known to be involved in valuation (the striatum and ventromedial prefrontal cortex or VMPFC), which are the same dopamine-producing reward areas of the brain activated by food, money, and many drugs. This was the case whether the information was useful, and changed the person's original decision, or not.

Next, the researchers were able to determine that the brain uses the same neural code for information about the lottery odds as it does for valuation or money by using a machine learning technique (called support vector regression). That allowed them to look at the neural code for how the brain responds to varying amounts of money, and then ask if the same code can be used to predict how much a person will pay for information. It can.

In other words, just as we can convert such disparate things as a painting, a steak dinner, and a vacation into a dollar value, the brain converts curiosity about information into the same common code it uses for money and other concrete rewards, Hsu said.

"We can look into the brain and tell how much someone wants a piece of information, and then translate that brain activity into monetary amounts," he said.

While the research does not directly address overconsumption of digital information, the fact that information engages the brain's reward system is a necessary condition for the addiction cycle, he said. And it explains why we find those alerts saying we've been tagged in a photo so irresistible.

"The way our brains respond to the anticipation of a pleasurable reward is an important reason why people are susceptible to clickbait," he said. "Just like junk food, this might be a situation where previously adaptive mechanisms get exploited now that we have unprecedented access to novel curiosities."

Credit: 
University of California - Berkeley Haas School of Business

New species of rock-eating shipworm identified in freshwater river in the Philippines

image: This microscopy image shows a 4-inch-long L. abatanica.

Image: 
Marvin Altamia and Reuben Shipway

A newly identified genus and species of worm-like, freshwater clam, commonly known as a shipworm, eats rock and expels sand as scat while it burrows like an ecosystem engineer in the Abatan River in the Philippines.

Local residents of Bohol Island tipped off an international group of scientists, including University of Amherst post-doctoral researcher Reuben Shipway, to the watery location of the bivalve, which the scientists named Lithoredo abatanica, using the Latin words for rock (litho) and the last two syllables of shipworm (teredo). Locals call the shipworm "antingaw," and new mothers are said to eat them in an effort to enhance lactation, Shipway says.

"These animals are among the most important in the river and in this ecosystem," says Shipway, a marine biologist working in the microbiology lab of professor Barry Goodell and lead author of the paper that describes L. abatanica, published in the Proceedings of the Royal Society B. "As they bore elaborate tunnels in the limestone bedrock, these animals change the course of the river and provide a really rich environment for other aquatic species to live in. So far, this is the only place on earth that we know these animals exist."

Co-authors include Marvin Altamia and Daniel Distel of Ocean Genome Legacy Center at Northeastern University, where Shipway previously worked; Gary Rosenberg of Drexel University; Gisela Concepcion of the University of the Philippines; and Margo Haygood of the University of Utah.

"Most other shipworms are as skinny as your finger," Shipway says. "These animals are quite chubby, robust. They look really different. Where they get their nutrition we don't know."

It's the second new genus and species of shipworm recently discovered in the Philippines by this international team of researchers known as the Philippine Mollusk Symbiont International Collaborative Biodiversity Group, funded by the National Institutes of Health. The other new bivalve, the impossibly elongated, pink and pinstriped Tamilokus mabinia, eats wood like most shipworms and was found to be filled with bacteria that provide its nutrition. The team of scientists produced a video abstract detailing its identification.

The researchers analyzed L. abatanica's bored and ingested rock and used micro-computed tomography (CT) to create three-dimensional scans of the shipworm, revealing its unique anatomy. DNA was extracted for sequencing and whole animals up to four inches long were taken and preserved.

The new shipworm also may provide new insights for paleontologists. Until recently, fossil borings in rocky substrates were thought to be a marker for ancient marine habitats, Shipway explains. L. abatanica shows that such fossils might mark ancient freshwater sites as well.

Shipway says the international collaborative aims to understand and preserve biodiversity, boost research capabilities of partners in the Philippines and, most importantly, use the biodiversity the animals reflect for new drug discovery.

He concludes, "This is one of the most noble projects I've ever been involved in."

Credit: 
University of Massachusetts Amherst

Biochar may boost carbon storage, but benefits to germination and growth appear scant

image: Illinois Sustainable Technology Center researchers Elizabeth Maschewski, left, and Nancy Holm and collaborators developed a systematic study to test the effectiveness of the soil additive biochar and found that it may not be as effective as previously thought.

Image: 
Photo courtesy Nancy Holm

CHAMPAIGN, Ill. -- Biochar may not be the miracle soil additive that many farmers and researchers hoped it to be, according to a new University of Illinois study. Biochar may boost the agricultural yield of some soils - especially poor quality ones - but there is no consensus on its effectiveness. Researchers tested different soils' responses to multiple biochar types and were unable to verify their ability to increase plant growth. However, the study did show biochar's ability to affect soil greenhouse gas emissions.

The new findings are published in the journal Chemosphere.

Biochar additives - particles of organic material burned in a controlled oxygen-free process - provide soil with a form of carbon that is more resistant to microbial action than traditional, uncharred biomass additives. In theory, this property should allow soil to hold onto carbon for long-term storage, the researchers said, because it does not degrade as rapidly as other forms of carbon.

"There are conflicting reports on the effectiveness of biochar for use to increase crop production as well as its potential as a carbon-storage reservoir," said Nancy Holm, an Illinois Sustainable Technology Center researcher and study co-author. "We came into this study suspecting that variations in types of biochar feedstock, preparation methods and soil composition were the cause of the conflicting results."

Addressing past research inconsistencies, the team designed a systematic study using 10 common Illinois soil types to test the effects of mixing in varying concentrations of biochars from three different feedstocks - corn, Miscanthus and hardwood.

To add a dimension to the study that is common in real-world agricultural settings, the team also examined how two other sources of carbon - plant material burned in an uncontrolled open-atmosphere setting and corn stover - affect soils. Corn stover is composed of raw stalks, leaves and cobs that remain in the field after harvest.

Factoring in each scenario, triplicate analysis and control samples, the experiment produced 429 soil samples in which the researchers planted two corn seeds each.

After a 14-day germination period, the study showed that adding biochar from any of the feedstocks or production techniques had no substantial influence on the output of greenhouse gas production, plant growth dynamics or microbial community activity. However, the researchers did see some important differences in the soils that included corn stover and burnt plant material.

"The addition of corn stover - which simulates actual field conditions - led to a dramatic increase in greenhouse gas emissions, as well as a change in the soil microbial community," said Elizabeth Meschewski, an ISTC researcher and lead author of the study. "But, initial seedling growth was not affected when comparing these results to the soils with no additives. Addition of burnt plant material did show reduced plant biomass above ground, increased production of the greenhouse gas nitrogen oxide and altered soil microbial community."

The team concluded that biochar might improve the quality of highly degraded or poor quality soils, but does not appear to provide any quality benefit to the soils used in this study. However, the researchers said that using biochar as an additive instead of raw biomass or burnt plant material could prevent microbe-generated greenhouse gas emissions.

The team acknowledges that a longer-period study is needed for a more comprehensive understanding of how biochar may benefit agriculture.

"For future studies, we recommend performing a similar study in many different soil types for the whole growing season for corn - not just 14 days - and possibly over several growing seasons," said ISTC researcher and study co-author B.K. Sharma.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Fatty fish without environmental pollutants protect against type 2 diabetes

image: Lin Shi, Postdoc in the Division of Food and Nutrition Science, Chalmers University of Technology

Image: 
Johan Bodell, Chalmers University of Technology

If the fatty fish we eat were free of environmental pollutants, it would reduce our risk of developing type 2 diabetes. However, the pollutants in the fish have the opposite effect and appears to eliminate the protective effect from fatty fish intake. This has been shown by researchers at Chalmers University of Technology in Sweden, using innovative methods that could be used to address several questions about food and health in future studies.

Research on the effect of fish consumption on diabetes risk has produced contradictory results in recent years. Some studies show that eating a lot of fish reduces the risk of developing type 2 diabetes, while others show it has no effect, and some studies show it even tends to increase the risk. Researchers at Chalmers University of Technology conducted a study with an entirely new design and have now arrived at a possible explanation for this puzzle.

"We managed to separate the effect of the fish per se on diabetes risk from the effect of various environmental pollutants that are present in fish," says Lin Shi, a Postdoc in Food and Nutrition science. "Our study showed that fish consumption as a whole has no effect on diabetes risk. We then screened out the effect of environmental pollutants using a new data analysis method based on machine learning. We were then able to see that fish themselves provide clear protection against type 2 diabetes."

"Protection is provided primarily by consumption of fatty fish. However, at the same time, we saw a link between high consumption of fatty fish and high contents of environmental pollutants in the blood."

Environmental pollutants measured in the present study are persistent organic pollutants (POPs), for example dioxins, DDT and PCB. Previous research has shown that they may be linked to increased risk of type 2 diabetes. The varying effect of fish on diabetes risk in different studies could therefore be due to varying levels of consumption of fish from polluted areas in the different studies.

According to the Swedish National Food Agency, food is the main source of exposure to dioxins and PCBs. These substances are fat soluble and are primarily found in fatty animal foods such as fish, meat and dairy products. Particularly high contents are found in fatty fish such as herring and wild salmon from polluted areas. In Sweden, for example, this means the Baltic Sea, the Gulf of Bothnia and the biggest lakes, Vänern and Vättern.

The Chalmers researchers also used a new method to find out what the study participants had eaten, as a complement to questionnaires on dietary habits. Previous research has often relied entirely on questionnaires. This produces sources of error that may also have contributed to the contradictory results concerning fish and type 2 diabetes.

"Using a technique known as mass spectrometry based metabolomics, we identified around 30 biomarkers in blood samples, i.e. specific molecules that could be used to objectively measure of how much fish the study participants had consumed," says Lin Shi.

Overall, the new methodology provides considerably better tools for this research field. They can be used to better discern which dietary factors are the actual causes of different types of health effects.

"Metabolomics and the new way of analysing data give us new opportunities to distinguish between effects from different exposures that are correlated," says Rikard Landberg, Professor of Food and Nutrition Science at Chalmers. "This is very important as otherwise it is difficult to determine whether it is diet, environmental pollutants or both that affect the risks of disease."

More about the study:

The study is a case-control study nested in a prospective cohort in Västerbotten in northern Sweden. The participants had completed questionnaires on dietary habits and lifestyle, and provided blood samples, which were frozen. A total of 421 people who had developed type 2 diabetes after an average of 7 years were included, and they were compared with 421 healthy control individuals. The original blood samples were then analysed. In addition, blood samples were analysed that had been provided ten years after the first blood samples by 149 of the case-control pairs.

Credit: 
Chalmers University of Technology

Frog protein may mitigate dangers posed by toxic marine microbes

A new study from UC San Francisco suggests that a protein found in the common bullfrog may one day be used to detect and neutralize a poisonous compound produced by red tides and other harmful algal blooms. The discovery comes as these waterborne toxic events are becoming increasingly common, a consequence of climate change making the world's oceans more hospitable to the microbes responsible for these formerly infrequent flare-ups.

Aquatic microbes manufacture an array of toxic compounds, but few are as formidable as the nerve agent saxitoxin, often found in algal blooms. Though not a household name like cyanide, saxitoxin is far more potent -- it's deadly at doses a thousand times smaller than a lethal dose of cyanide. And because it accumulates in filter-feeding shellfish, saxitoxin can work its way up the food chain and land on our dinner tables.

"Saxitoxin is among the most lethal natural poisons and is the only marine toxin that has been declared a chemical weapon," said Daniel L. Minor Jr., PhD, senior author of the new study and professor at UCSF's Cardiovascular Research Institute.

The research, a collaboration between Minor's lab and the laboratory of Stanford University Chemistry Professor Justin Du Bois, PhD, was published June 19 in Science Advances and provides the first high-resolution molecular structures of saxiphilin, a remarkable protein that is thought to render bullfrogs resistant to the neurotoxic effects of saxitoxin.

These structures -- detailed renderings that depict saxiphilin both by itself and bound to saxitoxin -- help explain how the protein confers this protective benefit and offer key insights into the evolutionary origins of toxin resistance. These findings provide a more thorough understanding of saxiphilin that will allow scientists to begin exploring ways to repurpose the protein for use as a tool to detect the toxin, both in the environment and in the body, and as an antitoxin therapy.

Frogs Are Resistant to Red Tide Toxin That Kills Victims by Paralyzing Nerve Cells

When ingested by humans, saxitoxin disrupts nerve signaling and can lead to a potentially fatal condition known as paralytic shellfish poisoning. Without immediate medical attention, the muscles that control breathing are rapidly incapacitated, leading to death by asphyxiation.

In the wild, the effects can be even more extreme. During red tides and other saxitoxin outbreaks, the lifeless carcasses of aquatic animals -- from fish and turtles to whales and dolphins -- are often found washed ashore by the hundreds. But you won't find frogs among the dead and dying.

Frogs are unusually saxitoxin resistant, a phenomenon first observed by Hermann Sommer, PhD, a researcher at the George Williams Hooper Foundation at UCSF from 1924 until his death in 1950. From 1928 to 1932, Sommer collected bivalves along the San Francisco coast and isolated what was then called "mussel poison." He tested its effects on lab animals and found that frogs didn't exhibit severe symptoms until they were given at least 15 times the lethal dose for mice. At the time, however, Sommer couldn't account for this result.

Later efforts by other researchers showed that saxitoxin's main target is a class of proteins known as voltage-gated sodium channels. These proteins, which are embedded in the surface of nerve and muscle cells, control the flow of electrically charged particles in and out of the cell.

"The interaction of the toxin with voltage-gated sodium channels is the reason why saxitoxin is so poisonous, as this interaction blocks electrical signals in nerves," Minor explained.

Though scientists understood this mechanism of neurotoxicity -- and also knew that saxiphilin could somehow counteract these effects -- Minor and colleagues had to deduce saxiphilin's molecular structure before they were able to explain exactly how the protein protected frogs from saxitoxin poisoning.

Molecular Architecture Explains Protein's Antitoxic Properties

To do this, the researchers turned to a technique known as X-ray crystallography, which is among the most common methods used for solving the structures of complex biomolecules. It was used in the 1950s to determine the structure of DNA, and remains the gold standard in structural biology to this day. And it was X-ray crystallography that provided the first saxiphilin structures -- maps that detail the three-dimensional arrangement of the atoms that comprise the protein.

With these biochemical blueprints in hand, Minor and colleagues were able to account for the bullfrog's remarkable saxitoxin resistance. The structures revealed a catcher's mitt-like "pocket" that ensnares saxitoxin in a web of powerful electrostatic interactions. Unable to escape from this electric vise, saxitoxin is no longer free to bind to voltage-gated sodium channels and disrupt nerve signaling.

"Saxiphilin acts as a 'toxin sponge' that sops up and sequesters this deadly microbial poison," said Minor.

Saxiphilin Ancestor Reveals How Proteins Evolve Novel Anti-Toxic Functions

The researchers also noticed an uncanny structural similarity between saxiphilin and transferrins, a family of proteins that transport iron to red blood cells from sites in the body where iron is absorbed or stored. When superimposed, the two proteins look nearly identical. But a closer inspection of their structures revealed key differences that explain their divergent function and also suggests that the reason they look so much alike is because saxiphilin evolved from a member of the transferrin family.

Transferrins have two iron binding sites, one of which abuts what Minor and colleagues refer to as a "proto-pocket." Though this proto-pocket is too small to bind saxitoxin -- and isn't known to bind any other molecules -- the scientists were able to show that replacing a small handful of positively charged amino acids with negatively charged amino acids transformed this functionless structural feature into a life-saving saxitoxin sponge. But they also discovered that the molecular architecture responsible for saxitoxin binding isn't unique to saxiphilin.

"Our findings uncover a remarkable convergence in how saxiphilin and voltage-gated sodium channels recognize the toxin. Both proteins share the same general blueprint for saxitoxin recognition," Minor said.

When Minor and colleagues compared their saxiphilin structures against existing structures that show saxitoxin bound to voltage-gated sodium channels, they discovered that these two classes of protein share a common architecture that facilitates their interaction with saxitoxin, even though they're otherwise structurally and evolutionarily unrelated.

An analysis of protein sequences from other animals revealed that similar saxitoxin binding pockets also appear in proteins from distantly related species, including the High Himalaya frog. This finding suggests that a common saxitoxin binding pocket evolved multiple times in the course of evolutionary history, though scientists are not entirely sure why these proteins evolved in the first place.

"This work is an important first step to understanding how organisms can evolve resistance to toxic environments," Minor said, also noting that the research has important practical applications that may be of particular interest to Bay Area residents.

"This problem is increasing with climate change and is a serious public health and commercial fishing hazard. Red tide warnings have closed California fisheries multiple times in the past few years and affect San Francisco, Marin and San Mateo counties. Our efforts may lead to new ways to detect saxitoxin and counter saxitoxin poisoning, for which there are currently no approved treatments."

Credit: 
University of California - San Francisco

US beekeepers lost over 40% of colonies last year, highest winter losses ever recorded

image: Bees at Roberson Farm

Image: 
FarmersGov

Beekeepers across the United States lost 40.7% of their honey bee colonies from April 2018 to April 2019, according to preliminary results of the latest annual nationwide survey conducted by the University of Maryland-led nonprofit Bee Informed Partnership. The survey results indicate winter losses of 37.7%, which is the highest winter loss reported since the survey began 13 years ago and 8.9 percentage points higher than the survey average. Honey bees pollinate $15 billion worth of food crops in the United States each year, so their health is critical to food production and supply.

"These results are very concerning, as high winter losses hit an industry already suffering from a decade of high winter losses," says Dennis vanEngelsdorp, associate professor of entomology at the University of Maryland and president for the Bee Informed Partnership.

During the 2018 summer season, beekeepers lost 20.5% of their colonies, which is slightly above the previous year's summer loss rate of 17.1%, but about equal to the average loss rate since the summer of 2011. Overall, the annual loss of 40.7% this last year represents a slight increase over the annual average of 38.7%.

"Just looking at the overall picture and the 10-year trends, it's disconcerting that we're still seeing elevated losses after over a decade of survey and quite intense work to try to understand and reduce colony loss," adds Geoffrey Williams, assistant professor of entomology at Auburn University and co-author of the survey. "We don't seem to be making particularly great progress to reduce overall losses."

Since beekeepers began noticing dramatic losses in their colonies, state and federal agricultural agencies, university researchers, and the beekeeping industry have been working together to understand the cause and develop Best Management Practices to reduce losses. The annual colony loss survey, which has been conducted since 2006, has been an integral part of that effort.

The survey asks commercial and backyard beekeeping operations to track the survival rates of their honey bee colonies. Nearly 4,700 beekeepers managing 319,787 colonies from all 50 states and the District of Columbia responded to this year's survey, representing about 12% of the nation's estimated 2.69 million managed colonies.

The Bee Informed Partnership team says multiple factors are likely responsible for persistently high annual loss rates and this year's jump in winter losses. They say a multi-pronged approach to research, Extension, and Best Management Practices is needed to combat the problem.

The number one concern among beekeepers and a leading contributor to winter colony losses is varroa mites, lethal parasites that can readily spread from colony to colony. These mites have been decimating colonies for years, with institutions like the University of Maryland actively researching ways to combat them. "We are increasingly concerned about varroa mites and the viruses they spread, says vanEngelsdorp. "Last year, many beekeepers reported poor treatment efficacy, and limited field tests showed that products that once removed 90% of mites or more are now removing far fewer. Since these products are no longer working as well, the mite problem seems to be getting worse."

"But mites are not the only problem," continues vanEngelsdorp. "Land use changes have led to a lack of nutrition-rich pollen sources for bees, causing poor nutrition. Pesticide exposures, environmental factors, and beekeeping practices all play some role as well."

Karen Rennich, executive director for the Bee Informed Partnership and senior faculty specialist at the University of Maryland, elaborates on land use and environmental factors that may be significant, including increases in extreme weather.

"The tools that used to work for beekeepers seem to be failing, and that may be evident in this year's high losses. A persistent worry among beekeepers nationwide is that there are fewer and fewer favorable places for bees to land, and that is putting increased pressure on beekeepers who are already stretched to their limits to keep their bees alive," says Rennich. "We also think that extreme weather conditions we have seen this past year demand investigation, such as wildfires that ravage the landscape and remove already limited forage, and floods that destroy crops causing losses for the farmer, for the beekeeper, and for the public."

According to Rennich and Williams, more research is needed to understand what role climate change and variable weather patterns play in honey bee colony losses.

"Beekeepers have to be very dynamic in their response to weather and environmental conditions," explains Williams. "If it is a cold, long winter, beekeepers need to be very diligent and make sure they have enough food for their bees to survive. On the other hand, warm winters can create favorable conditions for varroa mites, which means beekeepers need to know how to manage them properly."

Williams and the other researchers on the survey team agree that in addition to understanding the impact of weather conditions, beekeepers need to stay current on science-based Best Management Practices.

"One of the best things that a beekeeper can do is implement Best Management Practices for their region, and they can find those through the Bee Informed website," emphasizes vanEngelsdorp.

The survey is conducted by the Bee Informed Partnership with data collected and analyzed by the University of Maryland and Auburn University. Survey results are available on the Bee Informed website, with a summary provided below.

Winter Loss Estimates:

1 October 2018 - 1 April 2019: 37.7% losses
7 percentage points higher than winter 2017-2018
8.9 percentage points higher than average winter loss (2006-2019)

Summer Loss Estimates:

1 April 2018 - 1 October 2018: 20.5% losses
3.4 percentage points higher than summer 2017
Equal to average summer loss since summer survey began in 2011: 20.5%

Total Annual Loss Estimates:

1 April 2018 - 1 April 2019: 40.7% losses
0.6 percentage points higher than 2017-2018
2.9 percentage points higher loss since annual survey began in 2010-2011: 37.8%

Winter Loss Comparison by Beekeeper Category:

Backyard beekeepers (manage 50 or fewer colonies): 39.8%
Sideline (manage 51-500 colonies): 36.5%
Commercial (manage more than 500 colonies): 37.5%

Credit: 
University of Maryland

Simple scan could direct treatments for angina

A 40 minute test for angina could help patients avoid an overnight stay in hospital, according to research funded by the NIHR Guy's and St Thomas' Biomedical Research Centre.

The MR-INFORM trial looked at whether magnetic resonance imaging (MRI) could be used to guide treatment decisions for angina patients, rather than performing a more invasive procedure.

Angina is chest pain caused by reduced blood flow to the heart muscles. The condition affects two million people in the UK and is a warning sign that you could be at risk of a heart attack or stroke. Treatments include drugs to lessen the pain and changes to a healthier lifestyle.

Currently patients diagnosed with the condition are usually sent for an invasive angiography, a procedure which involves taking X-rays of the patient's arteries, and involves multiple hospital visits, including an overnight stay. If the condition is severe, patients have a procedure to improve blood flow to the heart called revascularisation.

The MR-INFORM trial looked at 918 patients with angina and risk factors for coronary heart disease, who were divided into two groups. One received the standard invasive angiography. The other had the 40 minute MRI perfusion scan of the heart, to decide whether to send the patient on for invasive angiography.

The two groups had similar outcomes for patients' health with under 4% of patients in both groups having cardiac events (such as heart attacks), in the following year. However, the group whose treatment was dictated by their MRI scan had significantly fewer procedures, with only 40% of this group going on to have invasive angiography. Only 36% of the MRI group went on to have revascularisation, compared to 45% in the other group.

Professor Eike Nagel, a consultant cardiologist and chair in Clinical Cardiovascular Imaging at King's College London, led the research. He said: "Personalising patients' treatment for angina will mean that we can target the more invasive treatments only to those patients that really need them.

"We have shown that MRI imaging, which is less invasive than current diagnostic tests, could mean that patients' initial visits to hospital are quicker and more patient friendly, and that they are less likely to have further procedures. But crucially the outcomes for patients of taking this approach were similar, so there is no negative impact of directing treatment in this way only to those we're sure need it.

"We are really grateful to patients across Europe who participated in the study."

Credit: 
NIHR Biomedical Research Centre at Guy’s and St Thomas’ and King’s College London

Patients of surgeons with unprofessional behavior more likely to suffer complications

Patients of surgeons with higher numbers of reports from co-workers about unprofessional behavior are significantly more likely to experience complications during or after their operations, researchers from Vanderbilt University Medical Center (VUMC) reported today in JAMA Surgery.

"Surgical teams require every team member to perform at their highest level. We were interested in understanding whether surgeons' unprofessional behaviors might undermine culture, threaten teamwork, and potentially increase risk for adverse outcomes of care," said the study's corresponding author, William O. Cooper, MD, MPH, vice president for Patient and Professional Advocacy at VUMC.

Cooper, the Cornelius Vanderbilt Professor of Pediatrics and Health Policy at Vanderbilt University, was lead author of a previous study that found that recording and analyzing patient and family reports about rude and disrespectful behavior can identify surgeons with higher rates of surgical site infections and other avoidable adverse outcomes.

In the current study, the researchers conducted a retrospective cohort study of outcome data from two academic medical centers that participate in the National Surgical Quality Improvement Program. The cohort included more than 13,600 adult patients who underwent operations by 202 surgeons between 2012 and 2016.

Reported unprofessional behaviors included poor or unsafe practices in the operating room, communicating disrespectfully with co-workers, or not following through on expected professional responsibilities, such as signing verbal orders.

Compared with patients whose surgeons had no reports, those whose surgeons were reported for unprofessional behavior in the 36 months before their operations were more likely to have wound infections and other complications including pneumonia, blood clots, renal failure, stroke and heart attack.

Patients whose surgeons had one to three reports of unprofessional behavior were at 18% higher estimated risk of experiencing complications, and those whose surgeons had four or more reports were at nearly 32% higher estimated risk compared to patients whose surgeons had no reports.

There was no difference, however, between study groups in the percentage of patients who died, required a second operation or who were readmitted to the hospital within 30 days of their first operation.

A greater percentage of surgeons who had no reports of unprofessional conduct were women, suggesting that female surgeons were less likely than their male counterparts to generate co-worker concerns.

"This study provides additional evidence of the important association between unprofessional behaviors and team performance by directly measuring patient outcomes," the researchers concluded.

"It's really about common sense," said Gerald Hickson, MD, the Joseph C. Ross Professor of Medical Education and Administration, professor of Pediatrics and the study's senior author. "If someone is disrespectful to you, how willing are you to share information or ask for advice or help from that individual?

"Unprofessional behavior modeled by the team leads reduces the effectiveness of the team," Hickson said.

Poor behavior can be modified. "Future work should assess whether improved interactions with patients, families and co-workers by surgeons who receive interventions for patterns of unprofessional behavior are also associated with improved surgical outcomes for their patients," the researchers concluded.

Credit: 
Vanderbilt University Medical Center

Synthetic joint lubricant holds promise for osteoarthritis

ITHACA, N.Y. - A new type of treatment for osteoarthritis, currently in canine clinical trials, shows promise for eventual use in humans.

The treatment, developed by Cornell University biomedical engineers, is a synthetic version of a naturally occurring joint lubricant that binds to the surface of cartilage in joints and acts as a cushion during high-impact activities, such as running.

"When the production of that specific lubricant goes down, it creates higher contact between the surfaces of the joint and, over time, it leads to osteoarthritis," said David Putnam, a professor in the College of Engineering with appointments in the Meinig School of Biomedical Engineering and the Smith School of Chemical and Biomolecular Engineering.

The study focuses on a naturally occurring joint lubricant called lubricin, the production of which declines following traumatic injuries to a joint, such as a ligament tear in a knee.

The knee is lubricated in two ways - hydrodynamic mode and boundary mode.

Hydrodynamic mode lubrication occurs when the joint is moving fast and there isn't a strong force pushing down on it. In this mode, joints are lubricated by compounds like hyaluronic acid (HA) that are thick and gooey, like car oil. There are numerous HA products on the market, approved by the Food and Drug Administration, for treating hydrodynamic mode lubrication disorders.

But HA is ineffective when strong forces are pushing down on the joint, such as those that occur during running or jumping. In these instances, thick gooey HA squirts out from between the cartilage surfaces, and boundary mode lubrication is necessary. Under these forces, lubricin binds to the surface of the cartilage. It contains sugars that hold on to water, to cushion hard forces on the knee.

In the paper, the researchers describe a synthetic polymer they developed that mimics the function of lubricin and is much easier to produce. "We are in clinical trials, with dogs that have osteoarthritis, with our collaborators at Cornell's College of Veterinary Medicine," Putnam said.

"Once we finalize the efficacy study in dogs, we will be in a very good position to market the material for veterinary osteoarthritis treatment," Putnam said. From there, the human market for a lubricin substitute should follow, just as HA has been made available for human use, mainly in knees.

Credit: 
Cornell University

New platform flips traditional on-demand supply chain approach on its head

TROY, N.Y. -- Imagine you are heading to the grocery store and receive a phone alert asking if you'd also be willing to bring your neighbor's groceries home. Or you are on your way to a concert and see you could fill the seats of your car--and your wallet--if you picked up a few other music fans along the way. As the supplier in these scenarios, you have the choice of which services you provide and when. This may very well be the way commerce is headed.

Research recently published in Transportation Research Part B: Methodological, by systems engineers at Rensselaer, demonstrated how a hierarchical model that provides suppliers with a certain amount of choice could improve supply and demand matching for underutilized resources--and may even transform what's become known as the sharing economy.

In this research Jennifer Pazour, associate professor of industrial and systems engineering at Rensselaer Polytechnic Institute, and Seyed Shahab Mofidi, who recently received his Ph.D. from Rensselaer, built a ride-sharing environment simulation and plugged simulated data into the algorithms they created. However, the same approach could be applied to other scenarios, such as businesses that wish to share warehouse space or nonprofits looking to fill volunteer hours with the use of an on-demand application.

"What is exciting to me is that this proof of concept shows the model works," Pazour said. "This laid the foundation that this way of giving people recommendations and choices can actually help all entities in the system."

Approaches currently being used, Pazour said, may match a supplier with a demand request based on what's best for the client without much choice from the supplier. This may result in a fast response but, she points out, it prevents some suppliers from participating.

Inversely, other existing platforms may show all available demand requests to a supplier, allowing them to sift through the options and choose what works for them. It's an approach that has supplier in mind, but results in a much slower response for the client.

The team's platform tries to strike a balance between supply and demand by giving the supplier some--but not all--choices based on previous supplier behavior. Pazour compares this approach to how other platforms may suggest a set of movies you may like to watch, or products you may like to buy based on your previous decisions.

For example, a driver--or supplier--with a car will be given some choices of potential riders they could pick up. They can then make a decision based on their plans for the day and the route they are already about to take. These decisions will inform which potential riders they are offered in the future.

"Our approach is more proactive," Pazour said. "We're not going to ask you anything initially. We're going to push you notifications, options, and let you choose and then the model will deal with the consequences."

What the researchers found is that this approach performed better, compared to other approaches, when the platform doesn't have much information about the supplier and their preferences.

"This methodology is most useful when the platform isn't able to perfectly predict people's actions," Pazour said. "That is reality, but I think that's one thing that's missing in a lot of the other apps."

Pazour also hopes more people will be inclined to opt in to a platform that provides more choice, which in turn could lead to an increase in the use of underutilized resources like empty seats in a car.

"If we give more choice, maybe we will get more people who are willing to do this," Pazour said.

In addition to looking at this challenge in terms of efficiency, Pazour also has equity in mind.

For example, an on-demand grocery delivery service could help those who don't have a grocery store nearby. In that case, simply matching a supplier with their closest neighbor may exclude some clients from being served.

"If it's designed for efficiency of resources, that's potentially a different algorithm than if it's designed for equity. So we're thinking about how we can make sure everyone gets this service at some equitable level," Pazour said.

Now that they know their methodology works, Pazour and her team are able to expand their research. They plan to improve the platform models and algorithms and apply them to other areas of supply and demand, including volunteerism. Pazour said her team is also exploring opportunities to work with companies to analyze actual data and evaluate if this unique approach could benefit them.

Credit: 
Rensselaer Polytechnic Institute

Study finds similar cardiovascular outcomes for generic, brand-name drugs for hypothyroidism

JACKSONVILLE, Fla. -- A new study by Mayo Clinic researchers may have broad implications for treatment of patients with predominantly benign thyroid disease and newly treated hypothyroidism.

The study, to be published in Mayo Clinic Proceedings in July, looked at whether generic and brand-name levothyroxine therapy affected hospitalization for cardiovascular events for those patients who are more at risk of coronary heart disease and heart failure. Levothyroxine is the most prescribed medication in the U.S., with more than 23 million prescriptions written annually.

The retrospective analysis, using deidentified claims data from a large private U.S. health plan, found that cardiovascular event rates were similar for generic and brand-name levothyroxine therapy, with lower pharmacy costs for the generic drug. The findings, if confirmed with research into longer-term event rates, suggest that generic or brand-name levothyroxine may be used to treat hypothyroidism due to benign thyroid disorders. The average 30-day cost of the generic drug was about half the cost of brand-name medication for patients and insurers.

"More than 90% of thyroid prescriptions are for levothyroxine, and there has been disagreement as to whether generic levothyroxine and branded thyroxine preparations are equivalent," says Robert Smallridge, M.D., a Mayo Clinic endocrinologist and the study's principal investigator. "These findings suggest that generic and brand levothyroxine therapy are similar as related to cardiovascular events risk."

Dr. Smallridge says the findings require confirmation with longer-term follow-up and study of subsets of patients, such as those with a history of thyroid cancer, who frequently receive higher doses of levothyroxine.

Hypothyroidism, or underactivity of the thyroid gland, affects the function of many organs in the body, elevating blood cholesterol levels and increasing the risk of heart attacks, heart failure and stroke. Levothyroxine is used to reduce elevated cholesterol and reverse symptoms of hypothyroidism. U.S. brand names for levothyroxine include Levothroid, Levoxyl, Synthroid, Tirosint and Unithroid.

The analysis was unusual in that it used information from an administrative claims database provided by OptumLabs Data Warehouse. OptumLabs was co-founded by Optum Inc. and Mayo Clinic in 2012. This data warehouse contains deidentified administrative claims data, as well as deidentified electronic health record data from a nationwide network of provider groups.

The study reviewed records for 87,902 patients followed for a mean period of one year, focusing on hospitalization for heart attacks, congestive heart failure, atrial fibrillation or strokes. The analysis found no difference in event rates for the four types of cardiovascular events.

Credit: 
Mayo Clinic

New research finds increased CT use for suspected urolithiasis patients in ED

Reston, VA (June 17, 2019) - A new study performed in conjunction with the Harvey L. Neiman Health Policy Institute examines changing characteristics of utilization and potential disparities in US emergency department (ED) patients undergoing CT of the abdomen and pelvis (CTAP) for suspected urolithiasis. The study is published online in the Journal of American College of Radiology (JACR).

Dr. Balthazar and team used the Nationwide Emergency Department Sample, which is the largest publicly available all-payer ED database in the United States to study patients from 2006 to 2015 with a primary diagnosis of suspected urolithiasis. The annual numbers of ED visits for suspected urolithiasis and associated CTAP examinations per visit were determined along with patient demographics, payer status and hospital characteristics as potential independent predictors of utilization.

"Overall, CT utilization rates in the ED continue to increase over time despite government and medical specialty organization initiatives to restrain the growth of advanced imaging services," stated first author Patricia Balthazar, MD, diagnostic radiology resident, Emory University. "Although the US population grew by 6.9% from 2006 to 2014, the annual ED visits for suspected urolithiasis increased by 17.9%, and the number of visits for suspected urolithiasis involving advanced imaging increased by 100.8%."

"The relative use of CTAP in ED patients presenting with suspected urolithiasis doubled between 2006 and 2014 and showed marked geographic variation," noted Richard Duszak, MD, senior author and Neiman Institute Affiliate Senior Research Fellow. "Among ED patients with suspected urolithiasis, CTAP was more frequent in patients from higher household income ZIP codes, with private insurance, in the Northeast, and at urban and nonteaching hospitals."

These findings provide important information to practicing clinicians, researchers, and policymakers interested in optimizing the use of advanced medical imaging in the ED.

Credit: 
Harvey L. Neiman Health Policy Institute

Concordia researchers develop new method to evaluate artificial heart valves

image: Lyes Kadem (left) and Ahmed Darwish.

Image: 
Concordia University

Researchers at Concordia have devised a technique to detect obstructions in a type of mechanical heart valve they believe will contribute to safer follow-up methods for cardiologists and their patients.

The team led by Lyes Kadem, professor in the Department of Mechanical, Industrial and Aerospace Engineering at the Gina Cody School of Engineering and Computer Science, published their findings in the journal Artificial Organs. PhD candidate Ahmed Darwish was lead author, and Giuseppe Di Labbio, assistant professor Wael Saleh and Othman Smadi of Hashemite University in Jordan contributed.

The researchers used high-tech equipment to look at the flow downstream of a bi-leaflet mechanical heart valve (BMHV). The equipment included a custom-made double-activation left heart duplicator designed and created in their lab by Concordia undergraduate students, a high-speed camera and a laser.

Despite the impressive name, the BMHV is a simple ring with an inner diameter of about 2.5 cm. Two carbon-based leaflets inside the ring open and close as the heart pumps blood out of the left ventricle and into the aortic arch, which sends the blood out into the body.

They replace damaged aortic valves, and are installed via open-heart surgery. An obstructed BMHV can be catastrophic.

Mapping blood flow

The method the team designed maps simulated blood flow patterns that result from six different heart valve blockages. The researchers photographed particles immersed in a liquid that mimics blood and pumped the fluid through the heart duplicator.

Using a technique called particle image velocimetry, they were able to determine the flow velocity. It allowed them to simulate what blood flow would look like with the leaflets completely clear of obstruction, when they were partially obstructed and when fully obstructed.

“Imagine you are outside a stadium and the crowd is leaving from three gates next to each other,” says Kadem, the Concordia Research Chair for Cardiovascular Engineering and Medical Devices.

“If the gates are open, you will see a uniform distribution of people leaving from all three openings. If one gate is closed, you will see more people leaving from the two others, and none from the one that is closed. Therefore, you will deduce that there is a blockage.”

When applied using phase-contrast magnetic resonance imaging (MRI), the method is both non-invasive and radiation-free, says Kadem. That means doctors can use it for BMHV dysfunction detection and follow-up.

“Currently, ultrasound is the best way to detect valve dysfunction,” he says. “The next step is cinefluoroscopy, which uses radiation. You can’t use this method as a follow-up because it exposes the patient to radiation and increases their risk of cancer.”

Darwish and Kadem note that artificial heart valves are generally safe but are not risk-free. There is between 0.1 per cent and 6 per cent chance of dysfunctions that can occur between one hour and 20 years after they replace the organic valve. These can be fatal, with a 28.6 per cent mortality rate when a dysfunction results in an emergency.

This research is supported by a grant from the Natural Sciences and Engineering Research Council of Canada.

Read the cited paper: Experimental investigation of the flow downstream of a dysfunctional bileaflet mechanical aortic valve.

Journal

Artificial Organs

DOI

10.1111/aor.13483

Credit: 
Concordia University

People with multiple physical conditions have faster brain decline, higher suicide risk

Having arthritis, or diabetes, or heart disease can change a person's life, getting in the way of daily activities and requiring special diets and medicines.

But what happens when new conditions get stacked on top of that first one, creating a burden of multiple diseases that need daily managing?

As millions of Americans cope with just such a combination of conditions, a new approach to measuring what their lives are actually like has emerged.

Multimorbidity scores can help doctors understand their patients' overall prognosis - and can help researchers identify special risks faced by people with multiple chronic illnesses.

In fact, new research by a University of Michigan team shows that people with higher multimorbidity scores had a much faster decline in their thinking and memory abilities than those with lower scores.

Even though most of the chronic conditions included in the index have no direct relationship to brain health, the higher a person's score, the faster they declined over a 14-year period in their ability to recall words and do simple math.

The results, published online in the Journals of Gerontology: Series A, used data from more than 14,260 people studied multiple times over a decade or more through the Health and Retirement Study based at U-M.

Meanwhile, just months ago, the same index revealed that people with higher scores were more than twice as likely to die by suicide than those with lower scores, and that they had worse mental health-related quality of life in general.

Those findings, made by calculating the multimorbidity index for participants in three long-term studies of more than 250,000 health professionals including dentists, podiatrists, chiropractors and nurses, were published in the Journal of the American Geriatrics Society. They show the mental and physical burden of living with multiple diseases.

U-M researcher and Michigan Medicine primary care physician Melissa Wei, M.D., M.P.H., M.S., has spearheaded the development of the scoring system, called the multimorbidity weighted index or MWI.

Assessing the total impact of a person's health conditions is important because 80% of adults over the age of 65 have more than one condition, and 45% of all adults have more than one, says Wei.

Careful tool development

Over the course of years, she compiled and tested a way to assess what life is like for people with multiple chronic conditions, from glaucoma and heart arrhythmias to multiple sclerosis and a history of knee, hip and spinal disc problems.

But it's not as simple as counting the number of diseases and conditions a person has gotten diagnosed with, the researchers caution.

Rather, the risk of cognitive decline, suicide or poor mental well-being has to do with the total impact that their unique combination of conditions has on their quality of life.

Because different conditions affect people in different ways, the scoring system takes into account how that happens - and how those effects might interact with one another.

Early this year, Wei and colleagues published a study showing that the risk of dying rose 8% for every single-point rise in MWI score, and that the rise in score tracked closely with the decrease in physical abilities of people with multiple conditions.

That study, also in Journals of Gerontology: Series A, also used Health and Retirement Study data, from 18,174 people over the age of 51 who took part in the study over 11 years.

How to calculate a multimorbidity score

While the MWI scoring approach has been useful in research, Wei and her colleagues now hope that clinicians can use it to help them understand the needs and manage the care of patients with multiple conditions.

A free MWI scoring tool is now available for clinician use at the website ePrognosis, run by the University of California, San Francisco.

Any clinician can enter a few pieces of anonymous information about a patient over age 54 into the calculator and come up with a score for them. The results page also gives a breakdown of how likely people like that study participant are to die within the next 10 years, or to experience a decline in their physical functioning in the next four to eight years.

While Wei cautions clinicians not to use the score as the sole indicator of any one patient's prognosis, she hopes that the score can help guide discussions about a range of decisions from preventive care to elective surgery to living arrangements and end-of-life care preferences.

Using multimorbidity research

The research that Wei and colleagues have done on the impacts of high MWI scores across groups of patients could also help guide care.

For instance, the finding that suicide risk rose sharply as MWI score rose could help clinicians think about which patients might be most in need of depression and suicide screening. As patients develop more conditions with age, physicians may want to monitor their mental health more closely, and offer appropriate lifestyle advice and treatment.

"As clinicians, we are more likely to assess suicide risk in people with known depression or other mental health or substance use issues, but we may not automatically consider that those with more 'physical' conditions only could also be at higher risk," says Wei. "Multimorbidity has several downstream consequences. Physical impairments are just the beginning. As conditions accumulate and physical functioning deteriorates, we have found this is closely linked to worse mental health, social health, and eventually premature mortality."

In short, she says, "The association between the MWI score with suicide risk and overall mental well-being warrants attention."

Having a high MWI score, she says, makes someone functionally older than their "calendar" or chronologic age would suggest. Clinicians can use the score to help them think about the "biologic age" of the patient before them based on the life span expectations for people with similar scores.

Using scores clinically could also help providers ensure that patients with high scores receive care management services, or other support to help them live their best life and keep on top of the tests, treatments and lifestyle changes that can help them do so.

"We want patients to have good insight into how the conditions they've developed over the years are affecting their well-being, and be open to communicating with their care teams about how those conditions affect their functioning, quality of life and overall health now and in the future," says Wei. "We also know that social support, and having a strong purpose in life, can protect against some of the detrimental effects of multiple conditions. We need to help patients understand these connections, foster their development early on, and sustain them through each stage of life and changes in health."

Credit: 
Michigan Medicine - University of Michigan

Serotonin linked to somatic awareness, a condition long thought to be imaginary

An international team spearheaded by researchers at McGill University has discovered a biological mechanism that could explain heightened somatic awareness, a condition where patients experience physical discomforts for which there is no physiological explanation.

Patients with heightened somatic awareness often experience unexplained symptoms - headaches, sore joints, nausea, constipation or itchy skin - that cause emotional distress, and are twice as likely to develop chronic pain. The condition is associated with illnesses such as fibromyalgia, rheumatoid arthritis and temporomandibular disorders, and is thought to be of psychological origin.

"Think of the fairy tale of the princess and the pea," says Samar Khoury, a postdoctoral fellow at McGill's Alan Edwards Centre for Research on Pain. "The princess in the story had extreme sensitivity where she could feel a small pea through a pile of 20 mattresses. This is a good analogy of how someone with heightened somatic awareness might feel; they have discomforts caused by a tiny pea that doctors can't seem to find or see, but it's very real."

Thanks to an existing study on genetic association, Samar Khoury and her colleagues might have found the elusive pea capable of explaining somatic awareness.

Their work, recently published in the Annals of Neurology, used data available through the Orofacial Pain: Prospective Evaluation and Risk Assessment cohort and demonstrates that patients who suffer from somatic symptoms share a common genetic variant. The mutation leads to the malfunctioning of an enzyme critical for the production of serotonin, a neurotransmitter with numerous biological functions.

"I am very happy and proud that our work provides a molecular basis for heightened somatic symptoms," says Luda Diatchenko, lead author of the new study and a professor in McGill's Faculty of Dentistry. "We believe that this work is very important to patients because we can now provide a biological explanation of their symptoms. It was often believed that there were psychological or psychiatric problems, that the problem was in that patient's head, but our work shows that these patients have lower levels of serotonin in their blood."

The results of their study have laid the groundwork for the development of animal models that could be used to better characterize the molecular pathways in heightened somatic awareness. Above all, Diatchenko and Khoury hope their work will pave the way for treatment options.

"The next step for us would be to see if we are able to target serotonin levels in order to alleviate these symptoms," says Diatchenko, who holds the Canada Excellence Research Chair in Human Pain Genetics.

Credit: 
McGill University