Culture

Gender bias alive and well in health care roles, study shows

New Orleans, LA -- Results of a multi-center study of patients' assumptions about health care professionals' roles based on gender show significant stereotypical bias towards males as physicians and females as nurses. The research team, led in New Orleans by Lisa Moreno-Walton, MD, LSU Health New Orleans Emergency Medicine at University Medical Center (UMC), found patients recognized males as physicians nearly 76% of the time. Female attending physicians were recognized as physicians only 58% of the time. The research paper, published in the Journal of Women's Health, is available online here.

"Despite the fact that about 52% of all medical students are women, unconscious/ implicit bias is so strong that even when women introduce themselves to patients as the doctor, patients fail to recognize the female as a physician," notes Dr. Moreno-Walton.

The researchers analyzed the responses of 150 patients to an anonymous survey conducted in the Emergency Department (ED) of a teaching hospital in Miami Beach, Florida. After being seen by the nurse, resident physician, and attending physician who introduced themselves, explained their exact roles and wrote their names on the white boards in the exam rooms next to the nurse, resident or attending physician designation, volunteers administered the survey. Patients were asked to recall the gender of their nurse, resident physician, and attending physician.

The patients' genders did not affect their recognition of physicians and nurses. Both male and female patients correctly recognized female physicians less than male and female more than male nurses. Patients' ages, though, did. Younger patients more often correctly identified male nurses as nurses and female attending physicians as physicians than did older patients.

"At UMC, the vast majority of female ED physicians report that on our monthly patient satisfaction reports, only about 50% of our patients state that they were seen by a physician," adds Moreno-Walton. "All of us know we are seeing all of our patients, but the patients do not know that we are doctors."

This experience is a common one.

"When I do my career development seminars all over the country, I ask, 'If there is any woman in the room who has NEVER been mistaken for a nurse, please raise your hand,'" says Moreno-Walton. "In seven years of doing these seminars, I have yet to have even one woman physician raise her hand. Implicit bias, microaggressions and the sometimes present explicit bias can do significant damage to the morale, self-esteem and confidence of health care professionals who are females and/or underrepresented minorities. This is not trivial."

The research team advises further research be conducted to understand the implications of implicit gender biases on patient satisfaction, patient compliance, physician burnout, compassion fatigue and job satisfaction, among other issues.

Credit: 
Louisiana State University Health Sciences Center

How you and your friends can play a video game together using only your minds

image: University of Washington researchers created a method for two people help a third person solve a task using only their minds. Heather Wessel, a recent UW graduate with a bachelor's degree in psychology (left), and Savannah Cassis, a UW undergraduate in psychology (right) sent information about a Tetris-like game from their brains over the internet to UW psychology graduate student Theodros Haile's (middle) brain. Haile could then manipulate the game with his mind.

Image: 
Mark Stone/University of Washington

Telepathic communication might be one step closer to reality thanks to new research from the University of Washington. A team created a method that allows three people to work together to solve a problem using only their minds.

In BrainNet, three people play a Tetris-like game using a brain-to-brain interface. This is the first demonstration of two things: a brain-to-brain network of more than two people, and a person being able to both receive and send information to others using only their brain. The team published its results April 16 in the Nature journal Scientific Reports, though this research previously attracted media attention after the researchers posted it September to the preprint site arXiv.

"Humans are social beings who communicate with each other to cooperate and solve problems that none of us can solve on our own," said corresponding author Rajesh Rao, the CJ and Elizabeth Hwang professor in the UW's Paul G. Allen School of Computer Science & Engineering and a co-director of the Center for Neurotechnology. "We wanted to know if a group of people could collaborate using only their brains. That's how we came up with the idea of BrainNet: where two people help a third person solve a task."

As in Tetris, the game shows a block at the top of the screen and a line that needs to be completed at the bottom. Two people, the Senders, can see both the block and the line but can't control the game. The third person, the Receiver, can see only the block but can tell the game whether to rotate the block to successfully complete the line. Each Sender decides whether the block needs to be rotated and then passes that information from their brain, through the internet and to the brain of the Receiver. Then the Receiver processes that information and sends a command -- to rotate or not rotate the block -- to the game directly from their brain, hopefully completing and clearing the line.

The team asked five groups of participants to play 16 rounds of the game. For each group, all three participants were in different rooms and couldn't see, hear or speak to one another.

The Senders each could see the game displayed on a computer screen. The screen also showed the word "Yes" on one side and the word "No" on the other side. Beneath the "Yes" option, an LED flashed 17 times per second. Beneath the "No" option, an LED flashed 15 times a second.

"Once the Sender makes a decision about whether to rotate the block, they send 'Yes' or 'No' to the Receiver's brain by concentrating on the corresponding light," said first author Linxing Preston Jiang, a student in the Allen School's combined bachelor's/master's degree program.

The Senders wore electroencephalography caps that picked up electrical activity in their brains. The lights' different flashing patterns trigger unique types of activity in the brain, which the caps can pick up. So, as the Senders stared at the light for their corresponding selection, the cap picked up those signals, and the computer provided real-time feedback by displaying a cursor on the screen that moved toward their desired choice. The selections were then translated into a "Yes" or "No" answer that could be sent over the internet to the Receiver.

"To deliver the message to the Receiver, we used a cable that ends with a wand that looks like a tiny racket behind the Receiver's head. This coil stimulates the part of the brain that translates signals from the eyes," said co-author Andrea Stocco, a UW assistant professor in the Department of Psychology and the Institute for Learning & Brain Sciences, or I-LABS. "We essentially 'trick' the neurons in the back of the brain to spread around the message that they have received signals from the eyes. Then participants have the sensation that bright arcs or objects suddenly appear in front of their eyes."

If the answer was, "Yes, rotate the block," then the Receiver would see the bright flash. If the answer was "No," then the Receiver wouldn't see anything. The Receiver received input from both Senders before making a decision about whether to rotate the block. Because the Receiver also wore an electroencephalography cap, they used the same method as the Senders to select yes or no.

The Senders got a chance to review the Receiver's decision and send corrections if they disagreed. Then, once the Receiver sent a second decision, everyone in the group found out if they cleared the line. On average, each group successfully cleared the line 81% of the time, or for 13 out of 16 trials.

The researchers wanted to know if the Receiver would learn over time to trust one Sender over the other based on their reliability. The team purposely picked one of the Senders to be a "bad Sender" and flipped their responses in 10 out of the 16 trials -- so that a "Yes, rotate the block" suggestion would be given to the Receiver as "No, don't rotate the block," and vice versa. Over time, the Receiver switched from being relatively neutral about both Senders to strongly preferring the information from the "good Sender."

The team hopes that these results pave the way for future brain-to-brain interfaces that allow people to collaborate to solve tough problems that one brain alone couldn't solve. The researchers also believe this is an appropriate time to start to have a larger conversation about the ethics of this kind of brain augmentation research and developing protocols to ensure that people's privacy is respected as the technology improves. The group is working with the Neuroethics team at the Center for Neurotechnology to address these types of issues.

"But for now, this is just a baby step. Our equipment is still expensive and very bulky and the task is a game," Rao said. "We're in the 'Kitty Hawk' days of brain interface technologies: We're just getting off the ground."

Credit: 
University of Washington

'Planting green' cover-crop strategy may help farmers deal with wet springs

image: Planting green involves planting main crops into living cover crops. An example shown here: Cereal rye is rolled and soybeans planted green in the same pass at Penn State's Russell E. Larson Agricultural Research Center.

Image: 
Heidi Reed/Penn State

Allowing cover crops to grow two weeks longer in the spring and planting corn and soybean crops into them before termination is a strategy that may help no-till farmers deal with wet springs, according to Penn State researchers.

The approach -- known as planting green -- could help no-till farmers counter a range of problems they must deal with during wet springs like the ones that have occurred this year and last year. These problems include soil erosion, nutrient losses, soils holding too much moisture and causing a delay in the planting of main crops, and main-crop damage from slugs.

"With climate change bringing the Northeast more extreme precipitation events and an increase in total precipitation, no-till farmers especially need a way of dealing with wet springs," said Heather Karsten, associate professor of crop production ecology, whose research group in the College of Agricultural Sciences conducted a three-year study of planting green. "We wanted to see if farmers could get more out of their cover crops by letting them grow longer in the spring."

As cover crops continue to grow, they draw moisture from the soil, creating desired drier conditions in wet springs for planting corn and soybeans. With planting green, after those main crops are planted into the cover crops, the cover crops are typically terminated by farmers with an herbicide. The decomposing cover crop residues then preserve soil moisture for the corn and soybean crops through the growing season.

The study took place at five sites over three years -- on three cooperating Pennsylvania farms that plant no-till in Centre, Clinton and Lancaster counties; at Penn State's Russell E. Larson Agricultural Research Center in Centre County; and at the University's Southeast Agricultural Research and Extension Center in Lancaster County.

At each location, researchers compared the results of planting green to the traditional practice of terminating cover crops 10 days to two weeks before planting the main crops of corn and soybeans.

Cover crops included in the study were primarily rye and triticale, as well as a mixture of triticale, Austrian winter pea, hairy vetch and radish in one location.

Findings of the research, recently published online today in Agronomy Journal, were mixed, according to study leader Heidi Reed, a doctoral student in agronomy when the research was conducted who is now an educator with Penn State Extension, specializing in field and forage crops.

Reed noted that planting green appeared to benefit soybean crops more than corn.

Planting green increased cover crop biomass by 94 percent in corn and by 94 to 181 percent in soybean.

However, because planting green results in more cover crop residues acting as mulch on the surface, it also cooled soils from 1.3 to 4.3 degrees Fahrenheit at planting.

At several of the sites during the study years, main-crop plant populations were reduced when planted green, possibly due to the cooler temperatures slowing crop emergence and nutrient cycling, and/or from cover crop residue interference with the planter. In corn, in a few cases, crop damage by slugs was also increased when corn was planted green.

No-till farmers struggle with slugs damaging corn and soybean seeds and seedlings because no-till doesn't disturb the soil and kill slugs or bury their eggs the way tillage does.

"No-till with cover crop residues also provides habitat for some crop pests and keeps the soil moist -- so no-till cover crop systems tend to be great slug habitat," Karsten said.

"We had hoped that letting cover crops grow longer in the spring would supply alternative forage for the slugs, as well as habitat for slug predators such as beetles -- and these factors would reduce slug damage of the main crop seedlings. But we did not see a consistent reduction in slug damage on main crops as we expected."

When researchers compared crop-yield stability between the two cover crop termination times across the multiple locations and years, corn yield was less stable and reduced by planting green in high-yielding environments; however, soybean yield was not influenced by planting green.

"We concluded that corn was more vulnerable to yield losses from conditions created by planting green than soybeans, " Reed said. "Since soybean yield was stable across study locations, and not affected by cover crop termination date, we suggest that growers who want to extend cover crop benefits and avoid the risk of crop-yield reduction from planting green should consider trying it first with soybean."

Credit: 
Penn State

Analysis finds US ecosystems shifting hundreds of miles north

image: University of Nebraska-Lincoln researchers (from left) Caleb Roberts, Craig Allen and Dirac Twidwell, have found evidence that multiple ecosystems in the U.S. Great Plains have moved substantially northward during the past 50 years.

Image: 
Craig Chandler/University Communication/University of Nebraska-Lincoln

Whole ecosystems are shifting dramatically north in the Great Plains, a phenomenon likely linked to human influences such as climate change, says new University of Nebraska-Lincoln research that analyzed nearly 50 years' worth of data on bird distributions.

The northernmost ecosystem boundary shifted more than 365 miles north, with the southernmost boundary moving about 160 miles from the 1970 baseline.

The findings could inform the development of an early-warning system that would give land managers decades to prepare for ecosystem shift or collapse, allowing them to accommodate or foster the change rather than simply reacting, the researchers said.

Early warning, long the siren song for extreme weather events such as tornadoes, is likewise an emerging goal in ecology. Ecologists long thought that ecosystems respond to external pressures -- climate changes, invasive species -- in idiosyncratic, largely unpredictable ways.

But the team's new study, published June 24 in the journal Nature Climate Change, managed to quantify the spatial component of that change for the first time. In doing so, it suggests that ecological responses are much more ordered and predictable than previously thought.

"If we can work toward prevention (of changes), we're going to save ourselves so much money and time," said Caleb Roberts, lead author and postdoctoral researcher at Nebraska. "We won't have to worry about specific endangered species, perhaps, because we will be protecting the system they require."

To arrive at their conclusions, the researchers analyzed 46 years' worth of avian data collected for the North American Breeding Bird Survey, a U.S. Geological Survey program designed to track bird populations. That survey included more than 400 bird species found within a 250-mile-wide transect stretching from Texas to North Dakota.

The team then separated bird species into groups based on their body masses and searched for gaps in the distribution of the groups. Those gaps effectively act like the DNA signature of an ecosystem, said co-author Craig Allen, allowing the team to identify where one ecosystem ends and another begins.

By analyzing the geographic movement of the distinct body-mass signatures over the 46-year period, the team managed to measure how much and how fast each ecosystem shifted north.

"All (these breaks) are saying is that there are a lot of animals with the small body size; then there's a gap with nothing in this middle body size; then you have another group and another group," said Allen, director of the university's Center for Resilience in Working Agricultural Landscapes. "And since these reflect the domains of scale in an ecosystem, it's like a signature -- the DNA -- of a given ecosystem."

Over their study area, and over time, the researchers identified three distinct ecosystem boundaries, with a fourth -- and thus a fourth ecosystem regime -- appearing in the final decade.

The fact that the northernmost boundary shifted more than its southernmost counterpart reflects a well-documented phenomenon known as Arctic amplification, suggesting that climate change is at play, the researchers said. But the movement also aligns with other global change drivers that include wildfire trends; the invasion of woody plants such as eastern red cedar trees; energy development; agricultural land conversion; and urbanization.

"Like most things in ecology, (these shifts) likely have multiple causations," Allen said. "And I think it's fairly intractable to try to separate, say, tree invasion from climate change, because it has to do with fire but also with changing climate. All of these things are highly related."

Grasslands are the most endangered ecosystem in the world, Roberts said, partially due to woody-plant encroachment. That encroachment, he said, is something people can work to control by increasing burning, increasing tree removal and decreasing planting.

"Those are all things we can do and use the early warning to say, 'We're coming to the edge of this grassland's resilience. It's about to collapse, especially in our area. What can we do to stop that?' That's the kind of power this tool would have," he said. "You don't have to wait until it gets to you. You can see it coming and act pre-emptively."

When land managers do wait until the problem arrives at their backdoor, Allen said, it's often too late to alter the outcome. Given that urgency, the researchers plan to expand the range of their ecosystem analysis both east and west -- potentially picking up forestlands and mountain ranges -- while further clarifying how neighboring ecosystems move in relation to one another and in relation to global drivers.

Eventually, the researchers said, they intend to develop tools usable by land managers and conservationists ranging from private industry to the military.

"We are working closely with a long list of partners to understand how to navigate these types of transitions and increase the performance of conservation investments," said Dirac Twidwell, associate professor of agronomy and horticulture. "Large-scale transitions should not be underestimated. Restoring what has been lost has proven extraordinarily difficult when the challenge spans large geographic regions."

Credit: 
University of Nebraska-Lincoln

Location-based data can provide insights for business decisions

CORVALLIS, Ore. - Data from social commerce websites can provide essential information to business owners before they make decisions that could determine whether a new venture succeeds or fails, a study from Oregon State University shows.

Social commerce sites such as the review and recommendation site Yelp collect large amounts of data from a variety of users, including customer opinions, geographical distribution of businesses in a given area, and customer "check-ins" that provide a sense of the foot traffic.

That information can provide business owners valuable information about the competitive environment in which they operate or are considering operating in, said the study's lead author, Xiaohui Chang, an assistant professor in OSU's College of Business.

Chang and co-author Jiexun Li of Western Washington University developed a tool that uses data collected through a social commerce site, including details such as types of businesses in a neighborhood, their hours, parking availability and other consumer features, to help determine whether one location is more likely to be successful than another.

"Small business owners, in particular, have a lot of choices when opening a new business, including where to locate," Chang said. "With this model, we use existing social commerce data to help you determine which location is going to perform the best."

The findings are published in the July issue of the journal Expert Systems With Applications.

The study was conceived as a way to address the age-old question of why some businesses succeed and others do not, Chang said. The work is particularly applicable to small businesses. While large companies can devote resources to collecting and analyzing financial data, small businesses may not have all of those tools available when researching where to open or what operating hours to keep.

The researchers focused on restaurants because the majority of new small businesses are restaurants, and many fail within the first year of opening.

For the study, the researchers looked at the accuracy of four different business performance prediction models. The attribute affinity model is a basic model that looks at businesses' intrinsic attributes without taking into account location or competition.

The geographic model, which has been used and tested by other researchers, suggests that businesses that are close to each other and share similar attributes are likely to do equally well. The contextual model, which is a new model, looks at the attributes of the business and the environment that might contribute to the success of a business; two businesses hundreds of miles apart with similar attributes and surrounding neighborhoods could achieve similar performance. The hybrid model uses both contextual and geographic models, which each also include aspects of the affinity model.

The researchers used Phoenix-area restaurant data from Yelp, a social commerce site that helps consumers find businesses using location-based services, to test each model. Yelp has made some of its data available to researchers and this study used data from 2013.

They found that the hybrid model did the best job of predicting whether a restaurant would be successful. Both business attributes and surrounding environments play important roles, Chang said.

Additional research is needed to fully test how the model might be used to help a new business make decisions, and to determine if it also works for other types of businesses, Chang said. In addition, social commerce companies such as Yelp, Trip Advisor or Foursquare, which collect a trove of location-based data, could use the model to help companies improve their businesses.

"You could regularly get new performance predictions and the data could be used to help businesses solve problems or keep themselves vibrant," Chang said. "If a similar business is more successful and you can use location-based data to pinpoint that the success is due in part to parking availability, hours or price point, you can make decisions based on that information."

Credit: 
Oregon State University

Geisel study finds downside risk contracts still less common for ACOs

Findings from a new study conducted by a team of researchers at Dartmouth's Geisel School of Medicine and published in the July issue of Health Affairs, shows that while the number and variety of contracts held by Accountable Care Organizations (ACOs) have increased dramatically in recent years, the proportion of those bearing downside risk has seen only modest growth.

ACOs, which use financial incentives in an effort to improve patient care and reduce healthcare costs, have become one of the most commonly implemented value-based payment models by payers. In 2018, there were more than 1,000 ACOs nationally, covering an estimated 33 million lives and including more than 1,400 different payment arrangements.

"Debates continue around the impact of the ACO model, including the contribution of downside risk--where ACOs that fail to meet their financial targets share responsibility with payers for losses," explains Carrie Colla, PhD, an associate professor at The Dartmouth Institute for Health Policy and Clinical Practice and senior author on the study.

To help improve understanding of the rapid growth and evolution of ACOs, the research team analyzed ACO structure and contracts over a six-year period (2012-18) using data from the National Survey of Accountable Care Organizations.

They found that while the number of ACOs had grown fivefold during that time period, the proportion of ACOs taking on downside risk remained relatively stable--increasing from 28 percent in 2012 to 33 percent in 2018. Overall, the majority were upside-only risk contracts, which reward cost and quality improvements but do not financially penalize poor performance. There is concern among industry experts that these kinds of contracts might not provide adequate incentives to boost ACO performance.

When examining the leadership, services, and size of ACOs, the researchers reported that those bearing downside risk were less likely to be physician-led or physician-owned, more likely to be part of larger, integrated delivery systems (that included hospitals), had more participating physicians, and were more likely to provide services such as inpatient rehabilitation, routine specialty care, and palliative or hospice care.

In addition, the researchers found that ACOs with downside risk contracts were more likely to have participating providers who had experience with other forms of payment reform (such as bundled or capitated payments) and had more ACO contracts across payer types (Medicare, commercial, and Medicaid).

"Overall, the increasing number of ACO payment contracts per ACO suggest an increase in the breadth of value-based financial incentives," says Colla. "However, there has been relative stagnation in the proportion of ACOs with deeper financial incentives--with only a third (in 2018) choosing contracts with downside risk.

"Understanding the potential importance of downside risk contracts in increasing the impact of the ACO model, the hesitancy of ACOs to adopt them, and the levers that could be used to strengthen both the breadth and the depth of incentives will be key to moving the ACO model forward," Colla says.

Credit: 
The Geisel School of Medicine at Dartmouth

Wood products mitigate less than 1% of global carbon emissions

image: Historical (black) and projected (in color) trends for global production of wood products and the carbon sequestered in them. The sequestration gap is the gap between all the carbon locked up in woody products (panel F, dotted line) and what is accounted for under current UN guidelines (black line).

Image: 
Craig Johnston / UW-Madison

MADISON, Wis. -- The world's wood products -- all the paper, lumber, furniture and more -- offset just 1 percent of annual global carbon emissions by locking away carbon in woody forms, according to new research.

An analysis across 180 countries found that global wood products offset 335 million tons of carbon dioxide in 2015, 71 million tons of which were unaccounted for under current United Nations standards. Wood product carbon sequestration could rise more than 100 million tons by 2030, depending on the level of global economic growth.

The results provide countries with the first consistent look at how their timber industries could offset their carbon emissions as nations search for ways to keep climate change manageable by severely curbing emissions.

Yet the new research also highlights how wood products account for just a small fraction of the needed offsets for all but a select few timber-heavy countries.

Craig Johnston, a professor of forest economics at the University of Wisconsin-Madison, and Volker Radeloff, a UW-Madison professor of forest and wildlife ecology, published their findings July 1 in the Proceedings of the National Academy of Sciences.

"Countries are looking for net-negative emissions strategies. So it's not just about lowering our emissions but pursuing strategies that might have storage potential, and harvested wood products are one of those options," says Johnston. "It's nice because you can pursue options that don't hinder growth. The question is, can we continue to consume wood products and have climate change benefits associated with that consumption?"

To address that question, Johnston worked with Radeloff to develop a consistent, international analysis of the carbon storage potential of these products, which countries must now account for under the global Paris Agreement to reduce carbon emissions.

They used data on lumber harvests and wood product production from 1961 to 2015, the most recent year available, from the U.N. Food and Agriculture Organization. The researchers modeled future carbon sequestration in wood products using five broad models of possible economic and population growth, the two factors that most affect demand for these products.

Although the production of wood products in 2015 offset less than 1 percent of global carbon emissions, the proportion was much higher for a handful of countries with large timber industries. Sweden's pool of wood products, for example, offset 9 percent of the country's carbon emissions in 2015, which accounted for 72 percent of emissions from industrial sources that year.

But for most countries, including the U.S., wood products mitigated a much smaller fraction of overall emissions in 2015, and this proportion is not expected to increase significantly through 2065, the researchers found.

Current U.N. guidelines only allow countries to count the carbon stored in wood products created from domestic timber harvests, not the timber grown locally and shipped internationally, nor products produced from imported lumber. These regulations create a gap between the actual amount of carbon stored in the world's wood products and what is officially counted.

In 2015, that gap amounted to 71 million tons of carbon dioxide, equivalent to the emissions from 15 million cars. If those guidelines remain unchanged, by 2065 another 50 million tons of carbon dioxide may go unaccounted for due to this gap. But this additional, uncounted carbon does not significantly increase the proportion of global emissions offset by wood products.

Johnston and Radeloff also found that the level of carbon stored in wood products is extremely sensitive to economic conditions. Slow or negative growth could significantly reduce the amount of carbon offset by these industries.

"As wood products are produced, you're adding to this carbon pool in the country, but these products do eventually decay. There's carbon emissions today from furniture or lumber that was produced 50 or 75 years ago," says Johnston. "So if we're not producing at a rate that at least offsets those emissions, then we'll actually see that carbon pool become a net source of emissions."

For example, the Great Recession in 2008 and 2009 turned America's wood products from a net sink of carbon into a net emitter. A similar effect released millions of tons of carbon dioxide from wood products for years after the Soviet Union collapsed, Johnston and Radeloff found.

All five of the study's projections for future economic growth predict that more carbon will be captured in wood products, but unforeseen economic shocks could temporarily reverse that trend for particular countries.

The current study offers a chance to assess current obligations and help countries predict future emissions. The results may also inform the next round of emissions targets and negotiations, the researchers say.

"We're making these data public. The whole model for all countries, for all wood products, for all scenarios is available," says Johnston. "Now we know what it looks like for every country under a common model and common assumptions moving forward."

Credit: 
University of Wisconsin-Madison

Yellow fever virus responsible for current epidemic in Brazil originated in Amazon in 1980

The origin of the virus responsible for the ongoing yellow fever epidemic in Brazil, the worst for 40 years, has just been identified by scientists affiliated with two Brazilian institutions: Adolfo Lutz Institute (IAL) and the University of São Paulo (USP).

By means of a molecular study of yellow fever viruses found in dead monkeys and in mosquitoes, the group discovered that the strain behind the current epidemic originated in Pará State in North Brazil in 1980.

The virus infected monkeys in Pará and spread from there throughout the Amazon region until it reached Venezuela and Suriname. From 2000 on, always via infection of monkeys, the disease migrated to the Center-West and Southeast of Brazil, finally reaching São Paulo State in 2013. The first deaths of humans in São Paulo occurred in 2016.

Findings of the study, which was supported by São Paulo Research Foundation - FAPESP, are published in Scientific Reports.

The investigation was led by Mariana Sequetin Cunha, a researcher in IAL's Vector-Borne Disease Group. Scientists at the University of São Paulo's Tropical Medicine Institute (IMT-USP), the Federal University of Pará (UFPA) and the Federal University of São Paulo (UNIFESP) also took part. The project was also funded by Brazil's National Council for Scientific and Technological Development (CNPq).

Since mid-2016, when the ongoing yellow fever epidemic began, 2,245 cases of the disease have been confirmed, with 764 deaths, according to the Health Ministry. The largest number of cases since 1980, when the government made notification mandatory, had previously been reported in 2000. In that year, 40 people died from yellow fever.

Another face of the problem is the infection of monkeys by the same mosquitoes that transmit the virus to humans. Since 2016, public health authorities responsible for epidemiological surveillance in the Center-West, Southeast and South, where the epidemic is concentrated, have collected the carcasses of more than 10,000 monkeys found in forests and parks, mainly howler monkeys (Alouatta spp.), marmosets (Callithrix spp.) and capuchins (Sapajus spp). Yellow fever virus was detected in 3,403, according to the Health Ministry (Boletim Epidemiológico de Febre Amarela).

"More than 90% of the dead monkeys are believed to be Alouatta guariba. The species is extremely susceptible to yellow fever," said Ester Sabino, Director of IMT-USP.

"Troops of more than 80 monkeys were entirely destroyed," Cunha said, referring to the deaths of howler monkeys from yellow fever in Horto Florestal, a nature reserve in the north of São Paulo City in late 2017.

Yellow fever is an acute disease caused by a virus transmitted to monkeys and humans through the bites of infected mosquitos. The symptoms include jaundice, a yellowish or greenish pigmentation of the skin and whites of the eyes due to high bilirubin levels, reflecting liver damage.

In the sylvatic (wild) transmission cycle, the yellow fever virus circulates between mosquitoes of the genera Haemagogus and Sabethes and monkeys. Humans are considered incidental hosts in the sense that people are infected only if they happen to live or work in tropical forests or travel on land through such areas. In the urban transmission cycle, the virus is transmitted to humans (the main host in this case) by the mosquito Aedes aegypti.

Yellow fever was endemic in the South and Southeast of Brazil in the early twentieth century. Urban transmission has been eradicated thanks to vaccination and action against A. aegypti breeding sites.

In the last two decades, transmission to humans has occurred outside the Amazon region, where yellow fever is still endemic. Cases have been reported in humans and monkeys in Bahia, Goiás, Minas Gerais, São Paulo, Paraná and Rio Grande do Sul.

Since late 2016, the disease has spread faster and farther, reaching the Atlantic Rainforest biome, with all its extraordinary biodiversity, which includes many species of monkey. Yellow fever had not occurred in these areas for decades.

In search of the yellow fever virus, Cunha and her group investigated samples of brain, liver and spleen tissue from dead monkeys found by state public health workers and compulsorily sent for analysis to IAL, the state reference laboratory. Samples from 430 dead monkeys were tested between July 2016 and March 2017. Most were Alouatta, Callithrix and Sapajus, but there were some specimens of Black-fronted titi (Callicebus nigrifrons) and Golden lion tamarin (Leontopithecus rosalia), an endangered species.

Investigation in search of the yellow fever virus was carried out in each one of the species. The published study contributes to a better understanding of the biotic pathways involved in the virus's spread from the Amazon to the Southeast. "The study describes the evolution of the virus in different species. The disease is milder in capuchins than in howler monkeys and marmosets," Sabino said.

Not all the dead monkeys sent to IAL died from yellow fever. "Some had been run over and others electrocuted, for example," Cunha said. "The protocol requires analysis by the reference lab of tissue samples from all dead monkeys found, whatever the circumstances."

The presence of the virus was ruled out in most cases, and even in the minority in which it was confirmed, it was not always possible to be sure that death was due to yellow fever. The disease is practically a death sentence for howler monkeys. Marmosets are susceptible but do not always die. Capuchins are considered resistant.

By mid-2017, the epidemic that began in the north of São Paulo State in 2016 had spread to the Campinas region, not far from the state capital. "Yellow fever virus hadn't circulated in Campinas since the early twentieth century," Cunha said.

The first infected monkey was confirmed by IAL in July 2016. It was a capuchin from the Ribeirão Preto region. The species is resistant, so the cause of death was not reported as having been caused by yellow fever, although the virus was found in its tissue.

"The animal came into contact with the pathogen via a mosquito bite but died from other causes. We wanted to find out if capuchins were acting as natural reservoirs of the virus precisely because they're resistant," Cunha said.

Yellow fever virus was found in 67 out of the 430 samples analyzed by Cunha and colleagues at IMT-USP; 30 were from howler monkeys, nine from marmosets, and seven from capuchins. The rest were from monkeys of unidentified genera.

"In these 21 cases, the material didn't indicate the genus, but we suspect they were Alouatta owing to the high viral loads in the tissues analyzed," Cunha said.

Forty-year-old strain

The researchers isolated the virus from each of the 67 confirmed samples, sequenced the genomes, and compared the genomes with those (available online) of viruses from the outbreaks of yellow fever that occurred between 1980 and 2015 in Brazil and neighboring countries.

They discovered that the strain responsible for the current epidemic originated in Venezuela and in Roraima State and Pará State in Brazil. This is in line with prior research suggesting that the 2016-17 epidemic began in the North region and spread to the Southeast by means of a long and continuous sylvatic cycle involving mosquitoes and monkeys.

The results of the study reveal an evolutionary journey of sizable proportions in both time and space. In 1980, yellow fever virus was endemic in Pará. In 2000, it reached Mato Grosso do Sul, Goiás and Minas Gerais in the Center-West. By 2004, it had crossed into Venezuela, and by 2009, it had reached Trinidad and Tobago in the Caribbean. In 2010, it was present in Roraima in the far North, while one strain was found in Rio Grande do Sul in the far South. The virus arrived in São Paulo State in 2013.

The molecular analyses performed by Cunha and colleagues showed that the virus was fully disseminated in most Brazilian states and in Suriname by 2017.

Other researchers at IAL and IMT-USP are now conducting similar studies involving the dead monkeys collected in São Paulo State during the second wave of the epidemic, in July 2017-June 2018, and during the third wave, which began in July 2018 and will fizzle out this year with the end of the rainy season and the advent of winter, when the mosquitoes practically stop reproducing.

Depending on the results of these forthcoming studies, it may be possible to determine whether the current epidemic in São Paulo State is on the wane. Alternatively, despite mass vaccination the virus could still be spreading through the monkey population and fresh outbreaks may be in the offing.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Study shows some generics can cost medicare recipients more than brand-name drugs

Medicare Part D enrollees may pay more out of pocket for high-priced specialty generic drugs than their brand-name counterparts, according to new research by health policy experts at Vanderbilt University Medical Center and the University of North Carolina at Chapel Hill.

Researchers examined differences in brand-name and generic or biosimilar drug prices, formulary coverage and expected out-of-pocket spending across all of the Medicare Part D plans available in the U.S. in the first quarter of 2018.

The study, published in the July issue of Health Affairs, found that current Medicare Part D beneficiaries can have higher out-of-pocket spending for generics than their branded counterparts if they use expensive specialty drugs and if the price differences between brands and generics are not large. This can be common for individuals prescribed specialty drugs typically used to treat rare or complex conditions such as cancer, rheumatoid arthritis or multiple sclerosis.

"Ironically, even if we assume that generic drugs have lower list prices than brands, for Medicare beneficiaries with $20,000 to $80,000 in annual drug spending, using only brand-name drugs could actually save them money," said Stacie Dusetzina, PhD, associate professor of Health Policy and Ingram Associate Professor of Cancer Research at VUMC, the study's lead author.

"This is happening because branded drug manufacturers now pay a discount in the donut hole, which gets counted as out-of-pocket spending," she said. "This helps patients reach catastrophic coverage faster, where they pay 5% of the drug's price instead of 25%. Generic drug makers do not pay these same discounts, so patients have to spend more of their own money to make it to the catastrophic phase of the benefit."

In 2019, this means people using brand-name drugs who reach the donut hole, or coverage gap, have to spend $982 to get to the catastrophic coverage phase. People using generic drugs have to spend $3,730 to reach that point. The study also notes policy changes set to take effect in 2020 will only make the situation worse by increasing patient out-of-pocket spending requirements for the catastrophic phase coverage from $5,100 to $6,350.

In response, the Trump administration and the Medicare Payment Advisory Commission (MedPAC) have included recommendations to exclude the manufacturer discount from out-of-pocket spending calculations.

"While this would level the playing field between generic drugs and brands, it would do so by making brand-name drugs more expensive instead of making generic drugs less expensive," said Dusetzina. "Congressional committees have signaled interest in addressing this and other issues in Medicare Part D, including placing a cap on out-of-pocket spending.

"The Part D benefit needs a redesign so that it works for people needing expensive drugs. I hope Congress will take this opportunity to make changes to Part D, including making sure that generic drug users aren't overpaying for these drugs."

Credit: 
Vanderbilt University Medical Center

Rutgers researchers identify the origins of metabolism

image: Life may have arisen near hydrothermal vents rich in iron and sulfur. The earliest cells incorporated these elements into small peptides, which became the first and simplest ferredoxins -- proteins that shuttle electrons within the cell, to support metabolism. As cells evolved, ferredoxins mutated into more complex forms. The ferredoxins in modern bacteria, plant and animal cells are all derived from that simple ancestor.

Image: 
Ian Campbell, Rice University

A Rutgers-led study sheds light on one of the most enduring mysteries of science: How did metabolism - the process by which life powers itself by converting energy from food into movement and growth - begin?

To answer that question, the researchers reverse-engineered a primordial protein and inserted it into a living bacterium, where it successfully powered the cell's metabolism, growth and reproduction, according to the study in Proceedings of the National Academy of Sciences.

"We are closer to understanding the inner workings of the ancient cell that was the ancestor of all life on earth - and, therefore, to understanding how life arose in the first place, and the pathways life might have taken on other worlds," said lead author Andrew Mutter, a post-doctoral associate at Rutgers University's Department of Marine and Coastal Sciences.

The discovery also has implications for the field of synthetic biology, which harnesses the metabolism of microbes to produce industrial chemicals; and bioelectronics, which seeks to apply cells' natural circuitry for energy storage and other functions.

The researchers looked at a class of proteins called ferredoxins, which support metabolism in bacteria, plants and animals by moving electricity through cells. These proteins have different, complex forms in today's living things, but researchers speculate they all arose from a much simpler protein that was present in the ancestor of all life.

Similar to the ways biologists compare modern birds and reptiles to draw conclusions about their shared ancestor, the researchers compared ferredoxin molecules that are present in living things and, using computer models, designed ancestral forms that may have existed at an earlier stage in the evolution of life.

That research led to their creation of a basic version of the protein - a simple ferredoxin that is able to conduct electricity within a cell and that, over eons of evolution, could have given rise to the many types that exist today.

Then, to prove their model of the ancient protein could actually support life, they inserted it into a living cell. They took the genome of E. coli bacteria, removed the gene it uses to create ferredoxin in nature, and spliced in a gene for their reverse-engineered protein. The modified E. coli colony survived and grew although more slowly than normal.

Study co-author Vikas Nanda, a professor at Rutgers Robert Wood Johnson Medical School and Center for Advanced Biotechnology and Medicine, said the discovery's implications for synthetic biology and bioelectronics come from ferredoxins' role in the circuitry of life.

"These proteins channel electricity as part of a cell's internal circuitry. The ferredoxins that appear in modern life are complex - but we've created a stripped-down version that still supports life. Future experiments could build on this simple version for possible industrial applications," Nanda said.

Credit: 
Rutgers University

New initiative improved care for sepsis patients, but black patients saw smaller benefits

PROVIDENCE, R.I. [Brown University] -- The New York Sepsis Initiative was launched in 2014 with the goal of improving the prompt identification and treatment of sepsis. A new study has found that while the program has improved care over all, there were racial and ethnic disparities in the implementation of the best-practice protocols.

Sepsis is a life-threatening condition that occurs when the body's extreme response to an infection triggers a chain reaction, said Dr. Mitchell Levy, a professor of medicine and chief of the division of Pulmonary, Critical Care and Sleep Medicine at Brown University's Warren Alpert Medical School. "Even with the best care, the mortality rate is between 15 and 25 percent."

Early identification and treatment of sepsis is essential for saving lives, and the multi-stage best practices for sepsis identification and treatment were codified in the New York Sepsis Initiative's protocols. The new research, published Monday, July 1 in the July issue of the journal Health Affairs, found that during the first 27 months of the initiative, the percent of patients who received the complete 3-hour-long, best-practice protocol increased from 60.7 percent to 72.1 percent. At the same time, the in-hospital mortality rates for sepsis patients decreased from 25.4 percent to 21.3 percent, which aligned with prior research by Levy, who is also the medical director of the Medical Intensive Care Unit at Rhode Island Hospital.

However, the paper highlights a disparity in sepsis care between black and white patients.

Specifically, during the first 27 months of the initiative, black patients only experienced an increase of 5.3 percentage points in the completion of the best-practice protocol, while white patients experienced an increase of 14 percentage points. Hispanic and Asian patients experienced an increase of 6.7 and 8.4 percentage points respectively.

Being aware of these disparities is critical because the Centers for Medicare and Medicaid Services is considering trying sepsis protocol completion rates to hospital reimbursement, said Dr. Amal Trivedi, senior author on the paper and a professor at Brown's School of Public Health and medical school. "If our study findings extend beyond New York, it raises concerns about the possibility of these quality improvement initiatives for sepsis exacerbating racial disparities in care."

The researchers found that hospitals that serve higher proportions of black patients had smaller improvements in protocol completion. Within the same hospital, white and black patients received similar care, in terms of protocol completion rates, Trivedi said.

Prior research found that minority-serving hospitals tend to have more financial stress, fewer resources and less infrastructure to devote toward quality improvement measures, which is likely the reason why minority-serving hospitals had smaller improvements in sepsis protocol completion, Trivedi said. These hospitals also tend to treat more uninsured patients and those on Medicaid.

After adjusting for risks, such as type of infection, age and other chronic health conditions, the team did not find a statistically significant change in hospital mortality rates between racial and ethnic groups, despite the disparities in care delivery. During the first three months of the initiative, 25.8 percent of white sepsis patients and 25.4 percent of black sepsis patients died while in the hospital. Two years into the initiative, 21.3 percent of white sepsis patients and 23.1 percent of black sepsis patients died while in the hospital.

"Our work highlights the need for state and federal policy makers to anticipate and monitor the effects that quality improvement projects, such as the New York State Sepsis Initiative, have on racial and ethnic minority groups," said Dr. Keith Corl, first author on the paper and an assistant professor of medicine in the division of Pulmonary, Critical Care and Sleep Medicine at Warren Alpert Medical School. "Racial and ethnic minority groups can get left behind. Knowing this, it is our job to better design and monitor these programs to ensure racial and ethnic minority patients realize the same benefits as white patients."

Trivedi added that in order to improve health equity, policymakers may need to devote additional funding to under-resourced hospitals that experience challenges in improving sepsis care so that their performances can match that of other hospitals.

Credit: 
Brown University

The chemical language of plants depends on context

image: A Geocoris nymph is attacking a tiny tobacco hornworm which has just hatched from an egg.

Image: 
Danny Kessler / Max Planck Institute for Chemical Ecology

A team of scientists from the Max Planck Institute for Chemical Ecology in Jena, Germany, studied the ecological function of linalool, a naturally abundant volatile organic compound, in wild Nicotiana attenuata tobacco plants. They found the gene responsible for linalool synthesis and release which vary considerably in plants of the same species. Females of the tobacco hawkmoth (Manduca sexta) prefer to lay eggs on plants with a higher naturally occurring linalool. At the same time, the more linalool a plant released, the more eggs and freshly hatched larvae were predated on by bugs. Behavioral assays in increasingly complex environments showed that the effects of linalool are quite variable, depending on the natural environment and the genetic makeup of the plant (Proceedings of the National Academy of Sciences of the United States of America, DOI: 10.1073/pnas.1818585116).

Interactions between tobacco plants, tobacco hawkmoths, and predatory bugs

Plants have evolved multiple strategies to defend themselves against herbivorous animals, especially insects. In addition to mechanical defenses, such as thorns and spines, plants also produce chemical defense compounds that hold insects and other herbivores at bay. These substances include volatile organic compounds, often only produced by plants after insect attack. Linalool is such a plant volatile organic compound; it mediates different ecological interactions with insects. Its complex mode of action has already been the subject of previous investigations. It is known that linalool in tobacco plants can attract predatory Geocoris bugs to show them the way to their prey: the eggs or freshly hatched larvae of tobacco hawkmoths. However, as a floral scent component, linalool is also attractive for adult hawkmoths and influences mated female moths in their decision to lay their eggs on a plant. A team of scientists from the Max Planck Institute for Chemical Ecology lead by Meredith Schuman and Ian Baldwin has now studied the ecological functions of the monoterpene linalool in wild Nicotiana attenuata plants in more detail.

The genetic analysis of linalool synthesis

The researchers observed a correlation between the rate of Manduca sexta eggs predated by Geocoris bugs and the amount of linalool produced by the respective plants. They didn't observe such a correlation between five similar organic compounds emitted by tobacco plants and the egg predation rate. This indicates that linalool, in fact, functions as the plants' chemical cry for help and attracts predatory bugs that attack herbivorous larvae. "Tobacco plants vary a lot in their linalool emission. If linalool mainly has a defensive effect, that is attracting predators and repelling hawkmoths, we would expect there to be less variation. Obviously, linalool emission is not always advantageous for the plant. Therefore we wanted to explore systematically, which ecological interactions result from differences in linalool production," Jun He, the first author of the study, explains.

The scientists were able to identify the enzyme that regulates linalool synthesis in Nicotiana attenuata and to determine its genetic basis. To achieve this, they crossed plants from native populations in Arizona which were high in linalool production, with plants from Utah which produced considerably less linalool. This approach, which is called forward genetics, allowed for an identification of genes underlying the natural variation of linalool synthesis.

Mirror images of molecules and their different effects

Linalool occurs in two different forms, so called enantiomers. Both enantiomers, (R)-(?)-linalool and (S)-(+)-linalool, are almost identical, however, their three-dimensional structures are mirror images of each other. Although only (S)-(+)-linalool was found in natural Nicotiana attenuata populations in Utah and Arizona, the researchers also used plants in their experiments which produced its mirror image, (R)-(?)-linalool. Both enantiomers are perceived as two different compounds by hawkmoths, resulting in different effects on their behavior.

Experiments in an increasingly complex context

The scientists tested the effect of these plants in behavioral assays with tobacco hawkmoths. They observed the behavior of mated females exposed to two different experimental plants in a choice assay in a wind tunnel in order to answer the question how linalool blends affect oviposition. Amazingly, egg-laying was only partially influenced by a manipulated production of the two linalool enantiomers. In fact, the genetic background of the plants, that is, whether a Utah or an Arizona plant had been modified to produce more linalool, had a much higher impact on the moths' preferences. "It was surprising to us that experimental context mattered even more than the two different enantiomers", Richard Fandino, who designed the wind tunnel experiments, explains. The researchers performed further experiments with moths and different tobacco plants in oviposition chambers and a large experimental tent, where moths were able to fly around. However, the differences in the moths' responses to linalool emission vanished, the more complex the environment became.

The meaning of signals in context

Context is a linguistic term. It points to the problem that words or vocabulary may have different meanings depending on the communication situation in which they are used. This is also true for the "chemical vocabulary" component, linalool. Originally, the authors of the study expected that a chemical compound triggers a certain behavior. "However, our study showed that moths pay attention to many different features of plants when choosing where to feed or oviposit. Then, they integrate this information in order to choose among the available plants. Thus, differences in other plant properties as well as the availability of alternative plants and their characteristics, are likely to determine the importance of any individual cue: in this case, linalool", Meredith Schuman, one of the main authors of the publication, summarizes.

A better understanding of context-appropriate plant defense against herbivores might help to overcome problems in standardized industrial agriculture, such as the evolution of resistance to commonly used pesticides.

Credit: 
Max Planck Institute for Chemical Ecology

Female bedbugs 'control' their immune systems ahead of mating to prevent against STIs

Female bedbugs are able to cleverly control their immune systems ahead of mating, to boost their defence against mating-related infection

Male bedbugs are more attracted to females who have recently feasted on blood as they will lay lots of eggs and their bigger size means they cannot fight back against traumatic insemination

This ability to 'manage' immune systems might be shared by other insects

Findings may pinpoint ways of making female bedbugs more susceptible to natural routes of infection, something that in turn may help us control them

Female bedbugs who are 'full bellied' and therefore more attractive mates for males, are able to boost their immune systems in anticipation of catching sexually transmitted infections, research has found.

Led by the University of Sheffield, the research discovered a correlation between fed females and the chances of them being inseminated and therefore infected as a result.

To mitigate this, female bedbugs that have just dined on blood and are therefore full, are able to cleverly manage their simple immune system in anticipation of mating. This is in comparison to female bedbugs that do not get regular food, do not mate regularly and therefore do not have the same need to boost their immune system in defence of infection.

Mating amongst bedbugs involves the male critter inserting its needle-like penis into the female bedbug's abdomen, in a process known as traumatic insemination.

The study found that females who boost their immune system in anticipation of this traumatic insemination also benefit from a longer lifespan - by being better able to resist the effects of infection - and have greater reproductive success. This is despite laying eggs at the same rate as 'hungry' females.

While it was previously known that insects can aid their offspring when in a parasite rich environment, this is the first evidence that an individual bedbug can regulate immunity in anticipation of infection. The team now believe this ability to 'manage' immune systems might be shared by other insects.

Professor Mike Siva-Jothy, from the University of Sheffield's Department of Animal and Plant Sciences, who led the research, said: "This is a pretty clever skill that bedbugs have developed to protect themselves against infection from what is quite a brutal mating ritual.

"This ability for bedbugs to do complex things with their simple immune systems thanks to some clever management may well also be something other insects have grasped the ability to do.

"Everyone knows bedbugs are some of the most unwanted human bed-mates. We hope the findings might therefore help us pinpoint ways of making females more susceptible to natural routes of infection, something that may help us find new ways of controlling them."

The study looked at 100s of bedbugs over several years, during experiments that lasted two-three months.

The experts believe the key reason why male bedbugs are attracted to recently fed females is because they are full of blood and will therefore lay lots of eggs. In addition, these full females will be fat from their blood feast and so cannot fight back against traumatic insemination.

Professor Siva-Jothy added: "We now need to understand how this boost to the immune system is switched on and off in reproductive cycles and whether other organisms use similar systems to minimise infection by sexually transmitted diseases.

Credit: 
University of Sheffield

An effort to stop the revolving door for hospital patients may be spinning its wheels

image: Hospital readmissions for patients covered by Medicare who had hip or knee replacement surgery began dropping even before federal penalties for non-surgical patient readmission were announced, and the decline accelerated after that. But the pace of decline has slowed in the years since the penalties for readmission of hip and knee replacement patients were announced.

Image: 
Health Affairs

Every American hospital has two front doors: The real one, and an imaginary revolving door.

Any patient who winds up back in the hospital within a few weeks of getting out travels through that imaginary door. And the more of them there are, the more money their hospital stands to lose from the Medicare system.

This readmission penalty, as it's called, aims to spur hospitals to prevent unnecessary costly care.

But a new study shows that after several years of rapid improvements in readmissions, the readmission penalty program may be spinning its wheels more than it's slowing the spinning of the revolving hospital door.

Writing in the journal Health Affairs, a team from the University of Michigan reports findings from their analysis of data from nearly 2.5 million Medicare patients. They focused on those who had hip or knee replacement surgery before and after penalties affecting these operations were announced.

In fact, the study shows, the readmission rate for these patients had already started dropping by the time the idea of readmission penalties was announced as part of the Affordable Care Act in 2010.

Soon after that, the readmissions rate for these surgical patients started dropping faster--even though the penalties announced in the ACA did not apply to surgical patients.

The rate kept dropping rapidly for several years -- even though hospitals weren't getting penalized yet for hip and knee replacement-related readmissions.

But that improvement started to slow down.

After the government announced in late 2013 that penalties would expand to hip and knee replacement, the rate of readmissions for these patients kept dropping, but at nearly half the rate.

In other words, improvements in surgical readmissions slowed to the same trend they had before any penalties were announced in 2010.

"These findings raise the question of whether we're about to reach the floor in our ability to reduce readmissions for these patients," says Karan Chhabra, M.D., M.Sc., the study's lead author.

Trends beyond readmissions

At the same time the readmission rates were changing, the average cost of caring for a Medicare hip or knee replacement patient did too, the new study shows.

In fact, it dropped by more than $3,000 from 2008 to 2016.

And hip and knee patients' chance of heading home from the hospital, rather than to a skilled nursing facility or other setting, has increased over that time, the researchers report. So has the likelihood that they will have home health aide help when they get home.

The same efforts that hospitals may have launched to prevent readmission of medical patients may have extended to these surgical patients, the authors speculate.

These might include care coordination programs and telephone check-ins with recently discharged patients, or better patient education about home care or changes to their medications.

Implications for expansion

The Hospital Readmission Reduction Program, or HRRP, still carries large penalties - up to 3% of what a hospital earns for certain Medicare patients. Not only that, it has expanded to include more conditions, including heart bypass surgery and more types of pneumonia including those with sepsis.

But Chhabra and his colleagues say that adding more conditions to the program is not likely to result in much more readmission prevention or cost savings.

"Based on the experience so far, it's hard to believe that adding on penalties for more conditions will further bend the curve of readmission," says Chhabra, a National Clinician Scholar at the U-M Institute for Healthcare Policy and Innovation who is also a resident in the Department of Surgery at Brigham and Women's Hospital.

Recent research by other groups has suggested that non-surgical patients may actually be harmed by the drive to reduce readmissions, including being more likely to die at home. Safety net hospitals, which take care of poorer and sicker patients, are also penalized more often by the program.

Says Chhabra, "We may be approaching the point for these surgical patients where the unintended consequences of readmissions reduction efforts begin to dominate. When you've squeezed the possible benefits out, all you have left are harms."

Potential alternatives

In the end, some readmissions are inevitable, the authors say, and trying to drive rates lower through penalties may mean some patients who should have been readmitted to deal with an issue won't be.

Instead, the researchers suggest that more use of bundled payments - where Medicare sets a defined amount of money it will pay for the episode of care surrounding a surgical patient's operation - could produce better results.

This is because bundled payments ensure hospitals focus on costs and complications around the entire episode of care, not just one narrow metric like readmissions.

In the meantime, Chhabra says, patients who get hospitalized for surgery or any other reason should make sure to know what their lines of communication back to their care team at all hours after they leave the hospital.

Patients and the loved ones who will care for them should also make sure they understand the instructions they received at hospital discharge, and know what kinds of symptoms or changes should prompt them to contact their team. Often, their surgical teams can provide instructions or reassurance that can prevent a bounce back to the emergency department.

That kind of open communication can make the difference between an appropriate and an inappropriate rehospitalization.

Credit: 
Michigan Medicine - University of Michigan

'Back to school asthma' linked to tripling in rate of health service appointments

'Back to school asthma'--a seasonal peak in cases associated with the start of the school year in September--is linked to a tripling in the rate of family doctor (GP) appointments across England, reveals research published online in the Journal of Epidemiology & Community Health.

The phenomenon seems to particularly affect the under 5s and boys, national monitoring data show.

The prevalence of asthma and associated deaths and use of healthcare services in the UK are thought to be the highest in the world. And 'back to school asthma' accounts for up to a quarter of serious bouts of the condition in many northern hemisphere countries.

But it's not clear where the resulting pressure points on healthcare services might be. In a bid to find out, the researchers analysed routine monitoring data for family doctor consultations about worsening asthma, both within and outside normal practice hours, as well as related visits to hospital emergency care departments between 2012 and 2016.

They looked at the time and sex specific trends for 0-4 and 5-14 year olds for the last four weeks of the summer holiday up to the first 6 to 7 weeks of the autumn term.

All three sources of data indicated similar age and sex specific patterns, with use of health services for asthma up to 2.4 times higher among boys in both age groups than among girls.

The highest rate of asthma cases was among 5 to 14 year old boys for each of the three services.

Rates of GP in hours appointments for asthma fell during school holidays, followed by an increase in the first two to three weeks of each of the autumn half terms, with the sharpest rise at the start of the school year in September.

Patterns for GP out of hours services were similar to those of in hour consultations. And emergency care department visits for worsening asthma also peaked at the start of the school year for both age groups.

Across the entire study period, the average delay between the start of the school year and the 'back to school' effect varied for each of the years, beginning as late as 17 days afterwards and as early as 7 days before.

But after taking account of these annual variations and sex differences, the daily in hours consultation rate for worsening bouts of asthma was more than three times as high in the back to school period as during the summer holidays for children up to the age of 4. And it was 2.5 times as high for 5-14 year olds.

For GP out of hours consultations, the rates were around twice as high for both age groups.

No such obvious peaks were seen for children aged 15 and older.

This is an observational epidemiological study, and as such, can't establish cause. The coding applied to surveillance data may vary in quality, acknowledge the researchers. And it can be difficult to accurately diagnose asthma in young children because of the range of other factors than can produce similar respiratory symptoms.

But the very large number of children studied over several years lends weight to the findings, say the researchers, who suggest that multiple factors are likely to be involved.

These might include changes in the weather, air pollution, the stress of starting a new school year, and seasonal increases in circulating viruses, particularly rhinovirus, which is implicated in worsening childhood asthma.

"The underlying aetiology of ['back to school'] asthma is complex and in addition to the established contribution of respiratory infections, environmental determinants may be involved: the role of fungal spores (which show autumnal seasonality) could be an area for future research to investigate aetiology and thus determine potential future interventions," suggest the researchers.

"These results support the need for further preventable work to reduce the impact of [back to school] asthma in children," they conclude.

Credit: 
BMJ Group