Culture

Study finds black and Hispanic patients face more barriers when making doctor appointments

Discrimination may cause black and Hispanic patients to wait longer for a scheduled primary care appointment, according to a new Tulane University study published in JAMA Network Open.

The research could shine more light on why people who belong to racial and ethnic minority groups experience worse health outcomes than white people in the United States.

"Timeliness of care is really important," said lead author Janna Wisniewski, assistant professor of health policy and management at Tulane University School of Public Health and Tropical Medicine. "Delay in seeing a provider means that the patient spends more time experiencing the illness or injury. They may be anxious or in pain for longer. They may struggle for longer to go to work or take care of their family. Delay also gives the condition more time to worsen, which means that if a health system gives more timely care to one group over another, the health system itself may be contributing to health disparities."

The research team recruited seven female callers who self-identified as non-Hispanic black, non-Hispanic white or Hispanic. Each invented a pseudonym that they felt signaled their gender, racial and ethnic identities and that they felt comfortable using on the calls. The women called more than 800 primary care offices in Texas. Each time, the caller introduced herself by her pseudonym and asked to be scheduled for the next available appointment as a new patient. Callers did not proactively offer any additional information but did answer any questions the scheduler asked using a standardized script.

The study found black and Hispanic callers were more likely to be offered an appointment, but they were asked more frequently about their insurance status. Researchers found black callers were 44% more likely than white callers to be asked about their insurance status during the call. Hispanic callers were 25% more likely than white callers to be asked if they had insurance. The study also found patients belonging to racial/ethnic minority groups received appointments further in the future than white callers.

"Schedulers may have believed that race and ethnicity were associated with insurance status, and those who asked about insurance appeared to be inquiring in response to race and ethnicity signals," Wisniewski said. "Asking about insurance may imply scheduling staff's concern about the caller's ability to pay."

Wisniewski said offices could reduce barriers to care with bias training and other mechanisms, such as automated scheduling systems. She hopes knowing the information will begin a conversation about overcoming bias in healthcare settings.

“This is not something that’s routinely checked in hospitals and clinics, whether they’re inadvertently discriminating based on race or ethnicity,” Wisniewski said. “Starting to even look at that and bring attention to it might be a good first step.”

Credit: 
Tulane University

Infectious disease experts warn of outbreak risks in US border detention centers

BALTIMORE, MD., Jan 29 - Over the past year, at least seven children have died from diseases including influenza while being detained by the U.S. Department of Homeland Security's Customs and Border Protection (CBP) agency. Infectious disease experts at the University of Maryland School of Medicine (UMSOM) called for protections like influenza vaccinations to prevent serious outbreaks.

"Detention centers have become tinderboxes for infectious-disease outbreaks," warned Mark Travassos, MD, Assistant Professor of Pediatrics at UMSOM and a pediatric infectious disease specialist in the UMSOM Center for Vaccine Development and Global Health (CVD). In a commentary published in the New England Journal of Medicine, Dr. Travassos and UMSOM Pediatrics-Internal Medicine Resident Carlo Foppiano Palacios, MD, said that it is not a surprise that thousands of detained migrants and asylum seekers have been quarantined because of influenza, mumps and chickenpox outbreaks.

"Children and adults are being held in crowded conditions without adequate sanitation or medical care," they wrote, adding that the physical and emotional stress and trauma that migrants and asylum seekers experience can also weaken their immune systems, thereby increasing their risk of systemic infection.

The rise of outbreaks and deaths in these detention centers point to the urgent need for mandatory influenza immunization for migrant children and an opt-out vaccine policy for adults in CBP detention centers, Dr. Travassos and Dr. Foppiano Palacios warned.

"The logistics of vaccine administration are relatively straightforward. Influenza vaccine is simple to administer and carries a low risk of adverse effects. In the event that a detainee has previously been immunized, there is no drawback to receiving multiple vaccinations," they said.

They also reccommended that employees at these detention centers be held to similar vaccination standards as health care workers at U.S. hospitals during influenza season, citing the Centers for Disease Control and Prevention's Advisory Committee on Immunization Practices' recommendation that all people who work in health care facilities receive annual influenza vaccines. "Mandatory immunization of these workers is critical to limiting the spread of diseases such as influenza," they asserted.

Infectious disease specialists at the CVD highlight the importance of the seasonal influenza vaccine. "We know that influenza spreads easily, particularly among children and in crowded conditions, and we know that vaccines are an important tool to prevent outbreaks," said Kathleen Neuzil, MD, MPH, the Myron M. Levine Professor in Vaccinology, Professor of Medicine and Pediatrics and Director of the CVD.

For more than four decades, experts in the CVD have been developing and testing vaccines to protect against infectious diseases like seasonal influenza, as well as emerging strains of this virus. Physician-scientists are also researching ways to develop a longer-lasting seasonal influenza vaccine.

"Our infectious disease research at the University of Maryland School of Medicine has served as a critical tool in protecting even the most vulnerable populations, the elderly and those with weakened immune systems, from complex and emerging diseases. Vaccines are an important tool in preventing serious illnesses such as influenza, measles, mumps and chickenpox," said UMSOM Dean E. Albert Reece, MD, PhD, MBA, who is also Executive Vice President for Medical Affairs, UM Baltimore, and the John Z. and Akiko K. Bowers Distinguished Professor, University of Maryland School of Medicine

Credit: 
University of Maryland School of Medicine

Prescribed burns benefit bees

image: This parasitic sweat bee was photographed in a longleaf pine forest in North Carolina. A study of these forests finds that bee diversity and abundance is twice as high in forests that are managed with prescribed burns.

Image: 
Clyde Sorenson, NC State University

Freshly burned longleaf pine forests have more than double the total number of bees and bee species than similar forests that have not burned in over 50 years, according to new research from North Carolina State University.

For many forests, fire is as essential as rainfall. But while several studies have outlined the benefits of human-controlled prescribed burns on forest ecosystems, little was understood about how prescribed burns, or fires in general, may impact pollinators.

"There is global concern about the decline of insects in general, and pollinators in particular, so it's really important for land managers to understand how prescribed fire affects insect communities," says Elsa Youngsteadt, co-author of a paper on the work and an assistant professor in NC State's Department of Applied Ecology.

"Given the importance of fire in maintaining longleaf pine ecosystems overall, you would expect it to be good for the region's native bees. But it's also easy to imagine small bees and their nests, especially nests in twigs and stems, just getting incinerated. We weren't sure where we would find the most robust pollinator community."

NC State researchers worked with the Walthour-Moss Foundation's longleaf pine savannah reserve, which was established to protect this endangered pine. The reserve regularly burns 90% of its plots in 3-year cycles, while the remaining 10% of plots have not been burned for at least 50 years. This provided an ideal opportunity to compare bee abundance and diversity between unmanaged and managed ecosystems.

"The southeastern U.S. has some of the highest lightning strike rates in the world, which used to contribute to low-intensity fires passing through the longleaf pine savannas every 2 or 3 years," Youngsteadt says. "But agriculture, development, and logging fragmented this landscape and blocked the movement of fire."

For this study, researchers placed bee "traps" at 16 sites: four that had been burned the year of sampling, four that had been burned one year before sampling, four that had been burned two years before sampling, and four unburned control sites.

The researchers found that burned sites supported 2.3 times more total pollinators than plots that had not burned in 50 years. Burned sites also had 2.1 times as many different bee species as unburned sites. Within those burned areas, bee abundance and diversity tended to be greatest at sites that were most recently burned, and this abundance and diversity decreased with time since the last fire.

But why?

Fires maintain openings in the forest canopy, reduce ground cover and release nutrients into soils at the same time, creating the perfect environment for large blooms, increasing the flower resources pollinators rely on. The study also found that the low-intensity prescribed burns did not reduce the amount of nesting material for above-ground nesting pollinators, and the abundance of above-ground nesting pollinators was not impacted by the fires. Meanwhile, below-ground nesting species appeared to benefit from the increased access to bare soil.

"It's great news that prescribed fire, as currently used in longleaf pine savannas, is helping to support the pollinator community," Youngsteadt says. "But there's still a lot to learn. For example, the fires in this study were set in the winter, but many land managers use summer burns. Knowing the effects of fire in different seasons will be an important next step, as will knowing the optimal area of land to burn at any one time."

Credit: 
North Carolina State University

Designing a puncture-free tire

image: This is an illustration of a non-pneumatic tire structure showing the shear layer.

Image: 
Illinois' Department of Aerospace Engineering

Some golf carts and lawnmowers already use airless tires and at least one major tire company produces a non-pneumatic automotive tire, but we still have long way to go before they are on every vehicle that comes off the assembly line. Finding a design that balances puncture-free strength with the elasticity needed for a comfortable, shock-free ride like conventional pneumatic tires is the key.

To address some of the issues, University of Illinois researchers focused on one component of the tire--the shear layer, which is just beneath the tread.

"The shear layer is where you get the most bang for your buck from a design perspective. It's where you have the most freedom to explore new and unique design configurations," said Kai James, assistant professor in the Department of Aerospace Engineering at U of I.

James along with U of I graduate student Yeshern Maharaj used design optimization, a computer algorithm, to come up with a variety of structural patterns for the shear layer of a non-pneumatic tire.

They had a computer simulation that modeled the elastic response on the shear layer. The simulation calculates the material's ability to stretch and twist.

"We were looking for a high level of shear--that is, how much strain the material can take under pressure--but we want stiffness in the axial direction," James said.

These physical pressures are not like aging or weathering on the tire, but about internal pressure and stresses--essentially, how much pressure the material exerts on itself.

"Beyond a certain level of stress, the material is going to fail," James said. "So we incorporate stress constraints, ensuring that whatever the design happens to be, the stress doesn't exceed the limit of the design material.

"There are also buckling constraints. If you have a narrow, slender member, say a strut within the element, that's undergoing compression that could be subject to buckling. We have ways to mathematically predict what force level is going to induce buckling in the structure and modify it accordingly. Depending on how you weight each of the design requirements--buckling, stress, stiffness, shear, and every combination of those--will result in a different design."

The goal is a tire design that can withstand pressure but is also elastic to provide a ride that doesn't feel like you're driving on tires made of steel.

James explained how, as the computer simulation works to find the optimum pattern, it eliminates structural patterns that are not optimal. It begins with a computer-simulated block of the bulk material that the tire will be made from. Because a solid block doesn't have much elasticity, the material is virtually cut away, leaving spaces for flexibility.

"If you carve holes in the material until it is something like a checkerboard pattern, with half of the material, you'd also have half of the original stiffness," he said. "Now, if you do a much more complicated pattern, you can actually tailor the stiffness."

Obviously, on a continuum from a block of material to a thin, lacelike pattern, the number of potential designs is infinite, but it's not realistic to test every design. And, it's important to note that the algorithm doesn't end by spitting out a single, optimal design.

"Search algorithms have clever ways to strategically search the design space so that ultimately you end up having to test as few different designs as possible," James said. "Then, as you test the designs, gradually, each new design is an improvement on the previous one and eventually, a design that is near optimal."

James said computer modeling of a structure like this one, or any physical system has levels of complexity encoded into the model--a higher accuracy model with higher fidelity is more costly.

"From a computational standpoint, we're generally talking about the time it takes to run the analysis on high-powered computers," James said.

Future analysis will require an industry or research collaborator.

Credit: 
University of Illinois Grainger College of Engineering

Low-calorie sweeteners do not mean low risk for infants

image: A researcher conducts test on low-calorie sweeteners in the lab.

Image: 
Riley Brandt

Many people turn to artificial or so-called natural sweeteners to cut calories and lose weight. A new study led by Dr. Raylene Reimer, PhD, published in the high-impact journal Gut discovered that the consumption of low-calorie sweeteners while pregnant increased body fat in their offspring and disrupted their gut microbiota - the trillions of bacteria and other microorganisms that inhabit the intestinal tract and affect our health and risk of numerous diseases.

The findings are significant as they impact the critical early years of life, particularly during pregnancy and breast feeding.

"Low-calorie sweeteners are considered safe to consume during pregnancy and lactation, however evidence is emerging from human studies to suggest they may increase body weight and other cardiovascular risk factors," says Reimer, a University of Calgary professor in the Faculty of Kinesiology, and Department of Biochemistry & Molecular Biology at the Cumming School of Medicine , and member of the Alberta Children's Research Institute.

"Even stevia, which is hailed as a natural alternative to aspartame and other low calorie artificial sweeteners, showed a similar impact on increasing offspring obesity risk in early life."

Aspartame, an artificial sweetener, and stevia, a natural low-calorie sweetener extracted from a plant native to South America, are 200-400 times sweeter than sugar. Stevia, gaining popularity, was historically used in Paraguay and Brazil to treat diabetes and is an emerging ingredient in many natural products and protein drinks.

Demand for sweeteners for weight loss

In response to higher obesity rates, the use of low-calorie sweeteners has risen, particularly in women and children. Daily consumption is associated with large babies and early menstruation in young females under 10 years - a known risk factor for chronic diseases. Additionally, the presence of some but not all of these sweeteners has been detected in breastmilk presenting a potential mode of transmission, according to the study.

"Understanding the impact of dietary ingredients on maternal metabolism and gut microbiota may help to define the optimal maternal diet, one which promotes a healthier future for both mother and child," says Reimer.

Altering the gut microbiota of babies

Our understanding of how sweeteners affect weight gain is not complete but there is reason to believe that alterations in the gut microbiota may play a key role. In this animal study, a fecal transplant was used to show the direct influence of altered gut microbiota on causing the increased obesity risk. Transplanting fecal matter from the offspring of mothers that consumed the low-calorie sweeteners into sterile, germ free mice caused the mice to gain more weight and have worse blood glucose control. Even though the offspring had never consumed the sweeteners themselves, the changes to mom's microbiota and metabolism was sufficient to change the microbiota in their offspring and trigger obesity.

"A healthy pregnancy, including good nutrition, is important for a healthy baby," says Reimer. "Our research will continue to examine what makes an optimal diet and more importantly seek to find ways to correct disruptions to gut microbiota should they occur."

Credit: 
University of Calgary

Antibiotic-resistance in Tanzania is an environmental problem

image: Murugan Subbiah (left) and Beatus Lyimo process fecal samples at the Nelson Mandel African Institution of Science and Technology, Arusha, Tanzania.

Image: 
WSU

Antibiotic-resistant bacteria are prevalent in people, wildlife and the water in northeastern Tanzania, but it's not antibiotic use alone driving resistance. Instead, researchers at Washington State University found transmission of bacteria in the environment is the most important factor.

These conclusions come from a four-year study led by researchers from WSU's Paul G. Allen School for Global Animal Health. The results of the study were just published in Nature Communications.

"We were surprised to find these microbes everywhere," said Douglas Call, a Regents professor and associate director for research at the Allen School, "but it appears that within impoverished communities, there are many opportunities for bacteria to spread between animals and people via contact with waste or through consumption of contaminated food and water."

The study, funded by the National Science Foundation, began in March 2012 and involved visiting 425 households from 13 villages throughout northeastern Tanzania.

At each household, data was collected about people's daily activities, after which researchers collected fecal samples from people, domestic livestock, chickens, dogs, and when present, wildlife. Water was also sampled.

The methods used by the team were unique from most studies, allowing collection and testing of more than 61,000 bacterial isolates. Depending on the community sampled, over 65% of bacteria from people were resistant to at least one of the nine antibiotics tested.

The prevalence of antibiotic-resistant bacteria was highest for people, but it was also high for other domestic animals even when those animals were never exposed to antibiotics.

For example, in some communities up to 50% of bacteria from dogs were antibiotic-resistant.

"In these communities, no one is treating their dog with antibiotics, and yet they have a high prevalence of resistance," Call said. "It's not an antibiotic use problem; they are coming into contact with antibiotic-resistant bacteria in the environment."

Antibiotic-resistant bacteria were also prevalent in wildlife.

More than 50 percent of wildlife feces contained ampicillin-resistant bacteria, which was higher than the average across people, chickens, livestock, and dogs. The prevalence of resistance to the remaining eight antibiotics was highly correlated with results from domestic samples.

"We've got almost as much resistance on the wildlife side as the domestic side," Call said. "This is one factor that shows how bacterial transmission plays such an important role in this system."

Antibiotic-resistant bacteria were also prevalent in water sources.

The World Health Organization recognizes antibiotic resistance as a threat to global health and estimates 10 million deaths worldwide by 2050 if no effective interventions are made.

While antibiotic-resistant bacteria are prevalent throughout the study area, these findings are the first step to address potential health risks.

"Hygiene and sanitation have to figure more prominently in efforts to combat antibiotic resistance," said Mark Caudell, a first-author of this work. Caudell, is a former WSU researcher and now the regional social science coordinator for antimicrobial-resistance at the Food and Agriculture Organization of the United Nations.

"Until hygiene and infrastructure improves, and transmission begins to decline, antibiotic stewardship alone is unlikely to have much impact," he said.

Credit: 
Washington State University

Unhealthy and unhappy -- the mental toll of troubled relationships

Some forms of domestic violence double victims' risk of depression and anxiety disorders later in life, according to University of Queensland research.

The UQ School of Public Health study found many victims of intimate partner violence at 21 showed signs of mental illness at the age of 30, with women more likely to develop depression and men varying anxiety disorders.

Intimate partner violence classifies physical abuse as pushing, shoving and smacking.

UQ researcher Emeritus Professor Jake Najman said the team also found equal levels of abuse by men and women.

"The number of men and women who experience intimate partner violence is very similar, leading us to believe couples are more likely to abuse each other," Professor Najman said.

"People generally don't end up in the hospital or a shelter, but there is a serious mental burden from this type of abuse."

The research showed defacto couples and those from lower socio-economic backgrounds were more likely to be involved in these types of abusive relationships.

Emotional abuse involves comments that make the person feel worthless.

Then there is harassment - a constant and distressing nagging that may have long-term consequences for those on the receiving end.

"It also raises the question, to what extent is this type of violent behaviour not just a characteristic of the relationship the couple has with each other, but with other people around them and possibly their children," Professor Najman said.

"There is a range of treatment and counselling programs available for couples and families to try and improve the way they relate to one another."

Credit: 
University of Queensland

NUS Medicine researchers can reprogramme cells to original state for regenerative medicine

Singapore, January 2020 -- Early mammalian development is a highly complex process involving elaborate and highly coordinated biological processes. One such process is zygotic genome activation (ZGA) which occurs following the union of the sperm and egg, marking the beginning of life. The resultant early embryos, termed 'zygotes' are capable of generating the entire organism, a property known as totipotency.

Totipotent cells sit atop the developmental hierarchy and have the greatest potency of all cell types, giving it limitless therapeutic potential. Surpassing pluripotent embryonic stem cells, which are only able to differentiate into all cell types within the embryo, the totipotent zygote loses its totipotency as in matures into pluripotency.

Scientists at the National University of Singapore's Yong Loo Lin School of Medicine have now found a way to manipulate pluripotent cells into acquiring the totipotent capacity previously thought to exist only in the zygote. This not only provides key insights into how totipotency is formed and the earliest events in mammalian development, but opens new doors for potential cell therapies that were previously unexplored.

The study identified a totipotency-inducing factor - Negative Elongation Factor A (NELFA), which is capable of driving pluripotent embryonic stem cells into totipotency in a petri dish. NELFA achieves this feat by causing specific changes in the gene regulatory and metabolic networks of the cell. Specifically, NELFA has the ability to reactivate certain genes that are only active in the zygote but otherwise silent in embryonic stem cells. NELFA is also able to alter the energy using pathways in the pluripotent stem cells. All these changes will result in pluripotent stem cells reverting into a totipotent-like state.

Discovering this method of inducing totipotency in cells outside of the embryo also provides a means to engineer cells with maximum cell plasticity for therapeutic purposes. This increases the potential applications of regenerative medicine, especially in cell replacement therapies.

According to Assistant Professor Tee Wee Wei, the lead investigator in this study, the eventual goal of this research is to translate the findings into the development of rapid and efficient cellular reprogramming strategies for clinical application, such as in the treatment of debilitating diseases and developmental disorders.

Credit: 
National University of Singapore, Yong Loo Lin School of Medicine

1 in 4 kids who get antibiotics in children's hospitals are prescribed drugs incorrectly

The overuse of antibiotics poses an increasing threat to children who develop -- or already have -- drug-resistant infections that are difficult or impossible to treat, and can cause extended hospitalization, disability and even death.

At any given time, about 1 in 3 patients in U.S. children's hospitals receive one or more antibiotics. However, for one-quarter of those children, the antibiotic treatments are unnecessary or otherwise "suboptimal," according to research led by Washington University School of Medicine in St. Louis.

The research -- involving nearly 12,000 children at 32 U.S. children's hospitals -- is published online in Clinical Infectious Diseases, a journal of the Infectious Diseases Society of America.

The study also found that nearly half of this inappropriate use of antibiotics would have gone undetected by current antibiotic stewardship programs designed to prevent antibiotic resistance. Such programs involve the routine review of certain patient prescriptions to determine if the correct dose, drug and duration were used. Problems can be flagged and addressed through such reviews.

"Antibiotic resistance is a growing danger to everyone; however, there is limited data on children," said Jason Newland, MD, a Washington University professor of pediatrics in the Division of Pediatric Infectious Diseases and director of the Antimicrobial Stewardship Program at St. Louis Children's Hospital, where he treats patients.

"Data on adults have suggested that 30% to 50% of antibiotics used in hospitalized adults is inappropriate," Newland said. "Our goal was to understand if antibiotics used to treat hospitalized children were suboptimal, meaning doctors shouldn't have prescribed any antibiotics; they could have used a more effective antibiotic; or they could have prescribed a different dose or for a shorter duration. Health-care workers must be vigilant since the inappropriate use of antibiotics is fueling dangerous drug resistance in children."

The multicenter study involved examining the medical records of 11,784 children from birth to age 17 who had been prescribed, in 32 U.S. children's hospitals, one or more antibiotics to treat or prevent infections. The researchers evaluated data collected on six separate days from July 2016 through December 2017.

Researchers found 2,946 (25%) of the patients received at least one antibiotic deemed suboptimal.

Altogether, health-care providers prescribed antibiotics 17,110 times. Of those, 3,593 were considered suboptimal. The most common cases of inappropriate antibiotic use included:

27% due to "bug-drug mismatch," meaning the wrong antibiotic was given for a particular infection.

17% due to prolonged antibiotic use after surgery to prevent surgical-site infections.

11% due to use of antibiotics when they were unnecessary.

11% due to use of broad-spectrum antibiotics, when a drug that targets a specific type of bacteria could have been used.

The study also found that almost half of the suboptimal prescriptions identified in the study would not have been routinely reviewed by the physicians and pharmacists involved in hospital antimicrobial stewardship programs, which only scrutinized the use of specific drugs.

"Arguably, this is one of the most important findings because it helps us to identify blind spots in antimicrobial stewardship programs," Newland said. "Antibiotics currently not targeted for review still have a significant need for oversight. The obvious solution is to expand routine reviews to include all antibiotics. Unfortunately, this is resource-intensive and may not be feasible at some hospitals."

However, the study also pinpointed medical conditions that would benefit from increased scrutiny. For example, the most common medical condition for antibiotic prescriptions was bacterial lower respiratory tract infection, or pneumonia. It also accounted for the greatest percentage of suboptimal prescriptions -- 18%.

A number of suboptimal prescriptions stemmed from antibiotics used to prevent surgical-site infections, mainly because the drug was prescribed for longer than necessary. "Notably, recently revised guidelines from the Centers for Disease Control and Prevention recommend that some surgeries limit antibiotics to a single preoperative dose," Newland said. "This means that an even greater proportion of the cases in our study now would be considered suboptimal."

The study involved researchers from seven other institutions: University of Michigan in Ann Arbor; Children's Mercy Hospital in Kansas City, Mo.; Novant Health Eastover Pediatrics in Charlotte, N.C.; University of Pennsylvania in Philadelphia; Children's Hospital of Philadelphia; University of Utah in Salt Lake City; and the University of Washington in Seattle. Participating hospitals were recruited by the Sharing Antimicrobial Reports for Pediatric Stewardship Collaborative, which Newland co-founded and currently leads.

"Our study also highlights the need for antimicrobial stewardship programs to expand current practices and efforts," Newland said. "Such evolution is imperative to ensure optimal antibiotic use for all hospitalized children."

Credit: 
Washington University School of Medicine

Smart single mother bees learn from their neighbors

Solitary female bees inspect other nests for signs of danger before making decisions on where to build their own, a new London-based study suggests.

The study, led by researchers at Queen Mary University of London, found the clever bees looked for signs of parasite infection in other species' nests and used this information to select a safe place to bring up their own brood.

The research team set up artificial nests in parks and grasslands across South East England and London from 2016-2018 to study the behaviours of different species of solitary cavity-nesting mason bees.

The scientists also tested the ability for these species to notice other cues of parasite infection in the surrounding environment.

They showed that solitary bees were surprisingly intelligent in their observations and were able to remember geometric symbols found next to parasitised nests, and avoid nests near these symbols in future breeding periods.

Dr Olli Loukola, lead author and postdoctoral fellow at Queen Mary University of London and the University of Oulu, said: "It's amazing that solitary bees are able to use such a complex strategy in their nest-site decisions. It really requires a sophisticated cognitive flexibility and it is fascinating to uncover how much genius is found in these small-brained solitary bees.

"Despite being solitary, these bees live in communities and can learn from each other. As environmental factors such as nesting suitability, predation and parasitism change both spatially and over time, it makes sense for bees to glean information from their neighbours, even if they aren't the same species."

Most research to-date has focused on social species, which live in communities led by a queen with several dozen to several thousand workers. However, most wild species are in fact solitary and individuals build their own nest rather than living in a hive as part of a larger community.

Whilst these bees live independently, they often exist in in groups with multiple different species residing in close proximity to each other.

Dr Lars Chittka, Professor of Sensory and Behavioural Ecology at Queen Mary University of London, said: "There are around 20,000 solitary bee species in total and about 150 live right here in London. In these species, female individuals, 'single mothers' build their own nests. The male bees never do any work.

"Our research suggests that despite the fact that these females largely work alone, they're able to use cues in their environment and activities of other animals in their surroundings, to successfully protect their broods."

With reported declines in bee numbers over the last few years, studies that improve our understanding of their behaviour and the environmental pressures they face are important for future conservation efforts to save these pollinators.

Credit: 
Queen Mary University of London

Most innovative cancer drugs facing delays in reaching patients

Analysis finds that highly innovative drugs took longer to reach NHS patients than more conventional treatments

Increasing numbers of cancer drugs are being licensed - but treatments are taking longer to progress through trials and approval

Researchers warn whole system of drug discovery and development - including researchers, companies and regulations - is too risk averse

Innovative drugs play an essential role in tackling cancer evolution and drug resistance - the biggest challenge in cancer research and treatment today

Cancer patients have had to wait longer for innovative new cancer drugs than for more conventional treatments, suggesting the most exciting new therapies have not been successfully fast tracked, a new analysis reports.

The researchers found that the higher the level of innovation of a cancer drug, the longer it was taking to pass through clinical trials, licensing and appraisal for availability on the NHS.

Their study found that increasing numbers of drugs are being authorised - including rising numbers of highly innovative treatments. But it is taking an increasing amount of time for these new drugs to navigate the strict regulations on clinical trials and licensing, forcing patients to wait even longer for the latest treatments.

A team at The Institute of Cancer Research, London, carried out the analysis as part of preparation for its pioneering new Centre for Cancer Drug Discovery, to help build the case for more streamlined regulations for the most exciting treatments.

Innovative new treatments are designed to attack brand new weaknesses in cancer, and form a key part of The Institute of Cancer Research (ICR)'s strategy to overcome cancer's lethal ability to evolve resistance to treatment.

The study is published today (Wednesday) in the journal Drug Discovery Today, and provides an overview of access to all drugs licensed through the European Medicines Agency (EMA) from 2000 to the end of 2016.

The researchers assigned each drug to one of three categories of innovation - high, medium or low - with the high innovation category including drugs with a new target or mechanism of action, or representing a new class of treatment in an area of unmet need.

The most innovative cancer drugs took 3.2 years longer to go from the filing of the patent through to NHS patients than low-innovation treatments - suggesting various initiatives intended to prioritise the most innovative drugs had yet to be effective.

The researchers are calling for action to be taken to improve patients' access to the most exciting new treatments by streamlining regulations on clinical trials and licensing and more strongly incentivising innovation in drug discovery and development.

Between 2000 and the end of 2016, the most innovative new drugs took 14.3 years to progress from patenting through to availability on the NHS - compared with 13.5 years for medium-innovation drugs and 11.1 years for the least innovative treatments.

Much of the delay seemed to occur in the period from the start of a phase I trial through to EMA authorisation - which lasted an average of 8.9 years for the most innovative drugs compared with 8.7 years for medium-innovation drugs and 6.8 years for the least innovative.

Among the highly innovative drugs to undergo delays, mifamurtide took 20 years to go from patent through trials and licensing to NICE approval for treating osteosarcoma, and it took trabectedin 22 years from patent to NICE approval for advanced soft tissue carcinoma. More recently, the EMA only authorised olaparib for breast cancer in April 2019, 15 months after the US Food and Drug Administration, and NICE has still not released its appraisal.

The average number of drugs to be licensed by the EMA increased from a median of six per year from 2000-08, to 13.5 per year from 2009-16. But the average time from a drug's patent being filed to it being made available on the NHS increased over the study period - from 12.8 years for drugs first licensed between 2000 and 2008, up to 14.0 years for drugs licensed between 2009 and 2016.

The time taken from the filing of the patent to setting up a phase I trial rose from a median of two years for those licensed from 2000 to 2008, up to three years for those licensed from 2009 to 2016.

And as with the delays to approval seen in the most innovative new drugs, there seemed to be hold-ups in the period between the start of the phase I trial and EMA authorisation - which lasted an average of 7.7 years for drugs first licensed from 2000 to 2008, but 9.0 years for those licensed from 2009 to 2016.

The researchers found that NICE had reduced the lag time between EMA authorisation and its beginning technology appraisals, from a mean of 21 months for drugs first licensed between 2000 and 2008, down to 6.5 months for drugs licensed between 2009 and 2016. But it was getting no faster at carrying out its appraisals, which took 16.7 months from 2000-08, and 16.0 months from 2009-16.

There was evidence that NICE had not been prioritising the most innovative treatments, with only 38 per cent of highly innovative cancer drugs having received a positive recommendation at the time of the analysis, compared with 53 per cent of drugs classed as moderately innovative and 40 per cent of low-innovation drugs.

However, this difference seemed to be because NICE had been less likely to start an appraisal for more highly innovative drugs - and it has since committed to appraising all new cancer drugs, which should address the discrepancy.

Overall, the findings suggest that there is a need to build stronger incentives into the whole system of drug discovery and development to encourage companies to take forward innovative new treatments that can deliver major advances for patients.

The study especially raises concerns over the regulation of clinical trials and drug licensing in the UK and Europe - and suggests that the Medicines and Healthcare Products Regulatory Agency (MHRA) and EMA should adapt their approaches to regulation to make it easier to take exciting new treatments to patients.

The study is published with the ICR now needing to raise less than £10 million for its new £75 million Centre for Cancer Drug Discovery, which will house the world's first drug programme to be dedicated to combating cancer evolution and drug resistance.

Study leader Professor Paul Workman, Chief Executive of The Institute of Cancer Research, London, said:

"Our study details the major progress being made against cancer, with the average number of drugs being licensed each year more than doubling over the last decade. But it also makes clear that our regulatory systems are not keeping pace with advances in the science. It is taking longer for new drugs to reach patients and, alarmingly, the delays are longest for the most exciting, innovative treatments, with the greatest potential to transform the lives of patients.

"At the moment the whole ecosystem for drug discovery and development - involving regulators, researchers and companies - is too risk averse. It's crucial that academic researchers and pharmaceutical companies should feel that the regulatory systems for drug development support risk taking and innovation, rather than discouraging it and slowing it down. Our study raises questions in particular for the processes in the UK and Europe for regulating clinical trials and licensing, which need to do better at recognising and rewarding innovation.

"As the ICR begins an exciting new 'Darwinian' programme of drug discovery to combat cancer evolution and drug resistance, we are calling on Government, regulators and industry to together reshape the cancer therapeutics ecosystem, so the best new treatments can progress to patients much more quickly."

Credit: 
Institute of Cancer Research

Drinking alcohol during pregnancy: #DRYMESTER the only safe approach

Drinking alcohol during pregnancy leads to poorer cognitive functioning in children, according to the most comprehensive review on the issue to date. The University of Bristol research published today [29 January] in the International Journal of Epidemiology, reviewed 23 published studies on the topic and found evidence that drinking in pregnancy could also lead to lower birthweight. The findings reinforce the UK Chief Medical Officers' #DRYMESTER guidelines, which is abstaining from alcohol in all trimesters.

To study the effects of drinking alcohol during pregnancy, the researchers funded by the National Institute for Health Research (NIHR) and the Medical Research Council (MRC) combined results from very different study designs for the first time. Methods included traditional studies such as randomised controlled trials, alongside alternative strategies such as comparing children in the same families whose mothers cut down or increased their alcohol use between pregnancies, and a genetic marker-based approach, 'Mendelian randomization'.

Previously, research on this topic has been through 'observational' studies, where participants are already exposed to a risk factor and researchers do not try to change who is or isn't exposed. However, there are limitations with this type of study: it can be impossible to unpick what is caused by alcohol, and what is caused by other factors. These factors could include a woman's education or family environment, as well as genetic predisposition, which can affect her child's development and cognition in the long-term.

Alternative study designs use different ways of minimising or removing these 'confounding' factors, so the results are more reliable. If the results of these studies all point in the same direction, they are less likely to be because of errors and biases, and more likely to be true causes of health and disease.

All the studies included in the review tried to compare like with like groups of people who were only different in terms of exposure to alcohol during pregnancy. This is as close as it gets to what would be achieved in an experiment.

While the review was comprehensive it was limited in its ability to establish how much alcohol leads to these negative outcomes. However, the researchers concluded that women should continue to be advised to abstain from alcohol during pregnancy.

Dr Luisa Zuccolo, the study lead and Senior Lecturer in Epidemiology at Bristol Medical School: Population Health Sciences, said: "The body of evidence for the harm that alcohol can do to children before they are born is growing, and our review is the first to look at the full range of studies on the issue. This is unlikely to be a fluke result, as we took into account a variety of approaches and results. Our work confirms the current scientific consensus: that consuming alcohol during pregnancy can affect one's child's cognitive abilities later in life, including their education. It might also lead to lower birthweight.

"Our study reinforces the UK Chief Medical Officers' guideline: DRYMESTER (abstaining in all trimesters) is the only safe approach. This message is more important than ever, given recent research which shows the alcohol industry promoting confusing information about the real health implications of drinking during pregnancy."

Credit: 
University of Bristol

Global dissatisfaction with democracy at record high, new report reveals

Dissatisfaction with democratic politics among citizens of developed* countries has increased from a third to half of all individuals over the last quarter of a century, according to the largest international dataset on global attitudes to democracy ever made.

In fact, researchers found that across the planet - from Europe to Africa, as well as Asia, Australasia, both Americas and the Middle East - the share of individuals who say they are "dissatisfied" with democracy has jumped significantly since the mid-1990s: from 47.9% to 57.5%.

The research team, from the University of Cambridge's new Centre for the Future of Democracy, say that the year 2019 "represents the highest level of democratic discontent on record". Detailed stocktaking of global political sentiment began in 1995.

The report used a unique dataset of more than 4 million people. It combines over 25 international survey projects covering 154 countries between 1995 and 2020, with some dating back as far as 1973, and includes new cross-country surveys.

The report, along with the new Centre, which will be based at the Bennett Institute for Public Policy, will be launched in Cambridge on Wednesday 29 January.

"Across the globe, democracy is in a state of malaise," said the report's lead author Dr Roberto Foa, from Cambridge's Department of Politics and International Studies (POLIS). "We find that dissatisfaction with democracy has risen over time, and is reaching an all-time global high, in particular in developed countries."

Professor David Runciman, head of the new Centre, said: "We need to move beyond thinking about immediate crises in politics and take a longer view to identify possible trajectories for democracy around the world. This means distinguishing what is essential to democracy, what is contingent and what can be changed.

"The Centre for the Future of Democracy will be looking at the bigger picture to see how democracy could evolve," he said.

The downward trend in satisfaction with democracy has been especially sharp since 2005, which marks the beginning of what some have called a 'global democratic recession'. Just 38.7% of citizens were dissatisfied in that year, but this has since risen by almost one-fifth of the population (+18.8%) to 57.5%.

Many large democracies are now at their highest-ever recorded level for democratic dissatisfaction. These include the United Kingdom, Australia, Brazil, Mexico, as well as the United States - where dissatisfaction has increased by a third since the 1990s. Other countries that remain close to their all-time dissatisfaction highs include Japan, Spain and Greece.

However, researchers uncovered what they call an "island of contentment" in the heart of Europe: Denmark, Switzerland, Norway and the Netherlands are among nations where satisfaction with democracy is reaching all-time highs. "We found a select group of nations, containing just two per cent of the world's democratic citizenry, in which less than a quarter of the public express discontent with their political system," said Foa.

Other regional "bright spots", where levels of civic contentment are significantly higher, include Southeast Asia, and to a lesser extent the democracies in South Asia and Northeast Asia. "For now, much of Asia has avoided the crisis of democratic faith affecting other parts of the world," said Foa.

The research team found that shifts in democratic satisfaction often responded to "objective circumstances and events" such as economic shocks or corruption scandals. "The 2015 refugee crisis and the 2008 financial crisis had an immediately observable effect upon average levels of civic dissatisfaction," said Foa.

Following the onset of the global financial crisis in October 2008, for example, global dissatisfaction with the functioning of democracy jumped by around 6.5 percentage points - an increase that "appears to have been durable", say researchers.

On a more hopeful note, the team also found the opposite: democracies working together to resolve policy crises have a positive effect. After the European Council agreed to a European Stability Mechanism to stem the sovereign debt crisis, dissatisfaction with democracy fell by 10 percentage points in Western Europe.

"Our findings suggest that citizens are rational in their view of political institutions, and update their assessment in response to what they observe," said Foa.

In the UK, the report shows democratic satisfaction rose fairly consistently for thirty years from the 1970s, reaching a high-water mark during the Blair years at the turn of the millennium. The Iraq War and parliamentary expenses scandal caused dips, but satisfaction plunged during the political stalemate following the EU Referendum. By 2019, for the first time since the mid-1970s, a clear majority of UK citizens were dissatisfied with democracy.

The US has seen a "dramatic and unexpected" decline in satisfaction, according to researchers. In 1995, more than three-quarters of US citizens were satisfied with American democracy, a figure that plateaued for the next decade. The first big knock came with the 2008 financial crisis, and deterioration has continued year-on-year ever since. Now, less than half of US citizens are content with their democracy.

"Such levels of democratic dissatisfaction would not be unusual elsewhere," said Foa. "But for the United States it may mark an end of exceptionalism, and a profound shift in America's view of itself."

The report's authors suggest that the 1990s were a better time for democracy, as the West emerged from the Cold War with renewed legitimacy, while multi-party elections spread across Latin America and Sub-Saharan Africa. However, repeated financial and foreign policy failures in established democracies, along with endemic corruption and state fragility in the Global South, have eroded trust in democracy over the last 25 years.

"The rise of populism may be less a cause and more a symptom of democratic malaise," said Foa. "Without this weakening legitimacy, it would be unthinkable for a US presidential candidate to denounce American democracy as rigged, or for the winning presidential candidate in Latin America's largest democracy to openly entertain nostalgia for military rule."

"If confidence in democracy has been slipping, it is because democratic institutions have been seen failing to address some of the major crises of our era, from economic crashes to the threat of global warming. To restore democratic legitimacy, that must change."

Credit: 
University of Cambridge

Protein levels in urine after acute kidney injury predict future loss of kidney function

High levels of protein in a patient's urine shortly after an episode of acute kidney injury is associated with increased risk of kidney disease progression, providing a valuable tool in predicting those at highest risk for future loss of kidney function.

This finding, published Jan. 27 in JAMA Internal Medicine by Vanderbilt University Medical Center researchers and collaborators from three other centers in North America, suggests that a greater emphasis should be placed on measuring urine protein levels in patients following acute kidney injury to ensure they receive appropriate follow-up.

Although protein levels in urine are known to rise after acute kidney injury, these protein levels are seldom measured, omitting a key piece of a patient's overall kidney health that is important for proper clinical decision making. Failing to measure urine protein levels also hinders the opportunity to reduce high levels of protein through targeted therapies, which could reduce adverse health outcomes following acute kidney injury.

In this study, researchers analyzed data from more than 1,500 hospitalized adults -- half of whom had acute kidney injury -- who completed an outpatient study visit three months after discharge. Patients were tested for proteinuria -- or high levels of protein in their urine -- and were followed for up to five years to assess for progression of kidney disease.

Disease progression was defined as a 50% decrease in the patient's ability to filter blood through the kidneys or by a diagnosis of end-stage renal disease.

The study found that high levels of protein in urine was a strong discriminator for future kidney disease progression, and when combined with other known risk factors for future loss of kidney function, risk could be better discriminated and predicted.

"Rates of acute kidney injury are growing, along with the number of people who leave the hospital after experiencing it. Some of these people will be at higher risk for future loss of kidney function," said Edward Siew, MD, MSCI, associate professor of Medicine in the Division of Nephrology at VUMC and senior author of the paper.

"As most will be seen by their primary care physicians, these findings inform the broader medical community that including a simple measurement of urine protein levels in the assessment of kidney function in the aftermath of an acute kidney injury episode can help identify those at highest risk, suggest potential strategies to reduce these risks and guide appropriate referrals."

Credit: 
Vanderbilt University Medical Center

'Curious and curiouser!' The space history in a meteorite

image: Olga Pravdivtseva, research associate professor of physics in Arts & Sciences at Washington University in St. Louis, uses noble gas isotopes to study the formation and evolution of the early solar system. Pravdivtseva, a member of the McDonnell Center for the Space Sciences, is pictured in her laboratory in Compton Hall.

Image: 
Whitney Curtis/Washington University

An unusual chunk in a meteorite may contain a surprising bit of space history, based on new research from Washington University in St. Louis.

Presolar grains -- tiny bits of solid interstellar material formed before the sun was born -- are sometimes found in primitive meteorites. But a new analysis reveals evidence of presolar grains in part of a meteorite where they are not expected to be found.

"What is surprising is the fact that presolar grains are present," said Olga Pravdivtseva, research associate professor of physics in Arts & Sciences and lead author of a new paper in Nature Astronomy. "Following our current understanding of solar system formation, presolar grains could not survive in the environment where these inclusions are formed."

Curious Marie is a notable example of an "inclusion," or a chunk within a meteorite, called a calcium-aluminum-rich inclusion (CAI). These objects, some of the first to have condensed in the solar nebula, help cosmochemists define the age of the solar system. This particular chunk of meteorite -- from the collection of the Robert A. Pritzker Center for Meteoritics and Polar Studies at the Chicago Field Museum -- was in the news once before, when scientists from the University of Chicago gave it its name to honor chemist Marie Curie.

For the new work, Pravdivtseva and her co-authors, including Sachiko Amari, research professor of physics at Washington University, used noble gas isotopic signatures to show that presolar silicon carbide (SiC) grains are present in Curious Marie.

That's important because presolar grains are generally thought to be too fragile to have endured the high-temperature conditions that existed near the birth of our sun.

But not all CAIs were formed in quite the same way.

"The fact that SiC is present in refractory inclusions tells us about the environment in the solar nebula at the condensation of the first solid materials," said Pravdivtseva, who is part of Washington University's McDonnell Center for the Space Sciences. "The fact that SiC was not completely destroyed in Curious Marie can help us to understand this environment a little bit better.

"Many refractory inclusions were melted and lost all textural evidence of their condensation. But not all."

Like solving a mystery

Pravdivtseva and her collaborators used two mass spectrometers built in-house at Washington University to make their observations. The university has a long history of noble gas work and is home to one of the best-equipped noble gas laboratories in the world. Still, this work was uniquely challenging.

The researchers had 20 mg of Curious Marie to work with, which is a relatively large sample from a cosmochemistry perspective. They heated it up incrementally, increasing temperature and measuring the composition of four different noble gases released at each of 17 temperature steps.

"Experimentally, it is an elegant work," Pravdivtseva said. "And then we had a puzzle of noble gas isotopic signatures to untangle. For me, it is like solving a mystery."

Others have looked for evidence of SiC in such calcium-aluminum-rich inclusions in meteorites using noble gases before, but Pravdivtseva's team is the first to find it.

"It was beautiful when all noble gases pointed to the same source of the anomalies -- SiC," she said.

"Not only do we see SiC in the fine-grained CAIs, we see a population of small grains that formed at special conditions," Pravdivtseva said. "This finding forces us to revise how we see the conditions in the early solar nebula."

Credit: 
Washington University in St. Louis