Culture

Freestanding emergency departments may increase out-of-pocket spending for patients

image: Murtaza Akhter, M.D., Assistant Professor in the Department of Emergency Medicine at the UArizona College of Medicine – Phoenix.

Image: 
The University of Arizona Health Sciences

PHOENIX - With emergency department visits spiking around the holidays, patients may be inclined to seek care at freestanding emergency departments in an urgent situation.

However, a new study suggests the best and most cost-effective option may be to skip the freestanding emergency department and head to an urgent care center.

A collaborative study led by researchers at the University of Arizona College of Medicine - Phoenix and Rice University found that freestanding emergency departments can increase out-of-pocket spending, health-care utilization and price per visit. Freestanding emergency departments deliver emergency care similar to a hospital ER. However, unlike an ER, these freestanding clinics are not physically attached to an acute-care hospital.

"Recent years have seen a significant increase in emergency department visits, and also a boom in openings of freestanding emergency departments," said Murtaza Akhter, MD, co-author of the study and assistant professor in the Department of Emergency Medicine at the College of Medicine - Phoenix. "Our study showed that an increase in freestanding emergency departments was associated with an increase in spending, per Blue Cross Blue Shield enrollees."

Dr. Akhter collaborated on the study with Vivian Ho, PhD, and Yingying Xu, PhD, both from Rice University. Researchers analyzed Blue Cross Blue Shield insurance claims data from Texas, Arizona, Florida and North Carolina, as well as data from the U.S. Census Bureau American Community Survey. The results were published Oct. 23 in Academic Emergency Medicine.

The researchers found that entry of an additional freestanding emergency department was associated with an increase in emergency department utilization in Texas, Florida and Arizona, but not in North Carolina.

The implied increases in utilization varied between roughly 3 and 5 percent. The estimated out-of-pocket payments for emergency care increased 3.6 percent with the entry of a freestanding emergency department in Texas, Florida and Arizona, but declined 15.3 percent in North Carolina.

Additionally, researchers discovered that freestanding emergency departments in a local market were associated with a 3.6-percent increase in emergency provider reimbursement, per insured beneficiary in all three states, except Arizona. (Health-care reimbursement is the payment that your health-care provider receives for giving a medical service.)

"Opening freestanding emergency departments can have various effects, and that typically leads to increased health-care costs," Dr. Akhter said. "This often is due to increased utilization, thereby making freestanding emergency departments supplements to hospital-based emergency departments, rather than substitutes."

Dr. Akhter said freestanding emergency departments provide round-the-clock care for a variety of conditions, from common colds to a heart attacks. However, they are billed just like hospital emergency departments, regardless of the health aliment.

"If you have an urgent complaint, go to urgent care or your primary-care provider, unless you want a huge bill for your sprained ankle."

A freestanding emergency department is not the same as an urgent care center. A freestanding clinic can treat the same conditions as a hospital-based emergency room. An urgent care center only treats minor injuries and illnesses.

In an urgent medical situation, Dr. Akhter advises patients to go to an urgent care center. An urgent situation can include sprains and strains, moderate flu symptoms or small cuts that may require stitches. If the situation is emergent, he recommends going to an emergency department or freestanding clinic. Emergent situations include chest pain, severe cuts, serious burns or slurred speech, among other conditions.

Dr. Akhter said more research is needed to consider how regulations in various states affect freestanding emergency departments and health-care costs.

Credit: 
University of Arizona Health Sciences

Could every country have a Green New Deal? Stanford report charts paths for 143 countries

image: This figure shows a projected timeline for transitioning to 100% wind, water, and solar energy.

Image: 
Jacobson et al. / One Earth

Ten years after the publication of their first plan for powering the world with wind, water, and solar, researchers offer an updated vision of the steps that 143 countries around the world can take to attain 100% clean, renewable energy by the year 2050. The new roadmaps, publishing December 20 in the journal One Earth, follow up on previous work that formed the basis for the energy portion of the U.S. Green New Deal and other state, city, and business commitments to 100% clean, renewable energy around the globe--and use the latest energy data available in each country to offer more precise guidance on how to reach those commitments.

In this update, Mark Z. Jacobson (@mzjacobson) of Stanford University and his team find low-cost, stable grid solutions in 24 world regions encompassing the 143 countries. They project that transitioning to clean, renewable energy could reduce worldwide energy needs by 57%, create 28.6 million more jobs than are lost, and reduce energy, health, and climate costs by 91% compared with a business-as-usual analysis. The new paper makes use of updated data about how each country's energy use is changing, acknowledges lower costs and greater availability of renewable energy and storage technology, includes new countries in its analysis, and accounts for recently built clean, renewable infrastructure in some countries.

"There are a lot of countries that have committed to doing something to counteract the growing impacts of global warming, but they still don't know exactly what to do," says Jacobson, a professor of civil and environmental engineering at Stanford and the co-founder of the Solutions Project, a U.S. non-profit educating the public and policymakers about a transition to 100% clean, renewable energy. "How it would work? How it would keep the lights on? To be honest, many of the policymakers and advocates supporting and promoting the Green New Deal don't have a good idea of the details of what the actual system looks like or what the impact of a transition is. It's more an abstract concept. So, we're trying to quantify it and to pin down what one possible system might look like. This work can help fill that void and give countries guidance."

The roadmaps call for the electrification of all energy sectors, for increased energy efficiency leading to reduced energy use, and for the development of wind, water, and solar infrastructure that can supply 80% of all power by 2030 and 100% of all power by 2050. All energy sectors includes electricity; transportation; building heating and cooling; industry; agriculture, forestry, and fishing; and the military. The researchers' modeling suggests that the efficiency of electric and hydrogen fuel cell vehicles over fossil fuel vehicles, of electrified industry over fossil industry, and of electric heat pumps over fossil heating and cooling, along with the elimination of energy needed for mining, transporting, and refining fossil fuels, could substantially decrease overall energy use.

The transition to wind, water, and solar would require an initial investment of $73 trillion worldwide, but this would pay for itself over time by energy sales. In addition, clean, renewable energy is cheaper to generate over time than are fossil fuels, so the investment reduces annual energy costs significantly. In addition, it reduces air pollution and its health impacts, and only requires 0.17% of the 143 countries' total land area for new infrastructure and 0.48% of their total land area for spacing purposes, such as between wind turbines.

"We find that by electrifying everything with clean, renewable energy, we reduce power demand by about 57%," Jacobson says. "So even if the cost per unit of energy is similar, the cost that people pay in the aggregate for energy is 61% less. And that's before we account for the social cost, which includes the costs we will save by mitigating health and climate damage. That's why the Green New Deal is such a good deal. You're reducing energy costs by 60% and social costs by 91%."

In the U.S., this roadmap--which corresponds to the energy portion of the Green New Deal, which will eliminate the use of all fossil fuels for energy in the U.S.--requires an upfront investment of $7.8 trillion. It calls for the construction of 288,000 new large (5 megawatt) wind turbines and 16,000 large (100 megawatt) solar farms on just 1.08% of U.S. land, with over 85% of that land used for spacing between wind turbines. The spacing land can double, for instance, as farmland. The plan creates 3.1 million more U.S. jobs than the business-as-usual case, and saves 63,000 lives from air pollution per year. It reduces energy, health, and climate costs 1.3, 0.7, and 3.1 trillion dollars per year, respectively, compared with the current fossil fuel energy infrastructure.

And the transition is already underway. "We have 11 states, in addition to the District of Columbia, Puerto Rico, and a number of major U.S. cities that have committed to 100% or effectively 100% renewable electric," Jacobson says. "That means that every time they need new electricity because a coal plant or gas plant retires, they will only select among renewable sources to replace them."

He believes that individuals, businesses, and lawmakers all have an important role to play in achieving this transition. "If I just wrote this paper and published it and it didn't have a support network of people who wanted to use this information," he says, "it would just get lost in the dusty literature. If you want a law passed, you really need the public to be supportive."

Like any model, this one comes with uncertainties. There are inconsistencies between datasets on energy supply and demand, and the findings depend on the ability to model future energy consumption. The model also assumes the perfect transmission of energy from where it's plentiful to where it's needed, with no bottlenecking and no loss of energy along power lines. While this is never the case, many of the assessments were done on countries with small enough grids that the difference is negligible, and Jacobson argues that larger countries like the U.S. can be broken down into smaller grids to make perfect transmission less of a concern. The researchers addressed additional uncertainties by modeling scenarios with high, mean, and low costs of energy, air pollution damage, and climate damage.

The work deliberately focuses only on wind, water, and solar power and excludes nuclear power, "clean coal," and biofuels. Nuclear power is excluded because it requires 10-19 years between planning and operation and has high costs and acknowledged meltdown, weapons proliferation, mining, and waste risks. "Clean coal" and biofuels are not included because they both cause heavy air pollution and still emit over 50 times more carbon per unit of energy than wind, water, or solar power.

One concern often discussed with wind and solar power is that they may not be able to reliably match energy supplies to the demands of the grid, as they are dependent on weather conditions and time of year. This issue is addressed squarely in the present study in 24 world regions. The study finds that demand can be met by intermittent supply and storage throughout the world. Jacobson and his team found that electrifying all energy sectors actually creates more flexible demand for energy. Flexible demand is demand that does not need to be met immediately. For example, an electric car battery can be charged any time of day or night or an electric heat pump water heater can heat water any time of day or night. Because electrification of all energy sectors creates more flexible demand, matching demand with supply and storage becomes easier in a clean, renewable energy world.

Jacobson also notes that the roadmaps this study offers are not the only possible ones and points to work done by 11 other groups that also found feasible paths to 100% clean, renewable energy. "We're just trying to lay out one scenario for 143 countries to give people in these and other countries the confidence that yes, this is possible. But there are many solutions and many scenarios that could work. You're probably not going to predict exactly what's going to happen, but it's not like you need to find the needle in the haystack. There are lots of needles in this haystack."

Credit: 
Cell Press

High carbon footprint families identified by sweets and restaurant food, not higher meat consumption

Families with high carbon footprints consume two to three times more sweets and alcohol than those with low footprints

Study by experts in Sheffield and Kyoto, Japan, found meat consumption explained less than 10 per cent of difference in carbon footprints

Researchers recommend carbon taxes on sweets and alcohol

Families with higher carbon footprints are likely to consume more confectionary, alcohol and restaurant food, according to a new study published in One Earth.

Considering the spectrum of traditional to urban lifestyles across Japan, researchers at the University of Sheffield and the Research Institute for Humanity and Nature in Kyoto, Japan, analysed the carbon footprints of the diets of 60,000 households across Japan's 47 regions. Using a life-cycle approach which details food supply chains around the country, they found that meat consumption was relatively constant per household - but carbon footprints were not.

The study shows that meat consumption could explain less than 10 per cent of the difference seen in carbon footprints between Japanese families. Instead, households with higher carbon footprints tended to consume more food from restaurants, as well as more vegetables and fish. However, it was the level of consumption of sweets and alcohol - two to three times higher than families with low carbon footprints - that really stood out.

Meat has earned a reputation as an environmentally damaging food, with beef production emitting 20 times more greenhouse gases than bean production for the same amount of protein.

However, the researchers caution against a one-size-fits-all policy after finding that the consumption of sweets, alcohol and restaurant food adds to families' footprints in a larger capacity than other items. Eating out was found to contribute on average 770 kg of greenhouse gases per year for those households with a higher footprint, whereas meat contributed just 280kg.

Associate Professor Keiichiro Kanemoto of the Research Institute for Humanity and Nature, Kyoto, Japan - who led the research - said: "If we think of a carbon tax, it might be wiser to target sweets and alcohol if we want a progressive system.

"If we are serious about reducing our carbon footprints, then our diets must change. Our findings suggest that high carbon footprints are not only a problem for a small number of meat lovers in Japan. It might be better to target less nutritious foods that are excessively consumed in some populations."

Kanemoto does, however, recommend eating less meat to reduce a household's environmental impact. "Meat is a high carbon footprint food. Replacing red meat consumption with white meat and vegetables will lower a family's carbon footprint," he said.

Japan's population is one of the oldest in the world, a trend that many industrial countries are following. This suggests that successful policies for dietary change and energy efficiency in Japan could act as models for many countries in the coming decades. The Japanese also have a relatively healthy diet, which is frequently attributed to them having the world's longest lifespan by country.

Dr Christian Reynolds from the Institute of Sustainable Food at the University of Sheffield, one of the study's co-authors, said: "Due to wealth, culture, and farming practices, different regions in a country consume food differently. Japan alone has some prefectures with more than 10 million people and others with fewer than one million. These regional and income differences in food consumption are also found in the UK, Europe, Australia and the US.

"All countries are facing challenges in how to shift diets to be healthier and more sustainable. This evidence from Japan demonstrates that research can help us to identify what to focus on. The same patterns of dietary change in terms of sugar, alcohol and dining out need to be considered in the UK, Australia, the US and Europe."

Credit: 
University of Sheffield

Thyroid cancer rates in US

Bottom Line: An analysis suggests rates of thyroid cancer in the U.S. appear to have plateaued in recent years after decades on the rise. That increase was mostly attributed to more screening and imaging over the last three decades that detected many small thyroid cancers. Researchers in this observational study used cancer surveillance registry data to examine changes in rates of new cases of thyroid cancer in the U.S. from 1992 to 2016. Authors report the rate increased from 5.7 to 13.8 per 100,000 between 1992 and 2009, with the greatest annual percentage change (6.6%) from 1998 to 2009. The rate of increase slowed from 2009 to 2014 (13.8 to 14.7 per 100,000) and the rate has been stable since 2014 (from 14.7 to 14.1 per 100,000). The rate changes possibly may be due to a decline in the occurrence of thyroid cancer but the changes happened when there was a greater understanding about overdiagnosis of thyroid cancer and practice guidelines changed so a less intensive workup of thyroid nodules is a more likely explanation. Limitations of the study include that observational analyses like these cannot determine causality and the results may not be generalizable to other areas of the U.S. beyond the regions included in the registry data.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

Authors: Jennifer L. Marti, M.D., Weill Cornell Medicine, New York, and coauthors.

(doi:10.1001/jama.2019.18528)

Editor's Note: The article includes conflict of interest disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Body cells spy out bacteria

image: Pseudomonas bacteria under the electron microscope.

Image: 
MPIIB / Volker Brinkmann

Bacterial infection does not automatically lead to illness; many germs only become dangerous when they occur in large numbers. Researchers at the Max Planck Institute for Infection Biology in Berlin have discovered that the body has a receptor, which doesn't recognize bacteria themselves, but spies out their communication. The body uses this to register when so many bacteria are present that they secrete illness-inducing substances known as virulence factors.

In the case of opportunistic pathogens in our environment, this critical threshold for an infection is particularly high: only when they occur in very high numbers and/or form illness-inducing substances can they overwhelm a person.

Pseudomonas aeruginosa is one such germ. Everyone regularly comes into contact with it, as it is found predominantly in water pipes, wash basins and other similar places. However, large quantities of pseudomonads can cause serious illness. They do this by forming disease-inducing substances which enable them to gain a foothold in the host and cause damage. This germ can easily cause pneumonia, wound infections or bacteraemia and blood poisoning, particularly in hospital patients. These diseases are extraordinarily difficult to treat, as the bacteria are highly resistant to antibiotics.

How do germs decide when the time is right for an attack? They communicate with one another via small molecules known as 'quorum sensing molecules'. Only when they have reached sufficient density do pseudomonads produce illness-inducing substances and mucous molecules, which defend them against antibiotics and the body's own immune system. This makes sense for the germs, because as long as mucus and virulence factors are not needed, their production only means unnecessary energy consumption. On the other hand, the energy expenditure is worthwhile during an actual attack, because only then can they successfully infect the host and use it as a 'breeding ground'.

Spotting communication among bacteria

Stefan Kaufmann and his team of researchers at the Max Planck Institute for Infection Biology have discovered that our body cells are able to spot communication amongst bacteria with the help of a receptor known as the aryl hydrocarbon receptor. This receptor detects the quorum sensing molecules, enabling body cells to detect when the bacteria are preparing for an attack. "Thanks to this spying, the body can activate the immune system in times of need to fend off an attack from these germs," explains the study's lead author, Pedro Moura-Alves, currently a group leader at the Ludwig Institute for Cancer Research, Oxford University.

In fact, the receptor eavesdrops on the bacteria before they have even reached their quorum; detecting the early stages of quorum sensing molecules inhibits the aryl hydrocarbon receptor, blocking a premature mobilization of the immune defences. "This is effective for the host, as it saves energy to leave a small number of bacteria alone, provided they aren't causing any damage. Only when they've reached a critical mass is the energy required for defence mustered," says Stefan Kaufmann. This also helps prevent collateral damage caused by the immune system's response.

So our body not only recognizes whether germs are present or not. These recent results show that it also registers the way their numbers are growing, in order to react to differing stages of an infection.

Credit: 
Max-Planck-Gesellschaft

Brain biomarkers for detecting Alzheimer's disease are located

image: Two subnetworks are altered during the evolution of AD.

Image: 
Gerd Altman

From the detection of functional brain changes that occur during Alzheimer's Disease (AD), a research team from the Complutense University of Madrid (UCM) has located a set of biomarkers that could predict which patients with Mild Cognitive Impairment (MCI) have a higher risk of developing dementia.

"In this work we find that two subnets, in theta and beta frequency bands, which involve fronto-temporal and fronto-occipital regions, are altered during the evolution of Alzheimer's Disease," says Mª Eugenia López García, researcher from the Cognitive Neuroscience group of the UCM and one of the authors of the study.

The research, published in Brain, takes hypersynchronization -increased connectivity between brain regions- as the main axis of changes and magnetoencephalography (MEG) as the tool to detect it.

Thus, it has been shown that patients with MCI who subsequently develop AD show an increase in synchrony but, as dementia develops, this synchronization decreases as a symptom of cortical network dysfunction.

In addition to the UCM Faculty of Psychology, the San Carlos Clinic hospital, the Laboratory of Cognitive and Computational Neuroscience (UCM-UPM) of the Centre for Biomedical Technology, the University of the Balearic Islands and the University of La Laguna (Tenerife) also participate in the project.

Model X

Mild Cognitive Impairment is an intermediate phase between what is considered normal aging and dementia, with a rate of 15 % of patients per year moving from this stage to Alzheimer's.

Therefore, the early diagnosis of MCI, and especially the distinction of patients with MCI who will end up suffering from AD "would help to diagnose and establish a prognosis and make a system for evaluating new pharmacological or non-pharmacological interventions available," says López. Garcia

To carry out this study, the researchers recruited 54 patients with MCI for three years (from an initial sample of 145) who were monitored every six months for three years, and magnetoencephalography records were implemented to measure the magnetic fields that our brain generates both at the beginning (pre phase) and at the end of the study (post phase).

Techniques for estimating brain synchronization were applied to these records: these allow us to determine the way in which different brain regions communicate with each other. At the end of the study, 27 patients remained as MCI (stable MCI, MCIs) and the other half converted to Alzheimer's (MCI converters, MCIc). The model developed in this study had a success rate of 96.2 %.

"The synchronization results obtained in the pre and post phases have allowed us to elaborate an explanatory model, which we have called "model X", which shows that hypersynchronization predicts the conversion in the MCIc group in the pre phase, demonstrating a decrease in synchronization in this group in the post phase. On the contrary, the MCIs group shows an inverse synchronization pattern in both stages," explains the UCM expert.

The World Health Organization (WHO) estimates that 35.6 million people currently live with dementia, understood as any neurodegenerative disease associated with age. Predictions suggest that the number of people affected will double by 2030 and triple by 2050, with AD being the most prevalent.

"Having this early and non-invasive biomarker can help people to understand this terrible disease. We as researchers have the objective of being useful and being able to make discoveries that can help improve our knowledge of this complex disease," concludes López García. Along with her, the investigators Sandra Pusil, María Eugenia López, Pablo Cuesta, Ricardo Bruña, Ernesto Pereda and Fernando Maestú were part of this research team.

Credit: 
Universidad Complutense de Madrid

HKU plant scientists identify new strategy to enhance rice grain yield

image: OsACBP2-OE rice grains possess higher oil content.

Image: 
@The University of Hong Kong

Rice provides a daily subsistence for about three billion people worldwide and its output must keep pace with a growing global population. In light of this, the identification of genes that enhance grain yield and composition is much desired. Findings from a research project led by Professor Mee-Len Chye, Wilson and Amelia Wong Professor in Plant Biotechnology from the School of Biological Sciences of The University of Hong Kong (HKU), with postdoctoral fellows Dr Guo Zehua and Dr Shiu-Cheung Lung, in collaboration with researchers from the University of Calgary and Rothamsted Research (UK), have provided a new strategy to enhance grain yield in rice by increasing grain size and weight. The research results have been published in The Plant Journal and an international patent has been filed (Patent Application No. WO 2019/104509).

In this technology, the research group led by Professor Chye has identified a protein, ACYL-COA-BINDING PROTEIN2 (OsACBP2) from rice (Oryza sativa), that when overexpressed in transgenic rice, will enhance grain size and weight by 10% and elevate grain yield (Image 1). The biomass of the OsACBP2-overexpressing transgenic rice grains exceeded the control by over 10%. OsACBP2 is a lipid-binding protein that binds lipids such as acyl-CoA esters, the major precursors in seed oil production. Oil was observed to accumulate in the transgenic rice grains (Image 2). OsACBP2 is promising not only in enhancing grain size and weight, but also in improving nutritional value with a 10% increase in lipid content of rice bran and whole seeds (Image 3).

As OsACBP2 contributes to boosting oil content as well as size and weight in transgenic rice grains, an application of this technology in rice is expected to benefit agriculture by increasing grain yield and composition to satisfy the need for more food. Professor Chye said: "Increasing grain size and yield, besides rice bran and seed lipid content, in crops such as rice is an important research area that aligns with the aspirations of Dr Wilson and Mrs Amelia Wong on the use of plant biotechnology for a sustainable future. Furthermore, as rice bran oil is considered highly valuable because it contains bioactive components that have been reported to lower serum cholesterol and possess anti-oxidation, anti-carcinogenic and anti-allergic inflammation activities, this technology, if applied to other food crops, would not only help address food security but also elevate nutritional properties in grains."

Credit: 
The University of Hong Kong

Strong change of course for muscle research

image: The patient's muscle cell population shows a very strong expression of the surface protein CLEC14A (green).

Image: 
Spuler lab, MDC

Anyone who climbs the 285 steps to the viewing platform of Berlin's Siegessäule, or Victory Column, will probably have quite a few sore muscles the next day. Out-of-the-ordinary activities such as climbing lots of steps or even normal exercise can put significant strain on muscles. Such activities cause tiny tears in the muscle fibers, which the body then repairs on its own.

Even when injuries occur, the muscles activate an endogenous regeneration program: A reserve supply of muscle stem cells, known as satellite cells, reside around the muscle fibers and are essential for the repair of damaged muscle cells. These satellite cells produce new muscle fibers in a process which results in muscle regeneration. People maintain this ability well into old age. Researchers are particularly interested in these cells since they could provide targets for new therapeutic approaches for people with muscle diseases.

An overrated protein

Researchers previously assumed that a certain protein - the transcription factor PAX7 - plays a key role in muscle regeneration. "Cells from which new muscles arise have enormous potential for developing gene therapies to treat muscle atrophy. And PAX7 is actually considered a characteristic property of muscle-building satellite cells," says Prof. Simone Spuler.

The scientist and physician is a research group leader at the Experimental and Clinical Research Center (ECRC), a joint institution of the Max Delbrück Center for Molecular Medicine in the Helmholtz Association (MDC) and Charité - Universitätsmedizin Berlin, and heads the Myology Group at the MDC. Her team has now reported in the journal Nature Communications that it's possible for muscles to grow and regenerate without PAX7. The study characterized a previously unknown subtype of satellite cells that could play an important role in the future development of gene therapies from muscle stem cells.

"The findings will certainly surprise many researchers in the field," says Dr. Andreas Marg, a senior scientist in Spuler's lab and the lead author of the study. He himself was initially guided by the assumption that the transcription factor was crucial for muscle growth. "I previously focused my research on PAX7-positive cells. Our findings lead us down a new path."

New muscles despite a mutation

The research team owes the discovery to a young girl: Lavin has suffered from a genetic form of muscular dystrophy since birth and is the protagonist in the study. Lavin has all the muscles of a healthy person, but each of her muscles is very small. The musculature along her spine is particularly affected by the disease. Lavin's arms and legs are strong, but she suffers breathing problems and has difficulty bending forward and holding her head up.

Gene analysis shows that the gene for PAX7 is damaged in Lavin; her cells can't produce this protein. The University Hospital Munich discovered this in 2017. Soon thereafter, Spuler and Marg learned of this extremely rare mutation - one that had not been described before. Lavin traveled with her parents to the Berlin-Buch campus, where the scientists took a sample of her muscle tissue. Marg used a new procedure to filter out Lavin's satellite cells and then implanted them in mice. He observed that new muscle fibers grew in the mice from Lavin's cells - despite the absence of PAX7.

Spuler presumes that PAX7 is not equally important for every cell. This would explain why Lavin can walk and climb relatively well, but has hardly any strength in her diaphragm, which causes the breathing problems. "We could perhaps develop a gene therapy for Lavin using the CRISPR-Cas9 gene-editing tool," says Spuler. "However, to repair the defective gene, CRISPR-Cas9 would have to specifically target the cells of the axial musculature, and that is not yet possible." But Spuler's lab is working intensively to figure out how to repair defective genes in muscle cells. For Lavin and her family, this research offers a small glimmer of hope that a suitable therapy will be found.

A new subtype of muscle stem cells

Marg and Spuler collaborated on the study with many colleagues at the MDC and with scientists from institutions abroad. Prof. Nikolaus Rajewsky's research group at the Berlin Institute for Medical Systems Biology (BIMSB) compared Lavin's cells with those donated by healthy people. Single-cell analysis, which looks at the activity of each cell individually, revealed a previously unknown cell population. In around 20 percent of the donors, the majority of the activated satellite cells also don't produce any PAX7, even though the genetic information is present in the cells. The team instead discovered something else in those cells in which the transcription factor was missing: CLEC14A, a protein that is found in many blood vessel cells. This very protein was highly expressed in Lavin's muscle stem cells.

The new study describes a previously unknown subtype of satellite cells. First, the researchers identified these cells in the stem cell niche, which is where the satellite cells reside. Second, PAX7 is not present in these cells. Third, other characteristic proteins such as CLEC14A are present instead. And fourth, new muscle fibers can be derived from this cell population.

Up to now, only cells with PAX7 have been considered as targets for gene therapy research involving satellite cells. The new study shows that the subtype discovered should also play a role in therapeutic development.

Credit: 
Max Delbrück Center for Molecular Medicine in the Helmholtz Association

Bark beetles control pathogenic fungi

image: These are three female ambrosia beetles in their nest.

Image: 
Gernot Kunz

Bark beetles control

Ants and honeybees share nests of hundreds or thousands of individuals in a very small space. Hence the risk is high that infectious diseases may spread rapidly. In order to reduce this risk, the animals have developed special social behaviours that are referred to as "social immune defence". This achievement is generally assumed to have evolved only in the eusocial insects including ants, bees and wasps. The finding that also more primitively social ambrosia beetles remove pathogens by cleaning each other indicates that social immunity may have evolved already much earlier. This was reported in the British science journal Proceedings of the Royal Society B by Jon A. Nuotcla and Michael Taborsky from the University of Bern (Switzerland), in collaboration with Peter Biedermann from the Julius-Maximilians-Universität (JMU) Würzburg in Bavaria, Germany.

Beetles help to raise siblings

"Ambrosia beetles live in galleries dug out of wood, and the roles of group members are not as strictly defined as in the colonies of bees and ants", says Jon Nuotclà, a PhD-student at the Institute of Ecology and Evolution of the University of Bern, and first author of the study. Workers can decide on their own whether to help their mother to care for the brood and fungus plantation, or rather to emigrate and establish an own nest. "In the evolution of social behaviour, ambrosia beetles are an intermediate stage between the solitarily and the socially living insects," outlines Peter Biedermann, a researcher at the JMU Biocentre. But when it comes to disease prevention, they apparently behave like social insects.

Fungal spores trigger mutual cleaning behaviour

"Our experiments indicate that the defence against pathogens may be an important factor in the evolution of social behaviour," says Michael Taborsky, who supervised this study. If the scientists sprayed spores of the pathogenic fungus Aspergillus into the beetle nests, the workers showed enhanced cleaning of their nestmates. "In fungus-laden nests, the beetles were also more inclined to serve the community: they then stayed longer in the nest to help raising sisters", Taborsky explains. As a next step, the researchers plan to investigate whether the saliva of the ambrosia beetles might contain antibiotic substances that kill the spores of Aspergillus fungi. It also remains to be studied how the beetles can prevent the development of resistance in pathogenic fungi.

A Beetle performing agriculture

Ambrosia beetles belong to the bark beetles that generally are not popular with the forest industry due to the economic damage they may cause. Their several thousand species are distributed worldwide. Ambrosia beetles infest dying or freshly dead trees and perform agriculture in their heartwood. The beetles are attracted by the alcohol exuded by these trees. They drill galleries into the stems and create ambrosia fungus plantations. These fungi serve as food for them and their larvae.

Credit: 
University of Bern

First step taken to find causes of muscle wasting disease

Researchers have gained new insight into the mechanisms involved in how skeletal muscles lose their mass and strength as people age, called sarcopenia.

Sarcopenia is common in older people and is an important contributor to frailty. It affects balance, the way a person moves and their overall ability to perform daily tasks. With an aging population, sarcopenia is a serious global public health problem.

In the first ever study to compare muscle tissue from groups of older people with sarcopenia across different geographies, researchers identified changes in the cells and molecules within muscle, which may explain why some people develop sarcopenia and some people do not.

The MEMOSA study (Multi-Ethnic MOlecular determinants of human SArcopenia), published in Nature Communications, was undertaken by the EpiGen Global Research Consortium in partnership with Nestlé Research. The study involved participants from the UK, Singapore and Jamaica.

It found that the muscle from individuals with sarcopenia had reduced activity of the key energy-producing pathway and a decrease in activity of the components that make up all five complexes in the energy production pathway critical to maintaining muscle strength and function.

These changes were found in the cohort of men from the Singapore cohort of the study and replicated in cohorts from the UK (Hertfordshire Cohort Study) and Jamaica.

Moreover, results showed that sarcopenia was also associated with reduced levels of enzymes involved in the recycling of NAD+, which acts as a metabolic sensor in the cell and regulates energy production pathways.

The MEMOSA team now plans to explore why the changes in the energy-producing pathway occur and are looking at genetic and nutritional factors.

Karen Lillycrop, Professor of Epigenetics at the University of Southampton and one of the lead authors, said: "Most studies to date have compared muscle tissue from young people to older people but we wanted to understand why there is variability in the loss of muscle mass and strength between elderly individuals."

"This is a really novel study, using advanced sequencing techniques for the first time, which has allowed us to identify the molecular basis of why some people develop sarcopenia and others do not in old age."

Professor Keith Godfrey, a co-author at the University of Southampton added: "Sarcopenia is becoming a major health care challenge for all countries, so much so that it was recently recognised as a medical condition. By identifying these differences in activity in key pathways within muscle cells we can now start to develop therapeutic interventions that will hopefully help a lot of people to remain active and healthy in later life."

Credit: 
University of Southampton

New study shows how patients' health values can impact vital pelvic floor treatment

image: Incontinence can be a distressing consequence of pelvic floor problems.

Image: 
Swansea University

Researchers and health professionals in Swansea have revealed the value women put on their own health can have a direct effect on the success of medical treatment for pelvic floor problems.

Pelvic floor dysfunction affects more than a quarter of all women in the UK. It can involve incontinence and prolapse, and can be treated by physiotherapy.

However, according to the research carried out by Swansea University and Swansea Bay University Health Board, many women do not put their own health first. As a result, they do not benefit from the treatment, and end up having to have surgical intervention instead.

Professor Phil Reed and health professionals carried out a study of 218 women who had been referred for physiotherapy to treat their pelvic floor dysfunction. The researchers discovered the strength of the women's health-related values predicted their attendance, but only those patients who valued health for themselves - rather than because of what it allowed them to do for others - showed improvement.

Professor Reed said: "The fact that holding strong health values is an important predictor of treatment attendance is no surprise, but the data show that many ladies place this aspect of their life lower than many other areas - and we need to help empower them to value their own health."

Health values influence outcomes in several healthcare contexts, but the impact of these values on physiotherapy was previously unknown.

The team's motivation in conducting the study was to gain a better understanding of the views of patients and the kinds of things that they regard as important, in order to develop appropriate support for women undergoing pelvic floor physiotherapy, and to enhance treatment attendance and outcomes.

Pelvic floor dysfunction causes substantial reductions in women's quality of life and impacts ability to work, as well as involving substantial costs to the NHS. Pelvic floor muscle training (PFMT) is effective, safe, and cost-efficient relative to alternative treatments, such as surgery. However, many psychological factors are associated with its outcome and with patient adherence to treatment.

One implication of these new findings is that supporting patients to develop the sorts of health values that aid better outcomes might enhance their attendance at PFMT sessions and help them recover their pelvic floor function without the need for operations.

Professor Reed added: "Physiotherapy treatment for this very common problem can be so effective and safe for the patients, and it is really important that the ladies who attend have their needs fully recognised and supported."

"If we do that, then we will enhance attendance and outcomes for these patients, and stop them having to go for operations, which will also have the benefit of saving the NHS much-needed money that can then be used to help other patients."

The team has previously shown that supporting the motivation of women to attend PFMT, through short group-based sessions, improves attendance by around 60%, and it says the current findings will help to tailor this support to the needs of the women even more closely.

Credit: 
Swansea University

Corpus luteum cells of cats successfully cultivated and comprehensively characterized

image: Pictured here are lynx cubs.

Image: 
Iberian Lynx Ex-situ Conservation Programme

The reproduction of lynxes is highly mysterious. Unlike other wild cats, most lynxes are only receptive for a few days once a year. As scientists from the Leibniz Institute for Zoo and Wildlife Research (Leibniz-IZW) have already shown in the past, this is a consequence of the long life of corpus lutea in the ovaries which prevents further ovulation during the course of the year. The Berlin team has now achieved another breakthrough in solving the puzzle: they were able to isolate several cell types of corpus luteum from domestic cat tissue and characterise their function in detail with the help of cell cultures. The new method can also be applied to endangered felids such as the Iberian lynx and could advance our understanding of the causes and mechanisms of the longevity of corpus lutea in lynxes. The ultimate goal in practical terms is to induce ovulation with the help of corpus luteum hormones. This would enhance the support for the reproduction of the highly endangered Iberian lynx in breeding programmes.

When it comes to reproduction, the felids are usually quite unanimous: most wild cat species go through several sexual cycles per year, so can become pregnant several times a year. However, unlike its relatives, the genus Lynx mainly uses a mono-oestric reproduction strategy. Three of four lynx species can become pregnant for a short time only once per year. This is a burden for endangered species such as the Iberian lynx (Lynx pardinus). If they do not succeed in producing offspring within this time, they have to wait until next year. Artificial insemination also failed, probably because of the lack of knowledge about how to induce ovulation. It is therefore indispensable for the success of the lynx conservation breeding programme to learn more about the mysterious physiology of their reproduction.

In 2014, the reproduction team of the Leibniz-IZW was able to present the first important partial solution of the puzzle. Together with colleagues from several zoological gardens they discovered that the corpus luteum of lynx is continuously active for several years and thus responsible for their unusual reproduction pattern. The corpus luteum is a glandular tissue in the ovaries of mammals that, among other things, produces progesterone - the hormone that supports pregnancy and prevents further ovulation. If the egg is not fertilised, the corpus luteum normally degrades quite quickly and thereby enables a new cycle.

"In lynxes, a mechanism has developed that maintains the corpus luteum for several years. This means that the genus Lynx has the longest known lifespan of functionally active corpora lutea among mammals," says Beate Braun, scientist in the Department of Reproduction Biology at the Leibniz-IZW. "It is astonishing that lynxes are ready for reception in a new season despite the presence of corpus lutea. The activity of the corpus luteum is apparently shut down for a short time, which triggers ovulation. Progesterone production is then resumed and held high beyond pregnancy. In this way, the persistent corpus luteum is likely to prevent further ovulations in the same year."

It is still unclear, how exactly the longevity of the corpus luteum is maintained. However, the scientists from Berlin have now come one step closer to solving the mystery. "We succeeded in isolating and cultivating different cell types from the corpus luteum of domestic cats," explains Michal Hryciuk, PhD student in the Department of Reproduction Biology at the Leibniz-IZW. "The cells originate from tissue taken from domestic cats in animal clinics during castration. Tissues from lynxes or other wild cat species are very rarely available - for example when dead animals are found or animals in zoos are castrated for medical reasons. It was therefore important to us to set up a functioning cultivation system first and then apply it to valuable samples, and that is exactly the system that we have now."

The scientists not only succeeded in cultivating several cell types but also characterised large and small cells of corpus lutea under controlled laboratory conditions. They were able to determine the amount of progesterone and other hormones produced and track the changing activity of genes over time. With the developed cultivation technique, scientific research now has the urgently needed instruments at its disposal to solve the riddle of the long-lived corpus luteum. "Our results will help to identify the hormonal control mechanisms that regulate the growth, maintenance, and degradation of corpus luteum," says Katarina Jewgenow, Head of the Department of Reproduction Biology at the Leibniz-IZW. "This opens up completely new possibilities to enhance the conception of endangered lynxes and other wild cat species in order to support conservation breeding programmes."

Credit: 
Forschungsverbund Berlin

Genes as early warning systems: Stroke research

Estimates based on genomic data predict stroke risk with an accuracy similar to, or greater than those based on clinical risk factors. This result implies that persons at high risk might benefit from more rigorous preventive measures.

Strokes are the second most common cause of death worldwide, and the leading cause of physical disabilities in adults. Approximately 80% of all cases of stroke are the result of ischemia, i.e. an acute lack of oxygen owing to the obstruction of blood flow through a cerebral artery in the brain. The individual level of risk for ischemic stroke is determined by a combination of genetic factors and pre-existing disorders, such as high blood pressure and diabetes. In collaboration with researchers at Cambridge University and the Baker Heart and Diabetes Institute in Melbourne, Professor Martin Dichgans (Ludwig-Maximilians-Universitaet (LMU) in Munich Medical Center) has now shown that genetic data obtained from a single sample of blood or saliva can identify persons who have a three-fold higher risk (than the average for the population as a whole) of suffering an ischemic stroke.

Moreover, this genetically based estimate of risk is as reliable, or even more so, than that based on the assessment of conventionally recognized clinical risk factors. Based on these results, the authors of the study conclude that individuals with a high genetic risk may need more meticulous monitoring and more intensive preventive interventions than current guidelines suggest. The new findings appear today in the online journal Nature Communications.

The collaboration first used a machine learning approach to analyze a large body of genetic data obtained by a variety of research groups in genome-wide association studies that identified gene variants which can be correlated with increased risk of stroke. On the basis of this analysis, they assigned an individual genetic risk score to each combination of pre-disposing variants. They then tested the predictive value of these risk scores by comparing them with data from a long-term prospective study, which has collected health-related information, including genomic data, from 420,000 individuals. These data are now archived in the UK Biobank. The results of their comparative analysis demonstrated that the new genetic estimator of stroke risk is more precise than those employed up to now, and is comparable in reliability to other estimates based on known behavioral or physiological factors such as cigarette smoking or body mass index (BMI). Moreover, the genetic risk score is a significantly more successful predictor of future episodes of ischemic stroke than an evaluation of the medical histories of a subject's family. - Indeed, the new risk score is accurate enough to predict whether or not any given individual belongs to the 0.25% of patients who have a three-fold higher risk of stroke than the population average.

The genomic risk assessment is derived from the genomic DNA, the sequence of which is unique to each individual, and has significant advantages over evaluations based on established risk factors, because it can be used from birth to estimate the risk of stroke. This makes it possible to initiate preventive strategies before patients develop any of the conditions that now serve as conventional risk factors for stroke, such as high blood pressure or high levels of fats in the bloodstream, said Martin Dichgans, Professor of Neurology and Director of the Institute for Stroke and Dementia Research (ISD) at the LMU Medical Center, and one of the co-leaders of the new study.

The authors underline the fact that people with a high genetic risk for ischemic stroke can still reduce this risk by minimizing their exposure to conventional risk factors - for example, by taking steps to reduce blood pressure and BMI and giving up smoking.

"The sequencing of the human genome has revealed many insights. For common diseases, such as stroke, it is clear that genetics is not destiny; however, each person does have their own innate risk for any particular disease. The challenge is now how we best incorporate this risk information into clinical practice so that the public can live healthier and longer," says Dr. Michael Inouye of the Baker Heart and Diabetes Institute and the University of Cambridge, another leader of the study.

The new genomic risk assessment is also detects significant differences in risk levels between individuals whom currently recommended guidelines assign to the same risk category. The researchers therefore suggest that the clinical guidelines now in place may be inadequate for persons with a high genomic risk score for stroke, since such individuals might well need more intensive interventions to counteract the increased risk. The new method of risk assessment can help to pinpoint the modifiable risk factors that need to be mitigated in order to reduce this risk to an acceptable level, as well as enabling more effective early interventions for persons at high risk of stroke and other cardiovascular diseases.

Credit: 
Ludwig-Maximilians-Universität München

Hitting HIT: Heparin therapy

Heparin is widely used as an anticoagulant, but evokes in some patients a potentially life-threatening condition called HIT. Clinical scientists at Ludwig-Maximilians-Universitaet (LMU) in Munich have now shown that inhibition of a single enzyme may markedly reduce this risk.

Heparin is frequently employed to mitigate the risk of clot formation in the leg veins following surgery, to treat venous thromboses and to prevent thrombotic occlusions of arteries in patients at risk for heart attacks. However, the agent can also lead to a reduction in the numbers of
thrombocytes (otherwise known as platelets) in the bloodstream - an effect known as heparin-induced thrombocytopenia (HIT). Up to 3% of patients who are treated with heparin develop an immunologically triggered subtype of HIT (called type II HIT), which is surprisingly associated with an increase in the incidence of thromboses (clots) in the blood vessels that are potentially life-threatening. The incorporation of platelets into the blood clots explains the occurrence of thrombocytopenia. A new study directed by Wolfgang Siess, Professor of Cardiovascular Pathobiochemistry at LMU's Institute for Prophylaxis and Epidemiology of Cardiovascular Diseases
(IPEK), has now demonstrated that this hazardous side-effect can be effectively prevented in vitro by the inhibition of a specific enzyme. The discovery provides a new option for the treatment of type II HIT. The study was carried out in collaboration with research groups led by Professor Michael Spannagl (Department of Transfusion Medicine, Cell Therapeutics and Hemostaseology, LMU Medical Center),

Professor Christian Weber and Dr. Philipp von Hundelshausen (both IPEK). The new findings appear in the journal Blood Advances.

Heparin is a large, negatively charged linear polysaccharide made up of a variety of repeat units. Type
II HIT is caused by the formation of antibodies that are directed against a specific molecular complex made up of heparin and a protein called PF4, which is secreted by platelets. The so-called Fc domain of the antibodies binds to the corresponding Fc receptor found on the platelet surface, which results in activation of the cells. The activated platelets form aggregates and secrete proteins which mediate the binding of platelets (thrombocytes) to the inner surface of the blood vessels (endothelium) and
to certain types of white blood cells (monocytes and neutrophils). This leads to the formation of thrombin and fibrin, and together with platelet activation triggers clot formation. - Hence heparin actually provokes the effect that it is intended to inhibit. "The complex pathogenesis of type II HIT is quite well understood, but the standard therapy of the disorder is suboptimal," says Siess.

A previous study had shown that activation of the Fc receptor on the thrombocyte membrane induces activation of an enzyme known as Bruton's tyrosine kinase (Btk). "We therefore asked whether inhibitors of Btk might be able to reduce the downstream activation of thrombocytes triggered by stimulation of the Fc receptor. We found that each of the six Btk inhibitors we tested
was indeed able not only to reduce but to completely block Fc-receptor induced platelet aggregation and secretion in blood. In addition, they all blocked the interaction of thrombocytes with neutrophils - thus preventing the formation of potentially harmful thromboses," Siess explains. "This was a very pleasant surprise for us, as earlier investigations had shown that Btk inhibitors could only partially block platelet activation induced by stimulation of a different receptor."

Inhibitors of Btk therefore offer a new promising therapeutic option for type II HIT, as they target an early and crucial step in the process that gives rise to the disorder. The inhibitors employed in the study have the additional advantage that they have either been approved for the treatment of other diseases, or have been successfully tested in clinical trials.

Credit: 
Ludwig-Maximilians-Universität München

Artificial intelligence tracks down leukemia

Artificial intelligence can detect one of the most common forms of blood cancer - acute myeloid leukemia (AML) - with high reliability. Researchers at the German Center for Neurodegenerative Diseases (DZNE) and the University of Bonn have now shown this in a proof-of-concept study. Their approach is based on the analysis of the gene activity of cells found in the blood. Used in practice, this approach could support conventional diagnostics and possibly accelerate the beginning of therapy. The research results have been published in the journal "iScience".

Artificial intelligence is a much-discussed topic in medicine, especially in the field of diagnostics. "We aimed to investigate the potential on the basis of a specific example," explains Prof. Joachim Schultze, a research group leader at the DZNE and head of the Department for Genomics and Immunoregulation at the LIMES Institute of the University of Bonn. "Because this requires large amounts of data, we evaluated data on the gene activity of blood cells. Numerous studies have been carried out on this topic and the results are available through databases. Thus, there is an enormous data pool. We have collected virtually everything that is currently available."

Fingerprint of Gene Activity

Schultze and his colleagues focused on the "transcriptome", which is a kind of fingerprint of gene activity. In each and every cell, depending on its condition, only certain genes are actually "switched on", which is reflected in their profiles of gene activity. Exactly such data - derived from cells in blood samples and spanning many thousands of genes - were analysed in the current study. "The transcriptome holds important information about the condition of cells. However, classical diagnostics is based on different data. We therefore wanted to find out what an analysis of the transcriptome can achieve using artificial intelligence, that is to say trainable algorithms," said Schultze, who is member of the Bonn-based "ImmunoSensation" cluster of excellence. "In the long term, we intend to apply this approach to further topics, in particular in the field of dementia."

The current study focused on AML. Without adequate treatment, this form of leukemia leads to death within weeks. AML is associated with the proliferation of pathologically altered bone marrow cells, which can ultimately enter the bloodstream. Ultimately both healthy cells and tumor cells drift in the blood. All these cells exhibit typical gene activity patterns, which were all considered in the analysis. Data from more than 12,000 blood samples - these came from 105 different studies - were taken into account: the largest dataset to date for a metastudy on AML. Approximately 4,100 of these blood samples derived from individuals diagnosed with AML, the remaining ones had been taken from individuals with other diseases or from healthy individuals.

High Hit Rate

The scientists fed their algorithms parts of this data set. The input included information about whether a sample came from an AML patient or not. "The algorithms then searched the transcriptome for disease-specific patterns. This is a largely automated process. It's called machine learning," said Schultze. Based on this pattern recognition, further data was analysed and classified by the algorithms, i.e. categorized into samples with AML and without AML. "Of course, we knew the classification as it was listed in the original data, but the software did not. We then checked the hit rate. It was above 99 percent for some of the applied methods. In fact, we tested various methods from the repertoire of machine learning and artificial intelligence. There was actually one algorithm that was particularly good, but the others were close behind."

Application in Practice?

Put into application, this method could support conventional diagnostics and help save costs, said Schultze. "In principle, a blood sample taken by the family doctor and sent to a laboratory for analysis could suffice. I guess that the cost would be less than 50 euros." Classical AML diagnostics includes a variety of methods. Some of these cost a few hundred euros per run, Schultze noted. "However, we have not yet developed a workable test. We have only shown that the approach works in principle. So we have laid the groundwork for developing a test."

Schultze emphasised that the diagnosis of AML will continue to require specialised physicians in the future. "The aim is to provide the experts with a tool that supports them in their diagnosis. In addition, many patients go through a real odyssey until they finally end up with a specialist and get a diagnosis." Because in the early stages the symptoms of AML can resemble those of a bad cold. However, AML is a life-threatening disease that should be treated as quickly as possible. "With a blood test, as it seems possible on the basis of our study, it is conceivable that the family doctor would already clarify a suspicion of AML. And when the suspicion is confirmed, the patient is referred to a specialist. Possibly, the diagnosis would then happen earlier than it does now and therapy could start earlier."

Credit: 
DZNE - German Center for Neurodegenerative Diseases