Culture

New treatment option shown for heart failure fluid overload

image: Shweta Bansal, M.D., a nephrologist at UT Health San Antonio, coauthored a study showing high-dose spironolactone effectively and safely treated fluid overload in heart failure patients who were not responding to other diuretics.

Image: 
UT Health San Antonio

Higher doses of spironolactone, a diuretic (water pill), can prevent the need for dialysis in selected heart failure patients, a UT Health San Antonio study found. The aggressive approach relieved fluid overload safely and effectively in patients who were not responding to conventional diuretics.

The findings are in the journal Annals of Internal Medicine. UT Health San Antonio kidney and cardiovascular specialists conducted a pilot study in 19 patients to show that the higher doses of spironolactone, which prevents reabsorption of excess salt in the kidneys and maintains potassium levels, could be used safely in these very-ill patients, who admit to the hospital with heart failure exacerbation and don't respond to conventional diuretics.

Spironolactone is usually given to these types of patients in doses of 25-50 milligrams. In this study, the dose was increased to 100 and, at some administrations, even 200 milligrams.

Symptoms

"Heart failure patients come into our care with excessive fluid (salt plus water) on their body, making them short of breath, unable to walk and unable to lie flat," said Shweta Bansal, M.D., associate professor in the Division of Nephrology at UT Health San Antonio. "They are miserable because of shortness of breath and distension in their abdomen and legs."

Generally the treatment is a low-salt diet and diuretics. Furosemide (brand name Lasix) is one of the frequently used medications. When patients are admitted to the hospital, they are monitored on this regimen and usually improve.

But about 15% to 20% of patients do not get better, Dr. Bansal said. They continue to have fluid overload.

"The reason is they get resistant to the commonly used loop diuretics, and a very high aldosterone level is one of the main reasons for this resistance," she said.

Targeting a different mechanism

Kidneys are made up of millions of tiny tubules called nephrons. Nephrons consist of four main segments, including a part called the loop of Henle, where 20% to 25% of salt reabsorption happens. Loop diuretics target this section.

Spironolactone inhibits the action of aldosterone, a hormone that makes the kidney excrete too much potassium and retain salt in the distal segment, another part of nephrons.

'Significant improvement'

Study participants who didn't respond to standard therapy were given high-dose spironolactone and monitored for urine output and breathing. "Most of them had a dramatic increase in their urine output and significant improvement in their shortness of breath," Dr. Bansal said. "We think some patients could avoid needing dialysis if treated in this manner."

Credit: 
University of Texas Health Science Center at San Antonio

Manufacture of light-activated proteins

image: Raziye Karapinar (left) and Stefan Herlitze

Image: 
RUB, Kramer

A new strategy for designing light-sensitive proteins has been developed by researchers at Ruhr-Universität Bochum (RUB). Such proteins, also known as optogenetic tools, can be switched on and off through light impulses, thus triggering specific cellular processes. They can, for example, be used to analyse and control how signals are transmitted by nerve cells. So far, researchers developing optogenetic tools have been pretty much forced to resort to trial-and-error. A combination of computer-aided and experimental methods has now paved the way for a more targeted approach.

In collaboration with a colleague from Münster, the team headed by Professor Stefan Herlitze, Department of General Zoology and Neurobiology at RUB, and Professor Klaus Gerwert, Department of Biophysics at RUB, has published an article on the method in the journal "Chembiochem", where it was featured as the cover story in the edition from 15 July 2019.

Switching proteins on and off with light in different colours

An example of an optogenetic tool is the protein melanopsin. It can be switched on and off by two light signals in different colours. "Often, more than just one optogenetic tool is required, for example if two different processes have to be controlled in a cell independently of each other," explains Raziye Karapinar from the Department of General Zoology and Neurobiology. "We must therefore ensure that the colour signals for both tools do not overlap," adds Dr. Till Rudack, biophysicist from Bochum.

Klaus Gerwert's and Stefan Herlitze's research team has developed a hybrid strategy for targeted protein engineering of melanopsin and other optogenetic tools. To this end, the researchers combined computer-aided calculating methods with electrophysiological measurements.

Computer simulation determines the activating light colour

Using quantum chemistry computer simulations, they calculated the specific light colour required to activate a protein. Thus, they determined how individual protein building blocks resp. the exchange of individual protein building blocks affects the light colour. The computer simulation generated a list of protein variants that qualify as potential optogenetic tools. Subsequently, the researchers used electrophysiological measurements to analyse the promising candidates with regard to their optogenetic potential. This includes light sensitivity, i.e. how much light is needed to switch the protein on and off, as well as the speed and selectivity at which mechanisms are implemented or terminated after the activation of the switch. A good optogenetic tool can be switched on and off in quick succession at low light intensity.

Validation with well-researched optogenetic tool

Using the well-researched optogenetic tool Channelrhodopsin-2, the team validated the new hybrid strategy. For this protein, the researchers used computer simulation to verify how an exchange of protein building blocks would affect the activating light colour. The prognoses corresponded with the values measured in experiments. "This match shows how reliable our strategy is, and it also validates its application for proteins about which we don't know much, such as melanopsin," says biophysicist Dr. Stefan Tennigkeit.

New melanopsin variants

With their strategy, the group exchanged specific protein building blocks in melanopsin, thus manipulating the light colour for molecule activation, without impairing the protein function. The light colour that activates the usual melanopsin version overlaps with that of many other optogenetic tools - which is why they cannot be used in combination. "I'm convinced that it will be possible to combine this new melanopsin variant with other optogenetic tools in future, in order to control complex cellular processes," says Stefan Herlitze.

"Unlike traditional protein engineering methods based on trial-and-error, our approach saves a lot of time thanks to automated computer-aided prognoses that can be calculated on several computers at the same time," concludes Klaus Gerwert.

Credit: 
Ruhr-University Bochum

Newly discovered Labrador fossils give clues about ancient climate

image: This fossilized tree leaf, are the first of their kind to have been found in the area. Alexandre Demers-Potvin, used the samples he collected to establish that Eastern Canada would have had a warm temperate and fully humid climate during the middle of Cretaceous period.

Image: 
Alexandre Demers-Potvin

The discovery of fossilized plants in Labrador, Canada, by a team of McGill directed paleontologists provides the first quantitative estimate of the area's climate during the Cretaceous period, a time when the earth was dominated by dinosaurs.

The specimens were found in the Redmond no.1 mine, in a remote area of Labrador near Schefferville, in August 2018. Together with specimens collected in previous expeditions, they are now at the core of a recent study published in Palaeontology.

Some of the specimens, are the first of their kind to have been found in the area. Alexandre Demers-Potvin, a graduate student under the supervision of Professor Hans Larsson, Canada Research Chair in Vertebrate Palaeontology at McGill University, used the samples he collected to establish that Eastern Canada would have had a warm temperate and fully humid climate during the middle of Cretaceous period.

Fossilized leaves and insects, known to be very similar to communities that today live further south, had been found at the Redmond No. 1 mine in the late 1950s had led paleontologists to hypothesize that the cretaceous climate of Quebec and Labrador was far warmer than it is today.

With the new samples they found, Demers-Potvin and his colleagues were able to confirm this using the Climate Leaf Analysis Multivariate Program. This tool is used to predict a variety of climate statistics for a given fossil flora, such as temperature and precipitation variables, based on the shape and size of its tree leaves. Their findings put the area's mean annual temperature around 15°C. Summers were hot - with temperatures of over 20?C - and year-round precipitations relatively high.

Alexandre Demers-Potvin, who is also the study's first author, said the new work provides insight into how the climate of Eastern Canada evolved over time, useful information to study today's changing climate.

"The fossils from the Redmond mine show that an area that is now covered by boreal forest and tundra used to be covered in warm temperate forests in the middle of the Cretaceous, one of our planet's 'hothouse' episodes, Demers-Potvin said. These are new pieces of evidence that can help improve projections of the global average temperature against global CO2 levels throughout the Earth's history."

Alexandre Demers-Potvin and his collaborators are now undertaking a description of the new fossilized insects discovered at the Redmond site. Demers-Potvin will return to Schefferville in the hopes of finding more insect specimens and fossilized vertebrates that could be hiding in the rubble of the abandoned mine.

Credit: 
McGill University

Eleven new species of rain frogs discovered in the tropical Andes

image: The Rain frogs comprise a unique group lacking a tadpole stage of development. Their eggs are laid on land and hatch as tiny froglets.

Image: 
BIOWEB-PUCE

Eleven new to science species of rain frogs are described by two scientists from the Museum of Zoology of the Pontifical Catholic University of Ecuador in the open-access journal ZooKeys. Discovered in the Ecuadorian Andes, the species are characterized in detail on the basis of genetic, morphological, bioacoustic, and ecological features.

On the one hand, the publication is remarkable because of the large number of new species of frogs. Regarding vertebrate animals, most studies only list between one and five new to science species, because of the difficulty of their collection and the copious amount of work involved in the description of each. To put it into perspective, the last time a single article dealt with a similar number of newly discovered frogs from the western hemisphere was in 2007, when Spanish scientist Ignacio de la Riva described twelve species from Bolivia.

On the other hand, the new paper by Nadia Paez and Dr Santiago Ron is astounding due to the fact that it comes as part of the undergraduate thesis of Nadia Paez, a former Biology student at the Pontifical Catholic University, where she was supervised by Professor Santiago Ron. Normally, such a publication would be the result of the efforts of a large team of senior scientists. Currently, Nadia Paez is a PhD student in the Department of Zoology at the University of British Columbia in Canada.

Unfortunately, amongst the findings of concern is that most of the newly described frog species are listed as either Data Deficient or Threatened with extinction, according to the criteria of the International Union for Conservation of Nature (IUCN). All of the studied amphibians appear to have very restricted geographic ranges, spanning less than 2,500 km2. To make matters worse, their habitats are being destroyed by human activities, especially cattle raising, agriculture, and mining.

Amongst the newly described species, there is the peculiar Multicolored Rain Frog, where the name refers to its outstanding color variation. Individuals vary from bright yellow to dark brown. Initially, the studied specimens were assumed to belong to at least two separate species. However, genetic data demonstrated that they represented a single, even if highly variable, species.

The rest of the previously unknown frogs were either named after scientists, who have made significant contributions in their fields, or given the names of the places they were discovered, in order to highlight places of conservation priority.

Credit: 
Pensoft Publishers

Prior Zika virus or dengue virus infection does not affect secondary infections in monkeys

image: Gabrielle Barry, a research specialist at the AIDS Vaccine Research Labs at the University of Wisconsin-Madison tests body fluids from rhesus macaque monkeys infected with the Zika virus searching for evidence of the virus on June 28, 2016 in Madison, Wisconsin.

Image: 
Breitbach et al. 2019

Previous infection with either Zika virus or dengue virus has no apparent effect on the clinical course of subsequent infection with the other virus, according to a study published August 1 in the open-access journal PLOS Pathogens by David O'Connor of the University of Wisconsin-Madison, and colleagues. This work is timely, given recent efforts to develop an effective vaccine for Zika virus, as well as the introduction of dengue virus vaccines in areas where both viruses are now co-endemic.

Zika and dengue viruses are related flaviviruses that now co-circulate in much of the tropical and subtropical world. The rapid emergence of Zika virus in the Americas in 2015 and 2016, and its recent associations with a neurological disorder called Guillain-Barré syndrome, birth defects, and fetal loss have led to the hypothesis that dengue virus infection induces cross-reactive antibodies that influence the severity of secondary Zika virus infections. It has also been proposed that pre-existing Zika virus immunity could affect the severity of secondary dengue virus infections. Data from in vitro experiments and mouse models suggest that pre-existing immunity to one virus could either enhance or protect against infection with the other. These somewhat contradictory findings highlight the need for immune-competent animal models to understand the role of cross-reactive antibodies in flavivirus infections.

In the new study, O'Connor and colleagues examined secondary Zika virus or dengue virus infections in rhesus and cynomolgus macaques that had previously been infected with the other virus. They assessed the outcomes of secondary Zika virus or dengue virus infections by quantifying viral RNA loads, clinical and laboratory parameters, body temperature, and weight for each cohort of animals and compared them with control animals. These comparisons demonstrated that within one year of primary infection, secondary infections with either Zika virus or dengue virus were similar to primary infections and were not associated with enhanced or reduced disease severity. All animals had asymptomatic infections and, when compared to controls, did not have significantly perturbed blood parameters. Although additional studies are needed, the findings suggest that vaccination against either dengue virus or Zika virus may not influence the severity of disease upon secondary infection with the other virus.

"Within a year of primary DENV or ZIKV infection, we did not observe any significant enhancement of, nor protection from, secondary infection with the opposite virus in non-pregnant macaques," the authors add. "Our results differ from those shown in tissue culture and immunocompromised mouse models and aligns more closely with current human epidemiological data where enhanced secondary Zika infection has not been reported after primary dengue infections."

Credit: 
PLOS

Researchers remove the need for anti-rejection drugs in transplant recipients

MINNEAPOLIS, MN- August 2, 2019 - For decades, immunologists have been trying to train the transplant recipient's immune system to accept transplanted cells and organs without the long-term use of anti-rejection drugs. New University of Minnesota preclinical research shows that this is now possible.

In a study published in Nature Communications, researchers at the University of Minnesota Medical School's Department of Surgery and Schulze Diabetes Institute, collaborating with colleagues at Northwestern University, have maintained long-term survival and function of pancreatic islet transplants despite complete discontinuation of all anti-rejection drugs on day 21 after the transplant. This study was performed in a stringent preclinical transplant setting in nonhuman primates, one step away from humans.

For many patients with end-stage organ failure, transplantation is the only effective and remaining treatment option. To prevent transplant rejection, recipients must take medications long-term that suppress the body's immune system. These immunosuppressive drugs are effective at preventing rejection over the short term; however, because anti-rejection drugs suppress all of the immune system nonspecifically, people taking these drugs face the risk of serious infections and even cancer. Additionally, non-immunological side effects of immunosuppression, such as hypertension, kidney toxicity, diarrhea, and diabetes diminish the benefits of transplantation. Finally, immunosuppressive drugs are much less effective at preventing transplant rejection over a long period of time, thereby leading to graft loss in many recipients.

Because a growing population of chronically immunosuppressed transplant recipients face that impasse, which might adversely affect their survival, generations of immunologists have pursued immune tolerance as the primary goal in the field of transplantation medicine. Inducing tolerance to transplants would eliminate the need for chronic immunosuppression and enhance transplant and patient survival. Proof that immune tolerance of transplants can be achieved was first demonstrated in mice by Peter Medawar in his Nobel Prize-winning Nature article more than 65 years ago. Yet, despite its immense significance, transplant tolerance has been achieved in only a very few patients.

This new study capitalizes on the unique attributes of modified donor white blood cells, which were infused into transplant recipients one week before and one day after the transplant, thereby recapitulating nature's formula for maintaining the body's tolerance of its own tissues and organs. Without the need for long-term antirejection drugs, islet cell transplants could become the treatment option of choice, and possibly a cure, for many people burdened by type 1 diabetes.

"Our study is the first that reliably and safely induces lasting immune tolerance of transplants in nonhuman primates," said senior author Bernhard Hering, MD, Professor and Vice Chair of Translational Medicine in the Department of Surgery at the University of Minnesota, who also holds the Jeffrey Dobbs and David Sutherland, MD, PhD, Chair in Diabetes Research. "The consistency with which we were able to induce and maintain tolerance to transplants in nonhuman primates makes us very hopeful that our findings can be confirmed for the benefit of patients in planned clinical trials in pancreatic islet and living-donor kidney transplantation - it would open an entirely new era in transplantation medicine."

Credit: 
University of Minnesota Medical School

Study suggests economic growth benefits wildlife but growing human populations do not

image: Wildebeest on the Serengeti.

Image: 
ZSL

In a world first, researchers at ZSL and UCL compared changes in bird and mammal populations with socio-economic trends in low- and lower-middle income countries over the past 20 years. Their results suggest that national-level economic growth and more gender-balanced governments enhance wildlife populations and provide support for linking the UN's human development and conservation targets.

In 2015, the 2030 Agenda for Sustainable Development was formally adopted by all United Nations Member States to provide "a shared blueprint for peace and prosperity for people and the planet, now and into the future." At its core are the 17 Sustainable Development Goals (SDGs) which call for world-wide collaboration to reduce inequality, improve human health and education, promote economic growth, tackle climate change and conserve biodiversity.

This blend of demographic and environmental development is complex, and the SDGs are not the only agenda the international community is working to. Evidence of continuing biodiversity loss has led to a succession of conservation-focussed policies too, chief of which are the Convention on Biological Diversity (CBD) Aichi targets, set for 2020. With potentially competing priorities, the team at ZSL's Institute of Zoology and UCL's Centre for Biodiversity and Environment Research, wanted to understand whether progress towards socio-economic targets might limit the likelihood of meeting conservation ones.

To explore these links, researchers cross-referenced data from the Living Planet Index on 298 bird and mammal populations - recorded outside protected reserves - with indicators of social, economic and political progress towards the SDGs in 33 low- and lower-middle income countries obtained from the World Bank. Their analysis, published today in the journal People and Nature, found consistently positive relationships between economic growth and wildlife abundance - so the richer the people, the safer the biodiversity. Similar relationships were found for more gender-equal societies, lower levels of government corruption and longer human lifespans too.

Lead author Judith Ament, PhD researcher at ZSL and UCL, said: "Our study suggests that at a national level, it is possible to work towards conservation and economic development at the same time and underlines the need for further integration of sustainable development strategies. We think this might be because as standards of living rise, people become less dependent on local natural resources for income and food, and environmental regulation becomes tighter. We are concerned that this could lead to more importing however, the impact of which would fall on wildlife elsewhere. This certainly merits further research."

Researchers also found that denser and faster-growing human populations reduced wildlife numbers and that there is evidence for national-level environmental benefits of urbanisation.

Dr Chris Carbone, Senior Research Fellow in ZSL's Institute of Zoology, said: "This is consistent with other studies that have shown how humans compete with animals for space and resources and if more people are concentrated in one place, more areas are open to wildlife. It wasn't all good news though and we did find that aspects of human development had a negative impact on some species. Numbers of water-birds, for example, fell as wider water sanitation and treatment processes were implemented. It's only by understanding these relationships that we can mitigate them and put forward policies that are good for people and for the natural world. This paper provides the first empirical evidence that simultaneous progress for both international development and conservation is possible and, should further research uphold our findings, could revolutionise UN target setting in the future."

ZSL's research was essential to establishing major global monitoring and prioritisation programmes, such as the IUCN Red List and Living Planet Index, as well as setting and evaluating the UN's biodiversity targets. The next CBD targets, to replace the Aichi targets, will be set at the convention in Beijing, October 2020, where ZSL data on global biodiversity trends will again be key.

Credit: 
Zoological Society of London

Sudden hearing loss: Update to guideline to improve implementation and awareness

ALEXANDRIA, VA --The American Academy of Otolaryngology-Head and Neck Surgery Foundation published the Clinical Practice Guideline: Sudden Hearing Loss (Update) today in Otolaryngology-Head and Neck Surgery. Sudden sensorineural hearing loss (SSNHL) affects five to 27 per 100,000 people annually, with about 66,000 new cases per year in the United States.

"Sudden hearing loss is a frightening symptom for patients that can dramatically decrease their quality of life. Prompt recognition and management of sudden sensorineural hearing loss may improve hearing recovery and quality of life. That is the overarching objective and purpose of this guideline update," said Seth R. Schwartz, MD, MPH, who served as the methodologist for both the 2012 guideline and the 2019 guideline update.

SHL is defined as a rapid-onset subjective sensation of hearing impairment in one or both ears. The hearing loss in SHL may be a conductive hearing loss (CHL), sensorineural hearing loss (SNHL), or mixed hearing loss, defined as both CHL and SNHL occurring in the same ear. CHL and the conductive component of mixed hearing loss may be due to an abnormality in the ear canal, tympanic membrane (''ear drum''), or middle ear.

For most patients with SHL, their medical journey often starts at an emergency room, urgent care clinic or primary care physician's office, with dizziness present in 30 to 60 percent of cases. The initial recommendations of this guideline update address distinguishing SSNHL from CHL at the initial presentation with hearing loss. They also clarify the need to identify rare, nonidiopathic SSNHL to help separate those patients from those with idiopathic sensorineural hearing loss (ISSNHL), who are the target population for the therapeutic interventions that make up the bulk of the guideline update. By focusing on opportunities for quality improvement, this guideline should improve diagnostic accuracy, facilitate prompt intervention, decrease variations in management, reduce unnecessary tests and imaging procedures, and improve hearing and rehabilitative outcomes for impacted patients.

"While the original guideline was a big step, this update provides an opportunity to improve diagnostic accuracy, facilitate prompt intervention, reduce unnecessary tests, and improve hearing and rehabilitative outcomes for patients," said Dr. Schwartz. "Sudden sensorineural hearing loss, particularly when accompanied by tinnitus and dizziness, can cause a great reduction of quality of life. Patients may experience fear and frustration at the inability to identify a cause for their hearing loss. The impact of this condition on a patient's function and well-being underlies the importance of an updated guideline to optimize care of patients with this debilitating condition."

Credit: 
American Academy of Otolaryngology - Head and Neck Surgery

'Wildling' mice could help translate results in animal models to results in humans

image: Researchers harnessed natural microbiota and pathogens to address shortcomings of current mouse models.

Image: 
Reprinted with permission from Rosshart et al., <i>Science</i> 365:eaaw4361 (2019)

Researchers at the National Institutes of Health developed a new mouse model that could improve the translation of research in mice into advances in human health. The mouse model, which the scientists called "wildling," acquired the microbes and pathogens of wild mice, while maintaining the laboratory mice's genetics that make them more useful for research. In two preclinical studies, wildlings mirrored human immune responses, where lab mice failed to do so. Led by scientists at the NIH's National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK), the study published online in Science.

"We wanted to create a mouse model that better resembles a mouse you'd find in the wild," said Barbara Rehermann, M.D., chief of the Immunology Section in NIDDK's Liver Diseases Branch and senior author on the study. "Our rationale was that the immune responses and microbiota of wild mice and humans are likely shaped in a similar way--through contact with diverse microbes out in the real world."

Microbiota refers to the trillions of tiny microbes, such as bacteria, fungi, and viruses, that live in and on the bodies of people and animals and play a critical role in keeping immune systems healthy. Unlike squeaky clean lab mice raised in artificial settings, wild mice have developed symbiotic relationships with microbes they have encountered in the outside world--just as people have done.

Rehermann and Stephan Rosshart, M.D., the study's lead author and NIDDK postdoctoral fellow, have long sought to improve animal models of complex diseases in humans. In 2017, they led research showing that transferring wild mice gut microbiota into lab mice helped the mice survive an otherwise lethal flu virus infection and fight colorectal cancer. (See an animated video about the study here.)

In the current study, they transplanted embryos of the most commonly used strain of laboratory mice for immune system research into female wild mice, who then gave birth to and raised wildlings. The researchers and their collaborators compared the microbiota of the wildlings, wild mice and lab mice. They found that the wildlings acquired the microbes and pathogens of wild mice and closely resembled wild mice in their bacterial microbes present at the gut, skin, and vagina, as well as in the number and kinds of fungi and viruses present.

"A healthy microbiome is important not only for the immune system, but also for digestion, metabolism, even the brain," said Rosshart, who recently completed his fellowship in NIDDK and will open a new lab in Germany. "The wildling model could help us better understand what causes diseases, and what can protect us from them, thus benefitting many areas of biomedical research."

The researchers also tested the stability and resilience of the wildlings' microbiota and found the microbiota was stable across five generations and resilient to environmental challenges. For example, when the mice were given antibiotics for seven days, the lab mice's gut microbiota changed and did not recover, while the wildlings' microbiota fully recovered. Further, when the mice were fed a 10-week high-fat diet, the microbiota of the lab mice changed significantly and never returned to baseline. The wildlings' microbiota changed only mildly and recovered shortly after the diet ended. The authors suggest that the stability and resilience of wildlings, if the model is used widely, could improve the validity and reproducibility of biomedical studies.

Finally, the researchers tested how well the wildlings could predict human immune responses. To do so, they drew from two studies where drugs used to target immune responses were successful in treating lab mice in preclinical trials but consequently failed to have therapeutic effects in humans. In the current study, the researchers treated wildlings and lab mice with the same drugs. The wildlings, but not the lab mice, mimicked the human responses seen in clinical trials.

"We always strive for effective ways to shorten the gap between early lab findings and health advances in people, and the wildling model has the potential to do just that," said NIDDK Director Griffin P. Rodgers, M.D. "By helping to predict immune responses of humans, the wildling model could lead to important discoveries to help treat and prevent disease, and ultimately, improve human health."

Credit: 
NIH/National Institute of Diabetes and Digestive and Kidney Diseases

Mapping the Milky Way in three dimensions

By measuring the distance from our sun to thousands of individual pulsating stars scattered across the Milky Way, researchers have charted our Galaxy on a larger scale than ever before. Their new three-dimensional map, which provides a broad view of our place among the stars, reveals the S-like structure of the Milky Way's warped stellar disc. "Our map shows the Milky Way disk is not flat. It is warped and twisted," says co-author Przemek Mroz in a related video. "This is the first time we can use individual objects to show this in three dimensions." Much of the current understanding of the spiral shape and structure of our Galaxy is built upon indirect measurements to celestial landmarks and inferences based on other galaxies in the Universe. However, the Galactic map drafted by those limited observations remains incomplete. Like so many lighthouses on distant foggy shores, classical Cepheid variable stars - massive young stellar bodies that burn hundreds, if not thousands of times brighter than our own Sun - pulsate at regular intervals and are visible through the vast clouds of interstellar dust that often obscure dimmer stellar bodies. Using the periodic variations in their brightness, the distances to these stars can be precisely determined. Dorota Skowron and colleagues charted the distance to more than 2,400 Cepheids throughout the Milky Way, most of which were identified by the Optical Gravitational Lensing Experiment (OGLE) - a project that more than doubled the number of known Galactic classical Cepheids. By determining the 3D coordinates of each distant pulsing star relative to our Sun, Skowron et al. built a large-scale 3D model of the Milky Way galaxy. The new map illustrates and helps constrain the previously observed shape of the Galaxy's warped stellar disc.

Credit: 
American Association for the Advancement of Science (AAAS)

Study assesses outcomes for meth users with burn injures

image: This is Dr. Kathleen Romanowski, assistant professor with UC Davis Health's Division of Burn Surgery.

Image: 
UC Davis Health

UC Davis Health researchers were surprised to find that methamphetamine use is not linked with worse health outcomes among burn patients. However, meth use was associated with significantly worse conditions for those patients after their release from the hospital.

Meth-positive patients commonly sustain large total body surface area (TBSA) burn injuries. They are often a result of drug-related accidents or explosions during meth production, according to the study authors.

“At first, we expected that matched meth-positive patients would have worse outcomes than meth-negative patients,” said senior author Kathleen Romanowski, assistant professor with UC Davis Health’s Division of Burn Surgery. “We were surprised to find that they did not have higher mortality or require more procedures, ventilation days, operating room visits or ICU days.”

Meth use and burn injuries

Using a database of burn-injured patients admitted to the Firefighters Burn Institute Regional Burn Center at UC Davis Medical Center over four years, the researchers examined all the burn-injury cases.

“This is the largest study to date to investigate methamphetamine use in burn-injured patients, with 264 meth-positive adult cases,” Romanowski said.

Out of the 264 meth-positive cases, they matched 193 patients with meth-negative patients based on their age and the nature of their injuries. The researchers looked at how measures of injury severity, burn management and socioeconomic data varied among matched and unmatched meth-positive and meth-negative patients.

Hospital stay and post-discharge care for meth-positive patients

The outcomes showed that meth-positive patients suffered worse injuries and stayed longer in hospital than meth-negative patients. Yet, more meth-positive patients left the hospital against medical advice. Fewer meth-positive patients had access to support such as skilled nursing facilities.

Meth-positive patients did worse in every measure of socioeconomic status, based on their ZIP codes. The early discharge could be linked to insurance status. Only 9% of the patients had private insurance, compared to nearly a quarter of meth-negative patients.

“Meth-positive patients are not receiving the same level of post-discharge care as their meth-negative counterparts. This is possibly due to lack of resources, addiction or perceived stigma,” Romanowski said. Patients who use meth may have unique and more complex issues that need to be addressed in addition to their burn injuries, said Romanowski. They represent a special group who needs additional resources to ensure they successfully recover.

“Providers need to consider how they can support these patients with adequate inpatient and follow-up care. This might include addiction counseling, social services or follow-up care facilitation,” Romanowski said. “Given these patients are being discharged with less support and access to resources, there is a need for further research on the long-term mortality and other health outcomes of these patients.”

On average, meth-positive patients were more likely than meth-negative patients to be younger (42 vs. 46 years), male (81.5% vs. 72.7%) smokers (54% vs. 29%) and drug dependent (81% vs. 16%). They also were less likely to have health issues such as congestive heart failure, high blood pressure requiring medication, obesity, diabetes and wheelchair dependency.

This research was supported by the National Center for Advancing Translational Sciences, National Institutes of Health through grant number UL1 TR001860. Co-authors of this study were Eve Solomon, David Greenhalgh, Soman Sen and Tina Palmieri, all of UC Davis Health. The study is published in the Journal of Burn Care and Research and available online. More information about UC Davis Health, including its burn center of excellence, is at health.ucdavis.edu.

Journal

Journal of Burn Care & Research

DOI

10.1093/jbcr/irz102

Credit: 
University of California - Davis Health

Three concepts from complexity could play a big role in social animal research

From bees to birds to humans, the animal kingdom is full of organisms that have evolved complex social structures to solve specific problems they encounter. Explaining why some species evolved more complex societies than others has been a fundamental challenge in the field of social animal research, and might be best approached with tools from complex systems, according to a team of researchers from the Santa Fe Institute.

Some complexity science concepts are already part of the lexicon of biology. For instance, evolution and adaptation are foundational to both fields. In a recent paper in Animal Behaviour, four current and past Santa Fe Institute postdoctoral fellows propose three additional concepts from complex systems science -- scales of organization, compression, and emergence -- that could be particularly useful to researchers studying complexity in animal sociality.

"None of these three concepts are, by themselves, diagnostic of a complex system, but all three are often part of one," says Elizabeth Hobson (University of Cincinnati), a former SFI complexity postdoc and lead author on the paper. "These concepts could lead us in totally new directions and offer new insights into animal social complexity."

The four authors on the paper come from wildly different perspectives, says mathematician Joshua Garland (Santa Fe Institute). Garland and Hobson, together with SFI Postdoctoral Fellow Artemy Kolchinsky and former SFI Omidyar Fellow Vanessa Ferdinand (University of Melbourne), span fields including information theory and neurology, cultural evolution, mathematics, and animal behavior. "The range of fields made it challenging to agree on just three concepts, but the diversity of perspectives is an asset to this paper," says Garland.

The first concept -- social scales -- is important to consider when measuring complexity in animal societies, because the level of complexity may vary across scales. The interactions of two individual honeybees, for example, may be quite simple, while the organizational structure of the hive can be highly complex.

The second concept, compression describes how systems encode information. Animal researchers could use compression to better compare different animal systems to one another or to describe the possible cognitive processes that allow social animals to remember relationships and group structures. "It could help us understand how animals reduce the overall cognitive load while functioning in their societies," says Hobson.

The final concept, emergence, is when a new pattern appears, often at a higher level of social organization, from lower-level interactions. A classic example is the wave-like behavior of a large flock of birds - something that can't exist at the individual level. Other social behaviors, like dominance hierarchies, culturally learned behaviors, or leadership within groups can also exhibit emergent properties.

Hobson and her co-authors suggest researchers consider these tools when exploring animal social complexity measures. "Taken together, we hope these three concepts from complex systems can help us better tackle longstanding questions about animal social structure and help better compare sociality across species," says Hobson.

"These are three big concepts that are both important and immediately applicable," says Garland. "but they just scratch the surface of complex systems ideas that could be useful for animal sociality research."

Credit: 
Santa Fe Institute

New computational method could advance precision medicine

Scientists have devised a new computational method that reveals genetic patterns in the massive jumble of individual cells in the body.

The discovery, published in the journal eLife, will be useful in discerning patterns of gene expression across many kinds of disease, including cancer. Scientists worked out the formulation by testing tissue taken from the testes of mice. Results in hand, they're already applying the same analysis to biopsies taken from men with unexplained infertility.

"There have been very few studies that attempt to find the cause of any disease by comparing single-cell expression measurements from a patient to those of a healthy control. We wanted to demonstrate that we could make sense of this kind of data and pinpoint a patient's specific defects in unexplained infertility," said co-senior author Donald Conrad, Ph.D., associate professor and chief of the Division of Genetics in the Oregon National Primate Research Center at Oregon Health & Science University.

Simon Myers, Ph.D., of the University of Oxford, also is a senior co-author.

Conrad said he expects the new method will advance the field of precision medicine, where individualized treatment can be applied to the specific nuance of each patient's genetic readout.

The scientists made the breakthrough by applying a method recently developed at the University of Oxford to gene expression data from the massive trove of individual cells comprising even minuscule tissue biopsies. The method is known as sparse decomposition of arrays, or SDA.

"Rather than clustering groups of cells, SDA identifies components comprising groups of genes that co-vary in expression," the authors write.

The new study applied the method to 57,600 individual cells taken from the testes of five lines of mice: Four that carry known genetic mutations causing defects in sperm production and one with no sign of genetic infertility. Researchers wanted to see whether it was possible to sort this massive dataset based on the variation in physiological traits resulting from differences in the genes expressed in the RNA, or ribonucleic acid, of individual cells.

Researchers found they were able to cut through the statistical noise and sort many thousands of cells into 46 genetic groups.

"It's a data-reduction method that allows us to identify sets of genes whose activity goes up and down over subsets of cells," Conrad said. "What we're really doing is building a dictionary that describes how genes change at a single-cell level."

The work will immediately apply to male infertility.

Infertility affects an estimated 0.5% to 1% of the male population worldwide. Current measures to treat male infertility involve focus on managing defects in the sperm itself, including through in vitro fertilization. However, those techniques don't work in all cases.

"We're talking about the problem where you don't make sperm to begin with," Conrad said.

This new technique could open new opportunities to diagnose a specific genetic defect and then potentially rectify it with new gene-editing tools such as CRISPR. Identification of a specific cause would be a vast improvement over the current state of the art in diagnosing male infertility, which amounts to a descriptive analysis of testicular tissue biopsies.

"The opportunity provided by CRISPR, coupled to this kind of diagnosis, is really a match made in heaven," Conrad said.

Credit: 
Oregon Health & Science University

Fear of more dangerous second Zika, dengue infections unfounded in monkeys

MADISON -- An initial infection with dengue virus did not prime monkeys for an especially virulent infection of Zika virus, according to a study at the University of Wisconsin-Madison. Nor did a bout with Zika make a follow-on dengue infection more dangerous.

As outbreaks on Pacific islands and in the Americas in recent years made Zika virus a pressing public health concern, the Zika virus's close similarity to dengue presented the possibility that one infection may exacerbate the other.

Dengue virus infections are infamous for being bad the first time around. But following infection with one of the four variants (called serotypes) of dengue with an infection by a different serotype can amplify the already dangerous symptoms -- high temperature, fatigue and pain -- and make dengue fever even more life-threatening.

"When that second dengue virus occurs, antibodies kind of recognize it, but not in a way that allows them to take the virus out of the system and neutralize it like normal," says Dawn Dudley, a scientist in the University of Wisconsin-Madison's Department of Pathology and Laboratory Medicine and one of the authors of the new Zika study. "Instead, they have kind of a secondary effect, where by binding to the virus loosely they actually enhance the ability of the virus to get into other cells in the body and replicate more."

Studies in tissue cultures and mice of back-to-back Zika and dengue infections suggested that the two members of the Flaviviruses -- a genus that also includes West Nile virus and yellow fever virus -- could interact to enhance each other. Data collected from human infections since the UW-Madison group began its work in 2017 appeared to contradict those tissue culture and mouse findings.

The study of 21 Wisconsin National Primate Research Center macaque monkeys, in which animals infected with one virus were challenged with another within nine to 12 months, supports the human epidemiological results.

"Whether it was a primary infection with one of the dengue serotypes followed by a Zika infection, or Zika first with a later dengue infection, we didn't see anything unusual in those secondary infections," says UW-Madison pathology research specialist Meghan Breitbach, also an author of the study.

Monkey weights, body temperatures, red and white blood cell counts, liver function and markers of cell damage did not stray significantly from typical infection levels.

"Because we've done several prior studies of Zika virus infections, we have a lot of historical data on what a typical infection looks like in these animals," says Christina Newman, study author and UW-Madison scientist in Pathology and Laboratory Medicine. "For the animals that were experiencing a secondary Zika virus infection after primary dengue infection, their viral loads were almost indistinguishable from animals that were only ever infected with Zika."

That news, which was published today in the journal PLOS Pathogens, is a positive development. But it comes with a caveat important to Zika: none of the study's monkeys were pregnant. Zika's most visible and troubling results are neurological problems in babies whose mothers were infected during pregnancy, though those complications vary widely.

"The immune system is different in pregnancy," Dudley says. "Previous dengue immunity may still be one of the reasons that some women have severe congenital Zika syndrome outcome in their infant while another woman with a known Zika infection doesn't."

A UW-Madison study of pregnant monkeys encountering both viruses could soon help describe whether back-to-back infections are more dangerous for the monkeys and their offspring.

The newly published study, which was supported by the National Institutes of Health, also represents a snapshot of monkeys encountering infections roughly one year apart, Newman says.

Dengue fever is enhanced by an earlier dengue infection only during certain conditions dependent on the serotypes of dengue involved, whether the immune memory produced by the initial infection was relatively strong or weak, and how much the antibodies created may have faded over months or years. The complicating factors have led to caution in development of Zika and dengue vaccines for fear of sparking more severe infections later.

"Our study suggests that that is unlikely," Newman says. "But as we learn more about people whose infections come two or three years apart, we may see we need to combine a Zika vaccine with a good vaccine against all four serotypes of dengue virus to prevent enhancement of either virus."

Credit: 
University of Wisconsin-Madison

Deep learning AI may identify atrial fibrillation from a normal rhythm ECG

An artificial intelligence (AI) model has been found to identify patients with intermittent atrial fibrillation even when performed during normal rhythm using a quick and non-invasive 10 second test, compared to current tests which can take weeks to years. Although early and requiring further research before implementation, the findings could aid doctors investigating unexplained strokes or heart failure, enabling appropriate treatment.

Researchers have trained an artificial intelligence model to detect the signature of atrial fibrillation in 10-second electrocardiograms (ECG) taken from patients in normal rhythm. The study, involving almost 181,000 patients and published in The Lancet, is the first to use deep learning to identify patients with potentially undetected atrial fibrillation and had an overall accuracy of 83%. The technology finds signals in the ECG that might be invisible to the human eye, but contain important information about the presence of atrial fibrillation.

Atrial fibrillation is estimated to affect 2.7-6.1 million people in the United States [1], and is associated with increased risk of stroke, heart failure and mortality. It is difficult to detect on a single ECG because patients' hearts can go in and out of this abnormal rhythm, so atrial fibrillation often goes undiagnosed.

Dr Paul Friedman, Chair of the Department of Cardiovascular Medicine, Mayo Clinic, USA says: "Applying an AI model to the ECG permits detection of atrial fibrillation even if not present at the time the ECG is recorded. It is like looking at the ocean now and being able to tell that there were big waves yesterday." [2]

He notes: "Currently, the AI has been trained using ECGs in people who needed clinical investigations, but not people with unexplained stroke nor the overall population, and so we are not yet sure how it would perform at diagnosing these groups. However, the ability to test quickly and inexpensively with a non-invasive and widely available test might one day help identify undiagnosed atrial fibrillation and guide important treatment, preventing stroke and other serious illness." [2]

After an unexplained stroke, it is important to accurately detect atrial fibrillation so that patients with it are given anticoagulation treatment to reduce the risk of recurring stroke, and other patients (who may be harmed by this treatment) are not. Currently, detection in this situation requires monitoring for weeks to years, sometimes with an implanted device, potentially leaving patients at risk of recurrent stroke as current methods do not always accurately detect atrial fibrillation, or take too long.

Hearts with atrial fibrillation develop structural changes, such as chamber enlargement. Before those changes are visible to standard imaging techniques such as echocardiograms, there is likely fibrosis (scarring) of the heart associated with atrial fibrillation. Additionally, the presence of atrial fibrillation may temporarily modify the electrical properties of the heart muscle, even after it has ended.

The researchers set out to train a neural network -- a class of deep learning AI -- to recognise subtle differences in a standard ECG that are presumed to be due to these changes, although neural networks are "black boxes" and the specific findings that drives their observations are not known. The authors used ECGs of cardiac rhythm acquired from almost 181,000 patients [3] (around 650,000 ECG scans) between December 1993 and July 2017, dividing the data into patients who were either positive or negative for atrial fibrillation.

ECG data was assigned into three groups: training, internal validation and testing datasets with 70% in the training group, 10% in validation and optimisation, and 20% in the testing group (454,789 ECGs from 126,526 patients in the training dataset, 64,340 ECGs from 18,116 patients in the validation dataset and 130,802 ECGs from 36,280 patients in the testing dataset).

The AI performed well at identifying the presence of atrial fibrillation: testing on the first cardiac ECG output from each patient, the accuracy was 79% (for a single scan), and when using multiple ECGs for the same patient the accuracy improved to 83%. Further research is needed to confirm the performance on specific populations, such as patients with unexplained stroke (embolic stroke of undetermined source - ESUS), or heart failure.

The authors of the study also speculate that it may one day be possible to use this technology as a point-of-care diagnostic test in the doctor's surgery to screen high-risk groups. Screening people with hypertension, diabetes, or age over 65 years for atrial fibrillation could help avoid ill health, however, current detection methods are costly and identify few patients. In addition, this screening currently requires wearing a large and uncomfortable heart monitor for days or weeks.

Dr Xiaoxi Yao, a study co-investigator from Mayo Clinic, USA, says: "It is possible that our algorithm could be used on low-cost, widely available technologies, including smartphones, however, this will require more research before widespread application." [2]

The authors note several limitations and further research before their work reaches clinics. The population studied may have higher prevalence of atrial fibrillation compared to the general population. The AI has therefore been trained to retrospectively classify clinically indicated ECGs more than for prediction in healthy patients, or those with unexplained stroke, and may need calibration before widespread application to screening of a broader, healthy population.

Patients were considered negative for atrial fibrillation if they did not have verified diagnosis, but there were likely some patients who had been undiagnosed and labelled erroneously, so the AI may have identified what previous testing had not. On the other hand, some of the false-positive patients identified by the AI as having a history of atrial fibrillation (despite being classified as negative by a human) might actually have had undiagnosed atrial fibrillation. Since the AI is only as good as the data it is trained against there could be errors in the interpretation when the test is applied to other populations, such as a individuals without an indicated ECG.

In a linked Comment, Dr Jeroen Hendriks of the University of Adelaide and Royal Adelaide Hospital, Adelaide, Australia, says: "In summary, Attia and colleagues are to be congratulated for their innovative approach and the thorough development and local validation of the AI-enabled ECG. Given that AI algorithms have recently reached cardiologist level in diagnostic performance this AI-ECG interpretation is ground-breaking in creating an algorithm to reveal the likelihood of atrial fibrillation in ECGs showing sinus rhythm."

Credit: 
The Lancet