Culture

Eleven new species of rain frogs discovered in the tropical Andes

image: The Rain frogs comprise a unique group lacking a tadpole stage of development. Their eggs are laid on land and hatch as tiny froglets.

Image: 
BIOWEB-PUCE

Eleven new to science species of rain frogs are described by two scientists from the Museum of Zoology of the Pontifical Catholic University of Ecuador in the open-access journal ZooKeys. Discovered in the Ecuadorian Andes, the species are characterized in detail on the basis of genetic, morphological, bioacoustic, and ecological features.

On the one hand, the publication is remarkable because of the large number of new species of frogs. Regarding vertebrate animals, most studies only list between one and five new to science species, because of the difficulty of their collection and the copious amount of work involved in the description of each. To put it into perspective, the last time a single article dealt with a similar number of newly discovered frogs from the western hemisphere was in 2007, when Spanish scientist Ignacio de la Riva described twelve species from Bolivia.

On the other hand, the new paper by Nadia Paez and Dr Santiago Ron is astounding due to the fact that it comes as part of the undergraduate thesis of Nadia Paez, a former Biology student at the Pontifical Catholic University, where she was supervised by Professor Santiago Ron. Normally, such a publication would be the result of the efforts of a large team of senior scientists. Currently, Nadia Paez is a PhD student in the Department of Zoology at the University of British Columbia in Canada.

Unfortunately, amongst the findings of concern is that most of the newly described frog species are listed as either Data Deficient or Threatened with extinction, according to the criteria of the International Union for Conservation of Nature (IUCN). All of the studied amphibians appear to have very restricted geographic ranges, spanning less than 2,500 km2. To make matters worse, their habitats are being destroyed by human activities, especially cattle raising, agriculture, and mining.

Amongst the newly described species, there is the peculiar Multicolored Rain Frog, where the name refers to its outstanding color variation. Individuals vary from bright yellow to dark brown. Initially, the studied specimens were assumed to belong to at least two separate species. However, genetic data demonstrated that they represented a single, even if highly variable, species.

The rest of the previously unknown frogs were either named after scientists, who have made significant contributions in their fields, or given the names of the places they were discovered, in order to highlight places of conservation priority.

Credit: 
Pensoft Publishers

Prior Zika virus or dengue virus infection does not affect secondary infections in monkeys

image: Gabrielle Barry, a research specialist at the AIDS Vaccine Research Labs at the University of Wisconsin-Madison tests body fluids from rhesus macaque monkeys infected with the Zika virus searching for evidence of the virus on June 28, 2016 in Madison, Wisconsin.

Image: 
Breitbach et al. 2019

Previous infection with either Zika virus or dengue virus has no apparent effect on the clinical course of subsequent infection with the other virus, according to a study published August 1 in the open-access journal PLOS Pathogens by David O'Connor of the University of Wisconsin-Madison, and colleagues. This work is timely, given recent efforts to develop an effective vaccine for Zika virus, as well as the introduction of dengue virus vaccines in areas where both viruses are now co-endemic.

Zika and dengue viruses are related flaviviruses that now co-circulate in much of the tropical and subtropical world. The rapid emergence of Zika virus in the Americas in 2015 and 2016, and its recent associations with a neurological disorder called Guillain-Barré syndrome, birth defects, and fetal loss have led to the hypothesis that dengue virus infection induces cross-reactive antibodies that influence the severity of secondary Zika virus infections. It has also been proposed that pre-existing Zika virus immunity could affect the severity of secondary dengue virus infections. Data from in vitro experiments and mouse models suggest that pre-existing immunity to one virus could either enhance or protect against infection with the other. These somewhat contradictory findings highlight the need for immune-competent animal models to understand the role of cross-reactive antibodies in flavivirus infections.

In the new study, O'Connor and colleagues examined secondary Zika virus or dengue virus infections in rhesus and cynomolgus macaques that had previously been infected with the other virus. They assessed the outcomes of secondary Zika virus or dengue virus infections by quantifying viral RNA loads, clinical and laboratory parameters, body temperature, and weight for each cohort of animals and compared them with control animals. These comparisons demonstrated that within one year of primary infection, secondary infections with either Zika virus or dengue virus were similar to primary infections and were not associated with enhanced or reduced disease severity. All animals had asymptomatic infections and, when compared to controls, did not have significantly perturbed blood parameters. Although additional studies are needed, the findings suggest that vaccination against either dengue virus or Zika virus may not influence the severity of disease upon secondary infection with the other virus.

"Within a year of primary DENV or ZIKV infection, we did not observe any significant enhancement of, nor protection from, secondary infection with the opposite virus in non-pregnant macaques," the authors add. "Our results differ from those shown in tissue culture and immunocompromised mouse models and aligns more closely with current human epidemiological data where enhanced secondary Zika infection has not been reported after primary dengue infections."

Credit: 
PLOS

Researchers remove the need for anti-rejection drugs in transplant recipients

MINNEAPOLIS, MN- August 2, 2019 - For decades, immunologists have been trying to train the transplant recipient's immune system to accept transplanted cells and organs without the long-term use of anti-rejection drugs. New University of Minnesota preclinical research shows that this is now possible.

In a study published in Nature Communications, researchers at the University of Minnesota Medical School's Department of Surgery and Schulze Diabetes Institute, collaborating with colleagues at Northwestern University, have maintained long-term survival and function of pancreatic islet transplants despite complete discontinuation of all anti-rejection drugs on day 21 after the transplant. This study was performed in a stringent preclinical transplant setting in nonhuman primates, one step away from humans.

For many patients with end-stage organ failure, transplantation is the only effective and remaining treatment option. To prevent transplant rejection, recipients must take medications long-term that suppress the body's immune system. These immunosuppressive drugs are effective at preventing rejection over the short term; however, because anti-rejection drugs suppress all of the immune system nonspecifically, people taking these drugs face the risk of serious infections and even cancer. Additionally, non-immunological side effects of immunosuppression, such as hypertension, kidney toxicity, diarrhea, and diabetes diminish the benefits of transplantation. Finally, immunosuppressive drugs are much less effective at preventing transplant rejection over a long period of time, thereby leading to graft loss in many recipients.

Because a growing population of chronically immunosuppressed transplant recipients face that impasse, which might adversely affect their survival, generations of immunologists have pursued immune tolerance as the primary goal in the field of transplantation medicine. Inducing tolerance to transplants would eliminate the need for chronic immunosuppression and enhance transplant and patient survival. Proof that immune tolerance of transplants can be achieved was first demonstrated in mice by Peter Medawar in his Nobel Prize-winning Nature article more than 65 years ago. Yet, despite its immense significance, transplant tolerance has been achieved in only a very few patients.

This new study capitalizes on the unique attributes of modified donor white blood cells, which were infused into transplant recipients one week before and one day after the transplant, thereby recapitulating nature's formula for maintaining the body's tolerance of its own tissues and organs. Without the need for long-term antirejection drugs, islet cell transplants could become the treatment option of choice, and possibly a cure, for many people burdened by type 1 diabetes.

"Our study is the first that reliably and safely induces lasting immune tolerance of transplants in nonhuman primates," said senior author Bernhard Hering, MD, Professor and Vice Chair of Translational Medicine in the Department of Surgery at the University of Minnesota, who also holds the Jeffrey Dobbs and David Sutherland, MD, PhD, Chair in Diabetes Research. "The consistency with which we were able to induce and maintain tolerance to transplants in nonhuman primates makes us very hopeful that our findings can be confirmed for the benefit of patients in planned clinical trials in pancreatic islet and living-donor kidney transplantation - it would open an entirely new era in transplantation medicine."

Credit: 
University of Minnesota Medical School

Study suggests economic growth benefits wildlife but growing human populations do not

image: Wildebeest on the Serengeti.

Image: 
ZSL

In a world first, researchers at ZSL and UCL compared changes in bird and mammal populations with socio-economic trends in low- and lower-middle income countries over the past 20 years. Their results suggest that national-level economic growth and more gender-balanced governments enhance wildlife populations and provide support for linking the UN's human development and conservation targets.

In 2015, the 2030 Agenda for Sustainable Development was formally adopted by all United Nations Member States to provide "a shared blueprint for peace and prosperity for people and the planet, now and into the future." At its core are the 17 Sustainable Development Goals (SDGs) which call for world-wide collaboration to reduce inequality, improve human health and education, promote economic growth, tackle climate change and conserve biodiversity.

This blend of demographic and environmental development is complex, and the SDGs are not the only agenda the international community is working to. Evidence of continuing biodiversity loss has led to a succession of conservation-focussed policies too, chief of which are the Convention on Biological Diversity (CBD) Aichi targets, set for 2020. With potentially competing priorities, the team at ZSL's Institute of Zoology and UCL's Centre for Biodiversity and Environment Research, wanted to understand whether progress towards socio-economic targets might limit the likelihood of meeting conservation ones.

To explore these links, researchers cross-referenced data from the Living Planet Index on 298 bird and mammal populations - recorded outside protected reserves - with indicators of social, economic and political progress towards the SDGs in 33 low- and lower-middle income countries obtained from the World Bank. Their analysis, published today in the journal People and Nature, found consistently positive relationships between economic growth and wildlife abundance - so the richer the people, the safer the biodiversity. Similar relationships were found for more gender-equal societies, lower levels of government corruption and longer human lifespans too.

Lead author Judith Ament, PhD researcher at ZSL and UCL, said: "Our study suggests that at a national level, it is possible to work towards conservation and economic development at the same time and underlines the need for further integration of sustainable development strategies. We think this might be because as standards of living rise, people become less dependent on local natural resources for income and food, and environmental regulation becomes tighter. We are concerned that this could lead to more importing however, the impact of which would fall on wildlife elsewhere. This certainly merits further research."

Researchers also found that denser and faster-growing human populations reduced wildlife numbers and that there is evidence for national-level environmental benefits of urbanisation.

Dr Chris Carbone, Senior Research Fellow in ZSL's Institute of Zoology, said: "This is consistent with other studies that have shown how humans compete with animals for space and resources and if more people are concentrated in one place, more areas are open to wildlife. It wasn't all good news though and we did find that aspects of human development had a negative impact on some species. Numbers of water-birds, for example, fell as wider water sanitation and treatment processes were implemented. It's only by understanding these relationships that we can mitigate them and put forward policies that are good for people and for the natural world. This paper provides the first empirical evidence that simultaneous progress for both international development and conservation is possible and, should further research uphold our findings, could revolutionise UN target setting in the future."

ZSL's research was essential to establishing major global monitoring and prioritisation programmes, such as the IUCN Red List and Living Planet Index, as well as setting and evaluating the UN's biodiversity targets. The next CBD targets, to replace the Aichi targets, will be set at the convention in Beijing, October 2020, where ZSL data on global biodiversity trends will again be key.

Credit: 
Zoological Society of London

Sudden hearing loss: Update to guideline to improve implementation and awareness

ALEXANDRIA, VA --The American Academy of Otolaryngology-Head and Neck Surgery Foundation published the Clinical Practice Guideline: Sudden Hearing Loss (Update) today in Otolaryngology-Head and Neck Surgery. Sudden sensorineural hearing loss (SSNHL) affects five to 27 per 100,000 people annually, with about 66,000 new cases per year in the United States.

"Sudden hearing loss is a frightening symptom for patients that can dramatically decrease their quality of life. Prompt recognition and management of sudden sensorineural hearing loss may improve hearing recovery and quality of life. That is the overarching objective and purpose of this guideline update," said Seth R. Schwartz, MD, MPH, who served as the methodologist for both the 2012 guideline and the 2019 guideline update.

SHL is defined as a rapid-onset subjective sensation of hearing impairment in one or both ears. The hearing loss in SHL may be a conductive hearing loss (CHL), sensorineural hearing loss (SNHL), or mixed hearing loss, defined as both CHL and SNHL occurring in the same ear. CHL and the conductive component of mixed hearing loss may be due to an abnormality in the ear canal, tympanic membrane (''ear drum''), or middle ear.

For most patients with SHL, their medical journey often starts at an emergency room, urgent care clinic or primary care physician's office, with dizziness present in 30 to 60 percent of cases. The initial recommendations of this guideline update address distinguishing SSNHL from CHL at the initial presentation with hearing loss. They also clarify the need to identify rare, nonidiopathic SSNHL to help separate those patients from those with idiopathic sensorineural hearing loss (ISSNHL), who are the target population for the therapeutic interventions that make up the bulk of the guideline update. By focusing on opportunities for quality improvement, this guideline should improve diagnostic accuracy, facilitate prompt intervention, decrease variations in management, reduce unnecessary tests and imaging procedures, and improve hearing and rehabilitative outcomes for impacted patients.

"While the original guideline was a big step, this update provides an opportunity to improve diagnostic accuracy, facilitate prompt intervention, reduce unnecessary tests, and improve hearing and rehabilitative outcomes for patients," said Dr. Schwartz. "Sudden sensorineural hearing loss, particularly when accompanied by tinnitus and dizziness, can cause a great reduction of quality of life. Patients may experience fear and frustration at the inability to identify a cause for their hearing loss. The impact of this condition on a patient's function and well-being underlies the importance of an updated guideline to optimize care of patients with this debilitating condition."

Credit: 
American Academy of Otolaryngology - Head and Neck Surgery

'Wildling' mice could help translate results in animal models to results in humans

image: Researchers harnessed natural microbiota and pathogens to address shortcomings of current mouse models.

Image: 
Reprinted with permission from Rosshart et al., <i>Science</i> 365:eaaw4361 (2019)

Researchers at the National Institutes of Health developed a new mouse model that could improve the translation of research in mice into advances in human health. The mouse model, which the scientists called "wildling," acquired the microbes and pathogens of wild mice, while maintaining the laboratory mice's genetics that make them more useful for research. In two preclinical studies, wildlings mirrored human immune responses, where lab mice failed to do so. Led by scientists at the NIH's National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK), the study published online in Science.

"We wanted to create a mouse model that better resembles a mouse you'd find in the wild," said Barbara Rehermann, M.D., chief of the Immunology Section in NIDDK's Liver Diseases Branch and senior author on the study. "Our rationale was that the immune responses and microbiota of wild mice and humans are likely shaped in a similar way--through contact with diverse microbes out in the real world."

Microbiota refers to the trillions of tiny microbes, such as bacteria, fungi, and viruses, that live in and on the bodies of people and animals and play a critical role in keeping immune systems healthy. Unlike squeaky clean lab mice raised in artificial settings, wild mice have developed symbiotic relationships with microbes they have encountered in the outside world--just as people have done.

Rehermann and Stephan Rosshart, M.D., the study's lead author and NIDDK postdoctoral fellow, have long sought to improve animal models of complex diseases in humans. In 2017, they led research showing that transferring wild mice gut microbiota into lab mice helped the mice survive an otherwise lethal flu virus infection and fight colorectal cancer. (See an animated video about the study here.)

In the current study, they transplanted embryos of the most commonly used strain of laboratory mice for immune system research into female wild mice, who then gave birth to and raised wildlings. The researchers and their collaborators compared the microbiota of the wildlings, wild mice and lab mice. They found that the wildlings acquired the microbes and pathogens of wild mice and closely resembled wild mice in their bacterial microbes present at the gut, skin, and vagina, as well as in the number and kinds of fungi and viruses present.

"A healthy microbiome is important not only for the immune system, but also for digestion, metabolism, even the brain," said Rosshart, who recently completed his fellowship in NIDDK and will open a new lab in Germany. "The wildling model could help us better understand what causes diseases, and what can protect us from them, thus benefitting many areas of biomedical research."

The researchers also tested the stability and resilience of the wildlings' microbiota and found the microbiota was stable across five generations and resilient to environmental challenges. For example, when the mice were given antibiotics for seven days, the lab mice's gut microbiota changed and did not recover, while the wildlings' microbiota fully recovered. Further, when the mice were fed a 10-week high-fat diet, the microbiota of the lab mice changed significantly and never returned to baseline. The wildlings' microbiota changed only mildly and recovered shortly after the diet ended. The authors suggest that the stability and resilience of wildlings, if the model is used widely, could improve the validity and reproducibility of biomedical studies.

Finally, the researchers tested how well the wildlings could predict human immune responses. To do so, they drew from two studies where drugs used to target immune responses were successful in treating lab mice in preclinical trials but consequently failed to have therapeutic effects in humans. In the current study, the researchers treated wildlings and lab mice with the same drugs. The wildlings, but not the lab mice, mimicked the human responses seen in clinical trials.

"We always strive for effective ways to shorten the gap between early lab findings and health advances in people, and the wildling model has the potential to do just that," said NIDDK Director Griffin P. Rodgers, M.D. "By helping to predict immune responses of humans, the wildling model could lead to important discoveries to help treat and prevent disease, and ultimately, improve human health."

Credit: 
NIH/National Institute of Diabetes and Digestive and Kidney Diseases

Mapping the Milky Way in three dimensions

By measuring the distance from our sun to thousands of individual pulsating stars scattered across the Milky Way, researchers have charted our Galaxy on a larger scale than ever before. Their new three-dimensional map, which provides a broad view of our place among the stars, reveals the S-like structure of the Milky Way's warped stellar disc. "Our map shows the Milky Way disk is not flat. It is warped and twisted," says co-author Przemek Mroz in a related video. "This is the first time we can use individual objects to show this in three dimensions." Much of the current understanding of the spiral shape and structure of our Galaxy is built upon indirect measurements to celestial landmarks and inferences based on other galaxies in the Universe. However, the Galactic map drafted by those limited observations remains incomplete. Like so many lighthouses on distant foggy shores, classical Cepheid variable stars - massive young stellar bodies that burn hundreds, if not thousands of times brighter than our own Sun - pulsate at regular intervals and are visible through the vast clouds of interstellar dust that often obscure dimmer stellar bodies. Using the periodic variations in their brightness, the distances to these stars can be precisely determined. Dorota Skowron and colleagues charted the distance to more than 2,400 Cepheids throughout the Milky Way, most of which were identified by the Optical Gravitational Lensing Experiment (OGLE) - a project that more than doubled the number of known Galactic classical Cepheids. By determining the 3D coordinates of each distant pulsing star relative to our Sun, Skowron et al. built a large-scale 3D model of the Milky Way galaxy. The new map illustrates and helps constrain the previously observed shape of the Galaxy's warped stellar disc.

Credit: 
American Association for the Advancement of Science (AAAS)

Study assesses outcomes for meth users with burn injures

image: This is Dr. Kathleen Romanowski, assistant professor with UC Davis Health's Division of Burn Surgery.

Image: 
UC Davis Health

UC Davis Health researchers were surprised to find that methamphetamine use is not linked with worse health outcomes among burn patients. However, meth use was associated with significantly worse conditions for those patients after their release from the hospital.

Meth-positive patients commonly sustain large total body surface area (TBSA) burn injuries. They are often a result of drug-related accidents or explosions during meth production, according to the study authors.

“At first, we expected that matched meth-positive patients would have worse outcomes than meth-negative patients,” said senior author Kathleen Romanowski, assistant professor with UC Davis Health’s Division of Burn Surgery. “We were surprised to find that they did not have higher mortality or require more procedures, ventilation days, operating room visits or ICU days.”

Meth use and burn injuries

Using a database of burn-injured patients admitted to the Firefighters Burn Institute Regional Burn Center at UC Davis Medical Center over four years, the researchers examined all the burn-injury cases.

“This is the largest study to date to investigate methamphetamine use in burn-injured patients, with 264 meth-positive adult cases,” Romanowski said.

Out of the 264 meth-positive cases, they matched 193 patients with meth-negative patients based on their age and the nature of their injuries. The researchers looked at how measures of injury severity, burn management and socioeconomic data varied among matched and unmatched meth-positive and meth-negative patients.

Hospital stay and post-discharge care for meth-positive patients

The outcomes showed that meth-positive patients suffered worse injuries and stayed longer in hospital than meth-negative patients. Yet, more meth-positive patients left the hospital against medical advice. Fewer meth-positive patients had access to support such as skilled nursing facilities.

Meth-positive patients did worse in every measure of socioeconomic status, based on their ZIP codes. The early discharge could be linked to insurance status. Only 9% of the patients had private insurance, compared to nearly a quarter of meth-negative patients.

“Meth-positive patients are not receiving the same level of post-discharge care as their meth-negative counterparts. This is possibly due to lack of resources, addiction or perceived stigma,” Romanowski said. Patients who use meth may have unique and more complex issues that need to be addressed in addition to their burn injuries, said Romanowski. They represent a special group who needs additional resources to ensure they successfully recover.

“Providers need to consider how they can support these patients with adequate inpatient and follow-up care. This might include addiction counseling, social services or follow-up care facilitation,” Romanowski said. “Given these patients are being discharged with less support and access to resources, there is a need for further research on the long-term mortality and other health outcomes of these patients.”

On average, meth-positive patients were more likely than meth-negative patients to be younger (42 vs. 46 years), male (81.5% vs. 72.7%) smokers (54% vs. 29%) and drug dependent (81% vs. 16%). They also were less likely to have health issues such as congestive heart failure, high blood pressure requiring medication, obesity, diabetes and wheelchair dependency.

This research was supported by the National Center for Advancing Translational Sciences, National Institutes of Health through grant number UL1 TR001860. Co-authors of this study were Eve Solomon, David Greenhalgh, Soman Sen and Tina Palmieri, all of UC Davis Health. The study is published in the Journal of Burn Care and Research and available online. More information about UC Davis Health, including its burn center of excellence, is at health.ucdavis.edu.

Journal

Journal of Burn Care & Research

DOI

10.1093/jbcr/irz102

Credit: 
University of California - Davis Health

Three concepts from complexity could play a big role in social animal research

From bees to birds to humans, the animal kingdom is full of organisms that have evolved complex social structures to solve specific problems they encounter. Explaining why some species evolved more complex societies than others has been a fundamental challenge in the field of social animal research, and might be best approached with tools from complex systems, according to a team of researchers from the Santa Fe Institute.

Some complexity science concepts are already part of the lexicon of biology. For instance, evolution and adaptation are foundational to both fields. In a recent paper in Animal Behaviour, four current and past Santa Fe Institute postdoctoral fellows propose three additional concepts from complex systems science -- scales of organization, compression, and emergence -- that could be particularly useful to researchers studying complexity in animal sociality.

"None of these three concepts are, by themselves, diagnostic of a complex system, but all three are often part of one," says Elizabeth Hobson (University of Cincinnati), a former SFI complexity postdoc and lead author on the paper. "These concepts could lead us in totally new directions and offer new insights into animal social complexity."

The four authors on the paper come from wildly different perspectives, says mathematician Joshua Garland (Santa Fe Institute). Garland and Hobson, together with SFI Postdoctoral Fellow Artemy Kolchinsky and former SFI Omidyar Fellow Vanessa Ferdinand (University of Melbourne), span fields including information theory and neurology, cultural evolution, mathematics, and animal behavior. "The range of fields made it challenging to agree on just three concepts, but the diversity of perspectives is an asset to this paper," says Garland.

The first concept -- social scales -- is important to consider when measuring complexity in animal societies, because the level of complexity may vary across scales. The interactions of two individual honeybees, for example, may be quite simple, while the organizational structure of the hive can be highly complex.

The second concept, compression describes how systems encode information. Animal researchers could use compression to better compare different animal systems to one another or to describe the possible cognitive processes that allow social animals to remember relationships and group structures. "It could help us understand how animals reduce the overall cognitive load while functioning in their societies," says Hobson.

The final concept, emergence, is when a new pattern appears, often at a higher level of social organization, from lower-level interactions. A classic example is the wave-like behavior of a large flock of birds - something that can't exist at the individual level. Other social behaviors, like dominance hierarchies, culturally learned behaviors, or leadership within groups can also exhibit emergent properties.

Hobson and her co-authors suggest researchers consider these tools when exploring animal social complexity measures. "Taken together, we hope these three concepts from complex systems can help us better tackle longstanding questions about animal social structure and help better compare sociality across species," says Hobson.

"These are three big concepts that are both important and immediately applicable," says Garland. "but they just scratch the surface of complex systems ideas that could be useful for animal sociality research."

Credit: 
Santa Fe Institute

New computational method could advance precision medicine

Scientists have devised a new computational method that reveals genetic patterns in the massive jumble of individual cells in the body.

The discovery, published in the journal eLife, will be useful in discerning patterns of gene expression across many kinds of disease, including cancer. Scientists worked out the formulation by testing tissue taken from the testes of mice. Results in hand, they're already applying the same analysis to biopsies taken from men with unexplained infertility.

"There have been very few studies that attempt to find the cause of any disease by comparing single-cell expression measurements from a patient to those of a healthy control. We wanted to demonstrate that we could make sense of this kind of data and pinpoint a patient's specific defects in unexplained infertility," said co-senior author Donald Conrad, Ph.D., associate professor and chief of the Division of Genetics in the Oregon National Primate Research Center at Oregon Health & Science University.

Simon Myers, Ph.D., of the University of Oxford, also is a senior co-author.

Conrad said he expects the new method will advance the field of precision medicine, where individualized treatment can be applied to the specific nuance of each patient's genetic readout.

The scientists made the breakthrough by applying a method recently developed at the University of Oxford to gene expression data from the massive trove of individual cells comprising even minuscule tissue biopsies. The method is known as sparse decomposition of arrays, or SDA.

"Rather than clustering groups of cells, SDA identifies components comprising groups of genes that co-vary in expression," the authors write.

The new study applied the method to 57,600 individual cells taken from the testes of five lines of mice: Four that carry known genetic mutations causing defects in sperm production and one with no sign of genetic infertility. Researchers wanted to see whether it was possible to sort this massive dataset based on the variation in physiological traits resulting from differences in the genes expressed in the RNA, or ribonucleic acid, of individual cells.

Researchers found they were able to cut through the statistical noise and sort many thousands of cells into 46 genetic groups.

"It's a data-reduction method that allows us to identify sets of genes whose activity goes up and down over subsets of cells," Conrad said. "What we're really doing is building a dictionary that describes how genes change at a single-cell level."

The work will immediately apply to male infertility.

Infertility affects an estimated 0.5% to 1% of the male population worldwide. Current measures to treat male infertility involve focus on managing defects in the sperm itself, including through in vitro fertilization. However, those techniques don't work in all cases.

"We're talking about the problem where you don't make sperm to begin with," Conrad said.

This new technique could open new opportunities to diagnose a specific genetic defect and then potentially rectify it with new gene-editing tools such as CRISPR. Identification of a specific cause would be a vast improvement over the current state of the art in diagnosing male infertility, which amounts to a descriptive analysis of testicular tissue biopsies.

"The opportunity provided by CRISPR, coupled to this kind of diagnosis, is really a match made in heaven," Conrad said.

Credit: 
Oregon Health & Science University

Fear of more dangerous second Zika, dengue infections unfounded in monkeys

MADISON -- An initial infection with dengue virus did not prime monkeys for an especially virulent infection of Zika virus, according to a study at the University of Wisconsin-Madison. Nor did a bout with Zika make a follow-on dengue infection more dangerous.

As outbreaks on Pacific islands and in the Americas in recent years made Zika virus a pressing public health concern, the Zika virus's close similarity to dengue presented the possibility that one infection may exacerbate the other.

Dengue virus infections are infamous for being bad the first time around. But following infection with one of the four variants (called serotypes) of dengue with an infection by a different serotype can amplify the already dangerous symptoms -- high temperature, fatigue and pain -- and make dengue fever even more life-threatening.

"When that second dengue virus occurs, antibodies kind of recognize it, but not in a way that allows them to take the virus out of the system and neutralize it like normal," says Dawn Dudley, a scientist in the University of Wisconsin-Madison's Department of Pathology and Laboratory Medicine and one of the authors of the new Zika study. "Instead, they have kind of a secondary effect, where by binding to the virus loosely they actually enhance the ability of the virus to get into other cells in the body and replicate more."

Studies in tissue cultures and mice of back-to-back Zika and dengue infections suggested that the two members of the Flaviviruses -- a genus that also includes West Nile virus and yellow fever virus -- could interact to enhance each other. Data collected from human infections since the UW-Madison group began its work in 2017 appeared to contradict those tissue culture and mouse findings.

The study of 21 Wisconsin National Primate Research Center macaque monkeys, in which animals infected with one virus were challenged with another within nine to 12 months, supports the human epidemiological results.

"Whether it was a primary infection with one of the dengue serotypes followed by a Zika infection, or Zika first with a later dengue infection, we didn't see anything unusual in those secondary infections," says UW-Madison pathology research specialist Meghan Breitbach, also an author of the study.

Monkey weights, body temperatures, red and white blood cell counts, liver function and markers of cell damage did not stray significantly from typical infection levels.

"Because we've done several prior studies of Zika virus infections, we have a lot of historical data on what a typical infection looks like in these animals," says Christina Newman, study author and UW-Madison scientist in Pathology and Laboratory Medicine. "For the animals that were experiencing a secondary Zika virus infection after primary dengue infection, their viral loads were almost indistinguishable from animals that were only ever infected with Zika."

That news, which was published today in the journal PLOS Pathogens, is a positive development. But it comes with a caveat important to Zika: none of the study's monkeys were pregnant. Zika's most visible and troubling results are neurological problems in babies whose mothers were infected during pregnancy, though those complications vary widely.

"The immune system is different in pregnancy," Dudley says. "Previous dengue immunity may still be one of the reasons that some women have severe congenital Zika syndrome outcome in their infant while another woman with a known Zika infection doesn't."

A UW-Madison study of pregnant monkeys encountering both viruses could soon help describe whether back-to-back infections are more dangerous for the monkeys and their offspring.

The newly published study, which was supported by the National Institutes of Health, also represents a snapshot of monkeys encountering infections roughly one year apart, Newman says.

Dengue fever is enhanced by an earlier dengue infection only during certain conditions dependent on the serotypes of dengue involved, whether the immune memory produced by the initial infection was relatively strong or weak, and how much the antibodies created may have faded over months or years. The complicating factors have led to caution in development of Zika and dengue vaccines for fear of sparking more severe infections later.

"Our study suggests that that is unlikely," Newman says. "But as we learn more about people whose infections come two or three years apart, we may see we need to combine a Zika vaccine with a good vaccine against all four serotypes of dengue virus to prevent enhancement of either virus."

Credit: 
University of Wisconsin-Madison

Deep learning AI may identify atrial fibrillation from a normal rhythm ECG

An artificial intelligence (AI) model has been found to identify patients with intermittent atrial fibrillation even when performed during normal rhythm using a quick and non-invasive 10 second test, compared to current tests which can take weeks to years. Although early and requiring further research before implementation, the findings could aid doctors investigating unexplained strokes or heart failure, enabling appropriate treatment.

Researchers have trained an artificial intelligence model to detect the signature of atrial fibrillation in 10-second electrocardiograms (ECG) taken from patients in normal rhythm. The study, involving almost 181,000 patients and published in The Lancet, is the first to use deep learning to identify patients with potentially undetected atrial fibrillation and had an overall accuracy of 83%. The technology finds signals in the ECG that might be invisible to the human eye, but contain important information about the presence of atrial fibrillation.

Atrial fibrillation is estimated to affect 2.7-6.1 million people in the United States [1], and is associated with increased risk of stroke, heart failure and mortality. It is difficult to detect on a single ECG because patients' hearts can go in and out of this abnormal rhythm, so atrial fibrillation often goes undiagnosed.

Dr Paul Friedman, Chair of the Department of Cardiovascular Medicine, Mayo Clinic, USA says: "Applying an AI model to the ECG permits detection of atrial fibrillation even if not present at the time the ECG is recorded. It is like looking at the ocean now and being able to tell that there were big waves yesterday." [2]

He notes: "Currently, the AI has been trained using ECGs in people who needed clinical investigations, but not people with unexplained stroke nor the overall population, and so we are not yet sure how it would perform at diagnosing these groups. However, the ability to test quickly and inexpensively with a non-invasive and widely available test might one day help identify undiagnosed atrial fibrillation and guide important treatment, preventing stroke and other serious illness." [2]

After an unexplained stroke, it is important to accurately detect atrial fibrillation so that patients with it are given anticoagulation treatment to reduce the risk of recurring stroke, and other patients (who may be harmed by this treatment) are not. Currently, detection in this situation requires monitoring for weeks to years, sometimes with an implanted device, potentially leaving patients at risk of recurrent stroke as current methods do not always accurately detect atrial fibrillation, or take too long.

Hearts with atrial fibrillation develop structural changes, such as chamber enlargement. Before those changes are visible to standard imaging techniques such as echocardiograms, there is likely fibrosis (scarring) of the heart associated with atrial fibrillation. Additionally, the presence of atrial fibrillation may temporarily modify the electrical properties of the heart muscle, even after it has ended.

The researchers set out to train a neural network -- a class of deep learning AI -- to recognise subtle differences in a standard ECG that are presumed to be due to these changes, although neural networks are "black boxes" and the specific findings that drives their observations are not known. The authors used ECGs of cardiac rhythm acquired from almost 181,000 patients [3] (around 650,000 ECG scans) between December 1993 and July 2017, dividing the data into patients who were either positive or negative for atrial fibrillation.

ECG data was assigned into three groups: training, internal validation and testing datasets with 70% in the training group, 10% in validation and optimisation, and 20% in the testing group (454,789 ECGs from 126,526 patients in the training dataset, 64,340 ECGs from 18,116 patients in the validation dataset and 130,802 ECGs from 36,280 patients in the testing dataset).

The AI performed well at identifying the presence of atrial fibrillation: testing on the first cardiac ECG output from each patient, the accuracy was 79% (for a single scan), and when using multiple ECGs for the same patient the accuracy improved to 83%. Further research is needed to confirm the performance on specific populations, such as patients with unexplained stroke (embolic stroke of undetermined source - ESUS), or heart failure.

The authors of the study also speculate that it may one day be possible to use this technology as a point-of-care diagnostic test in the doctor's surgery to screen high-risk groups. Screening people with hypertension, diabetes, or age over 65 years for atrial fibrillation could help avoid ill health, however, current detection methods are costly and identify few patients. In addition, this screening currently requires wearing a large and uncomfortable heart monitor for days or weeks.

Dr Xiaoxi Yao, a study co-investigator from Mayo Clinic, USA, says: "It is possible that our algorithm could be used on low-cost, widely available technologies, including smartphones, however, this will require more research before widespread application." [2]

The authors note several limitations and further research before their work reaches clinics. The population studied may have higher prevalence of atrial fibrillation compared to the general population. The AI has therefore been trained to retrospectively classify clinically indicated ECGs more than for prediction in healthy patients, or those with unexplained stroke, and may need calibration before widespread application to screening of a broader, healthy population.

Patients were considered negative for atrial fibrillation if they did not have verified diagnosis, but there were likely some patients who had been undiagnosed and labelled erroneously, so the AI may have identified what previous testing had not. On the other hand, some of the false-positive patients identified by the AI as having a history of atrial fibrillation (despite being classified as negative by a human) might actually have had undiagnosed atrial fibrillation. Since the AI is only as good as the data it is trained against there could be errors in the interpretation when the test is applied to other populations, such as a individuals without an indicated ECG.

In a linked Comment, Dr Jeroen Hendriks of the University of Adelaide and Royal Adelaide Hospital, Adelaide, Australia, says: "In summary, Attia and colleagues are to be congratulated for their innovative approach and the thorough development and local validation of the AI-enabled ECG. Given that AI algorithms have recently reached cardiologist level in diagnostic performance this AI-ECG interpretation is ground-breaking in creating an algorithm to reveal the likelihood of atrial fibrillation in ECGs showing sinus rhythm."

Credit: 
The Lancet

NYU physicist receives US Department of Energy Early Career Award

image: This is NYU physicist Jiehang Zhang.

Image: 
Image courtesy of New York University

New York University physicist Jiehang Zhang has received an Early Career Award from the U.S. Department of Energy.

Zhang, an assistant professor in NYU's Department of Physics, is a pioneer in the study of quantum systems--the behavior of large numbers of interacting ions. His five-year grant totals approximately $750,000.

"Supporting our nation's most talented and creative researchers in their early career years is crucial to building America's scientific workforce and sustaining America's culture of innovation," said Secretary of Energy Rick Perry in announcing this year's Early Career Award recipients. "We congratulate these young researchers on their significant accomplishments to date and look forward to their achievements in the years ahead."

Zhang's work with ions--atoms with electrical charges--is significant because quantum systems, his area of focus, can solve problems of great scientific interest that cannot be addressed with other methods--conventional digital computers. This is because trapped ions evolve according to the rules of quantum mechanics, enabling new types of information processing, known as quantum computing and quantum simulations.

The Early Career Awards, now in their 10th year, are "designed to bolster the nation's scientific workforce by providing support to exceptional researchers during the crucial early career years, when many scientists do their most formative work," the DOE said in its announcement.

To be eligible for the DOE award, a researcher must be an untenured, tenure-track assistant or associate professor at a U.S. academic institution or a full-time employee at a DOE national laboratory, who received a Ph.D. within the past 10 years.

A list of the 73 awardees, their institutions, and titles of research projects is available on the Early Career Research Program webpage https://science.osti.gov/early-career.

Credit: 
New York University

Veterans with traumatic brain injuries have higher suicide risk

AURORA, Colo. (Aug. 1, 2019) - Military veterans with a history of traumatic brain injury (TBI) are more than twice as likely to die by suicide compared with veterans without such a diagnosis, according to a newly published study by researchers led by faculty from the CU School of Medicine.

The researchers reviewed electronic medical records of more than 1.4 million military veterans who received care from the Veterans Health Administration (VHA) between Oct. 1, 2005 and Sept. 30, 2015. Combining these records with National Death Index data, they evaluated the severity of TBI, and diagnoses of psychiatric and other medical conditions. Among those that died by suicide, the method was also analyzed

After adjusting for psychiatric diagnoses, such as depression, the researchers found that those with moderate or severe TBI were 2.45 times as likely to die by suicide compared to those without a TBI diagnosis. Additionally, among suicide decedents, they report that the odds of using firearms as a means of suicide was significantly increased for those with moderate or severe TBI as compared to those without a history of TBI.

"Together, these findings underscore the importance of understanding Veterans' lifetime history of TBI to prevent future deaths by suicide, and support the implementation of screening initiatives for lifetime history of TBI among all individuals utilizing the VHA," the researchers write in the Journal of Head Trauma Rehabilitation. Moreover, the findings support research regarding lethal means safety among those with moderate to severe TBI.

The corresponding author of the article, which was published online this week, is Lisa A. Brenner, PhD, professor of physical medicine and rehabilitation at the CU School of Medicine and director of the Veterans Health Administration Rocky Mountain Mental Illness Research Education and Clinical Center, which supported the project.

During the period of time they studied, the rate of suicide was 86 per 100,000 person years for those with TBI compared with 37 per 100,000 person years for those without TBI. Of those in the sample who died by suicide, 68 percent used firearms. Veterans with moderate or severe TBIs had the highest proportion of suicides by firearms at 78 percent.

Credit: 
University of Colorado Anschutz Medical Campus

How little we know: Experts document the lack of research on youth firearm injury

A century ago, if a child or teenager died, an infectious disease was the most likely cause.

A half century ago, if a child or teenager died, the most common reason was injuries from an automobile crash.

Today, if an American child or teenager dies, firearm-related injuries and automobile crashes are almost equally likely to be to blame.

Research on everything from vaccines to seatbelts has changed the odds for children and adolescents, and the rate of crash-related deaths continues to drop. But research on firearm injuries has been lacking.

Now, experts have released the largest-ever examination of the state of research on all aspects of youth firearm injury - whether intentional, unintentional or self-inflicted.

The bottom-line conclusion: Far more research, and better research, is needed on children, teens and the prevention and aftermath of firearm injuries and deaths. If translated into action, such new knowledge could help reduce death and injury rates, and other effects.

The papers that lead to this conclusion make up the entire new issue of the Journal of Behavioral Medicine. Nearly all the papers in it come from researchers from across the U.S. who make up the Firearm Safety Among Children and Teens consortium funded by the National Institutes of Health.

Each of the papers in the issue focuses on firearm injury and violence as a public health issue - just as infectious diseases and automobile safety have come to be been seen in decades past.

Rebecca Cunningham, M.D., co-directs FACTS and is interim vice president for research and an emergency medicine physician at the University of Michigan. "Firearm injuries and deaths have the potential to be understood much the same way as many other public health threats have been understood, using scientific methods," she says. "Much more work is clearly necessary."

Cunningham and her U-M colleagues Patrick Carter, M.D., and Marc Zimmerman, Ph.D., who co-direct FACTS, wrote an overview of FACTS for the new special issue.

The research reviews done by FACTS members cover five key areas. Here are some of their findings and recommendations:

Factors that put young people at higher risk of firearm injury, or protect them from risk

The researchers studying this issue found only 28 studies published between 1985 and 2018 on this topic, and only five that were rigorous studies that looked at data from a large population over time. Most of the studies looked at individual-level risk factors, not ones at the school or population level, or protective factors at either level.

Use of marijuana, and problematic use of alcohol, emerged as two of the most specific risk factors that were associated with higher risk of suffering a firearm injury or dying from one. So were holding attitudes favoring retaliation, and prior involvement in violent events, including as a victim. There is also some evidence that having post-traumatic stress disorder as a youth is a risk factor for firearm injury, as is living in a community where violence or economic disadvantage are higher than normal.

Risk factors for youth suicide by firearm has been studied, but the evidence is mixed about the impact of prior mental health issues and earlier suicide attempts on that risk. Living in a home where there is access to a gun, especially one that is unlocked or kept in plain sight, has been found by several studies to be associated with increased risk of youth suicide.

The authors call for more research on larger and broader samples, and more attention to school- and family-level risk factors, and protective factors.

Prevention of injuries from firearms among children and teens

The review identified only 46 studies that focused on preventive measures to reduce the risk of firearm injury and death among youth between 1985 and 2018. These included studies of "safe storage" efforts to encourage use of trigger locks and lock boxes or safes by gun owners with children in the home, and other initiatives to prevent children and teens from handling or carrying firearms or using them without supervision.

Parent education during a health care visit appears to increase safe gun storage when combined with the distribution of free gun locks, including general groups of parents and parents of teens who have survived a suicide attempt. Community efforts to give away gun locks have shown an impact on safe gun storage, but the impact on actual firearm injury rates among children are unknown.

Education programs designed to prevent unsupervised firearm handling by young children show some promise when they include children rehearsing behavioral skills, but effects outside of the classroom are unclear. Among adolescents, education programs based on "scare tactics" did not have evidence behind them, the researchers find. A recommended next step is to adapt effective youth violence prevention programs to focus on preventing firearm injuries among adolescents.

In general, they conclude, much more research is needed to build evidence around different preventive strategies. They recommend that firearm owners should be partners in the development of such strategies, and the research to evaluate their effects - similar to what was done when children's car seats were being developed and refined through research. In the meantime, counseling about safe storage during a health care visit, and provision of locking devices, are identified as a next step to reduce firearm injuries and death among youth.

Association of firearm laws with outcomes in children and teens

The authors found only 20 studies were done between 1985 and mid-2018 that looked at the relationship between laws regarding firearms and outcomes such as firearm injury by children and teens. Most looked at the impacts of laws designed to prevent children from accessing guns by threatening to charge adults with a crime if their guns were able to be taken or used by children.

The authors conclude there is strong evidence that child access prevention laws are associated with reductions in unintentional firearm mortality among children and teens. There is also evidence that these laws are associated with reductions in suicide.

Meanwhile, there are so few high-quality studies of the impacts of other firearm-related laws on outcomes in children and teens, it is not possible to draw conclusions about the laws' effects. The lack of a national surveillance system that systematically tracks nonfatal firearm injuries or firearm crimes, and researchers' reliance instead on hospital inpatient records and incomplete law enforcement data, make high-quality research more difficult.

In the end, the authors make a call for more, high quality research and surveillance systems to better determine the impact of firearm laws on childhood firearm injury and death, writing, "policymakers and other stakeholders often lack the evidence they need to craft, evaluate, and make informed decisions regarding firearm policies. This should not be interpreted as the policies not having an effect, but rather that the research is often too sparse to measure impact."

Long-term consequences of firearm incidents involving children and teens

For this review, the researchers found only 31 published studies in English from 1985 to 2018 that documented the long-term impact of firearm injury on the mental and physical health of children and teens. Much of the existing literature was published using datasets from European countries. Existing literature is clear that a child's exposure to firearm injury increases the long-term risk of post-traumatic stress and future injury - regardless of the type of injury that a child is exposed to.

Of the 14 studies examining the effects of mass shootings on youth, 6 were written about a single event in Norway, and the only articles from the United States were written before the year 2000. No manuscripts examined the effect of school shootings or community gun violence on parents or teachers. No manuscripts described social media responses to mass shootings.

The authors note that more than 90% of youth firearm deaths are due to assault or suicide. Literature in these areas was particularly lacking. Although youth firearm suicide is common, very few studies examined mental health or suicide contagion among witnesses and survivors.

Most importantly, this review revealed that -- despite the obvious physical and mental effects of firearm injury on youth -- there is almost no evidence on how to prevent future mental, physical, and behavioral issues in youth who experience or witness injury.

Patterns of gun-carrying behavior by older children and teens

This review found 53 studies of all aspects of teen firearm carriage, with most studies focused on urban African-American youth or youth involved in the criminal justice system. The authors rated one-third of these studies as high-quality.

In general, the studies show that among the 5% to 10% of teens who carry firearms during adolescence, most do so intermittently throughout their teenage years, and primarily for reasons of self-defense or protection. Researchers found only one study focused among rural teens, with this study demonstrating similar rates of firearm carriage to those conducted among urban adolescent populations. Youth who carried for a longer period of time during adolescence and those that carried consistently throughout adolescence had more exposure and involvement in violence, substance use, and the criminal justice system.

Based on their findings, the study authors called for more research on this issue, especially research with a focus dedicated to adolescent populations, including studies that examine firearm carriage among different gender and racial/ethnic groups, as well as within a variety of urban, rural, and suburban contexts.

The authors emphasize the need for more rigorous studies that evaluate firearm carriage separate from the carriage of other weapons such as knives. They also call for researchers to study what motivates teens to start carrying a firearm or continue carrying one, and what factors at a population level are most associated with firearm carriage to inform public health interventions to decrease risky firearm behaviors among adolescent populations.

Credit: 
Michigan Medicine - University of Michigan