Culture

Crows consciously control their calls

image: Carrion Crow vocalizing

Image: 
Tobias Machts

Crows can voluntarily control the release and onset of their calls, suggesting that songbird vocalizations are under cognitive control, according to a study published August 27 in the open-access journal PLOS Biology by Katharina Brecht of the University of Tübingen, and colleagues.

Songbirds are renowned for their acoustically elaborate songs; these show a degree of flexibility, potentially indicating that they are under conscious control. However, the observed variability in vocalizations might simply be driven by involuntary mechanisms, and need not be based on cognitive control. In the new study, Brecht and colleagues directly tested the idea that songbirds deliberately control their calls, in the sense that they can be emitted or inhibited at will, as opposed to being knee-jerk responses to food, mates, or predators.

The findings show that trained carrion crows (Corvus corone), songbirds of the corvid family, can exert control over their calls in a goal-directed manner. In a detection task, three male carrion crows rapidly learned to emit calls in response to a visual cue (colored squares) with no inherent meaning ("go-trials"), and to withhold calls in response to another cue. Two of these crows were then trained on a task with the cue colors reversed, in addition to being rewarded for withholding vocalizations to yet another cue ("nogo-trials").

Vocalizations in response to the detection of the go-cue were precise timed and highly reliable in all three crows. The crows also quickly learned to withhold calls in nogo-trials, showing that vocalizations weren't produced by an anticipation of a food reward in correct trials. According to the authors, further work is needed to evaluate the neurobiological basis of such cognitive vocal control in birds.

"Our study shows that crows can be thaught to control their vocalizations, just like primates can, and that their vocalizations are not just a reflexive response. This finding not only demonstrates once again the cognitive sophistication of the birds of the crow family. It also advances our understanding of the evolution of vocal control."

Credit: 
PLOS

Prehistoric puma poo reveals oldest parasite DNA ever recorded

image: The oldest parasite DNA ever recorded has been found in the ancient, desiccated faeces of a puma.

Image: 
Ancient parasitic DNA reveals Toxascaris leonina presence in Final Pleistocene of South America. Romina S. Petrigh, Jorge G. Martínez, Mariana Mondini and Martín H. Fugassa. Parasitology.

A team of Argentinian scientists from the National Council of Scientific and Technical Research (CONICET) made the discovery after studying a coprolite taken from a rock-shelter in the country's mountainous Catamarca Province, where the remains of now extinct megafauna have previously been recovered in stratigraphic excavations.

Radiocarbon dating revealed that the coprolite and thus the parasitic roundworm eggs preserved inside dated back to between 16,570 and 17,000 years ago, towards the end of the last Ice Age.

At that time, the area around the shelter at Peñas de las Trampas in the southern Andean Puna was thought to have been wetter than today, making it a suitable habitat for megafauna like giant ground sloths, and also smaller herbivores like American horses and South American camelids which the pumas may have preyed on.

Ancient mitochondrial DNA analysis was used to confirm the coprolite came from a Puma (Puma concolor) and that the eggs belonged to Toxascaris leonina, a species of roundworm still commonly found in the digestive systems of modern day cats, dogs and foxes.

The study, published in the journal Parasitology, explains that the extremely dry, cold and salty conditions which took hold at the Peñas de las Trampas site since the onset of the Holocene would have helped to reduce the breakdown of the DNA, allowing it to be preserved.

Led by Romina Petrigh and Martín Fugassa, the study was carried out by an interdisciplinary team including archaeologists and biologists and is part of a project that views ancient faeces as important paleobiological reservoirs.

Dr Petrigh, from the National University of Mar del Plata and CONICET, said: "While we have found evidence of parasites in coprolites before, those remains were much more recent, dating back only a few thousand years. The latest find shows that these roundworms were infecting the fauna of South America before the arrival of the first humans in the area around 11,000 years ago."

She added: "I was very happy when I discovered how old this DNA was. It's difficult to recover DNA of such an old age as it usually suffers damage over time. Our working conditions had to be extremely controlled to avoid contamination with modern DNA, so we used special decontaminated reagents and disposable supplies. Several experiments were performed to authenticate the DNA sequences obtained and the efforts of the team of researchers who participated was essential."

The discovery marks a number of firsts: it represents the oldest record of an ancient DNA sequence for a gastrointestinal nematode parasite of wild mammals, the oldest molecular parasite record worldwide, and also a new maximum age for the recovery of old DNA of this origin.

For Dr Petrigh, the findings also cast light on both the past and the present. She said: "This work confirms the presence of T. leonina in prehistoric times, presumably even before that of humans in the region, and it represents the oldest record in the world. The common interpretation is that the presence of T. leonina in American wild carnivores today is a consequence of their contact with domestic dogs or cats, but that should no longer be assumed as the only possible explanation.

"Our aDNA studies have also confirmed the presence of pumas in the southern Puna at the end of the Pleistocene. This has significant implications for the natural history of the region, as well as for inferring the ecological context immediately before - as far as is known - the first human explorers ventured into the area."

She added: "The large number of eggs of T. leonina and its larva state in the puma coprolite analysed here indicate the high infective capacity of this parasite, involving a high risk for carnivores and for humans."

Credit: 
Cambridge University Press

Clinical trial shows alternate-day fasting a safe alternative to caloric restriction

image: This visual abstract reflects the finds of Stekovic et al., who show in the clinic that alternate day fasting (ADF) is a simple alternative to calorie restriction and provokes similar improvements on cardiovascular parameters and body composition. ADF was shown to be safe and beneficial in healthy, non-obese humans, not impairing immune function or bone health.

Image: 
Stekovic et al./<em>Cell Metabolsim</em>

In recent years there has been a surge in studies looking at the biologic effects of different kinds of fasting diets in both animal models and humans. These diets include continuous calorie restriction, intermittent fasting, and alternate-day fasting (ADF). Now the largest study of its kind to look at the effects of strict ADF in healthy people has shown a number of health benefits. The participants alternated 36 hours of zero-calorie intake with 12 hours of unlimited eating. The findings are reported August 27 in the journal Cell Metabolism.

"Strict ADF is one of the most extreme diet interventions, and it has not been sufficiently investigated within randomized controlled trials," says Frank Madeo, a professor of the Institute of Molecular Biosciences at Karl-Franzens University of Graz in Austria. "In this study, we aimed to explore a broad range of parameters, from physiological to molecular measures. If ADF and other dietary interventions differ in their physiological and molecular effects, complex studies are needed in humans that compare different diets."

In this randomized controlled trial, 60 participants were enrolled for four weeks and randomized to either an ADF or an ad libitum control group, the latter of which could eat as much as they wanted. Participants in both groups were all of normal weight and were healthy. To ensure that the people in the ADF group did not take in any calories during fast days, they underwent continuous glucose monitoring. They were also asked to fill in diaries documenting their fasting days. Periodically, the participants had to go to a research facility, where they were instructed on whether to follow ADF or their usual diet, but other than that they lived their normal, everyday lives.

Additionally, the researchers studied a group of 30 people who had already practiced more than six months of strict ADF previous to the study enrollment. They compared them to normal, healthy controls who had no fasting experience. For this ADF cohort, the main focus was to examine the long-term safety of the intervention.

"We found that on average, during the 12 hours when they could eat normally, the participants in the ADF group compensated for some of the calories lost from the fasting, but not all," says Harald Sourij, a professor at the Medical University of Graz. "Overall, they reached a mean calorie restriction of about 35% and lost an average of 3.5 kg [7.7 lb] during four weeks of ADF."

The investigators found several biological effects in the ADF group:

* The participants had fluctuating downregulation of amino acids, in particular the amino acid methionine. Amino acid restriction has been shown to cause lifespan extension in rodents.

They had continuous upregulation of ketone bodies, even on nonfasting days. This has been shown to promote health in various contexts.

They had reduced levels of sICAM-1, a marker linked to age-associated disease and inflammation.

They had lowered levels of triiodothyronine without impaired thyroid gland function. Previously, lowered levels of this hormone have been linked to longevity in humans.

They had lowered levels of cholesterol.

They had a reduction of lipotoxic android trunk fat mass--commonly known as belly fat.

"Why exactly calorie restriction and fasting induce so many beneficial effects is not fully clear yet," says Thomas Pieber, head of endocrinology at the Medical University of Graz. "The elegant thing about strict ADF is that it doesn't require participants to count their meals and calories: they just don't eat anything for one day."

The investigators point to other benefits that ADF may have, compared with continuous calorie restriction. Previous studies have suggested calorie-restrictive diets can result in malnutrition and a decrease in immune function. In contrast, even after six months of ADF, the immune function in the participants appeared to be stable.

"The reason might be due to evolutionary biology," Madeo explains. "Our physiology is familiar with periods of starvation followed by food excesses. It might also be that continuous low-calorie intake hinders the induction of the age-protective autophagy program, which is switched on during fasting breaks."

Despite the benefits, the researchers say they do not recommend ADF as a general nutrition scheme for everybody. "We feel that it is a good regime for some months for obese people to cut weight, or it might even be a useful clinical intervention in diseases driven by inflammation," Madeo says. "However, further research is needed before it can be applied in daily practice. Additionally, we advise people not to fast if they have a viral infection, because the immune system probably requires immediate energy to fight viruses. Hence, it is important to consult a doctor before any harsh dietary regime is undertaken."

In the future, the researchers plan to study the effects of strict ADF in different groups of people including people with obesity and diabetes. They also plan to compare ADF to other dietary interventions and to further explore the molecular mechanisms in animal models.

Credit: 
Cell Press

These albino lizards are the world's first gene-edited reptiles

video: This video shows the gene-editing procedure used to create albino lizards.

Image: 
Rasys et al./<em>Cell Reports</em>

Meet the world's first gene-edited reptiles: albino lizards roughly the size of your index finger. Researchers used CRISPR-Cas9 to make the lizards, providing a technique for gene editing outside of major animal models. In their study, publishing August 27 in the journal Cell Reports, the researchers also show that the lizards can successfully pass gene-edited alleles for albinism to their offspring.

"For quite some time we've been wrestling with how to modify reptile genomes and manipulate genes in reptiles, but we've been stuck in the mode of how gene editing is being done in the major model systems," says corresponding author Doug Menke, an associate professor at the University of Georgia. "We wanted to explore anole lizards to study the evolution of gene regulation, since they've experienced a series of speciation events on Caribbean islands, much like Darwin's finches of the Galapagos."

The way gene editing is performed in most model systems is to inject CRISPR-Cas9 gene-editing reagents into freshly fertilized eggs or single-cell zygotes. But this technique cannot be used in reptiles, Menke says, because lizards have internal fertilization and the time of fertilization cannot be predicted. An isolated single-cell embryo from a female lizard also cannot be easily transferred, making it almost impossible to manipulate outside of the lizard.

But Menke and his research team noticed that the transparent membrane over the ovary allowed them to see all of the developing eggs, including which eggs were going to be ovulated and fertilized next. They decided to inject the CRISPR reagents into the unfertilized eggs within the ovaries and see if the CRISPR would still work.

"Because we are injecting unfertilized eggs, we thought that we would only be able to perform gene editing on the alleles inherited from the mother. Paternal DNA isn't in these unfertilized oocytes," Menke says. "We had to wait three months for the lizards to hatch, so it's a bit like slow-motion gene editing. But it turns out that when we did this procedure, about half of the mutant lizards that we generated had gene-editing events on the maternal allele and the paternal allele."

This suggests that the CRISPR components remain active for several days, or even weeks, within the unfertilized eggs. After screening the offspring, the researchers found that about 6% to 9% of the oocytes, depending on their size, produced offspring with gene-editing events.

"Relative to the very established model systems that can have efficiencies up to 80% or higher, 6% seems low, but no one has been able to do these sorts of manipulations in any reptile before," Menke says. "There's not a large community of developmental geneticists that are studying reptiles, so we're hoping to tap into exciting functional biology that has been unexplored."

Menke says that his team had two reasons for making the lizards albino, as opposed to editing other traits. First, when the tyrosinase albinism gene is knocked out, it results in a loss of pigmentation without being lethal to the animal. Second, since humans with albinism often have vision problems, the researchers hope to use the lizards as a model to study how the loss of this gene impacts retina development.

"Humans and other primates have a feature in the eye called the fovea, which is a pit-like structure in the retina that's critical for high-acuity vision. The fovea is absent in major model systems, but is present in anole lizards, as they rely on high-acuity vision to prey on insects," Menke says.

Studying gene functions in reptiles offers new opportunities for exploring aspects of development that are best studied in non-established animal models, Menke says. And ultimately, this gene-editing technique could be translated for use in other animals.

"We never know where the next major insights are going to come from, and if we can't even study how genes work in a huge group of animals, then there's no way to know if we've explored everything there is to explore in the realm of gene function in animals," Menke says. "Each species undoubtedly has things to tell us, if we take the time to develop the methods to perform gene editing."

Credit: 
Cell Press

Possible treatment on the horizon for severe dengue disease

image: Live imaging demonstrate vascular leakage over time from the blood vessels of DENV infected preclinical models (lane 2). The leakage was reversed when treated with tryptase inhibitor nafamostat mesylate (lane 3).

Image: 
Duke-NUS Medical School

SINGAPORE, 27 August 2019 - Researchers led by Duke-NUS Medical School have discovered that tryptase, an enzyme in human cells that acts like scissors to cut up nearby proteins, is responsible for blood vessel leakage in severe dengue haemorrhagic fever. The finding suggests a possible new treatment strategy using the tryptase inhibitor, nafamostat mesylate, for severe dengue disease - a potentially fatal condition for which no targeted treatment is currently available.

The dengue virus infects about 390 million people globally each year, causing substantial morbidity and mortality. While most patients experience dengue fever or a mild form of the disease, a small percentage develops dengue haemorrhagic fever (DHF), the more severe occurrence of dengue wherein blood 'leaks' from ruptured blood vessels. This can lead to dengue shock syndrome (DSS) - the final stage of DHF - where the circulatory system fails, sending the body into bleeding and shock, which is fatal without prompt treatment. How dengue patients go on to develop these severe conditions had not been clearly understood and, as a result, no targeted treatments have been developed to prevent hemorrhaging or reverse shock in infected patients.

"We discovered that, in severe cases, a particular enzyme called tryptase cuts the proteins that act as seals between blood vessel cells, resulting in blood vessel leakage and shock during dengue infection," said Assistant Professor Ashley St. John, from Duke-NUS' Emerging Infectious Diseases Programme, corresponding author of the study.

Based on this finding, the team wanted to know if a drug specific to inhibiting tryptase could be used to treat the hemorrhaging. Nafamostat mesylate, a clinically-approved tryptase inhibitor with a good safety profile, was tested using preclinical models. They found that administration of this drug, which is already used for the treatment of certain bleeding complications in some countries, prevented vascular leakage in the dengue model. Even delayed treatment with the drug was significantly effective in reducing dengue vascular leakage in a preclinical model of severe disease. The team also observed tryptase levels were very high in the blood of severe dengue patients who experienced DHF/DSS, but low in patients who easily recovered, affirming the link between high levels of the enzyme and severe dengue disease.

"Currently, only supportive care is available to patients suffering from severe dengue disease, with no targeted treatment for this potentially fatal condition. We believe our findings raise the possibility of developing new targeted treatments for dengue and, specifically, one that might be able to prevent shock," added Asst Prof St. John.

"We are currently experiencing a surge in dengue cases in Singapore," said Professor Patrick Casey, Senior Vice Dean for Research at Duke-NUS. "This timely study by our researchers not only holds out hope for a promising new strategy to treat severe dengue disease, but could also have broader implications for the treatment of other haemorrhagic diseases."

The study authors say the next step is to conduct clinical trials to test if tryptase inhibitors can reverse dengue vascular leakage and prevent shock in humans.

Credit: 
Duke-NUS Medical School

Retina-on-a-chip provides powerful tool for studying eye disease

image: This is the the retina-on-a-chip technology.

Image: 
Fraunhofer IGB

The development of a retina-on-a-chip, which combines living human cells with an artificial tissue-like system, has been described today in the open-access journal eLife.

This cutting-edge tool may provide a useful alternative to existing models for studying eye disease and allow scientists to test the effects of drugs on the retina more efficiently.

Many diseases that cause blindness harm the retina, a thin layer of tissue at the back of the eye that helps collect light and relay visual information to the brain. The retina is also vulnerable to harmful side effects of drugs used to treat other diseases such as cancer. Currently, scientists often rely on animals or retina organoids, tiny retina-like structures grown from human stem cells, to study eye diseases and drug side effects. But results from studies in both models often fail to describe disease and drug effects in people accurately. As a result, a team of scientists have tried to recreate a retina for testing purposes using engineering techniques.

"It is extremely challenging, if not almost impossible, to recapitulate the complex tissue architecture of the human retina solely using engineering approaches," explains Christopher Probst, Postdoctoral Researcher at the Fraunhofer Institute for Interfacial Engineering and Biotechnology in Stuttgart, Germany, and co-lead author of the current study.

To overcome these challenges, the scientists coaxed human pluripotent stem cells to develop into several different types of retina cells on artificial tissue. This tissue recreates the environment that cells would experience in the body and delivers nutrients and drugs to the cells through a system that mimics human blood vessels.

"This combination of approaches enabled us to successfully create a complex multi-layer structure that includes all cell types and layers present in retinal organoids, connected to a retinal pigment epithelium layer," says co-lead author Kevin Achberger, Postdoctoral Researcher at the Department of Neuroanatomy & Developmental Biology at the Eberhard Karls University of Tu?bingen, Germany. "It is the first demonstration of a 3D retinal model that recreates many of the structural characteristics of the human retina and behaves in a similar way."

The team treated their retina-on-the-chip with the anti-malaria drug chloroquine and the antibiotic gentamicin, which are toxic to the retina. They found that the drugs had a toxic effect on the retinal cells in the model, suggesting that it could be a useful tool for testing for harmful drug effects.

"One advantage of this tiny model is that it could be used as part of an automated system to test hundreds of drugs for harmful effects on the retina very quickly," Achberger says. "Also, it may enable scientists to take stem cells from a specific patient and study both the disease and potential treatments in that individual's own cells."

"This new approach combines two promising technologies - organoids and organ-on-a-chip - and has the potential to revolutionise drug development and usher in a new era of personalised medicine," concludes senior author Peter Loskill, Assistant Professor for Experimental Regenerative Medicine at the Eberhard Karls University of Tu?bingen, and head of the Fraunhofer Attract group Organ-on-a-Chip at the Fraunhofer Institute for Interfacial Engineering and Biotechnology. His laboratory, which spans the two universities, is already developing similar organ-on-a-chip technology for the heart, fat, pancreas and more.

Credit: 
eLife

How worms snare their hosts

image: Parasite-infected shrimp give themselves away with an orange spot, which is visible with the naked eye through the transparent shell.

Image: 
© Nicole Bersau/Uni Bonn

Acanthocephala are parasitic worms that reproduce in the intestines of various animals, including fish. However, only certain species of fish are suitable as hosts. A study by the University of Bonn now shows how the parasites succeed in preferably infecting these types. The results will be published in the journal Behaviour, but are already available online.

The parasitic worm Pomphorhynchus laevis does not have an easy life: In order to reproduce, the parasite must first hope that its eggs will be eaten by a freshwater shrimp. The larvae hatching from the eggs then need a change of scenery: They can only develop into adult worms if they are swallowed by a fish. However, not every fish species is suitable as a final host. Some species have defense mechanisms that kill the parasite before it can mate and release new eggs into the water through the fish intestine.

In order to improve their chances of reproduction, the worms have developed several sophisticated strategies in the course of evolution. "For example, parasite-infected shrimp change their behavior," explains Dr. Timo Thünken from the Institute for Evolutionary Biology and Ecology at the University of Bonn. "They no longer avoid certain fish species and are therefore eaten more frequently." Another thesis was so far controversial: Freshwater shrimp are beige-brownish; their body shell is also relatively transparent. They therefore barely stand out from their surroundings. Pomphorhynchus laevis larvae, on the other hand, are bright orange. It is therefore possible to see with the naked eye whether a shrimp is infected: Its parasitic cargo is marked by an orange spot.

Infected shrimp attract more attention

It may be that the shrimp are less well camouflaged as a result and are more frequently eaten by fish. Study director Prof. Dr. Theo Bakker already investigated this thesis several years ago. He was indeed able to determine that shrimp with an orange mark more frequently ended up in the stomachs of sticklebacks. Yet this finding was not confirmed in studies with brown trout.

However, the brown trout, in contrast to the stickleback, is not a suitable final host for Pomphorhynchus laevis. Its immune system usually prevents a successful infection with Pomphorhynchus laevis. "It is therefore possible that the orange coloring attracts particular those fish that are especially suitable for the parasite's further reproduction," Thünken suspects. "We have now conducted experiments to put this thesis to the test."

The biologists marked the shrimp with an orange dot in order to simulate larval infestation. Then they tested how often the shrimp in question were eaten by different fish in comparison to unmarked species. The mark did in fact increase the risk of being eaten - but only in some types of fish: Barbels and sticklebacks were particularly interested in the marked freshwater shrimp; the dot made no difference to brown trout.

In another experiment, the researchers fed their fish exclusively with larvae-infested shrimp. "We were able to show that this diet often led to infection in barbels and sticklebacks, but very rarely in brown trout," explains Thünken. Evidently, their conspicuous coloring ensures that the larvae end up mainly in the stomach of suitable final hosts. However, it is unclear whether they have acquired their orange hue in the course of evolution in order to reach precisely these hosts. "Perhaps over time they have simply adapted better to the digestive tract of those fish that responded particularly strongly to the orange color," says Thünken.

The study also shows that the striking coloring of the larvae is not without disadvantages. The scientists used a different population of sticklebacks in their experiments. In contrast to their counterparts, this group avoided the marked shrimp: In the course of evolution they may have learned to interpret orange coloring as a warning signal.

Credit: 
University of Bonn

New drug combination shows promising activity in non-small cell lung cancer patients

TAMPA, Fla. - Patients with non-small cell lung cancer (NSCLC) now have more improved treatment options compared to standard of care with the addition of several new agents called immune-checkpoint inhibitors (ICI). Despite these changes, many patients still develop progressive disease after ICI treatment. In a new study published in Clinical Cancer Research, Moffitt Cancer Center researchers describe promising results from an early clinical trial that may offer patients who progress after ICI an additional treatment option.

Several ICI agents have been approved in recent years to treat NSCLC, including nivolumab, atezolizumab and pembrolizumab. ICIs function by stimulating the immune system to target cancer cells for destruction. However, patients may become resistant to ICIs and develop progressive disease. Many of these patients who have poor responses to ICIs have low numbers of T cells in their tumor environment. Therefore, the cancer community has been trying to find drugs that work in conjunction with ICIs to increase the number and activity of T cells in a tumor.

Several preclinical studies, including studies conducted at Moffitt, have shown that drugs call histone deacetylase inhibitors (HDAC inhibitors) are capable of stimulating the immune system and enhancing the response to ICIs. "In our preclinical studies, we reported that HDAC inhibitors improve response to PD-1 blockade in mouse models of lung cancer by increasing T cell trafficking to tumors and enhancing T cell function," explained Amer Beg, Ph.D., senior member of the Department of Immunology at Moffitt.

Given the positive results of preclinical studies with ICIs and HDAC inhibitors, Moffitt researchers wanted to assess the potential benefit of these agents in patients with NSCLC. They conducted a phase 1/1b study of pembrolizumab plus the HDAC inhibitor vorinostat in 33 patients with advanced or metastatic NSCLC.

"To our knowledge, this represents the first publication of the clinical trial combination of ICI with an HDAC inhibitor in lung cancer. We found that this combination was well tolerated and demonstrated preliminary anti-tumor activity in patients who were refractory to prior ICI treatment," said Jhanelle Gray, M.D., assistant member of the Department of Thoracic Oncology at Moffitt.

The researchers report that the most common adverse events during treatment were fatigue (33%), nausea (27%) and vomiting (27%). Immune-activating drugs are often associated with adverse events related to increased immune system activity. The immune-related adverse events experienced by patients during the study were similar to those reported previously for pembrolizumab, with the most common event being hypothryroidism in 15% of patients. The combination of pembrolizumab and vorinostat also demonstrated preliminary anti-tumor activity. Of 30 patients who were evaluable for efficacy, 4 had a partial response to treatment and 16 developed stable disease, for a disease control rate of 67%.

During the study, the researchers conducted a correlative analysis to determine if any blood or tissue biomarkers were associated with better patient outcomes to the combination treatment. They discovered that patients who had higher levels of T cells within the stromal environment before treatment had improved outcomes to therapy. According to the researchers, these observations suggest that vorinostat may sensitize tumors to ICIs by causing the T cells to migrate from the stroma to the tumor bed.

The researchers will further investigate this hypothesis and the activity of pembrolizumab and vorinostat in the ongoing phase 2 trial in patients with advanced/metastatic NSCLC who did not previously receive ICI treatment. "We believe our results lay the groundwork for future trials to assess impact of epigenetic agents on ICI response, and the discovery of biomarkers to assess the dynamic nature of the immune response early in patient's treatment course," said Beg.

Credit: 
H. Lee Moffitt Cancer Center & Research Institute

The dark side of extrasolar planets share surprisingly similar temperatures

image: Nightsides of hot Jupiters share clouds made of minerals.

Image: 
McGill University/Dylan Keating

A new study by McGill University astronomers has found that the temperature on the nightsides of different hot Jupiters-- planets that are similar size in to Jupiter, but orbit other stars--- is surprisingly uniform, suggesting the dark sides of these massive gaseous planets have clouds made of minerals and rocks.

Using data from the Spitzer Space and the Hubble Space telescopes, the researchers from the McGill Space Institute found that the nightside temperature of 12 hot Jupiters they studied was about 800°C.

Unlike our familiar planet Jupiter, so-called hot Jupiters circle very close to their host star -- so close that it typically takes fewer than three days to complete an orbit. As a result, hot Jupiters have daysides that permanently face their host stars and nightsides that always face the darkness of space, similarly to how the same side of the Moon always faces the Earth. The tight orbit also means these planets receive more light from their star, which is what makes them extremely hot on the dayside. But scientists had previously measured significant amounts of heat on the nightside of hot Jupiters, as well, suggesting some kind of energy transfer from one side to the other.

"Atmospheric circulation models predicted that the nightside temperatures should vary much more than they do," said Dylan Keating, a Physics PhD student under the supervision of McGill professor Nicolas Cowan. "This is surprising because the planets we studied all receive different amounts of irradiation from their host stars and the dayside temperatures among them varies by almost 1700°C."

Keating, the first author of a new Nature Astronomy study describing the findings, said the nightside temperatures are probably the result of condensation of vaporized rock in these very hot atmospheres.

"The uniformity of the nightside temperatures suggests that clouds on this side of the planets are likely similar to one another in composition. Our analysis suggests that these clouds are likely made of minerals such as manganese sulfide or silicates: in other words, rocks," Keating explained.

According to Cowan, because the basic physics of cloud formation are universal, the study of the nightside clouds on hot Jupiters could give insight into cloud formation elsewhere in the Universe, including on Earth. Keating said that future space telescope missions - such as the James Webb Space Telescope and the European Space Agency's ARIEL mission - could be used to further characterize the dominant cloud composition on hot Jupiter nightsides, as well as to improve models of atmospheric circulation and cloud formation of these planets.

"Observing hot Jupiters at both shorter and longer wavelengths will help us determine what types of clouds are on the nightsides of these planets," Keating explained.

Credit: 
McGill University

Machine learning increases resolution of eye imaging technology

video: As the tissue sample on the left rotates under a traditional OCT scan, computational imaging gradually builds the OCRT image on the right until the resolution has peaked in all directions.

Image: 
Kevin Zhou, Duke University

DURHAM, N.C. -- Biomedical engineers at Duke University have devised a method for increasing the resolution of optical coherence tomography (OCT) down to a single micrometer in all directions, even in a living patient. The new technique, called optical coherence refraction tomography (OCRT), could improve medical images obtained in the multibillion-dollar OCT industry for medical fields ranging from cardiology to oncology.

The results appear in a paper published online on August 19 in the journal Nature Photonics.

"An historic issue with OCT is that the depth resolution is typically several times better than the lateral resolution," said Joseph Izatt, the Michael J. Fitzpatrick Professor of Engineering at Duke. "If the layers of imaged tissues happen to be horizontal, then they're well defined in the scan. But to extend the full power of OCT for live imaging of tissues throughout the body, a method for overcoming the tradeoff between lateral resolution and depth of imaging was needed."

OCT is an imaging technology analogous to ultrasound that uses light rather than soundwaves. A probe shoots a beam of light into a tissue and, based on the delays of the light waves as they bounce back, determines the boundaries of the features within. To get a full picture of these structures, the process is repeated at many horizontal positions over the surface of the tissue being scanned.

Because OCT provides much better resolution of depth than lateral direction, it works best when these features contain mostly flat layers. When objects within the tissue have irregular shapes, the features become blurred and the light refracts in different directions, reducing the image quality.

Previous attempts at creating OCT images with high lateral resolution have relied on holography -- painstakingly measuring the complex electromagnetic field reflected back from the object. While this has been demonstrated, the approach requires the sample and imaging apparatus to remain perfectly still down to the nanometer scale during the entire measurement.

"This has been achieved in a laboratory setting," said Izatt, who also holds an appointment in ophthalmology at the Duke University School of Medicine. "But it is very difficult to achieve in living tissues because they live, breathe, flow and change."

In the new paper, Izatt and his doctoral student, Kevin Zhou, take a different approach. Rather than relying on holography, the researchers combine OCT images acquired from multiple angles to extend the depth resolution to the lateral dimension. Each individual OCT image, however, becomes distorted by the light's refraction through irregularities in the cells and other tissue components. To compensate for these altered paths when compiling the final images, the researchers needed to accurately model how the light is bent as it passes through the sample.

To accomplish this computational feat, Izatt and Zhou turned to their colleague Sina Farsiu, the Paul Ruffin Scarborough Associate Professor of Engineering at Duke, who has a long history of using machine learning tools to create better images for health care applications.

Working with Farsiu, Zhou developed a method using "gradient-based optimization" to infer the refractive index within the different areas of tissue based on the multi-angle images. This approach determines the direction in which the given property -- in this case the refractive index -- needs to be adjusted to create a better image. After many iterations, the algorithm creates a map of the tissue's refractive index that best compensates for the light's distortions. The method was implemented using TensorFlow, a popular software library created by Google for deep learning applications.

"One of the many reasons why I find this work exciting is that we were able to borrow tools from the machine learning community and apply them not only to post-process OCT images, but also to combine them in a novel way and extract new information," said Zhou. "I think there are many applications of these deep learning libraries such as TensorFlow and PyTorch, outside of the standard tasks such as image classification and segmentation."

For these proof-of-concept experiments, Zhou took tissue samples such as the bladder or trachea of a mouse, placed them in a tube, and rotated the samples 360 degrees beneath an OCT scanner. The algorithm successfully created a map of each sample's refractive index, increasing the lateral resolution of the scan by more than 300 percent while reducing the background noise in the final image. While the study used samples already removed from the body, the researchers believe OCRT can be adapted to work in a living organism.

"Rather than rotating the tissue, a scanning probe developed for this technique could rotate the angle of the beam on the tissue surface," said Zhou.

Zhou is already investigating how much a corneal scan could be improved by the technology with less than a 180-degree sweep, and the results appear promising. If successful, the technique could be a boon to many medical imaging needs.

"Capturing high-resolution images of the conventional outflow tissues in the eye is a long sought-after goal in ophthalmology," said Farsiu, referring to the eye's aqueous humor drainage system. "Having an OCT scanner with this type of lateral resolution would be very important for early diagnosis and finding new therapeutic targets for glaucoma."

"OCT has already revolutionized ophthalmic diagnostics by advancing noninvasive microscopic imaging of the living human retina," said Izatt. "We believe that with further advances such as OCRT, the high impact of this technology may be extended not only to additional ophthalmic diagnostics, but to imaging of pathologies in tissues accessible by endoscopes, catheters, and bronchoscopes throughout the body."

Credit: 
Duke University

Would a carbon tax help to innovate more-efficient energy use?

Washington, DC-- Taxing carbon emissions would drive innovation and lead to improved energy efficiency, according to a new paper published in Joule from Carnegie's Rong Wang (now at Fudan University), Harry Saunders, and Ken Caldeira, along with Juan Moreno-Cruz of the University of Waterloo.

Despite advances in solar, wind, and other renewable energy sources, fossil fuels remain the primary source of the climate-change-causing carbon emissions. In order to halt global warming at the 2 degrees Celsius limit set by the Paris Agreement, we must reduce and eventually stop or completely offset carbon released into the atmosphere by burning of oil, coal, and gas.

"It has long been theorized that raising carbon prices would provide an incentive to reduce emissions through energy efficiency improvements," explained lead author Rong. "So, we looked to history to determine how cost increases have affected energy use efficiency in the past."

The researchers developed their own version of the productivity model created by Nobel Prize-winning economist Robert Solow.

They found that historically, in various countries, when the cost of energy comprised a larger fraction of the cost of production, those countries found new ways to reduce energy use or to use it more efficiently. Rong and his colleagues asked what would happen if these historical relationships between energy costs and efficiency improvements continued into the future. When this dynamic was continuously in play, according to their model, by 2100 energy usage would be reduced by up to 30 percent relative to simulations where this dynamic was not considered.

"Other studies have examined how taxing carbon emission would drive innovation in renewables," explained Caldeira. "But we show that it would also lead to more-efficient consumption of energy--not just by getting people to use better existing technology, but also by motivating people to innovate better ways to use energy. This means that solving the climate problem, while still hard, is a little easier than previously believed."

Credit: 
Carnegie Institution for Science

Researchers develop affordable, less intensive methane detection protocol

image: Tao Wen (right), a postdoctoral scholar in the Earth and Environmental Systems Institute at Penn State, examines a dead vegetation zone on a farm in the Gregs Run watershed, Lycoming County.

Image: 
Josh Woda, Penn State

A new testing protocol that uses existing, affordable water chemistry tests can help scientists and regulators detect sites showing evidence of new methane gas leaks caused by oil and gas drilling, according to Penn State researchers.

The researchers took a testing protocol they had described in a paper last year in the Proceedings of the National Academy of Sciences and applied it to a much larger dataset of domestic water wells in three regions of Pennsylvania impacted by the fossil fuel industry. They looked for certain chemical constituents in the test results to determine if methane may have impacted the sites when the samples were collected. They published their findings in the journal Environmental Science & Technology and for the first time made public the datasets.

The scientists wanted to see what percentage of the water wells showed certain chemical changes that could indicate new methane contamination, like that which can occur during drilling and extraction of fossil fuels, and not pre-existing methane that is commonly found in Pennsylvania water.

"We expected to see few sites, less than 1%, showing evidence of new methane," said Tao Wen, a postdoctoral scholar in the Earth and Environmental Systems Institute at Penn State. "We found 17 out of 20,751 samples, or about 0.08 %, that showed possible signs of methane contamination when those samples were collected."

Unconventional shale gas wells dominate northeast Pennsylvania, whereas conventional oil and gas wells, including the first commercial oil well in the United States, dominate the northwest. The southwest has both conventional and unconventional oil and gas wells and a significant coal mining history.

The researchers divided the water samples into five types. The two types that the scientists defined as samples most likely impacted by new methane contained high methane and sulfate levels and either low or high iron levels.

"It's not uncommon to see methane in groundwater in the Marcellus shale and other shale plays," Wen said. "Also, if methane had been in the groundwater for a long time, bacteria would have reduced the iron and sulfate. The reduced forms would have precipitated as iron sulfide, or pyrite."

The researchers classified low-methane samples, where methane measured less than 10 parts per million, as low priority samples. The other two types not impacted by new methane contained high amounts of methane and either high salts, indicating naturally occurring methane not caused by energy extraction, or freshwater and low sulfate levels, meaning that the methane had been there for a time.

Of the 17 samples that came back positive for new methane, 13 came from the northeast. None came from sites within 2,500 feet of known problematic gas wells. State law holds oil and gas companies responsible for methane leaks that affect wells within that 2,500-foot area. The researchers' findings suggest that methane may migrate farther than previously thought if the new methane was derived from these known problematic gas wells. Only intensive field investigations could show whether this happened.

The testing protocol can act as an effective screening tool for methane contamination and narrow the window for a more in-depth analysis, such as using carbon-stable or noble gas isotopes, according to Wen.

"We focus on the Marcellus shale, but this testing protocol has the potential to be applied to other shale plays in the United States and other countries," he said. "It can benefit the global community."

Credit: 
Penn State

Corruption among India's factory inspectors makes labour regulation costly

New research shows that 'extortionary' corruption on the part of factory inspectors in India is helping to drive up the cost of the country's labour regulations to business.

University of Kent economist Dr Amrit Amirapu, along with Dr Michael Gechter of the Pennsylvania State University, US, conducted the research, which found that a particular set of regulations - including mandatory benefits, workplace safety provisions and reporting requirements - increase firms' per-worker labour costs by 35% on average across India.

The study also found that the effective cost of the regulations varies widely and is much higher than the 35% average figure in regions and industries that are more exposed to corruption by public officials.

The study also shed light on the puzzle of why some of the most significant problems faced by developing countries - including low labour force participation rates and low levels of employment in the formal sector - are often blamed on restrictive labour regulations when these regulations are usually quite similar to those found in rich countries - at least on paper.

The results suggest that corruption by inspectors, often involving bribery, adds to the cost burden faced by many Indian firms. For example, businesses located in Indian states that had reformed their inspector-related regulations in a positive way faced lower effective regulatory costs.

Dr Amirapu, of the University's School of Economics, said: 'Our results suggest a mechanism that may explain why these regulations are so costly in a developing country context: high de facto regulatory costs appear to be driven by extortionary corruption on the part of inspectors.'

The study concludes that the size of regulatory costs in practice have more to do with the way regulations are implemented than with the content of the specific laws themselves.

Credit: 
University of Kent

The genealogy of important broiler ancestor revealed

image: Adult females of the HWS and LWS body weight selected White Plymouth Rock lines.

Image: 
Christa F. Honaker

A new study examines the historical and genetic origins of the White Plymouth Rock chicken, an important contributor to today's meat chickens (broilers). Researchers at Uppsala University in Sweden, The Livestock Conservancy and Virginia Tech in the USA have used genomics to study breed formation and the roots of modern broilers.

The mid-19th century was an era of excitement among poultry breeders. Newly imported chickens from Asia were crossed with American landrace chickens and specialty breeds from Europe to establish new breeds and varieties that were standardized by the American Poultry Association beginning in 1873. With contributions by multiple breeders using different strategies, histories of these American breeds are sometimes unclear or inconsistent.

Two well-known lines of chickens developed at Virginia Tech represented the White Plymouth Rock. The HWS and LWS lines have been selected since 1957 for high and low body weights, respectively, and are considered representative of the White Plymouth Rock breed as of the mid-20th century in the USA. The research team sequenced DNA from HWS, LWS, and the eight breeds generally considered to have been used to develop the White Plymouth Rock. They then ascertained the percentage of genetic contribution made by each of the founding breeds. Furthermore, by measuring each breed's contribution to individual chromosomes, they were able to determine contribution to specific traits on those chromosomes. Contributions to the male and female chromosomes shed further light on the breed history.

The results confirmed that the Dominique, a very old American breed, was the major contributor to the Plymouth Rock. Dominique, Black Java, and Cochin breeds contributed to the maternal ancestry, while contributions on the male ancestry included Black Java, Cochin, Langshan, Light Brahma, and Black Minorca. Perhaps surprisingly, the proportional contribution of each of the founders is consistent with early breed history and records, despite selection in the 19th century for white feathers, clean legs, single comb, and yellow skin and selection in the early 20th century for increased body size and egg production.

Differences in the overall ancestral contibutions to the HWS and LWS lines were minor, despite more than 60 years of selection for 8-week body weight. Contributions to individual chromosomes were more apparent, and subsequent analyses may provide more insights into the relationship between ancestry in specific chromosome regions and long-term selection for body weight differences. Such analyses may have implications for genetic contributions to today's broilers.

The livestock and poultry breeds of today are the result of foundation, isolation (genetic drift), and selection, both natural and intentional.

"Genomic analysis has proven to be a good tool for understanding genetic contributions to breed development. Through additional study of founder contribution to chromosomes and genes, such analyses may also reveal more about the importance of drift and selection in closed populations. Such work also highlights the importance of conserving pure breeds and selected lines of chickens, "says Örjan Carlborg, Professor at the Department of Medical Biochemistry and Microbiology, Uppsala University, and lead author of the study.

Credit: 
Uppsala University

New information on regulation of sense of smell with the help of nematodes

image: This is an expression of fluorescently tagged PIM kinase in the nervous system and intestine of C. elegans.

Image: 
Karunambigai Kalichamy

PIM kinases are enzymes that are evolutionarily well conserved in both humans and nematodes. Led by Dr Päivi Koskinen, a research group from the Department of Biology of the University of Turku in Finland has previously proven that PIM kinases promote the motility and survival of cancer cells, but now the group has shown that these enzymes also regulate the sense of smell.

The novel results were inspired by new research tools. Koskinen's research group had assessed the efficacy of PIM-inhibitory compounds in cancer cells and decided to test whether they affect normal tissues, too.

"We especially wanted to study the regulation of olfactory neurons, since we had noticed that PIM kinases are expressed in the olfactory epithelium in mice. Instead of mice, however, we chose the invertebrate Caenorhabditis elegans nematode as our experimental organism. These little creatures have only 302 neurons, but can still efficiently distinguish between attractive or repulsive olfactory and gustatory cues in their environment, and thereby find food and avoid danger," tells Koskinen.

The C. elegans cultures in Turku were set up with the expert help by Dr Carina Holmberg from the University of Helsinki. For the actual experiments, Dr Karunambigai Kalichamy from India was recruited to provide further expertise on the use and behavioural analyses of these small and modest test animals.

In her chemotactic experiments, Kalichamy measured the motility of the nematodes towards or away from volatile or soluble compounds and compared untreated control animals with those treated with PIM inhibitors.

"Untreated nematodes immediately started to crawl towards attractive agents or run away from unpleasant compounds. PIM inhibitors did not affect any gustatory sensations, but significantly interfered with olfaction, so that the drug-treated animals were not able to respond to olfactory cues, but randomly moved to different directions," tells Kalichamy.

These results were not simply due to off-target effects of the drugs, as similar data were obtained also when PIM expression was abrogated by targeted mutations.

"Next it would be interesting to dissect the molecular effects of the PIM kinases in more detail and find out, whether they regulate olfaction also in humans and other mammals. The results could be interesting not only to evolution biologists, but to pharma companies, too. In case the PIM inhibitors rapidly, but luckily reversibly, reduce human olfactory responses in cancer patients, this phenomenon could even be used as a biomarker to assess the efficacy of the PIM-targeted therapies that are currently being developed against several types of cancer," states Koskinen.

Credit: 
University of Turku