Culture

These albino lizards are the world's first gene-edited reptiles

video: This video shows the gene-editing procedure used to create albino lizards.

Image: 
Rasys et al./<em>Cell Reports</em>

Meet the world's first gene-edited reptiles: albino lizards roughly the size of your index finger. Researchers used CRISPR-Cas9 to make the lizards, providing a technique for gene editing outside of major animal models. In their study, publishing August 27 in the journal Cell Reports, the researchers also show that the lizards can successfully pass gene-edited alleles for albinism to their offspring.

"For quite some time we've been wrestling with how to modify reptile genomes and manipulate genes in reptiles, but we've been stuck in the mode of how gene editing is being done in the major model systems," says corresponding author Doug Menke, an associate professor at the University of Georgia. "We wanted to explore anole lizards to study the evolution of gene regulation, since they've experienced a series of speciation events on Caribbean islands, much like Darwin's finches of the Galapagos."

The way gene editing is performed in most model systems is to inject CRISPR-Cas9 gene-editing reagents into freshly fertilized eggs or single-cell zygotes. But this technique cannot be used in reptiles, Menke says, because lizards have internal fertilization and the time of fertilization cannot be predicted. An isolated single-cell embryo from a female lizard also cannot be easily transferred, making it almost impossible to manipulate outside of the lizard.

But Menke and his research team noticed that the transparent membrane over the ovary allowed them to see all of the developing eggs, including which eggs were going to be ovulated and fertilized next. They decided to inject the CRISPR reagents into the unfertilized eggs within the ovaries and see if the CRISPR would still work.

"Because we are injecting unfertilized eggs, we thought that we would only be able to perform gene editing on the alleles inherited from the mother. Paternal DNA isn't in these unfertilized oocytes," Menke says. "We had to wait three months for the lizards to hatch, so it's a bit like slow-motion gene editing. But it turns out that when we did this procedure, about half of the mutant lizards that we generated had gene-editing events on the maternal allele and the paternal allele."

This suggests that the CRISPR components remain active for several days, or even weeks, within the unfertilized eggs. After screening the offspring, the researchers found that about 6% to 9% of the oocytes, depending on their size, produced offspring with gene-editing events.

"Relative to the very established model systems that can have efficiencies up to 80% or higher, 6% seems low, but no one has been able to do these sorts of manipulations in any reptile before," Menke says. "There's not a large community of developmental geneticists that are studying reptiles, so we're hoping to tap into exciting functional biology that has been unexplored."

Menke says that his team had two reasons for making the lizards albino, as opposed to editing other traits. First, when the tyrosinase albinism gene is knocked out, it results in a loss of pigmentation without being lethal to the animal. Second, since humans with albinism often have vision problems, the researchers hope to use the lizards as a model to study how the loss of this gene impacts retina development.

"Humans and other primates have a feature in the eye called the fovea, which is a pit-like structure in the retina that's critical for high-acuity vision. The fovea is absent in major model systems, but is present in anole lizards, as they rely on high-acuity vision to prey on insects," Menke says.

Studying gene functions in reptiles offers new opportunities for exploring aspects of development that are best studied in non-established animal models, Menke says. And ultimately, this gene-editing technique could be translated for use in other animals.

"We never know where the next major insights are going to come from, and if we can't even study how genes work in a huge group of animals, then there's no way to know if we've explored everything there is to explore in the realm of gene function in animals," Menke says. "Each species undoubtedly has things to tell us, if we take the time to develop the methods to perform gene editing."

Credit: 
Cell Press

Possible treatment on the horizon for severe dengue disease

image: Live imaging demonstrate vascular leakage over time from the blood vessels of DENV infected preclinical models (lane 2). The leakage was reversed when treated with tryptase inhibitor nafamostat mesylate (lane 3).

Image: 
Duke-NUS Medical School

SINGAPORE, 27 August 2019 - Researchers led by Duke-NUS Medical School have discovered that tryptase, an enzyme in human cells that acts like scissors to cut up nearby proteins, is responsible for blood vessel leakage in severe dengue haemorrhagic fever. The finding suggests a possible new treatment strategy using the tryptase inhibitor, nafamostat mesylate, for severe dengue disease - a potentially fatal condition for which no targeted treatment is currently available.

The dengue virus infects about 390 million people globally each year, causing substantial morbidity and mortality. While most patients experience dengue fever or a mild form of the disease, a small percentage develops dengue haemorrhagic fever (DHF), the more severe occurrence of dengue wherein blood 'leaks' from ruptured blood vessels. This can lead to dengue shock syndrome (DSS) - the final stage of DHF - where the circulatory system fails, sending the body into bleeding and shock, which is fatal without prompt treatment. How dengue patients go on to develop these severe conditions had not been clearly understood and, as a result, no targeted treatments have been developed to prevent hemorrhaging or reverse shock in infected patients.

"We discovered that, in severe cases, a particular enzyme called tryptase cuts the proteins that act as seals between blood vessel cells, resulting in blood vessel leakage and shock during dengue infection," said Assistant Professor Ashley St. John, from Duke-NUS' Emerging Infectious Diseases Programme, corresponding author of the study.

Based on this finding, the team wanted to know if a drug specific to inhibiting tryptase could be used to treat the hemorrhaging. Nafamostat mesylate, a clinically-approved tryptase inhibitor with a good safety profile, was tested using preclinical models. They found that administration of this drug, which is already used for the treatment of certain bleeding complications in some countries, prevented vascular leakage in the dengue model. Even delayed treatment with the drug was significantly effective in reducing dengue vascular leakage in a preclinical model of severe disease. The team also observed tryptase levels were very high in the blood of severe dengue patients who experienced DHF/DSS, but low in patients who easily recovered, affirming the link between high levels of the enzyme and severe dengue disease.

"Currently, only supportive care is available to patients suffering from severe dengue disease, with no targeted treatment for this potentially fatal condition. We believe our findings raise the possibility of developing new targeted treatments for dengue and, specifically, one that might be able to prevent shock," added Asst Prof St. John.

"We are currently experiencing a surge in dengue cases in Singapore," said Professor Patrick Casey, Senior Vice Dean for Research at Duke-NUS. "This timely study by our researchers not only holds out hope for a promising new strategy to treat severe dengue disease, but could also have broader implications for the treatment of other haemorrhagic diseases."

The study authors say the next step is to conduct clinical trials to test if tryptase inhibitors can reverse dengue vascular leakage and prevent shock in humans.

Credit: 
Duke-NUS Medical School

Retina-on-a-chip provides powerful tool for studying eye disease

image: This is the the retina-on-a-chip technology.

Image: 
Fraunhofer IGB

The development of a retina-on-a-chip, which combines living human cells with an artificial tissue-like system, has been described today in the open-access journal eLife.

This cutting-edge tool may provide a useful alternative to existing models for studying eye disease and allow scientists to test the effects of drugs on the retina more efficiently.

Many diseases that cause blindness harm the retina, a thin layer of tissue at the back of the eye that helps collect light and relay visual information to the brain. The retina is also vulnerable to harmful side effects of drugs used to treat other diseases such as cancer. Currently, scientists often rely on animals or retina organoids, tiny retina-like structures grown from human stem cells, to study eye diseases and drug side effects. But results from studies in both models often fail to describe disease and drug effects in people accurately. As a result, a team of scientists have tried to recreate a retina for testing purposes using engineering techniques.

"It is extremely challenging, if not almost impossible, to recapitulate the complex tissue architecture of the human retina solely using engineering approaches," explains Christopher Probst, Postdoctoral Researcher at the Fraunhofer Institute for Interfacial Engineering and Biotechnology in Stuttgart, Germany, and co-lead author of the current study.

To overcome these challenges, the scientists coaxed human pluripotent stem cells to develop into several different types of retina cells on artificial tissue. This tissue recreates the environment that cells would experience in the body and delivers nutrients and drugs to the cells through a system that mimics human blood vessels.

"This combination of approaches enabled us to successfully create a complex multi-layer structure that includes all cell types and layers present in retinal organoids, connected to a retinal pigment epithelium layer," says co-lead author Kevin Achberger, Postdoctoral Researcher at the Department of Neuroanatomy & Developmental Biology at the Eberhard Karls University of Tu?bingen, Germany. "It is the first demonstration of a 3D retinal model that recreates many of the structural characteristics of the human retina and behaves in a similar way."

The team treated their retina-on-the-chip with the anti-malaria drug chloroquine and the antibiotic gentamicin, which are toxic to the retina. They found that the drugs had a toxic effect on the retinal cells in the model, suggesting that it could be a useful tool for testing for harmful drug effects.

"One advantage of this tiny model is that it could be used as part of an automated system to test hundreds of drugs for harmful effects on the retina very quickly," Achberger says. "Also, it may enable scientists to take stem cells from a specific patient and study both the disease and potential treatments in that individual's own cells."

"This new approach combines two promising technologies - organoids and organ-on-a-chip - and has the potential to revolutionise drug development and usher in a new era of personalised medicine," concludes senior author Peter Loskill, Assistant Professor for Experimental Regenerative Medicine at the Eberhard Karls University of Tu?bingen, and head of the Fraunhofer Attract group Organ-on-a-Chip at the Fraunhofer Institute for Interfacial Engineering and Biotechnology. His laboratory, which spans the two universities, is already developing similar organ-on-a-chip technology for the heart, fat, pancreas and more.

Credit: 
eLife

How worms snare their hosts

image: Parasite-infected shrimp give themselves away with an orange spot, which is visible with the naked eye through the transparent shell.

Image: 
© Nicole Bersau/Uni Bonn

Acanthocephala are parasitic worms that reproduce in the intestines of various animals, including fish. However, only certain species of fish are suitable as hosts. A study by the University of Bonn now shows how the parasites succeed in preferably infecting these types. The results will be published in the journal Behaviour, but are already available online.

The parasitic worm Pomphorhynchus laevis does not have an easy life: In order to reproduce, the parasite must first hope that its eggs will be eaten by a freshwater shrimp. The larvae hatching from the eggs then need a change of scenery: They can only develop into adult worms if they are swallowed by a fish. However, not every fish species is suitable as a final host. Some species have defense mechanisms that kill the parasite before it can mate and release new eggs into the water through the fish intestine.

In order to improve their chances of reproduction, the worms have developed several sophisticated strategies in the course of evolution. "For example, parasite-infected shrimp change their behavior," explains Dr. Timo Thünken from the Institute for Evolutionary Biology and Ecology at the University of Bonn. "They no longer avoid certain fish species and are therefore eaten more frequently." Another thesis was so far controversial: Freshwater shrimp are beige-brownish; their body shell is also relatively transparent. They therefore barely stand out from their surroundings. Pomphorhynchus laevis larvae, on the other hand, are bright orange. It is therefore possible to see with the naked eye whether a shrimp is infected: Its parasitic cargo is marked by an orange spot.

Infected shrimp attract more attention

It may be that the shrimp are less well camouflaged as a result and are more frequently eaten by fish. Study director Prof. Dr. Theo Bakker already investigated this thesis several years ago. He was indeed able to determine that shrimp with an orange mark more frequently ended up in the stomachs of sticklebacks. Yet this finding was not confirmed in studies with brown trout.

However, the brown trout, in contrast to the stickleback, is not a suitable final host for Pomphorhynchus laevis. Its immune system usually prevents a successful infection with Pomphorhynchus laevis. "It is therefore possible that the orange coloring attracts particular those fish that are especially suitable for the parasite's further reproduction," Thünken suspects. "We have now conducted experiments to put this thesis to the test."

The biologists marked the shrimp with an orange dot in order to simulate larval infestation. Then they tested how often the shrimp in question were eaten by different fish in comparison to unmarked species. The mark did in fact increase the risk of being eaten - but only in some types of fish: Barbels and sticklebacks were particularly interested in the marked freshwater shrimp; the dot made no difference to brown trout.

In another experiment, the researchers fed their fish exclusively with larvae-infested shrimp. "We were able to show that this diet often led to infection in barbels and sticklebacks, but very rarely in brown trout," explains Thünken. Evidently, their conspicuous coloring ensures that the larvae end up mainly in the stomach of suitable final hosts. However, it is unclear whether they have acquired their orange hue in the course of evolution in order to reach precisely these hosts. "Perhaps over time they have simply adapted better to the digestive tract of those fish that responded particularly strongly to the orange color," says Thünken.

The study also shows that the striking coloring of the larvae is not without disadvantages. The scientists used a different population of sticklebacks in their experiments. In contrast to their counterparts, this group avoided the marked shrimp: In the course of evolution they may have learned to interpret orange coloring as a warning signal.

Credit: 
University of Bonn

New drug combination shows promising activity in non-small cell lung cancer patients

TAMPA, Fla. - Patients with non-small cell lung cancer (NSCLC) now have more improved treatment options compared to standard of care with the addition of several new agents called immune-checkpoint inhibitors (ICI). Despite these changes, many patients still develop progressive disease after ICI treatment. In a new study published in Clinical Cancer Research, Moffitt Cancer Center researchers describe promising results from an early clinical trial that may offer patients who progress after ICI an additional treatment option.

Several ICI agents have been approved in recent years to treat NSCLC, including nivolumab, atezolizumab and pembrolizumab. ICIs function by stimulating the immune system to target cancer cells for destruction. However, patients may become resistant to ICIs and develop progressive disease. Many of these patients who have poor responses to ICIs have low numbers of T cells in their tumor environment. Therefore, the cancer community has been trying to find drugs that work in conjunction with ICIs to increase the number and activity of T cells in a tumor.

Several preclinical studies, including studies conducted at Moffitt, have shown that drugs call histone deacetylase inhibitors (HDAC inhibitors) are capable of stimulating the immune system and enhancing the response to ICIs. "In our preclinical studies, we reported that HDAC inhibitors improve response to PD-1 blockade in mouse models of lung cancer by increasing T cell trafficking to tumors and enhancing T cell function," explained Amer Beg, Ph.D., senior member of the Department of Immunology at Moffitt.

Given the positive results of preclinical studies with ICIs and HDAC inhibitors, Moffitt researchers wanted to assess the potential benefit of these agents in patients with NSCLC. They conducted a phase 1/1b study of pembrolizumab plus the HDAC inhibitor vorinostat in 33 patients with advanced or metastatic NSCLC.

"To our knowledge, this represents the first publication of the clinical trial combination of ICI with an HDAC inhibitor in lung cancer. We found that this combination was well tolerated and demonstrated preliminary anti-tumor activity in patients who were refractory to prior ICI treatment," said Jhanelle Gray, M.D., assistant member of the Department of Thoracic Oncology at Moffitt.

The researchers report that the most common adverse events during treatment were fatigue (33%), nausea (27%) and vomiting (27%). Immune-activating drugs are often associated with adverse events related to increased immune system activity. The immune-related adverse events experienced by patients during the study were similar to those reported previously for pembrolizumab, with the most common event being hypothryroidism in 15% of patients. The combination of pembrolizumab and vorinostat also demonstrated preliminary anti-tumor activity. Of 30 patients who were evaluable for efficacy, 4 had a partial response to treatment and 16 developed stable disease, for a disease control rate of 67%.

During the study, the researchers conducted a correlative analysis to determine if any blood or tissue biomarkers were associated with better patient outcomes to the combination treatment. They discovered that patients who had higher levels of T cells within the stromal environment before treatment had improved outcomes to therapy. According to the researchers, these observations suggest that vorinostat may sensitize tumors to ICIs by causing the T cells to migrate from the stroma to the tumor bed.

The researchers will further investigate this hypothesis and the activity of pembrolizumab and vorinostat in the ongoing phase 2 trial in patients with advanced/metastatic NSCLC who did not previously receive ICI treatment. "We believe our results lay the groundwork for future trials to assess impact of epigenetic agents on ICI response, and the discovery of biomarkers to assess the dynamic nature of the immune response early in patient's treatment course," said Beg.

Credit: 
H. Lee Moffitt Cancer Center & Research Institute

The dark side of extrasolar planets share surprisingly similar temperatures

image: Nightsides of hot Jupiters share clouds made of minerals.

Image: 
McGill University/Dylan Keating

A new study by McGill University astronomers has found that the temperature on the nightsides of different hot Jupiters-- planets that are similar size in to Jupiter, but orbit other stars--- is surprisingly uniform, suggesting the dark sides of these massive gaseous planets have clouds made of minerals and rocks.

Using data from the Spitzer Space and the Hubble Space telescopes, the researchers from the McGill Space Institute found that the nightside temperature of 12 hot Jupiters they studied was about 800°C.

Unlike our familiar planet Jupiter, so-called hot Jupiters circle very close to their host star -- so close that it typically takes fewer than three days to complete an orbit. As a result, hot Jupiters have daysides that permanently face their host stars and nightsides that always face the darkness of space, similarly to how the same side of the Moon always faces the Earth. The tight orbit also means these planets receive more light from their star, which is what makes them extremely hot on the dayside. But scientists had previously measured significant amounts of heat on the nightside of hot Jupiters, as well, suggesting some kind of energy transfer from one side to the other.

"Atmospheric circulation models predicted that the nightside temperatures should vary much more than they do," said Dylan Keating, a Physics PhD student under the supervision of McGill professor Nicolas Cowan. "This is surprising because the planets we studied all receive different amounts of irradiation from their host stars and the dayside temperatures among them varies by almost 1700°C."

Keating, the first author of a new Nature Astronomy study describing the findings, said the nightside temperatures are probably the result of condensation of vaporized rock in these very hot atmospheres.

"The uniformity of the nightside temperatures suggests that clouds on this side of the planets are likely similar to one another in composition. Our analysis suggests that these clouds are likely made of minerals such as manganese sulfide or silicates: in other words, rocks," Keating explained.

According to Cowan, because the basic physics of cloud formation are universal, the study of the nightside clouds on hot Jupiters could give insight into cloud formation elsewhere in the Universe, including on Earth. Keating said that future space telescope missions - such as the James Webb Space Telescope and the European Space Agency's ARIEL mission - could be used to further characterize the dominant cloud composition on hot Jupiter nightsides, as well as to improve models of atmospheric circulation and cloud formation of these planets.

"Observing hot Jupiters at both shorter and longer wavelengths will help us determine what types of clouds are on the nightsides of these planets," Keating explained.

Credit: 
McGill University

Machine learning increases resolution of eye imaging technology

video: As the tissue sample on the left rotates under a traditional OCT scan, computational imaging gradually builds the OCRT image on the right until the resolution has peaked in all directions.

Image: 
Kevin Zhou, Duke University

DURHAM, N.C. -- Biomedical engineers at Duke University have devised a method for increasing the resolution of optical coherence tomography (OCT) down to a single micrometer in all directions, even in a living patient. The new technique, called optical coherence refraction tomography (OCRT), could improve medical images obtained in the multibillion-dollar OCT industry for medical fields ranging from cardiology to oncology.

The results appear in a paper published online on August 19 in the journal Nature Photonics.

"An historic issue with OCT is that the depth resolution is typically several times better than the lateral resolution," said Joseph Izatt, the Michael J. Fitzpatrick Professor of Engineering at Duke. "If the layers of imaged tissues happen to be horizontal, then they're well defined in the scan. But to extend the full power of OCT for live imaging of tissues throughout the body, a method for overcoming the tradeoff between lateral resolution and depth of imaging was needed."

OCT is an imaging technology analogous to ultrasound that uses light rather than soundwaves. A probe shoots a beam of light into a tissue and, based on the delays of the light waves as they bounce back, determines the boundaries of the features within. To get a full picture of these structures, the process is repeated at many horizontal positions over the surface of the tissue being scanned.

Because OCT provides much better resolution of depth than lateral direction, it works best when these features contain mostly flat layers. When objects within the tissue have irregular shapes, the features become blurred and the light refracts in different directions, reducing the image quality.

Previous attempts at creating OCT images with high lateral resolution have relied on holography -- painstakingly measuring the complex electromagnetic field reflected back from the object. While this has been demonstrated, the approach requires the sample and imaging apparatus to remain perfectly still down to the nanometer scale during the entire measurement.

"This has been achieved in a laboratory setting," said Izatt, who also holds an appointment in ophthalmology at the Duke University School of Medicine. "But it is very difficult to achieve in living tissues because they live, breathe, flow and change."

In the new paper, Izatt and his doctoral student, Kevin Zhou, take a different approach. Rather than relying on holography, the researchers combine OCT images acquired from multiple angles to extend the depth resolution to the lateral dimension. Each individual OCT image, however, becomes distorted by the light's refraction through irregularities in the cells and other tissue components. To compensate for these altered paths when compiling the final images, the researchers needed to accurately model how the light is bent as it passes through the sample.

To accomplish this computational feat, Izatt and Zhou turned to their colleague Sina Farsiu, the Paul Ruffin Scarborough Associate Professor of Engineering at Duke, who has a long history of using machine learning tools to create better images for health care applications.

Working with Farsiu, Zhou developed a method using "gradient-based optimization" to infer the refractive index within the different areas of tissue based on the multi-angle images. This approach determines the direction in which the given property -- in this case the refractive index -- needs to be adjusted to create a better image. After many iterations, the algorithm creates a map of the tissue's refractive index that best compensates for the light's distortions. The method was implemented using TensorFlow, a popular software library created by Google for deep learning applications.

"One of the many reasons why I find this work exciting is that we were able to borrow tools from the machine learning community and apply them not only to post-process OCT images, but also to combine them in a novel way and extract new information," said Zhou. "I think there are many applications of these deep learning libraries such as TensorFlow and PyTorch, outside of the standard tasks such as image classification and segmentation."

For these proof-of-concept experiments, Zhou took tissue samples such as the bladder or trachea of a mouse, placed them in a tube, and rotated the samples 360 degrees beneath an OCT scanner. The algorithm successfully created a map of each sample's refractive index, increasing the lateral resolution of the scan by more than 300 percent while reducing the background noise in the final image. While the study used samples already removed from the body, the researchers believe OCRT can be adapted to work in a living organism.

"Rather than rotating the tissue, a scanning probe developed for this technique could rotate the angle of the beam on the tissue surface," said Zhou.

Zhou is already investigating how much a corneal scan could be improved by the technology with less than a 180-degree sweep, and the results appear promising. If successful, the technique could be a boon to many medical imaging needs.

"Capturing high-resolution images of the conventional outflow tissues in the eye is a long sought-after goal in ophthalmology," said Farsiu, referring to the eye's aqueous humor drainage system. "Having an OCT scanner with this type of lateral resolution would be very important for early diagnosis and finding new therapeutic targets for glaucoma."

"OCT has already revolutionized ophthalmic diagnostics by advancing noninvasive microscopic imaging of the living human retina," said Izatt. "We believe that with further advances such as OCRT, the high impact of this technology may be extended not only to additional ophthalmic diagnostics, but to imaging of pathologies in tissues accessible by endoscopes, catheters, and bronchoscopes throughout the body."

Credit: 
Duke University

Would a carbon tax help to innovate more-efficient energy use?

Washington, DC-- Taxing carbon emissions would drive innovation and lead to improved energy efficiency, according to a new paper published in Joule from Carnegie's Rong Wang (now at Fudan University), Harry Saunders, and Ken Caldeira, along with Juan Moreno-Cruz of the University of Waterloo.

Despite advances in solar, wind, and other renewable energy sources, fossil fuels remain the primary source of the climate-change-causing carbon emissions. In order to halt global warming at the 2 degrees Celsius limit set by the Paris Agreement, we must reduce and eventually stop or completely offset carbon released into the atmosphere by burning of oil, coal, and gas.

"It has long been theorized that raising carbon prices would provide an incentive to reduce emissions through energy efficiency improvements," explained lead author Rong. "So, we looked to history to determine how cost increases have affected energy use efficiency in the past."

The researchers developed their own version of the productivity model created by Nobel Prize-winning economist Robert Solow.

They found that historically, in various countries, when the cost of energy comprised a larger fraction of the cost of production, those countries found new ways to reduce energy use or to use it more efficiently. Rong and his colleagues asked what would happen if these historical relationships between energy costs and efficiency improvements continued into the future. When this dynamic was continuously in play, according to their model, by 2100 energy usage would be reduced by up to 30 percent relative to simulations where this dynamic was not considered.

"Other studies have examined how taxing carbon emission would drive innovation in renewables," explained Caldeira. "But we show that it would also lead to more-efficient consumption of energy--not just by getting people to use better existing technology, but also by motivating people to innovate better ways to use energy. This means that solving the climate problem, while still hard, is a little easier than previously believed."

Credit: 
Carnegie Institution for Science

Researchers develop affordable, less intensive methane detection protocol

image: Tao Wen (right), a postdoctoral scholar in the Earth and Environmental Systems Institute at Penn State, examines a dead vegetation zone on a farm in the Gregs Run watershed, Lycoming County.

Image: 
Josh Woda, Penn State

A new testing protocol that uses existing, affordable water chemistry tests can help scientists and regulators detect sites showing evidence of new methane gas leaks caused by oil and gas drilling, according to Penn State researchers.

The researchers took a testing protocol they had described in a paper last year in the Proceedings of the National Academy of Sciences and applied it to a much larger dataset of domestic water wells in three regions of Pennsylvania impacted by the fossil fuel industry. They looked for certain chemical constituents in the test results to determine if methane may have impacted the sites when the samples were collected. They published their findings in the journal Environmental Science & Technology and for the first time made public the datasets.

The scientists wanted to see what percentage of the water wells showed certain chemical changes that could indicate new methane contamination, like that which can occur during drilling and extraction of fossil fuels, and not pre-existing methane that is commonly found in Pennsylvania water.

"We expected to see few sites, less than 1%, showing evidence of new methane," said Tao Wen, a postdoctoral scholar in the Earth and Environmental Systems Institute at Penn State. "We found 17 out of 20,751 samples, or about 0.08 %, that showed possible signs of methane contamination when those samples were collected."

Unconventional shale gas wells dominate northeast Pennsylvania, whereas conventional oil and gas wells, including the first commercial oil well in the United States, dominate the northwest. The southwest has both conventional and unconventional oil and gas wells and a significant coal mining history.

The researchers divided the water samples into five types. The two types that the scientists defined as samples most likely impacted by new methane contained high methane and sulfate levels and either low or high iron levels.

"It's not uncommon to see methane in groundwater in the Marcellus shale and other shale plays," Wen said. "Also, if methane had been in the groundwater for a long time, bacteria would have reduced the iron and sulfate. The reduced forms would have precipitated as iron sulfide, or pyrite."

The researchers classified low-methane samples, where methane measured less than 10 parts per million, as low priority samples. The other two types not impacted by new methane contained high amounts of methane and either high salts, indicating naturally occurring methane not caused by energy extraction, or freshwater and low sulfate levels, meaning that the methane had been there for a time.

Of the 17 samples that came back positive for new methane, 13 came from the northeast. None came from sites within 2,500 feet of known problematic gas wells. State law holds oil and gas companies responsible for methane leaks that affect wells within that 2,500-foot area. The researchers' findings suggest that methane may migrate farther than previously thought if the new methane was derived from these known problematic gas wells. Only intensive field investigations could show whether this happened.

The testing protocol can act as an effective screening tool for methane contamination and narrow the window for a more in-depth analysis, such as using carbon-stable or noble gas isotopes, according to Wen.

"We focus on the Marcellus shale, but this testing protocol has the potential to be applied to other shale plays in the United States and other countries," he said. "It can benefit the global community."

Credit: 
Penn State

Corruption among India's factory inspectors makes labour regulation costly

New research shows that 'extortionary' corruption on the part of factory inspectors in India is helping to drive up the cost of the country's labour regulations to business.

University of Kent economist Dr Amrit Amirapu, along with Dr Michael Gechter of the Pennsylvania State University, US, conducted the research, which found that a particular set of regulations - including mandatory benefits, workplace safety provisions and reporting requirements - increase firms' per-worker labour costs by 35% on average across India.

The study also found that the effective cost of the regulations varies widely and is much higher than the 35% average figure in regions and industries that are more exposed to corruption by public officials.

The study also shed light on the puzzle of why some of the most significant problems faced by developing countries - including low labour force participation rates and low levels of employment in the formal sector - are often blamed on restrictive labour regulations when these regulations are usually quite similar to those found in rich countries - at least on paper.

The results suggest that corruption by inspectors, often involving bribery, adds to the cost burden faced by many Indian firms. For example, businesses located in Indian states that had reformed their inspector-related regulations in a positive way faced lower effective regulatory costs.

Dr Amirapu, of the University's School of Economics, said: 'Our results suggest a mechanism that may explain why these regulations are so costly in a developing country context: high de facto regulatory costs appear to be driven by extortionary corruption on the part of inspectors.'

The study concludes that the size of regulatory costs in practice have more to do with the way regulations are implemented than with the content of the specific laws themselves.

Credit: 
University of Kent

The genealogy of important broiler ancestor revealed

image: Adult females of the HWS and LWS body weight selected White Plymouth Rock lines.

Image: 
Christa F. Honaker

A new study examines the historical and genetic origins of the White Plymouth Rock chicken, an important contributor to today's meat chickens (broilers). Researchers at Uppsala University in Sweden, The Livestock Conservancy and Virginia Tech in the USA have used genomics to study breed formation and the roots of modern broilers.

The mid-19th century was an era of excitement among poultry breeders. Newly imported chickens from Asia were crossed with American landrace chickens and specialty breeds from Europe to establish new breeds and varieties that were standardized by the American Poultry Association beginning in 1873. With contributions by multiple breeders using different strategies, histories of these American breeds are sometimes unclear or inconsistent.

Two well-known lines of chickens developed at Virginia Tech represented the White Plymouth Rock. The HWS and LWS lines have been selected since 1957 for high and low body weights, respectively, and are considered representative of the White Plymouth Rock breed as of the mid-20th century in the USA. The research team sequenced DNA from HWS, LWS, and the eight breeds generally considered to have been used to develop the White Plymouth Rock. They then ascertained the percentage of genetic contribution made by each of the founding breeds. Furthermore, by measuring each breed's contribution to individual chromosomes, they were able to determine contribution to specific traits on those chromosomes. Contributions to the male and female chromosomes shed further light on the breed history.

The results confirmed that the Dominique, a very old American breed, was the major contributor to the Plymouth Rock. Dominique, Black Java, and Cochin breeds contributed to the maternal ancestry, while contributions on the male ancestry included Black Java, Cochin, Langshan, Light Brahma, and Black Minorca. Perhaps surprisingly, the proportional contribution of each of the founders is consistent with early breed history and records, despite selection in the 19th century for white feathers, clean legs, single comb, and yellow skin and selection in the early 20th century for increased body size and egg production.

Differences in the overall ancestral contibutions to the HWS and LWS lines were minor, despite more than 60 years of selection for 8-week body weight. Contributions to individual chromosomes were more apparent, and subsequent analyses may provide more insights into the relationship between ancestry in specific chromosome regions and long-term selection for body weight differences. Such analyses may have implications for genetic contributions to today's broilers.

The livestock and poultry breeds of today are the result of foundation, isolation (genetic drift), and selection, both natural and intentional.

"Genomic analysis has proven to be a good tool for understanding genetic contributions to breed development. Through additional study of founder contribution to chromosomes and genes, such analyses may also reveal more about the importance of drift and selection in closed populations. Such work also highlights the importance of conserving pure breeds and selected lines of chickens, "says Örjan Carlborg, Professor at the Department of Medical Biochemistry and Microbiology, Uppsala University, and lead author of the study.

Credit: 
Uppsala University

New information on regulation of sense of smell with the help of nematodes

image: This is an expression of fluorescently tagged PIM kinase in the nervous system and intestine of C. elegans.

Image: 
Karunambigai Kalichamy

PIM kinases are enzymes that are evolutionarily well conserved in both humans and nematodes. Led by Dr Päivi Koskinen, a research group from the Department of Biology of the University of Turku in Finland has previously proven that PIM kinases promote the motility and survival of cancer cells, but now the group has shown that these enzymes also regulate the sense of smell.

The novel results were inspired by new research tools. Koskinen's research group had assessed the efficacy of PIM-inhibitory compounds in cancer cells and decided to test whether they affect normal tissues, too.

"We especially wanted to study the regulation of olfactory neurons, since we had noticed that PIM kinases are expressed in the olfactory epithelium in mice. Instead of mice, however, we chose the invertebrate Caenorhabditis elegans nematode as our experimental organism. These little creatures have only 302 neurons, but can still efficiently distinguish between attractive or repulsive olfactory and gustatory cues in their environment, and thereby find food and avoid danger," tells Koskinen.

The C. elegans cultures in Turku were set up with the expert help by Dr Carina Holmberg from the University of Helsinki. For the actual experiments, Dr Karunambigai Kalichamy from India was recruited to provide further expertise on the use and behavioural analyses of these small and modest test animals.

In her chemotactic experiments, Kalichamy measured the motility of the nematodes towards or away from volatile or soluble compounds and compared untreated control animals with those treated with PIM inhibitors.

"Untreated nematodes immediately started to crawl towards attractive agents or run away from unpleasant compounds. PIM inhibitors did not affect any gustatory sensations, but significantly interfered with olfaction, so that the drug-treated animals were not able to respond to olfactory cues, but randomly moved to different directions," tells Kalichamy.

These results were not simply due to off-target effects of the drugs, as similar data were obtained also when PIM expression was abrogated by targeted mutations.

"Next it would be interesting to dissect the molecular effects of the PIM kinases in more detail and find out, whether they regulate olfaction also in humans and other mammals. The results could be interesting not only to evolution biologists, but to pharma companies, too. In case the PIM inhibitors rapidly, but luckily reversibly, reduce human olfactory responses in cancer patients, this phenomenon could even be used as a biomarker to assess the efficacy of the PIM-targeted therapies that are currently being developed against several types of cancer," states Koskinen.

Credit: 
University of Turku

Cardiology compensation continues to rise; first heart failure compensation data reported

MedAxiom, an American College of Cardiology Company, the nation's leading cardiovascular health care performance community and top cardiovascular-specific consulting firm, has released its seventh annual Cardiovascular Provider Compensation and Production Survey. The report reveals trends that help cardiovascular organizations navigate an evolving health care industry and ultimately advance heart health.

Key findings include:

Overall total cardiology compensation increased 3 percent year over year, reaching the second highest total since 2012

Electrophysiologists are once again the top earners

Heart failure cardiologist compensation ($441,845 per FTE), reported for the first time, is 10 percent lower than general non-invasive compensation

The income gap between physicians in private vs. integrated groups narrowed to 3.5 percent

There are more cardiologists age 61 and over than ever before (about 1 in 4)

"MedAxiom continues to evolve the data collected for this annual report to keep up with the ever-changing health care landscape. With the addition of heart failure data, this year's report is more robust than ever," said Joel Sauer, MedAxiom executive vice president of consulting and author of the report. "Looking at compensation and production together helps us understand where we stand today and where we need to head as we continue to advance cardiovascular health care."

The comprehensive report provides data and expert analysis on compensation and production trends by subspecialty, geographic region, ownership model and more. It looks at the diverse set of data points and factors including compensation per work relative value unit, key cardiology volumes and ratios, diagnostic testing trends, and the roles of advanced practice providers, part-time physicians and non-clinical roles.

"As the health care delivery system evolves, it's crucial that we understand how economics are affected," said Gerald Blackwell, MD, MBA, FACC, MedAxiom president. "Having a pulse on cardiovascular compensation and production data and trends enables cardiovascular organizations to make informed practice decisions and ensures they are on par with peer organizations."

At the beginning of each year MedAxiom surveys its membership which represents more than one-third of all cardiology and cardiovascular groups in the country. The data collected contain financial, staffing, productivity and compensation metrics, and a number of demographic measures. Data for the 2019 report were collected over the 2008-2018 timeframe and include 184 groups, representing 2,267 full-time physicians.

Email Sam Roth to receive a copy of the full 2019 Provider Compensation and Production Survey Report or an infographic with key report findings.

Credit: 
American College of Cardiology

Defective sheath

Schwann cells form a protective sheath around nerve fibres and ensure that nerve impulses are transmitted rapidly. If these cells are missing or damaged, severe neurological diseases may occur as a result. Researchers at Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) have succeeded in demonstrating a complex interaction within Schwann cells which plays an important role for correct cell maturation.

Insulation and nutrients

For years, Dr. Franziska Fröb and Dr. Michael Wegner have devoted most of their time to researching one particular type of cell in the human body: Schwann cells. Similarly to insulation on electric cables, these cells form a sheath around the nerve fibres in the peripheral nervous system which connect nerve cells to muscle cells and the surrounding area and conduct impulses. If this protective layer, known as the myelin sheath, is damaged, the exchange of information becomes slower, incorrect, or ceases entirely. Nerve fibres and the corresponding nerve cells may die completely, as the Schwann cells also provide them with nutrients. The consequences for patients are pain, numbness, muscular atrophy or problems with moving hands and feet correctly.

Improved understanding of networks

Researchers Dr. Fröb and Prof. Wegner, Chair of Biochemistry and Pathobiochemistry at FAU, hope that they will one day be able to help people suffering from diseases such as diabetic neuropathy, Charcot-Marie-Tooth disease or Guillain-Barré syndrome. However, there is still a long way to go. After more than 25 years of research on this type of cell, most of the proteins and protein complexes which have a role to play in the development and maturation of Schwann cells have been identified. However, the proteins interact with each other as well. Research into how the various components within this regulatory network interact is only just beginning. 'We will only be able to consider possible therapies once we have gained a better understanding of the networks,' Prof. Dr. Michael Wegner explains the current state of research.

The FAU working group has now succeeded in deciphering one of these complex links. Research has focused on a protein named Ep400, which the team has only recently discovered in Schwann cells. Together with other proteins, this protein acts within Schwann cells to ensure that DNA is packaged correctly and marked accordingly. Packaging is immensely important in order to transport genetic information to the cell nucleus as compactly as possible. Marking allows the required information to be found and read. In their experiments, the scientists deleted the protein from the Schwann cells. Thereafter, the mechanism for creating cells did not complete correctly and overlapped the maturation mechanism which would normally follow, meaning that certain proteins which were no longer required continued to be created without restriction, whilst other proteins which were required were no longer produced in sufficient quantities. As a result, the myelin sheaths of the Schwann cells were deformed. They were too thin and short, and the protective sheath of the nerve fibres was defective as a result. When the scientists deleted a further protein named Tfap2a which is normally regulated by Ep400 and which had been continued to be produced, there was a marked reduction in defects.

'We are, of course, pleased that we have made such a significant step forward in understanding the complex interdependencies within the Schwann cells,' explains Prof. Wegner. 'Our results indicate that DNA structural changes induced by proteins such as Ep400 are extremely important and may also be useful in future to develop therapies for peripheral neuropathy.'

Credit: 
Friedrich-Alexander-Universität Erlangen-Nürnberg

Changing partners doesn't change relationship dynamics, study shows

Think your new romance will be much different from your last one? According to new University of Alberta research, it's not likely.

An eight-year study of 554 people in Germany showed that eventually, they had the same dynamics in new partnerships as in past broken relationships, after the glow of the honeymoon phase had faded.

"Although some relationship dynamics may change, you are still the same person, so you likely recreate many of the same patterns with the next partner," said Matthew Johnson, a U of A relationship researcher and lead author on the study.

"New love is great, but relationships continue past that point."

In the study, which is among the first of its kind to explore the issue long-term, researchers surveyed people at four points: a year before their first intimate relationship ended and again in the final year, then within the first year of the new relationship and again a year after that.

Seven relationship aspects were reviewed, including satisfaction, frequency of sex, ability to open up to a partner, how often they expressed appreciation for the other person and confidence in whether the relationship would last.

All but two aspects were stable across the past and present relationships. The exceptions were frequency of sex and expressing admiration for your partner--both increased in the second relationship, which would be expected, according to Johnson.

"These aspects are directly dependent on a partner's behaviour, so we are more likely to see changes in these areas," he said.

However, Johnson noted the level of sexual satisfaction tended to stay the same as in the prior relationship even though sexual frequency increased.

People may feel that a new relationship is different but that's because of how past partnerships end, the study showed.

"Things get worse as a relationship ends, and when we start a new one, everything is wonderful at first, because we're not involving our partner in everyday life like housework and child care. The relationship exists outside of those things," Johnson said.

But most relationship dynamics during the middle phase of the prior relationship, when things were going well, were similar to those of the second relationship, after the initial honeymoon phase had passed.

"There's a lot of change in between, but more broadly, we do have stability in how we are in relationships," Johnson noted.

That could be both good and bad.

"It's good in a sense that we as individuals can bring ourselves and our experiences into relationships; we aren't totally trying to change who we are, and that continuity shows we stay true to ourselves," Johnson explained.

In fact, relationships end for a lot of reasons and breaking up shouldn't necessarily be seen as a failure, he added.

"It could be the best possible outcome for the people involved."

The downside of bringing the same dynamic to new relationships is that people may not be learning from their mistakes.

"Just starting a new partnership doesn't mean things are going to be different. This research shows that chances are, you are going to fall into the same patterns in many aspects of the relationship. Even if things are different, they're not guaranteed to be better," Johnson said.

The study also showed that people who tended to experience a lot of negative emotions fared worse in their second relationships--they tended to have lower relationship and sexual satisfaction, less frequent sex, fewer expressions of admiration and more conflict.

"Who you are matters, and addressing personal issues is going to be very impactful on whether you'll be successful in your relationship or not," Johnson said.

It's important to have an honest view of our past romances as we move into new ones, Johnson advised.

"Because of how badly a relationship ends, that colours our view of the whole thing. But having a more balanced view of the negatives and positives gives us realistic expectations for the new relationship."

Credit: 
University of Alberta