Culture

Sex, lies and crustaceans: New study highlights peculiar reproductive strategies of Daphnia

image: This graphic shows the unusual reproductive strategy used by the organism. In this system, known as cyclic parthenogenesis, Daphnia can alternate between sexual and parthenogenetic (asexual) reproduction.

Image: 
Shireen Dooling

Flourishing in spectacular numbers in lakes and ponds around the world, tiny creatures known as Daphnia play an essential role in freshwater ecology. Daphnia, a type of planktonic crustacean, are the primary consumers of algae and are an important food source for fish and other aquatic life.

Daphnia are ubiquitous in freshwater sources, but their mode of reproduction, known as cyclic parthenogenesis--which involves alternating phases of both sexual and asexual reproduction-- is an evolutionary puzzle. A further mystery surrounding Daphnia is that in some populations, up to 50% of females do not produce male offspring.

Now, Arizona State university's Michael Lynch and his colleagues, including lead author Zhiqiang Ye, believe they have identified a variation in a single gene (unique to Daphnia) that accounts for the non-male-producing (NMP) trait. Isolation of this NMP dominant allele sheds new light on the behavior of this important model organism as well as broader genetic mechanisms of male offspring production. Eventually, new insights into such mechanisms could spur innovations in fertility treatment or alternately, in pest management.

"What we discovered is that there are members of some Daphnia populations that are incapable of producing boys," Lynch says. "We have a chemical that we use called methyl farnesoate that we squirt in the water to induce male production and they don't even respond to that. So there are two flavors of individual in the population--one that produces daughters and sons and one that can only produce daughters."

Michael Lynch directs the Biodesign Center for Mechanisms of Evolution and is a professor at ASU's School of Life Sciences. The group's research on Daphnia appears in the current issue of the journal PNAS.

Beneath the surface

Daphnia, while just visible to the naked eye, are best observed under the microscope, where their translucent forms allow a clear view of internal structure. A heart that beats at a brisk 300 pulses per minute can clearly be seen along with the creature's five pairs of legs, which Daphnia use to sweep algae and organic detritus into their digestive tract. (Daphnia's cleanly visible structure coupled with their essential role in freshwater ecosystems have made them a vital organism for the study of ecology and physiology for over 100 years.)

Daphnia measure roughly 1-3 millimeters in length and feature prominent compound eyes sensitive to sunlight striking pond and lake surfaces, a durable carapace, antennae and a pair of hair-like abdominal structures known as setae.

Also visible in Daphnia is a brood chamber, where offspring resulting from sexual and asexual or parthenogenetic reproduction develop before birth. The movements of immature young can often be seen in the brood chamber.

Daphnia are prolific egg producers and it is estimated their numbers would quickly overpopulate a pond, were they not perpetually eaten by other aquatic life. Their rapid, abbreviated movements in the water are somewhat flea-like and they are often referred to as water fleas, though they are microcrustaceans, not insects.

While there are over 200 species in the Daphnia genus, the most common (and the subject of the current study) are D. pulex. (Precise taxonomy of Daphnia remains controversial, partly due to hybridization, phenotypic plasticity and other factors.)

Innovations in sex

Daphnia use a combined strategy of asexual and sexual reproduction during their life cycle. During most of the growth season, females reproduce asexually, producing diploid eggs (bearing two sets of chromosomes and producing genetically identical offspring). These eggs typically hatch after two to three days after being transferred to the Daphnia's brood pouch.

Late in the season, as environmental conditions change, the reproductive lifestyle of Daphnia shifts. Females begin producing males and what are known as resting eggs (or winter eggs). These are highly durable and capable of surviving intact--through extreme cold, drought or periods of scarce nutrients--for years or even decades, until environmental conditions become favorable for their hatching and development. Resting eggs are fertilized internally by male Daphnia before being deposited in the brood capsule.

Searching for clues

The new study examines a subset of clones of Daphnia pulex with a peculiar attribute: they only produce female offspring. Careful examination of these non-male producing Daphnia (NMP) identified some 3500 specific genomic markers found only in the NMP individuals. The researchers successively narrowed the field of candidate markers, comparing them with other NMP populations and looking for overlap, finally identifying just 5 protein-coding variants exclusively associated with the NMP Daphnia, all of which belong to a single gene.

Although some Daphnia populations have no members carrying the NMP variant, Lynch and his colleagues got lucky on their first try, sequencing 100 individuals, roughly half of which were non-male producers. Further, the NMP markers were all found in a confined region of the Daphnia genome, near the tip of chromosome 1. The variant that appears to be responsible for the NMP trait occurs on a gene labelled 8960, about which many questions remain. The gene is not present in other arthropods and appears unique to Daphnia.

"We have a huge Daphnia genome project," Lynch says. "We've sequenced about 2000 Daphnia genomes and we're continuing to do so. It's one of the biggest genome projects--maybe the biggest, except for the human genome project."

Zhiqiang Ye, a post-doc in Lynch's lab and lead author of the new study carried out much of the laboratory work. He examined gene expression in the male-producing and non-male producing alleles, and demonstrated that the NMP allele was largely non-responsive to methyl farnesoate, a critical hormone responsible for conferring male traits in Daphnia offspring.

Reproductive quandary

In evolutionary terms, the rationale for creating NMP Daphnia remains speculative. Intriguingly, the NMP trait appears to have originally been imported from a sister species of D. pulex, known as D. pulicaria. The NMP allele may act to limit the deleterious effects of too much inbreeding in Daphnia populations or simply help maintain an equilibrium of male production. The study, Lynch points out, may offer new clues concerning the origins and lifeways of obligate asexual organisms, (i.e. those that rely exclusively on asexual reproduction, a major conundrum in evolutionary biology).

The NMP phenotype bears similarity to a reproductive strategy found in so-called gynodioecious plants. "In a gynodioecious system, you have individuals who produce both male and female gametes and some that are only female," Lynch says. "In a superficial sense, it's very much like the Daphnia system."

In future research, Lynch and colleagues will apply the CRISPR-Cas9 gene editing methodology to the study of Daphnia, opening a new avenue of research into the genetics and genomics of this model organism. One proposed experiment will use the technique to knock out the gene responsible for male production or edit the NMP allele, to definitively establish gene 8960 as the culprit for the NMP trait in Daphnia.

Credit: 
Arizona State University

Helping transplanted stem cells stick around and do their jobs

Bone marrow transplants of hematopoietic stem cells have become standard treatment for a host of conditions including cancers of the blood and lymphatic systems, sickle cell anemia, inherited metabolic disorders, and radiation damage. Unfortunately, many bone marrow transplants fail due to rejection by the patient's immune system or graft-versus-host disease (in which the transplanted marrow cells attack the patient's healthy cells), both of which can be fatal. Mesenchymal stromal cells (MSCs) are known to secrete compounds that modulate the immune system and have shown promise in mitigating these problems in animal trials. However, clinical results with MSCs have been disappointing thus far, as they are rapidly cleared from the body and can draw attack from patients' immune systems, and efforts to encapsulate MSCs in protective biomaterials have resulted in large, bulky hydrogels that cannot be given intravenously and compromise the cells' functions.

Today, in a scientific first, researchers from the Wyss Institute for Biologically Inspired Engineering, Harvard's John A. Paulson School of Engineering and Applied Sciences (SEAS), and the Harvard Stem Cell Initiative (HSCI) demonstrate a single-cell encapsulation technology that effectively protects transplanted MSCs from clearance and immune attack and improves the success of bone marrow transplants in mice. The work is published in PNAS.

"To our knowledge, this is the first example of single-cell encapsulation being used to improve cell therapies, which are becoming more widespread as treatments for a number of diseases," said first author Angelo Mao, Ph.D., a former graduate student in the lab of Wyss Core Faculty member and lead of the Wyss Immuno-Materials Platform David Mooney, Ph.D. who is now a postdoc with Wyss Core Faculty member James Collins, Ph.D. "And, our encapsulated cells can be frozen and thawed with minimal impact on the cells' performance, which is critical in the context of hospitals and other treatment centers."

This advance builds on a method the team previously developed that uses a microfluidic device to coat individual living cells with a thin layer of an alginate-based hydrogel, creating what they term "microgels." The process encapsulates cells with 90% efficiency, and the resulting microgels are small enough that they can be delivered intravenously, unlike the bulky hydrogels created by other methods. When injected into mice, MSCs encapsulated using this technique remained in the animals' lungs ten times longer than "bare" MSCs, and remained viable for up to three days.

Because a large amount of MSCs' clinical appeal lies in their secretion of compounds that modulate the body's immune system, the researchers needed to test how microgel encapsulation affects MSCs' ability to function and resist immune attack. They modified their original alginate microgel by adding another compound that cross-links to the alginate and makes the microgel stiffer and better able to resist the body's immune system and clearance mechanisms. They also cultured the MSCs after encapsulation to encourage them to divide and produce more cells. When these new microgels were injected into mice, their persistence increased five-fold over the previous microgel design and an order of magnitude over bare MSCs.

To induce an immune response against the MSCs, the team incubated encapsulated cells in a medium containing fetal bovine serum, which is recognized by the body as foreign, before introducing them into mice. While the clearance rate of the encapsulated MSCs was higher than that observed without immune activation, it was still five times lower than that of bare MSCs. The microgels also outperformed bare MSCs when injected into mice that had a preexisting immune memory response against MSCs, which mimics human patients who are given multiple infusions of stem cells.

MSCs exposed to inflammatory cytokines respond by increasing their expression of immune-modulating genes and proteins, so the researchers next tested whether encapsulation in their new microgels impacted this response. They found that bare and encapsulated MSCs had comparable levels of gene expression when exposed to the same cytokines, demonstrating that the microgels did not impair MSC performance.

For their pièce de résistance, the team injected their MSC-containing microgels into mice along with transplanted bone marrow, half of which was immune-compatible with the recipient mouse and half of which was allogeneic, or an immune mismatch. Mice that received encapsulated MSCs had more than double the fraction of allogeneic bone marrow cells in their marrow and blood after nine days compared with mice that did not receive MSCs. Encapsulated MSCs also led to a greater degree of engraftment of the allogeneic cells into the host bone marrow compared to bare MSCs.

"One of the strong points of this work is that it uses a completely non-genetic approach to dramatically increase cell survival in transplant contexts, where it's sorely needed," said Mooney, who is also the Robert P. Pinkas Family Professor of Bioengineering at SEAS. "This technology nicely complements genetic engineering approaches, and in fact could be more efficient than attempting to directly modify immune cells themselves."

The Wyss Institute's Validation Project Program is supporting advancement of this approach as a possible treatment for ischemia (narrowing of blood vessels) in human patients, and hopes to demonstrate clinical viability in the near future. Validation Projects are technologies with potential high-impact applications that have successfully progressed through significant concept refinement and meet predefined technical, product development, and intellectual property criteria.

"This technology simultaneously resolves multiple issues with bone marrow transplants and stem cell therapies using an elegant, biomaterials-based approach that represents the kind of cross-disciplinary thinking that we value so greatly at the Wyss Institute," said Wyss Founding Director Donald Ingber, M.D., Ph.D., who is also the Judah Folkman Professor of Vascular Biology at HMS and the Vascular Biology Program at Boston Children's Hospital, as well as Professor of Bioengineering at SEAS. "We are excited to support this project as it moves toward clinical validation, and we look forward to other potential applications of microencapsulation to address drug and cell delivery problems."

Credit: 
Wyss Institute for Biologically Inspired Engineering at Harvard

Researchers describe new ALS biomarkers, potential new drug targets

Amyotrophic lateral sclerosis, or ALS, is an adult-onset neurodegenerative disease that causes paralysis and ultimately death when the nerves enervating the lungs cease to carry the signals needed for breathing. The disease has what is called a "focal onset," where paralysis starts with an arm or a leg and spreads throughout the body as motor neurons in the spinal cord and brain die off.

Early diagnosis of the disease has not been possible because of a lack of known biomarkers indicative of ALS, but scientists believe that cellular changes within spinal neurons occur before symptoms are detectable, and these changes could serve as useful biomarkers that can aid in earlier diagnosis.

Now, researchers at the University of Illinois at Chicago have described unique populations of neurons and associated cells in the spinal cords of patients who died of ALS.

When symptoms of ALS begin with the paralysis of an arm or a leg, it means that the disease has affected the motor neurons that enervate that arm or leg and which originate in a specific region along the spinal cord. For example, neurons that innervate the arm originate in the upper part of the spinal cord. In ALS, where symptoms first appear in the arm, the motor neurons in the upper spine region die off. Motor neurons above and below that region begin to die off next as the disease spreads up and down the spine, causing paralysis of other parts of the body.

The researchers, led by Fei Song, associate professor of neurology and rehabilitation in the UIC College of Medicine, found that patients with focal-onset ALS had different types of neurons in areas of the spinal cord that are less affected by the disease compared with patients without neurological disorders. They also found that spinal neurons in less affected regions of patients who died of focal-onset ALS were associated with cells called microglia and astrocytes. They report their findings in the journal Neurobiology of Disease.

"Since there must be cellular changes occurring in spinal cord regions adjacent to areas where the disease has clearly affected motor neurons in the spine, we wanted to look at neurons from these adjacent areas to determine if they are different from healthy tissue," Song said. "The debilitating disease has no effective treatment to stop the disease progression and there are only two medications that can prolong patient survival by a few months. So, new drug targets, especially ones that could be given in the earlier stages of the disease, are very much needed."

Song had worked with Dr. John Ravits, professor of clinical neurosciences at the University of California, San Diego, on ALS research in the past. Ravits sees patients with ALS and runs a biorepository that includes nerve tissue from patients who died of ALS and consented to have their nerve tissue collected after death.

Ravits published a paper in 2010 analyzing the expression of genes in spinal motor neurons taken from 12 patients whose ALS started focally compared with nerves of patients without neurological disorders. Motor neurons were collected from less affected regions of the spinal cord where the tissue was assumed to be in the earlier stages of disease.

While significant differences were found, Song wanted to reanalyze the genetic data using a new technique that could shed light on different cell types that may have been present in the samples collected by Ravits. Song and colleague Fabien Dachet, a research specialist with bioinformatics expertise in the UIC department of neurology and rehabilitation, applied a novel bioinformatics analysis to the genetic data.

"When we examined the data, it was clear that the mixture of cells from the ALS patients was very different from patients with no neurodegenerative disease," Song said.

They found that in samples from patients with focal-onset ALS, there were different types of motor neurons compared with control samples from patients without neurological disease. They also saw other cells called microglia and infiltrated macrophages associated with motor neurons from the ALS patients, where these cells were absent in similar samples from patients without neurological disease.

"We found a novel and unique subtype of motor neurons in these patients never before reported," Song said. "Now that we have identified new subtypes of motor neurons and microglia present in ALS patients, we can begin to further study their roles in contributing to disease progression."

Credit: 
University of Illinois Chicago

New biomarker-guided strategy has potential for liver cancer treatment

image: Li-Chuan Chan, Ph.D.,

Image: 
MD Anderson Cancer Center

A study at The University of Texas MD Anderson Cancer Center discovered a cellular pathway tied to cancer that may be beneficial in reducing side effects and extending duration of immunotherapy in some patients with hepatocellular carcinoma, the most common form of liver cancer.

Researchers looked at a cellular pathway formed when a protein known as interleukin-6 (IL-6) activates an enzyme called Janus kinase 1 (JAK1), and the potential for anti-IL-6 antibodies and anti-T-cell immunoglobulin mucin-3 (anti-Tim-3) in augmenting immunotherapy. Study results were published in the July 15 online issue of the Journal of Clinical Investigation.

The IL-6/JAK1 pathway is often observed in tumors and may play a role in cancer evasion by regulating a crucial cellular function in programmed death ligand 1 (PD-L1), a type of protein known to suppress the immune system.

"Our results demonstrated that anti-IL-6 antibodies, when combined with anti-Tim-3 antibodies, boosted T-cell killing effects in mouse models," said Li-Chuan Chan, Ph.D., a postdoctoral fellow in the Department of Molecular and Cellular Oncology "We identified a mechanism regulating PD-L1 glycosylation initiation, suggesting that a combination of anti-IL-6 and anti-Tim-3 as an effective marker-guided therapeutic strategy."

The researchers looked at the correlation between IL-6 and PD-L1 expression in tumor samples from 183 liver patients and found that patients with high IL-6 expression also had elevated PD-L1 expression. Previous MD Anderson studies reported that high IL-6 levels are associated with poorer prognosis in liver cancer patients. The team's new findings suggested that IL-6 is "physiologically significant and clinically relevant" to PD-L1 expression in liver cancer.

"We also found that the IL-6/JAK1 pathway contributed to PD-L1 phosphorylation, which appeared to be the dominant driver of cancer immune evasion in a liver cancer mouse model," said Chan. "Together, these findings may provide a potential mechanism on how activated JAK1 translocates to other cellular compartments and warrant further investigation in the future."

The study also pointed to a potential benefit for lessening immunotherapy side effects, which sometimes can shorten the amount of time patients can stay on treatment. Immune checkpoint inhibitors have been shown to stimulate the production of IL-6 serum, which can cause arthritis, Crohn's disease and a psoriasiform dermatitis.

"Therefore, blocking the IL-6 pathway may resolve these side effects and extend the duration of immunotherapy," said Chan.

Credit: 
University of Texas M. D. Anderson Cancer Center

Insurance companies: Want to steal your competitors' customers?

Key Takeaways:

Drivers who switch insurance companies bring higher risk than "equivalent own" customers.

Only 50 percent of the risk gap between switchers and non-switchers can be accounted by the difference in observable drivers characteristics

Even drivers with excellent driving records that switch insurers carry higher risk.

CATONSVILLE, MD, July 15, 2019 - Researchers from the United States published new research in the INFORMS journal Marketing Science (Editor's note: The source of this research is INFORMS), which sheds light on just how much it may take for the companies to profitably "steal" customers from their competitors. Frequently, the managers focus on customer acquisition cost when deciding if to poach customers from the competitor. To that extent, the managers may downplay the other factors, such as future cost to serve of poached customers. The researchers demonstrate that switchers can generate as much as 21 percent higher cost to serve as equivalent own customers. This brings an important caveat when designing marketing strategy.

The study to be published in the July edition of the INFORMS journal Marketing Science is titled "Skimming from the Bottom: Empirical Evidence of Adverse Selection When Poaching Customers," and is authored by Przemyslaw Jeziorski from the University of California at Berkeley, Elena Krasnokutskaya from Johns Hopkins University, and Olivia Ceccarini.

The study authors arrived at their conclusions after analyzing the Portuguese car insurance industry, which is an established, multi-billion-dollar market. They selected the insurance industry because they knew that it would present measurable factors that are similar to several other service industries, such as credit markets and retail.

They were able to use individual-level data on insurance claims from a leading Portuguese auto insurer which showed that in the insurance industry, the average customer who switches from one insurer to another generates a 32 percent higher volume of liability claims than the average customer who does not switch.

Further, they found that commonly employed actuarial screening mechanisms can only partially alleviate this problem.

"Screening based on factors we could observe and a detailed driving history accounts for than 50 percent of the adverse selection," said Jeziorski. "We found that the average customer who switches insurance companies is approximately 20 percent more risky than the 'nonswitcher.' This suggests that drivers exhibit a large degree of unobservable patterns of riskiness and that higher risk is often correlated with customers who switch insurers."

The researchers found that current insurance company pricing does not reflect this risk gap.

To address the risk gap and determine optimal pricing strategies for customers who switch insurance companies, the researchers decided to focus on patterns they could observe, which included how long those customers were with a particular insurance company.

They found that higher-risk drivers tended to be with an insurance company for less than two years or less. They concluded that customers who have been with an insurer for less than two years are "significantly more risky than the otherwise equivalent customers with three or more years of tenure."

"We found that 20 percent of clients churn within one year," said Jeziorski. "Some of those customers frequently switch insurance companies, and 35 percent of customers that incur a claim do not renew their contract."

Jeziorski added, "Such selective attrition can explain the relationship between tenure and riskiness. We also found that switchers with bad driving histories generate a 100 percent larger volume of claims than customers who do not switch but have the same driving histories. Even among drivers with excellent driving histories, customers who exhibit a pattern of switching insurance companies generate 38 percent larger claims."

Credit: 
Institute for Operations Research and the Management Sciences

Out of Africa and into an archaic human melting pot

Genetic analysis has revealed that the ancestors of modern humans interbred with at least five different archaic human groups as they moved out of Africa and across Eurasia.

While two of the archaic groups are currently known - the Neandertals and their sister group the Denisovans from Asia ¬- the others remain unnamed and have only been detected as traces of DNA surviving in different modern populations. Island Southeast Asia appears to have been a particular hotbed of diversity.

Published in the Proceedings of the National Academy of Sciences (PNAS), researchers from the University of Adelaide's Australian Centre for Ancient DNA (ACAD) have mapped the location of past "mixing events" (analysed from existing scientific literature) by contrasting the levels of archaic ancestry in the genomes of present-day populations around the world.

"Each of us carry within ourselves the genetic traces of these past mixing events," says first author Dr João Teixeira, Australian Research Council Research Associate, ACAD, at the University of Adelaide. "These archaic groups were widespread and genetically diverse, and they survive in each of us. Their story is an integral part of how we came to be.

"For example, all present-day populations show about 2% of Neandertal ancestry which means that Neandertal mixing with the ancestors of modern humans occurred soon after they left Africa, probably around 50,000 to 55,000 years ago somewhere in the Middle East."

But as the ancestors of modern humans travelled further east they met and mixed with at least four other groups of archaic humans.

"Island Southeast Asia was already a crowded place when what we call modern humans first reached the region just before 50,000 years ago," says Dr Teixeira. "At least three other archaic human groups appear to have occupied the area, and the ancestors of modern humans mixed with them before the archaic humans became extinct."

Using additional information from reconstructed migration routes and fossil vegetation records, the researchers have proposed there was a mixing event in the vicinity of southern Asia between the modern humans and a group they have named "Extinct Hominin 1".

Other interbreeding occurred with groups in East Asia, in the Philippines, the Sunda shelf (the continental shelf that used to connect Java, Borneo and Sumatra to mainland East Asia), and possibly near Flores in Indonesia, with another group they have named "Extinct Hominin 2".

"We knew the story out of Africa wasn't a simple one, but it seems to be far more complex than we have contemplated," says Dr Teixeira. "The Island Southeast Asia region was clearly occupied by several archaic human groups, probably living in relative isolation from each other for hundreds of thousands of years before the ancestors of modern humans arrived.

"The timing also makes it look like the arrival of modern humans was followed quickly by the demise of the archaic human groups in each area."

Credit: 
University of Adelaide

New UCI-led study uncovers weakness in C. diff toxin

image: On the left is an illustration of the deadly C. diff bacteria as seen under a microscope. The image on the right, resembling a question mark, is the first three-dimensional crystal structure of TcdB holotoxin, secreted by C. diff. TcdB is illustrated as a ribbon model superimposed with a transparent molecular surface. This structure provides a blueprint to develop new therapeutics and vaccines for the treatment of CDI.

Image: 
UCI School of Medicine

A new study, led by researchers from the University of California, Irvine (UCI), uncovers the long-sought-after, three-dimensional structure of a toxin primarily responsible for devastating Clostridium difficile infection (CDI).

Published today in Nature Structural & Molecular Biology, the study titled, "Structure of the full-length Clostridium difficile toxin B," sheds light on the weaknesses of TcdB, one of the toxins secreted by the Clostridium difficile (C. diff) bacteria and the main cause of CDI.

"This is the first time we could directly see the 3D structure of the gigantic TcdB holotoxin at a near atomic resolution," said Rongsheng Jin, PhD, a professor in the Department of Physiology & Biophysics at UCI's School of Medicine and the senior author in the study. "Interestingly, this toxin shapes like a question mark when viewed from a certain angle, and it has been a major question for us as we seek ways to fight the toxin and CDI."

Also included in the study, the team demonstrated how three antibodies could neutralize TcdB, revealing intrinsic vulnerabilities of the TcdB toxin that could be exploited to develop new therapeutics and vaccines for the treatment of CDI.

C. diff is an opportunistic pathogen that establishes in the colon when the gut microbiota are disrupted, often seen in severely ill or elderly patients in hospitals or in long-term care facilities. CDI has become the most common cause of antibiotic-associated diarrhea and gastroenteritis-associated death in developed countries, accounting for half-million cases and 29,000 deaths annually in the US. It is classified as one of the top three "urgent threats" by CDC. The current standard of care for CDI involves treatments using broad spectrum antibiotics that reduce the level of C. diff bacteria, but also kill the good bacteria in the gut and disrupt the normal gut microbiome. This approach often leads to frequent disease recurrence (up to 35%).

Recently, the Food and Drug Administration (FDA) issued a warning about an investigational fecal microbiota for transplantation (FMT) procedure for CDI treatment following the death of patient in a clinical trial. In another action, the FDA approved Bezlotoxumab, a TcdB-neutralizing human monoclonal antibody, as a prevention against recurrent infection.

"There remains a desperate need for more potent and cost-effective therapies for CDI," said Jin. "The good news is, the 3D structure of TcdB we have identified literally provides a blueprint for the development of next-generation vaccines and therapeutics that have enhanced potency and broad-reactivity across different C. diff strains."

Already the UCI team is working on a novel vaccine based on the new structure. Early studies show promising results, which Jin hopes to publish soon. In the meantime, The Regents of the University of California has filed a patent on their work.

Credit: 
University of California - Irvine

An inflammatory diet correlates with colorectal cancer risk

image: Dr. Victor Moreno (left) and Dr. Mireia Obón (next to him).

Image: 
Gemma Castaño-Vinyals

Researchers from the Molecular Mechanisms and Experimental Therapy in Oncology program (Oncobell) of the Bellvitge Biomedical Research Institute (IDIBELL) and the Catalan Institute of Oncology (ICO), together with the Biodonostia Health Research Institute (IIS Biodonostia), among others, have published in Nutrients the results of a multicenter study that unveils a correlation between inflammatory and antioxidant diets and the risk of developing colorectal and breast cancer. Dr. Mireia Obón-Santacana from IDIBELL-ICO is the first author of a research which was led by Dr. Pilar Amiano, principal investigator at IIS Biodonostia, and Dr. Víctor Moreno, head of the colorectal cancer research group at IDIBELL-ICO. Part of the study has been possible thanks to the funding provided by the Spanish Association Against Cancer (AECC).

"We have observed an association between the risk of developing colorectal cancer and the inflammatory potential of the diet. That is, the participants who followed an inflammatory diet had almost twice the risk of developing colorectal cancer, which is the 4th most frequent cancer worldwide", explains Dr. Mireia Obón. "On the other hand, we have not appreciated a significant increase in breast cancer risk. That is why we need to carry out more studies to check if there is really any correlation with other factors", she adds.

An inflammatory diet is usually characterized by the consumption of refined carbohydrates, red and processed meat, and saturated or trans fats. In an antioxidant diet, the consumption of vegetables, legumes, fruits and nuts predominates. "In this study we have focused on the role of diet, and specifically on its inflammatory and antioxidant capacity, as there is evidence that both chronic inflammation and oxidative stress influence the development of these two types of cancer", says Dr. Víctor Moreno.

"Following a pro-inflammatory and pro-oxidant diet is a very important risk factor for colon cancer. The positive part is that this is a modifiable factor and, therefore, it can be changed", underlines Dr. Mireia Obón. "Therefore, in order to prevent such cancers, it is very important to follow the recommendations of official agencies and international agencies. We should reorient our eating habits towards a Mediterranean diet, rich in fruits and vegetables, nuts, whole grains and healthy oils, such as olive oil and move away from a more pro-inflammatory diet", she argues.

What the IDIBELL-ICO researcher suggests is to "implement education strategies created by nutrition and health professionals, so that the general population can follow dietary recommendations and change their habits".

In this new study, scientists have specifically analysed the Spanish population through the Dietary Inflammatory Index (DII) and the Non-Enzymatic Antioxidant Capacity (NEAC), which are two useful and validated tools to estimate the inflammatory and the antioxidant potential of the diet. To carry out the study, 1852 cases of colorectal cancer and 1567 cases of breast cancer were included, together with 3447 and 1487 control cases, respectively. The study drew on data from 12 Spanish provinces.

Credit: 
IDIBELL-Bellvitge Biomedical Research Institute

NIST's quantum logic clock returns to top performance

image: Illustration of the ion trap that forms the heart of NIST's quantum logic clock. The trap is the gold structure with the cross-shaped cutout. The inset shows the aluminum ion (blue), the source of the clock's "ticks," and the partner magnesium ion (yellow).

Image: 
S. Burrows/JILA

The quantum logic clock--perhaps best known for showing you age faster if you stand on a stool--has climbed back to the leading performance echelons of the world's experimental atomic clocks.

Physicists at the National Institute of Standards and Technology (NIST) have been quietly upgrading their quantum logic clock design for the past eight years, mainly to reduce errors from unwanted motion of the single aluminum ion (electrically charged atom) that provides the clock "ticks."

As described in Physical Review Letters, the quantum logic clock's systematic uncertainty (how closely the clock represents the ion's natural vibrations, or frequency) is 9.4×10-19, the best of any clock worldwide. This means the logic clock would now neither gain nor lose one second in 33 billion years, which is about two-and-a-half times the estimated age of the universe.

In this metric, it now outpaces both NIST clocks using neutral atoms trapped in lattices of laser beams, the ytterbium lattice clock and the strontium lattice clock.

"The logic clock's performance is not surprising to me," project leader David Leibrandt said. "Ion clocks are naturally better isolated from the environment--which is the source of inaccuracy for atomic clocks--than lattice clocks are. It's important to distinguish between precision and stability on this point. People expect that lattice clocks should perform the best in stability, and they currently do. Our newest quantum logic clock is the world leader in precision but not stability."

The logic clock's stability (how long it takes to measure the time) is 1.2×10?15 for a 1-second measurement, which is near the best achieved by a single ion clock but about 10 times worse than both NIST lattice clocks.

The quantum logic clock got its nickname because it borrows logical decision-making techniques from experimental quantum computing. Aluminum is an exceptionally stable source of clock ticks, vibrating between two energy levels over a million billion times per second, but its properties are not easily manipulated or detected with lasers. So, logic operations with a partner magnesium ion are used to cool the aluminum and to signal its ticks.

Back in 2010, NIST's quantum logic clock had the best performance of any experimental atomic clock. The clock also attracted attention for 2010 demonstrations of "time dilation" aspects of Einstein's theories of relativity: that time passes faster at higher elevations but more slowly when you move faster.

Since then, NIST's lattice clocks have been continually leapfrogging each other in performance, giving the impression of a race to identify a single winner. In fact, all the clocks are useful for research purposes and are possible contenders for future time standards or other applications.

The international definition of the second (in the International System of Units, or SI) has been based on the cesium atom since 1967, so cesium remains the "ruler" for official timekeeping. The logic clock is one contender for a future time standard to be selected by the international scientific community. NIST scientists are working on several different types of experimental clocks, each based on different atoms and offering its own advantages. All these experimental clocks are based on optical frequencies, which are higher than the microwave frequencies used in today's timekeeping standards based on cesium.

Several technical advances enabled the improved performance of the logic clock, including a new ion trap design that reduced heat-induced ion motion, enabling operation near the desirable ground state, or lowest motional energy level. In addition, a lower frequency was used to operate the ion trap, reducing unwanted ion motion caused by the electric field used to trap the ions. Finally, improved quantum control has reduced the uncertainty of measurements of frequency shifts due to ion motion.

The clock's precision was determined by measuring and adding up the frequency shifts caused by nine different effects. Stability was measured by comparison to NIST's ytterbium lattice clock.

Additional improvements in trap design and other features are planned to further improve performance. Already, NIST's three experimental clocks can be compared to improve measurements of possible changes in some of the fundamental "constants" of nature, a line of inquiry that has important implications for cosmology and tests of the laws of physics such as Einstein's theories of special and general relativity.

Credit: 
National Institute of Standards and Technology (NIST)

Persistent HIV in central nervous system linked to cognitive impairment

WHAT:

Many people with HIV on antiretroviral therapy (ART) have viral genetic material in the cells of their cerebrospinal fluid (CSF), and these individuals are more likely to experience memory and concentration problems, according to new data published online today in the Journal of Clinical Investigation. A study of 69 individuals on long-term ART found that nearly half of the participants had persistent HIV in cells in their CSF, and 30% of this subset experienced neurocognitive difficulties. These findings suggest that HIV can persist in the nervous system even when the virus is suppressed in a patient's blood with medication. The study was funded by the National Institute of Allergy and Infectious Diseases (NIAID) and the National Institute of Mental Health (NIMH), both parts of the National Institutes of Health.

Investigators from the University of North Carolina, the University of Pittsburgh, and Yale University studied participants enrolled in the AIDS Clinical Trials Group (ACTG) HIV Reservoirs Cohort Study. This primarily male group--aged 45 to 56--of long-term HIV survivors had infections controlled with ART for on average nine years. Researchers analyzed each participant's CSF for HIV DNA and then compared these data to each participants' results from standard neurocognitive evaluations. About half of participants had viral DNA in cells in the CSF, indicating the presence of latent virus, even though standard HIV RNA 'viral load' tests of the cell-free CSF fluid were positive in only 4% of participants. Investigators also found that 30% of individuals with persistent HIV DNA in the CSF experienced clinical neurocognitive impairment compared with 11% of individuals whose CSF did not contain viral DNA.

Many researchers hypothesize that HIV-related inflammation causes HIV-associated neurocognitive disorder (HAND). The new findings suggest that the presence of persistent HIV-infected cells in the central nervous system (CNS), despite long-term ART, may play a role in neurocognitive impairment. The authors note that the overall frequency of neurocognitive impairment in this group was relatively low and that the association does not confirm that HIV DNA causes HAND. Overall, the current study found that examining CSF cells revealed a higher-than-expected prevalence of persistent HIV in the CNS, which may be a significant obstacle to efforts to eradicate HIV from the body.

Credit: 
NIH/National Institute of Allergy and Infectious Diseases

Dietary quality influences microbiome composition in human colonic mucosa

image: Dr. Li Jiao, the corresponding author of this work.

Image: 
Baylor College of Medicine

It is well established that diet influences health and disease, but the mechanisms underlying this effect are not fully understood. Shedding light on the diet-health connection, a team led by researchers at Baylor College of Medicine reports today in The American Journal of Clinical Nutrition an association between diet quality and microbiome composition in human colonic mucosa. The researchers found that a high-quality diet is linked to more potentially beneficial bacteria; while a low-quality diet is associated with an increase in potentially harmful bacteria. They propose that modifying the microbiome through diet may be a part of a strategy to reduce the risk of chronic diseases.

"In this study, rather than looking at individual diets, we focused on dietary patterns as defined by the Healthy Eating Index (HEI)-2005 and how they relate to the microbiome," said corresponding author Dr. Li Jiao, associate professor of medicine-gastroenterology and member of the Dan L Duncan Comprehensive Cancer Center at Baylor College of Medicine. "In a previous study, we found that HEI-2005 is associated with reduced risk of pancreatic cancer."

Diet is considered a principal factor influencing the structure of the microbial community in the gut, which in turn significantly affects the ability of beneficial or harmful microbes to colonize it. The human gut microbiome also influences nutrient uptake, synthesis of vitamins, energy harvest, chronic inflammation, carcinogen metabolism and the body's immune and metabolic response, factors that can affect disease risk, Jiao explained.

"One new contribution to this work is that we looked at the microbiome associated with colonic mucosa," Jiao said. "Most other studies of the human gut microbiome have used fecal samples. We looked at colon mucosal-associated microbiome because we know that this microbiome is different from that in the fecal samples, and it is said to be more related to human immunity and the host-microbiome interaction than the microbiome in fecal samples."

The researchers used next-generation sequencing techniques to analyze the type and abundance of bacteria present in colonic mucosal biopsies. The samples were obtained endoscopically from enrolled consenting 50- to 75-year-old participants who had a colonoscopy at the Michael E. DeBakey Veterans Affairs Medical Center in Houston between 2013 and 2017. The participants were polyp-free and seemingly healthy. They reported their dietary consumption using a food frequency questionnaire before the colonoscopy.

Dietary quality significantly influences the colon's microbiome

Jiao and her colleagues found that a good-quality diet as the one recommended by the Dietary Guidelines for Americans to be high in fruits, vegetables and whole grains, and low in added sugar, alcoholic beverages and solid fats is associated with higher abundance of beneficial bacteria such as those with anti-inflammatory properties. A poor-quality diet, on the other hand, is associated with more potentially pathogenic bacteria, such as Fusobacteria, which has been linked to colorectal cancer.

The researchers propose that the effect diet has on the structure of bacterial communities in human colonic mucosa can lead to modifications of innate immunity, inflammation and the risk of chronic diseases.

Their next step is to confirm the study findings in a larger study population. In addition, they want to investigate how bacterial products, or metabolites, such as short-chain fatty acids or secondary bile acids, can modify tissue microenvironment into one that either inhibits or promotes tumor growth or development of other diseases. Also, Jiao and her colleagues are interested in investigating how the unfavorable gut microbiome in individuals consuming a poor diet would respond to tailored dietary intervention using diet, pre- or probiotics, as previous studies have produced mixed results.

"Other factors, such as aging, genetics or certain medications, also influence the risk of disease but we cannot modify them," Jiao said. "Diet, on the other hand, can be modified and thus provides a strategy to develop a microbiome that promotes healthy living. We suggest that modifying the microbiome through diet may be a part of a plan to reduce the risk of chronic diseases."

Credit: 
Baylor College of Medicine

Science of microdosing psychedelics remains patchy and anecdotal, say researchers

The practice of taking small, regular doses of psychedelic drugs to enhance mood, creativity, or productivity lacks robust scientific evidence, say scientists.

The process, called microdosing, has been lauded by some, with high profile proponents in Silicon Valley. But to date, scientific evidence to support or even fully explore claims of the benefits and safety, has been lacking.

Now, an international group of researchers, led by Imperial College London and Maastricht University, has approached the issue in a wide-ranging review paper, published today in the Journal of Psychopharmacology, to tackle some of the key questions - including what is microdosing? Is it safe? Is it legal? And are the claims of benefits from taking small amounts of psychedelics even plausible?

According to the researchers, their review aims to present evidence around several themes of microdosing psychedelics, such as LSD or psilocybin (magic mushrooms), including discussions of concerns around impacts on cardiovascular health, as well as to providing a framework for future research in the area.

"Despite so much interest in the subject, we still don't have any agreed scientific consensus on what microdosing is - like what constitutes a 'micro' dose, how often someone would take it, and even if there may be potential health effects" said Professor David Nutt, Edmond J Safra Chair in Neuropsychopharmacology at Imperial College London and senior author of the review.

Professor Nutt and the team define microdosing as the practice of taking repeated, low doses of psychedelic substance - at doses that do not impair a person's 'normal' functioning (a fraction of 'recreational dose') - in order to improve well-being and enhance cognitive or emotional processes.

However, in practice, frequency may vary widely - from a few consecutive days, to weekdays - as may strength and potency of substances depending on what it is and where it's from.

The review explains that while most reports on microdosing to date are anecdotal and have focused on positive experiences, future research should be expanded to focus on the potential risks.

Focusing on psilocybin - the active compound in magic mushrooms - as one of the two most commonly used psychedelic substances (alongside LSD), and being much further along the clinical pipeline to potential approval as a treatment, the team presents the available evidence on several aspects of microdosing.

Chief among the issues raised is the lack of controlled scientific studies, the standard measure in medical science - where the effect of a treatment is measured in those taking it against a control or placebo group (who do not take the compound). The authors also cite a lack of certainty around the doses used in previous trials, as well as where the substances came from, and their potency.

Regarding safety, they claim evidence for long-term, repeated dosing of psilocybin is lacking in humans and animals, and that there is some evidence to highlight cardiovascular risks.

Similarly, the authors describe how data on the behavioural effects of microdosing, such as increased concentration or creativity, remain patchy. Early-stage research has shown psilocybin targets specific receptors in the brain which bind to serotonin - a chemical messenger in the brain associated with feelings of happiness, as well as learning and memory. They speculate that these changes to the activity of networks of brain cells may explain some of the reported therapeutic benefits of microdosing, such as improvement in mood, memory or productivity.

Beyond the scientific issues, the legality and regulation of substances remains a significant barrier, say the researchers. Despite the renaissance in the science of psychedelic research, the drugs in the field - chiefly psilocybin, LSD and DMT - remain Schedule 1 Drugs under the UN Convention and Class A under the Misuse of Drugs Act in the UK. In the UK, this currently means only researchers with a licence from the Home Office are able to obtain and test substances, and anyone obtaining substances for microdosing without a licence could face prosecution.

The team hopes that the evidence laid out in their review will go some way to focus the attention of the research community in order to answer some of the major remaining questions in the field. They write "rigorous, placebo-controlled clinical studies need to be conducted with low doses of [psilocybin] to determine whether there is any evidence for the claims of microdosers".

Dr Kim Kuypers, from Maastricht University and first author of the review, said: "This review is timely as a lot of hope is generated by positive media reports about alleged effects of microdosing. Patients might feel attracted by those reports to try it but may actually not helped by it. We try to emphasise the lack of scientific proof that microdosing is indeed effective in combatting certain symptoms and hope that this will give impetus to new lines of research in this area."

Professor Nutt added: "Researchers working in the area of psychedelics regularly receive requests from the media asking about microdosing. We hope that this critique will provide answers to all these questions in future as well as providing a framework for research."

Credit: 
Imperial College London

Paleontology: New light on cichlid evolution in Africa

A collaborative research project carried out under the auspices of the GeoBio-Center at Ludwig-Maximilians-Universitaet (LMU) in Munich has developed an integrative approach to the classification of fossil cichlids, and identified the oldest known member of the Tribe Oreochromini.

Cichlids (Cichlidae) are a group of small to medium-sized fish that are ubiquitous in freshwater habitats in the tropics. They are particularly notable in exhibiting a wide range of morphological and behavioral specializations, such as various modes of parental care, including mouthbrooding. Some species (mainly members of the genus Tilapia) have achieved fame as culinary delicacies and are of considerable economic significance. Cichlids have undergone rapid diversification in Africa, which is home to at least 1100 species. This process has been especially prominent in the Great Lakes in East Africa's Rift Valley (Lakes Tanganyika, Malawi and Victoria), where it is referred to as the East African Radiation.

"Cichlid diversification in East Africa has become a central paradigm in evolutionary biology. As a consequence, dating the onset of the process and understanding the mechanisms that drive it are issues of great interest to evolutionary biologists and paleobiologists," says LMU paleontologist Professor Bettina Reichenbacher, who is also member of the GeoBio-Center at LMU. Fossils from the area provide the sole source of direct evidence that would allow one to determine the timing and trace the course of lineage diversification within the group. However, the search for cichlid fossils has proven to be both arduous and extremely time-consuming. Indeed, only about 20 fossil species of cichlids from Africa have yet been formally described.

In a study that appears in the online journal Scientific Reports, a team of researchers led by Bettina Reichenbacher now describes a new fossil cichlid, which the authors assign to the new genus Oreochromimos. The name derives from the fact that the specimens, which the team discovered in Central Kenya, show similarities to members of the Tribe Oreochromini (hence the element 'mimos', meaning 'mimic', in the genus name), which are widely distributed in Africa today. "Determining whether or not the fossils could be assigned to any of the extant cichlid lineages was particularly challenging," says Stefanie Penk, first author of the study and a doctoral student in Reichenbacher's group. The difficulties are rooted in the great diversity of the modern cichlid fauna in Africa, and the fact that even distantly related species may be morphologically very similar to each other. "The architecture of the skeleton in cichlids is pretty conservative. All of them have a similar basic form, which undergoes very little change during speciation," Reichenbacher explains. In collaboration with Dr. Ulrich K. Schliewen, co-author of the new paper, Curator of Fishes at the Bavarian State Collection for Zoology in Munich (SNSB-ZSM) and also a member of the GeoBio-Center at LMU, the team adopted the 'best-fit approach' to the classification of the fossil specimens. This requires comparison of the fossil material with all the relevant modern lineages of cichlids. In light of their contemporary diversity, that might seem an impossible task. But thanks to Schliewen's knowledge - and the range of comparative material represented in the collection under his care - the strategy succeeded.

A unique glimpse of the past

Reichenbacher and colleagues recovered the Oreochromimos material from a fossil-fish Lagerstätte in Kenya's Tugen Hills, which lie within the Eastern Branch of the East African Rift System. This site provides a unique window into the region's past. The volcanic and sedimentary rocks deposited here date back 5-20 million years. They were overlain by younger material and subsequently uplifted to altitudes of as much as 2000 m by tectonic forces. As a result, the fossil-bearing rocks exposed in the Tugen Hills are either inaccessible to exploration or have been lost to erosion in other parts of Africa. Consequently, the strata here contain a unique assemblage of fossils. Undoubtedly the best known finds so far excavated are the 6-million-year-old remains of a hominin species, which has been named Orrorin tugenensis (orrorin means 'original man' in the local language).

But cichlid fossils are also among the paleontological treasures preserved in these sedimentary formations - and they are at the heart of Reichenbacher's Kenya Project, which began in 2011. The material collected so far was recovered in cooperation with Kenya's Egerton University, and is now on loan to LMU's Department of Earth and Environmental Sciences for further study.

The Oreochromimos specimens are about 12.5 million years old, which makes this genus the oldest known fossil representative of the Tribe Oreochromini. It therefore qualifies as the oldest fossil clade yet assigned to the Haplotilapiini, the lineage which gave rise not only to most of the species that constitute the present-day diversity of African cichlids, but also to the East African Cichlid Radiation in the Great Lakes of the Rift Valley. With their use of an innovative approach to comparative systematics, the authors of the new study have provided a basis for the taxonomic assignment of future finds of fossil cichlid material. "With the aid of this dataset, it will be possible to classify fossil cichlids much more reliably than before and thus to shed new light on their evolutionary history," says Bettina Reichenbacher.

Credit: 
Ludwig-Maximilians-Universität München

Defective potassium channels cause headache, not body pain

image: Top: Normal distribution of TRESK channels (green) in facial pain sensory neurons. Bottom: Absence of channels in the knock-out mouse model.

Image: 
Guo et al., eNeuro 2019

Defective potassium channels involved in pain detection can increase the chance of developing a headache and could be implicated in migraines, according to research in mice published in eNeuro.

A type of potassium channel called TRESK is thought to control the excitability of peripheral sensory neurons that detect pain, heat, cold, and touch. Even though these channels are found throughout the neurons sensing both body and facial pain, channel mutations are linked only with headaches and not body pain.

Yu-Qing Cao and colleagues at Washington University in St. Louis analyzed a knock-out mouse with defective TRESK channels and measured the resulting neural activity. The researchers found that only facial pain receptors were more excitable, and that the sensory neurons had more spontaneous activity. Using behavioral tests, the scientists observed that the knock-out mice showed increased sensitivity to temperature and touch stimuli on their faces, as well as more headache-related behaviors, but no body pain behaviors.

These results indicate that TRESK channels have cell-specific roles and are responsible for regulating pain in facial sensory neurons, making them a target for migraine treatment research.

Credit: 
Society for Neuroscience

The rush to air conditioning in Europe pushed by urbanization and climate change

image: Enrica De Cian, professor in Environmental Economics, Ca' Foscari University of Venice

Image: 
Andrea Avezzù / Ca' Foscari UNiversity of Venice

A new study published in Environmental Science and Policy shows that without adequate and focused policies, many households will rely on air conditioners to adapt to climate change, thus generating even more greenhouse gas emissions.

The study, led by Enrica De Cian, professor at University Ca' Foscari Venice and researcher at the Euro-Mediterranean Center on Climate Change, analyses for the first time the dynamics which brings households to adopt air conditioning and thermal insulation in 8 countries, of which 5 European, from 1990 till 2040. The study is based on a survey implemented by the OECD on a sample of European families, combined with climatologic data.

Besides the important regional differences, and the surge of AC in the cities, the study also shows that, more than the income of the head of the family, an essential factor for choosing to have an air conditioner at home is actually the wealth of the household. The presence of sensible persons in households, such as children and the elderly, is also crucial. Other important factors are the typology, the property and the state of the home as well as the environmental awareness of the head of the family, and its habit of putting into practice energy saving behaviours.

Prof. De Cian also points out that "Europeans have globally a low inclination for installing Air Conditioners in their home (20% of households on average) if compared to countries like Japan (90%) and Australia (72%) which should reach 100% in 2040. Climatic and cultural differences, even within the same country, lead to very different adoption rates across households today, and for the next 20 years."

As the number of households which will buy a new AC will increase by an average 4.3% between 2011 and 2040 across the 5 European Countries object of the study, the study shows that this increase is more due to urbanization (3/4) than climate change (1/4).

FRANCE is a country with typically little inclination for AC, both for cultural and climatic reasons, and high preference for thermal insulation, with 50% of its dwellings equipped today. In 1990, the percentage of households with AC was almost nil, while since 2000 we can witness a low but constant increase, reaching 13% in 2011 and 17.3% in 2040.

Similarly to France, THE NETHERLANDS see about 60% of its households equipped today with some sort of thermal insulation. As the number of hot days have increased by 60% between 1990 and 2011, the number of AC in households have also surged: from 0.5% in 1990 to 14% in 2014, the study projects a worrying 19% for 2040.

SPAIN on the opposite, being the only Mediterranean country object of the study, shows well different characteristics and distributions. Also because of the numerous heatwaves insisting on the country, the 5% of households having an AC in 1990 will become 50% in 2040, while thermal insulation reaches only 1/3 of the dwellings, and will remain constant.

SWEDEN, being a north-European country, is traditionally less exposed to recurring heatwaves but the number of households with AC have nevertheless increased today by 30 times since 2005, and the study projects it will reach more than one family in five in 2040. This is also one of the countries with more inclination for thermal insulation with almost one family on two in 2040.

SWITZERLAND, because of its peculiar cultural, territorial and climatic characteristics is - among the European countries analysed - the one with fewer AC installed today, but these will increase by 50% in the next 20 years, reaching 15% of the households in 2020.

The EU is currently lagging in its scheduled objectives to cut heavily Green House Gas Emissions by 2020 and 2030. It is true that new buildings consume on an average 40% less energy than old ones, but only 1% of the current stock is of this type in Europe.

"As highlighted by a number of studies, improving thermal insulation of buildings through the adoption of building codes, is among the most effective policy instruments for reducing residential energy consumption and reduce adaptation needs for cooling" explains Filippo Pavanello co-author of the study and researcher at Ca' Foscari. "New policies should also seek to increase the environmental awareness of the public, as we show that this is an important factor for deciding to install AC in your home and choose how much to use it."

Credit: 
Università Ca' Foscari Venezia