Culture

Ancient genomics pinpoint origin and rapid turnover of cattle in the Fertile Crescent

image: A zebu-shaped weight from Tel Beth-Shemesh.

Image: 
A. Hay, courtesy of the Tel Beth-Shemesh Excavations exhibition.

The keeping of livestock began in the Ancient Near East and underpinned the emergence of complex economies and then cities. Subsequently, it is there that the world's first empires rose and fell. Now, ancient DNA has revealed how the prehistory of the region's largest domestic animal, the cow, chimes with these events.

An international team of geneticists, led by those from Trinity College Dublin, have deciphered early bovine prehistory by sequencing 67 ancient genomes from both wild and domestic cattle sampled from across eight millennia.

"This allowed us to look directly into the past and observe genomic changes occurring in time and space, without having to rely on modern cattle genetic variation to infer past population events," said Postdpoctoral Researcher at Trinity,Marta Verdugo, who is first author of the article that has just been published in leading international journal, Science.

The earliest cattle are Bos taurus, with no ancestry from Bos indicus, or zebu - herds which were from a different origin in the Indus Valley.

"However, a dramatic change occurred around 4,000 years ago when we detect a widespread, wholesale influx of zebu genetics from the east," added Verdugo.

The rapid influx that occurred at this point - despite Near Eastern Bos taurus and zebu having coexisted for previous millennia - may be linked to a dramatic multi-century drought that was experienced across the greater Near East, referred to as the 4.2-thousand years ago climate event. At this time the world's first empires in Mesopotamia and Egypt collapsed and breeding with arid-adapted zebu bulls may have been a response to changing climate by ancient herders.

Professor of Population Genetics at Trinity, Dan Bradley, said: "This was the beginning of the great zebu diaspora that continues to the present day - descendants of ancient Indus Valley cattle are herded in each continental tropics region today."

Sequencing Near Eastern wild cattle, or aurochs, also allowed the team to unpick the domestication of this most formidable of beasts. Whereas their similarity to the early cattle of Anatolia concurs with a primary origin there, it is clear that different local wild populations also made significant additional genetic contributions to herds in Southeast Europe and also in the southern Levant, adding to the distinctive make up of both European and African populations today.

"There is a great power in ancient genomics to uncover new, unforeseen tales from our ancient history," added Professor Bradley.

Credit: 
Trinity College Dublin

Mustering a milder mustard

image: Indian mustard, like these yellow flowered plants, is an important agricultural crop in many parts of the world. It is also a Brassica plant that shares a bitter taste with other cruciferous plants like broccoli and cabbage. Researchers from three continents -- including biologists from Washington University in St. Louis -- have mapped the crystal structure of a key protein that makes the metabolites responsible for the bitter taste in Brassicas.

Image: 
Soon Goo Lee

The mustards, broccolis and cabbages of the world share a distinct and bitter taste. Some consider the flavor of cruciferous plants their strongest attribute. But even in India and China, where Brassicas have been cultivated for more than 4,000 years, scientists have sought to tone down the chemical compounds responsible for their pungent flavor. Turns out the same compounds that make them bitter also make them toxic at some levels.

Now researchers from three continents -- including biologists from Washington University in St. Louis -- have mapped the crystal structure of a key protein that makes the metabolites responsible for the bitter taste in Brassicas. A study published this month in the journal The Plant Cell is the first snapshot of how the protein evolved and came to churn out such diverse byproducts in this agriculturally significant group of plants.

The results could be used along with ongoing breeding strategies to manipulate crop plants for nutritional and taste benefits.

The new work is born of a longtime collaboration initiated by Naveen C. Bisht, staff scientist at the National Institute of Plant Genome Research, in New Delhi, India, with Joseph Jez, professor of biology in Arts & Sciences at Washington University, and Jonathan Gershenzon, of the Max Planck Institute for Chemical Ecology, in Jena, Germany.

"All of the Brassicas -- be it Indian mustard, Arabidopsis, broccoli or brussel sprouts -- they all make these pungent, sulphur-smelling compounds, the glucosinolates," Jez said. The compounds have long been recognized as a natural defense against pests.

"Plants need to fight back," Jez said. "They can't really do anything, but they can make stuff."

"There's different profiles of glucosinolates in different plants," he said. "The question has always been if you could modify their patterns to make something new. If insects are eating your plants, could you change the profile and get something that could prevent crop loss?"

But there are a daunting number of glucosinolates: almost 130 different kinds recognized within the Brassicas. Each plant species within the genus makes a "collection" of several different kinds of glucosinolates -- its own flavor mix -- all of which are secondary metabolites of a particular protein.

Researchers have known about the central role of this protein for decades. But prior to this study, no one had ever been able to complete the x-ray crystallography necessary to map it in detail.

The new work, co-led by Roshan Kumar, now a postdoctoral fellow in the Jez laboratory at Washington University, uses genetics, biochemistry and structural biology to help unravel the molecular basis for the evolution and diversification of glucosinolates.

"Glucosinolates are derived from amino acids," Kumar said. "Gene elongation is one of the important steps that provides most of the diversity in the glucosinolate profiles across all of the Brassicas. It decides which type of glucosinolates (the plant) is going to form."

The insight gained in the new study is important step toward mustering a milder mustard, or building a bitter-free broccoli.

But will it help us to eat our greens?

Maybe. Mostly researchers are interested in the potential for modifying glucosinolates in seeds, not in the stems or leafy parts of Brassica plants, Kumar said.

The major oilseed crop Brassica juncea and related rapeseeds are used to make cooking oil in temperate and subtropical areas of the world. Plant breeders have sought to adjust the levels of glucosinolates in these crops so that the protein-rich seed cake leftovers can be used as a feed supplement for cattle and poultry.

"If you decrease glucosinolates from all over the plant, it becomes susceptible to pests and pathogens," Kumar said. "That is why there is a need for smart engineering of glucosinolates."

Credit: 
Washington University in St. Louis

Scientists opening the door to a new era of medicinal chemistry

image: A new molecular descriptor estimates molecular complexity and defines the evolution of small molecules in medicinal chemistry.

Image: 
Insilico Medicine

July 11, 2019 - Progress in the pharmaceutical industry depends largely on the achievements and advances in medicinal chemistry. Big pharma companies, which set the pace of the industry, can be regarded as major drivers of medicinal chemistry evolution. Since 2007 there has been a significant decline in the number of patent records involving new chemical entities, and many molecules observed during the HTS (High Throughput Screening) boom, were not considered attractive. Despite this, dominant methods and principles of organic chemistry have drastically evolved and resulted in building molecules with an increased 3D complexity.

Now, a team of researchers from the Medicinal Chemistry Department of Insilico Medicine have introduced the original descriptor MCE-18, which defines key features of "next-generation" molecules and traces the evolution of medicinal chemistry through the years.

Yan Ivanenkov, Head of Medicinal Chemistry Department at Insilico Medicine, along with Bogdan Zagribelnyy, and Vladimir Aladinskiy, both scientists in the Medicinal Chemistry Department of Insilico Medicine, reported their findings on MCE-18 in the paper, "Are We Opening the Door to a New Era of Medicinal Chemistry or Being Collapsed to a Chemical Singularity?" in the Journal of Medicinal Chemistry.

MCE-18 can be applied to assess the effectiveness of new molecules and may help researchers in designing new chemical entities that have great potential in modern drug development.

"Equipped with the newly developed MCE-18 descriptor and in silico tools, we have clearly shown that molecules and scaffolds are becoming increasingly sophisticated with higher degrees of 3D complexity for compounds against various biological targets such as kinases, GPCRs and proteases. Pharma has become more qualitative and smarter. We can reasonably regard this as a novel turning point in chemical evolution and state that medicinal chemistry has ushered in a new era of drug design and development" said Ivanenkov.

Credit: 
InSilico Medicine

New study discovers genetic changes linked to leukaemia in children with down's syndrome

Researchers at the University of Oxford, in collaboration with colleagues from Hannover Medical School and Martin-Luther-University Halle-Wittenberg, have discovered the specific gene mutations that are required for the development of leukaemia in children with Down's syndrome. Children with Down's syndrome have a 150-fold increased risk of myeloid leukaemia, and while some of the genetic causes of this have been previously established, this is the first study to identify a wide range of mutations and how they functionally interact to lead to leukaemia. The study was published today in the journal Cancer Cell.

'We already knew that 30% of babies born with Down's syndrome have acquired a change in a gene called GATA1 in their blood cells. This is not an inherited genetic change, but one that occurs and will remain only in the baby's blood cells," says study author Professor Paresh Vyas, from the MRC Weatherall Institute of Molecular Medicine at the Radcliffe Department of Medicine, University of Oxford. "The abnormality in the GATA1 gene can be detected by a simple blood test at birth. Babies with an altered GATA1 gene have a predisposition to develop leukaemia, and we often refer to this as 'myeloid preleukaemia'.'

Of the 30% of children with Down's syndrome who are found to have 'myeloid preleukaemia', only 10% of those will go on to develop myeloid leukaemia (3% of all children with Down's syndrome). Until now, it was not understood why only some children with the GATA1 mutation were progressing to full leukaemia, while others were not.

'90% of babies with Down's syndrome do not go on to develop preleukaemia. But until now, we did not fully understand why some babies did develop leukaemia,' says Vyas, who is also a group leader at the MRC Molecular Haematology Unit. 'To answer this question, we carefully characterised the mutations in genes required for leukaemia to develop. We found that additional genetic changes are required in the altered GATA1 blood cells, and these additional changes transform the preleukaemic blood cells into leukaemic blood cells.' In total, 43 different altered genes were found.

The discovery of which specific genetic changes are required for leukaemia to develop has practical implications. While children with Down's syndrome are currently tested at birth for the GATA1 mutation, it may now become possible in the future to test for the additional mutations too. 'This would mean that we could identify the 10% of children who will develop leukaemia more quickly and easily, and importantly reassure 90% of families whose children will not develop leukaemia,' says Vyas. 'The identification of these genetic changes may also mean we can develop and test new treatments specifically targeting the genetic changes we now know are required by the leukaemia - and so develop more targeted treatments with less side effects.'

Current treatments for Down's syndrome children with leukaemia are already highly successful, and off the back of this research, another possible drug treatment has come to light. The drug Ruxolitinib, which is currently used to treat some blood conditions, could potentially be used to treat some of the specific genetic mutations found in the study. Clinical trials of the drug are a possibility for the future.

'The recent identification of a group of genes linked to leukaemia in children with Down's syndrome is an important first step towards developing early diagnostic tests and identifying effective treatments to help these patients,' says Dr Mariana Delfino-Machin, Programme Manager at the Medical Research Council (MRC). 'The MRC is proud to support the research undertaken at the MRC Molecular Haematology Unit, of which this early-stage study is a great example.'

Credit: 
University of Oxford

Lack of crop diversity and increasing dependence on pollinators may threaten food security

image: Honey bee worker and a male Andrena sp on apple blossom.

Image: 
Martin Husemann

A multinational team of researchers has identified countries where agriculture's increasing dependence on pollination, coupled with a lack of crop diversity, may threaten food security and economic stability. The study, which was published in the journal Global Change Biology on July 11, 2019, is the first global assessment of the relationship between trends in crop diversity and agricultural dependence on pollinators.

Using annual data from the U.N. Food and Agriculture Organization from 1961 to 2016, the study showed that the global area cultivated in crops that require pollination by bees and other insects expanded by 137%, while crop diversity increased by just 20.5%. This imbalance is a problem, according to the researchers, because agriculture dominated by just one or two types of crops only provides nutrition for pollinators during a limited window when the crops are blooming. Maintaining agricultural diversity by cultivating a variety of crops that bloom at different times provides a more stable source of food and habitat for pollinators.

"This work should sound an alarm for policymakers who need to think about how they are going to protect and foster pollinator populations that can support the growing need for the services they provide to crops that require pollination," said David Inouye, professor emeritus of biology at the University of Maryland and a co-author of the research paper.

Globally, a large portion of the total agricultural expansion and increase in pollinator dependence between 1961 and 2016 resulted from increases in large-scale farming of soybean, canola and palm crops for oil. The researchers expressed concern over the increase in these crops because it indicates a rapid expansion of industrial farming, which is associated with environmentally damaging practices such as large monocultures and pesticide use that threaten pollinators and can undermine productivity.

Particularly vulnerable to potential agricultural instability are Brazil, Argentina, Paraguay and Bolivia, where expansion of pollinator-dependent soybean farms has driven deforestation and replaced rich biodiversity that supports healthy populations of pollinators with large-scale single-crop agriculture (monoculture). Malaysia and Indonesia face a similar scenario from the expansion of oil palm farming.

"Farmers are growing more crops that require pollination, such as fruits, nuts and oil seeds, because there's an increasing demand for them and they have a higher market value," Inouye said. "This study points out that these current trends are not great for pollinators, and countries that diversify their agricultural crops are going to benefit more than those that expand with only a limited subset of crops."

Although the study found that countries replacing forests and diverse, heterogeneous agricultural landscapes with extensive with pollinator-dependent monoculture are most vulnerable, other countries also face risks from growing dependence on pollinators.

In Europe, farmland is contracting as development replaces agriculture, but pollinator-dependent crops are replacing non-pollinator-dependent crops such as rice and wheat (which are wind pollinated). According to the study, increasing need for pollination services without parallel increases in diversity puts agricultural stability at risk in places like Australia, the United Kingdom, Germany, France, Austria, Denmark and Finland.

In the U.S., agricultural diversity has not kept pace with expansion of industrial-scale soybean farming.

"This work shows that you really need to look at this issue country by country and region by region to see what's happening because there are different underlying risks," Inouye said. "The bottom line is that if you're increasing pollinator crops, you also need to diversify crops and implement pollinator-friendly management."

Inouye said the researchers are hoping this work will spur policymakers and resource managers to reevaluate current trends and practices to introduce more pollinator-friendly management such as reducing insecticide use, planting edge rows and flower strips to provide nest sites and food for pollinators, and restoring seminatural and natural areas adjacent to crops.

Credit: 
University of Maryland

Study: Global farming trends threaten food security

image: A honey bee worker and a male mining bee on an apple flower.

Image: 
Martin Husemann

Citrus fruits, coffee and avocados: The food on our tables has become more diverse in recent decades. However, global agriculture does not reflect this trend. Monocultures are increasing worldwide, taking up more land than ever. At the same time, many of the crops being grown rely on pollination by insects and other animals. This puts food security at increased risk, as a team of researchers with help from Martin Luther University Halle-Wittenberg (MLU) writes in the journal Global Change Biology. For the study, the scientists examined global developments in agriculture over the past 50 years.

The researchers analysed data from the United Nations' Food and Agriculture Organization (FAO) on the cultivation of field crops between 1961 and 2016. Their evaluation has shown that not only is more and more land being used for agriculture worldwide, the diversity of the crops being grown has declined. Meanwhile, 16 of the 20 fastest growing crops require pollination by insects or other animals. "Just a few months ago, the World Biodiversity Council IPBES revealed to the world that up to one million animal and plant species are being threatened with extinction, including many pollinators," says Professor Robert Paxton, a biologist at MLU and one of the authors of the new study. This particularly affects bees: honeybees are increasingly under threat by pathogens and pesticides, and populations of wild bees have been on the decline around the world for decades.

Fewer pollinators could mean that yields are much lower or even that harvests fail completely. However, risks are not spread equally across the world. The researchers used the FAO data to create a map showing the geographical risk of crop failure. "Emerging and developing countries in South America, Africa and Asia are most affected," says Professor Marcelo Aizen of the National Council for Scientific and Technological Research CONICET in Argentina, who led the study. This is not surprising, he says, since it is precisely in these regions where vast monocultures are grown for the global market. Soy is produced in many South American countries and then exported to Europe as cattle feed. "Soy production has risen by around 30 percent per decade globally. This is problematic because numerous natural and semi-natural habitats, including tropical and subtropical forests and meadows, have been destroyed for soy fields," explains Aizen.

According to the authors, current developments have little to do with sustainable agriculture, which focuses on the food security of a growing world population. And, although poorer regions of the world are at the greatest risk, the consequences of crop failure would be felt worldwide: "The affected regions primarily produce crops for the rich industrial nations. If, for example, the avocado harvest in South America fails, people in Germany and other industrial nations may no longer be able to buy them," concludes Robert Paxton, who is also a member of the German Centre for Integrative Biodiversity Research (iDiv) Halle-Jena-Leipzig.

The researchers advocate for a trend reversal: Care should be taken to diversify agriculture worldwide and make it more ecological. This means, for example, that farms in particularly susceptible countries should grow a diversity of crops. In addition, farmers all over the world would need to make the areas under cultivation more natural, for example by planting strips of flowers or hedgerows next to their fields and by providing nesting habitats on field margins. This would ensure that there are adequate habitats for insects, which are essential for sustainable and productive farming.

Credit: 
Martin-Luther-Universität Halle-Wittenberg

Are the 'viral' agents of MS, ALS and schizophrenia buried in our genome?

What if the missing 'environmental' factor in some of our deadliest neurological diseases were really written in our genome?

Writing in Frontiers in Genetics, researchers from the University of Dusseldorf explain how viruses ended up in our DNA - and what puts them in the frame in unsolved diseases like multiple sclerosis.

The enemy within

A whopping 8% of our DNA comes from viruses. Specifically, ones called retroviruses - not because they're old, but because they reverse the normal process of reading DNA to write themselves into their host's genome.

Retroviruses are old though: they began merging with our earliest, primordial ancestors millions of years ago. Over the millennia, most of their remnants in our DNA - known as human endogenous retroviruses or HERVs - have been silenced by mutations. Others, which had evolved to fend off rival viruses, formed the prototypical immune system and to this day protect us from infection.

However, HERVs might also be the missing causative link in major 'unsolved' neurological diseases.

"HERVs have been implicated in the onset and progression of multiple sclerosis [MS], amyotrophic lateral sclerosis [ALS] and schizophrenia [SCZ]," says senior author Prof. Patrick Kuery. "Dormant HERVs can be reactivated by environmental factors such as inflammation, mutations, drugs, or infection with other viruses, so could provide a mechanism for their well-established epidemiological link to these disorders."

Role in MS

So far, the strongest evidence links HERVs to MS.

"MS is caused by direct autoimmune attacks on myelin - the fatty coating of nerve cells - in the brain and spinal cord," explains Kuery. "But we don't yet understand how these attacks are triggered."

A variety of studies suggest that reactivation of HERV could be just such a trigger.

"Retroviruses were first associated with MS in 1989, but only decades later was it realized that these are in fact HERVs.

"Subsequently, it was shown that levels of HERV RNA and protein - the 'readouts' from reactivated HERV DNA - are increased in the brain and spinal cord fluid [CSF] of sufferers, as well as in their brain tissue postmortem.

"Linking this HERV reactivation to autoimmune attacks in MS, it was found that HERV proteins can trigger an immune response against myelin, which triggers MS-like disease in mouse models."

Mechanistically, HERV proteins could trigger autoimmunity through 'molecular mimicry'.

"In addition to direct effects of HERV on myelinating cells, several groups report structural similarities between HERV and myelin oligodendrocyte glycoprotein - a molecule displayed on the surface of myelin. This similarity could fool the immune system into damaging myelin, when it mounts an attack on HERVs."

Experimental proof in humans

Similar experiments have linked HERVs to the peripheral demyelinating disease CIDP, as well as more distinct disease processes like progressive loss of motor neurons in ALS (Lou Gehrig's disease).

In schizophrenia, a complex neurodevelopmental disorder, the link to HERVs is more circumstantial.

"HERV proteins have been reported to increase expression of schizophrenia-linked genes in cultured human brain cells," reports Kuery. "However, studies on schizophrenia sufferers show inconsistent changes in HERV expression in blood, CSF and postmortem brain tissue compared to healthy controls."

Whether or not HERVs contribute to these and other unexplained neurological conditions requires further investigation. An important step will be to test the effects of HERV-neutralizing antibodies in humans.

"Of note, in relapsing MS patients a phase 2b clinical trial using HERV protein-neutralizing antibody Temelimab has been conducted. We're now waiting to see if the treatment showed beneficial effects on remyelination or attenuated neurodegeneration," Kuery concludes.

Credit: 
Frontiers

Training trials

When new rules capped training hours for medical residents at 80 hours per week in 2003, critics worried that the change would leave physicians-in-training unprepared for the challenges of independent practice.

Now, new research published July 11 in BMJ and led by scientists in the Department of Health Care Policy in the Blavatnik Institute at Harvard Medical School, shows that these dire warnings were largely unjustified.

The analysis--believed to be the first national study examining the impact of reduced hours on physician performance--found no evidence that reduced training hours had any impact on the quality of care delivered by new physicians.

Following a series of high-profile patient injuries and deaths believed to stem from clinical errors caused by fatigue, medical accreditation agencies initiated a series of sweeping changes to the regulations governing resident hours and other aspects of training. These efforts culminated in 2003 with the U.S. Accreditation Council for Graduate Medical Education capping the training of medical residents at 80 hours per week.

"This is probably the most hotly debated topic in medical education among physicians," said Anupam Jena, the HMS Ruth L. Newhouse Associate Professor of Health Care Policy in the Blavatnik Institute, a physician in the department of medicine at Massachusetts General Hospital and lead author of the study. "Many doctors trained under the old system think that today's residents don't get enough training under the new system. You hear a lot of senior physicians looking at younger doctors coming out of training and saying, 'They're not as prepared as we were.'"

The findings of the study should assuage these fears, Jena said.

The researchers found no significant differences in 30-day mortality, 30-day readmissions, or inpatient spending between physicians who completed their residency before and after the residency hour reforms.

"We found no evidence that the care provided by physicians who trained under the 80-hour-a-week model is suboptimal," Jena said.

Given the changes in hospital care over the past decade, the researchers knew that they couldn't just compare the difference between outcomes of recently trained doctors before and after the cap, since overall outcomes have improved thanks to better diagnoses and treatments, better coordination of care and new digital tools designed to prevent harmful drug interactions and other human errors.

Comparing new physicians trained before reform with those trained after would confound the effect of changes in training with the effect of overall changes in hospital care. To avoid conflating the two, the researchers compared new physicians before and after the reforms with senior physicians who had trained before the reform.

The study analyzed 485,685 hospitalizations of Medicare patients before and after the reform.

The training hour reforms were not associated with statistically significant differences in patient outcomes after the physicians left training.

For example, 30-day mortality rates among patients cared for by first-year attending internists during 2000-2006 and 2007-2012 were 10.6 percent (12,567/118,014) and 9.6 percent (13,521/140,529), respectively. In comparison, the 30-day mortality among patients cared for by tenth-year attending physicians was 11.2 percent (11,018/98,811) and 10.6 percent (13,602/128,331) for the same years.

Further statistical analysis to eliminate the unwanted effects of other variables showed that these differences translated into a less than 0.1 percentage point gap between the groups. The difference in hospital readmission rates was similarly minuscule: 20.4 percent for patients cared for by first-year physicians in both 2000-2006 and 2007-2012, compared with 20.1 percent and 20.5 percent, respectively, among patients treated by senior physicians.

Taken together, these findings suggest that U.S. residency work hour reforms have not made a difference in the quality of physician training, Jena said.

As a way of magnifying any possible gaps in care stemming from a difference in training hours, the researchers looked specifically at outcomes for high-risk patients, in whom even small differences in quality of care would become apparent.

"We looked at patients who were particularly ill. In these cases, one little mistake could mean the difference between life and death," Jena said. "Even for these sickest patients we found that the reduced training hours had no effect on patient mortality."

Credit: 
Harvard Medical School

Ancient epigenetic changes silence cancer-linked genes

image: A study in zebrafish indicates that some genes linked to cancers in humans have been strictly regulated throughout evolution.

Image: 
Michael Geng

An epigenetic change, a form of DNA control, that deactivates some genes linked to cancer late in human development has been conserved for more than 400 million years, new research led by the Garvan Institute of Medical Research suggests.

Researchers uncovered that genes turned on in some cancers in humans also exist in zebrafish - but are 'silenced' within just hours of fertilisation. The study sheds new light on how our epigenetics can regulate genes, some of which are linked to cancer development later in life, over large evolutionary distances. It also uncovers significant differences between how the epigenome 'resets itself' in zebrafish and human embryos, which may guide future studies on epigenetic inheritance.

"We've shown that we have conserved this embryonic event that switches off genes linked to cancers in humans," says Dr Ozren Bogdanovic, Head of the Developmental Epigenomics Laboratory, who led the study. "It's intriguing and we still don't know why it's happening, but it suggests just how important to human health it is to keep these genes silenced."

The findings are published in the journal Nature Communications.

An unexpected relative

At first glance, humans and zebrafish (a tiny species of fish native to South Asia) hardly seem related - in fact, our common evolutionary ancestor dates back more than 400 million years.

But genetically, zebrafish and humans are not so different - we share around 70% of protein-producing genes. The Garvan-led team set out to investigate how conserved the epigenetic changes, that control how DNA is 'read', are during the development of an embryo.

Genes are in part controlled by methylation - tags on DNA that 'block' genes from being read.

The researchers first isolated primordial germ cells, the precursor cells of sperm and egg, from developing zebrafish embryos and generated whole genome bisulfite sequencing (WGBS) data - a snapshot of all the DNA methylation in the cell.

The zebrafish genome's father figure

The team uncovered fundamental differences in how DNA is methylated in mammalian and zebrafish embryos.

In humans, these DNA methylation tags are mostly 'swept clean' when a sperm fertilises an egg, and then gradually methylated again, to ensure the embryo can develop correctly. Instead, zebrafish embryos retain the methyl group pattern of the father.

In this study, the researchers found that primordial germ cells of zebrafish do not reset their methylation patterns either, but inherit paternal DNA methylation patterns. This contrasts with findings in mammalian primordial germ cells, which undergo a second 'sweep cleaning' of their DNA methylation tags. The researchers say this finding sheds light on the molecular principles of germline development and highlights zebrafish as a useful experimental model to study how epigenetic signatures are inherited throughout generations.

Further, the researchers screened how DNA is methylated in zebrafish embryos, at four stages of development. They discovered 68 genes that were methylated and turned off early during embryonic development, within 24 hours of fertilisation.

"What was interesting is that most of these genes belong to a group called cancer testis antigens," says Dr Ksenia Skvortsova, co-first author of the study. "Our work shows that these are some of the very first genes that are 'silenced', or targeted by DNA methylation, in both zebrafish and mammals."

Fresh insight on an ancient mechanism

The genes that code for cancer testis antigens, or CTAs for short, are only active in the male testis, but are turned off in all other tissues, in humans. For an unknown reason, CTA genes are turned on again in some cancers, such as melanomas.

"Mammals and fish have very different strategies when it comes to developing an embryo," says Dr Bogdanovic. "But in spite of these very different strategies, it appears that the control of CTA genes are conserved throughout evolution."

While the work sheds new light on our evolution, it may have potential to impact the future of human health. Drugs which target CTAs are already being investigated as a potential therapy for cancers. The current study provides more evidence on how significant CTAs are, and how tightly controlled they have been over the course of evolution.

Credit: 
Garvan Institute of Medical Research

EPFL scientists map high-risk areas for hepatitis E

image: On this map, the researchers compared the location of the refugee camps with the hepatitis E outbreaks recorded to date.

Image: 
© LCE/LASIG EPFL 2019

EPFL scientists have created the first world map of regions with the highest prevalence of the hepatitis E virus (HEV). They hope that their map - freely available online - will help governments and NGOs design more effective prevention campaigns based on reliable data, particularly when it comes to setting up refugee camps. The scientists' research has just been published in Scientific Reports.

In Europe, China, Japan and North America, the main way people catch HEV is by eating undercooked pork, and the resulting disease is generally not fatal. However, in Mexico, India, Africa and most Asian countries, HEV is contracted by coming into contact with water from a river or well contaminated with fecal matter. According to the World Health Organization, there are around 20 million HEV infections worldwide every year and some 50,000 deaths from the disease. Hepatitis E epidemics are particularly deadly for pregnant women and generally occur after heavy rains and floods or after months-long droughts.

Machine learning

To build their map, the EPFL scientists compiled data on all Hepatitis E epidemics recorded worldwide since 1980 and on environmental statistics like temperature, soil wetness and rainfall over the same period. They also factored in geographical location, population density and the rate of evapotranspiration, or how much river water evaporates during a drought. Evapotranspiration is important because the more that occurs, the more highly concentrated the intestinal pathogens are in the contaminated water that remains - water that is often used for cooking, washing or even religious ceremonies.

Thanks to machine learning, the scientists were able to crunch through all the data and come up with actionable results. "Our study confirmed that the areas most at risk are those with a high population density, heavy seasonal rainfall and high evapotranspiration rates," says Anna Carratalà, a scientist at EPFL's Environmental Chemistry Laboratory and the study's lead author. Her co-author, Stéphane Joost, works at EPFL's Laboratory of Geographic Information Systems. "One way to reduce that risk is to artificially increase river water flow rates during the hottest, driest periods of the year."

The need for more data

The EPFL scientists have accomplished a monumental task in bringing together data from a number of online sources, yet their map is only one step towards developing prevention campaigns in high-risk areas. For instance, their map shows that measures urgently need to be taken in northern India. According to Carratalà, the next step would be to add information on annual HEV concentrations in the Ganges River to their dataset, along with the number of Hepatitis E cases recorded at local hospitals. That would give them greater insight into how environmental factors affect Hepatitis E epidemics in that region.

The scientists worked with India's National Institute of Epidemiology to collect data about the country. In a new project, they will look at how human activity affects the concentrations of HEV and other contaminants - like antibiotic-resistant genes - in the Rhone in Switzerland and compare that with concentrations in the Ganges.

Credit: 
Ecole Polytechnique Fédérale de Lausanne

Study finds no correlation between brain function & head impacts after 2 seasons of tackle football

COLUMBUS, Ohio - Many parents, potential players and medical providers are increasingly wary of youth contact sports participation. The concern over the potential short- and long-term effects of head impacts experienced by youth football players has likely driven decreasing participation, according to a group of researchers.

To date, most studies that have attempted to understand connections between neurocognitive function and sub-concussive head impacts have been retrospective - and inconclusive. Tracking athletes in real time can account for confounding factors such as pre-participation cognitive ability.

This was the idea behind a study led by Sean Rose, MD, pediatric sports neurologist and co-director of the Complex Concussion Clinic at Nationwide Children's Hospital. He collaborated with MORE Foundation, The Sports Neurology Clinic, and other researchers to follow more than 150 youth tackle football players ages 9 to 18.

Recently, they published the results of the first two years of the study in Journal of Neurotrauma, and the researchers plan to continue the study an additional two years. The data from the first season was published in 2018 in Journal of Head Trauma Rehabilitation.

"When trying to determine the effects of repeated, sub-concussive head impacts, prospective outcomes studies are an important addition to the existing retrospective studies," says Dr. Rose. "We designed this study to include a wide variety of neurocognitive outcomes tests, to give us new insights into how repeated hits might influence outcomes."

The pre and postseason assessments used to measure outcomes included:

Neuropsychological (cognitive) testing

Symptoms assessment

Vestibular and ocular-motor screening

Balance testing

Parent-reported ADHD symptoms

Self-reported behavioral adjustment

Sensors placed in the helmets recorded sub-concussive head impacts during practices and games. In the full 166 player group, a computerized test of processing speed declined over time. The other 22 outcome measures improved or did not change over time. Neither the total number of impacts nor the intensity of impacts correlated with change in outcomes from before season 1 to after season 2 in the 55 players who participated in both seasons of the study.

"So far, the study is showing us that sub-concussive impacts don't seem to be associated with changes in neurocognitive function over two seasons of youth football. And we're finding that other factors, such as ADHD and younger age are more predictive of worsening scores on our pre and post-season tests," says Dr. Rose. "However, we remain concerned about repetitive head impacts in children, and longer follow up times are necessary to look for delayed effects on neurocognition."

Credit: 
Nationwide Children's Hospital

Flu fact sheet for parents increases vaccination rate in children

NEW YORK, NY (July 10, 2019)--Young children are more likely to suffer severe, even life-threatening complications from the flu, but only around half of children in the US get the flu vaccine.

A cheap and simple pamphlet about the flu, handed to parents in their pediatrician's waiting room, can increase the number of children who get the flu vaccine, a new study from researchers at Columbia University has found.

The study--a randomized, controlled clinical trial--is one of the first to look at the effect of educational information on influenza vaccination rates in children.

Background

"Parents' concerns and misperceptions about vaccines are on the rise," says Melissa Stockwell, MD, MPH, associate professor of pediatrics and population and family health at Columbia University Vagelos College of Physicians and Surgeons and senior author on the paper. "But previous studies have shown that offering information to disprove vaccine myths, in some cases, only reinforces parents' beliefs about vaccination and can even reduce the number of vaccine-hesitant parents who intend to get their kids vaccinated."

Influenza spreads easily and affects about 8% of children each year. In young children, especially those under 2 years of age, the flu is more likely to cause pneumonia and severe inflammatory responses, which can result in hospitalization and even death.

The best way to prevent influenza is with the influenza vaccine, aka 'flu shot,' and both the CDC and American Academy of Pediatrics recommend annual flu vaccination for children age 6 months and up.

"In our study, we hoped to identify educational content that would encourage parents to get their children vaccinated against the flu," says Vanessa P. Scott, MD, first author who was previously a general academic pediatrics fellow at Columbia University Irving Medical Center and is currently an assistant clinical professor of pediatrics at University of California San Diego.

What the Study Found

The study included 400 parent-and-child pairs at pediatric clinics in northern Manhattan. The parents answered a brief questionnaire to assess their attitudes toward the flu shot and their intent to vaccinate. One-third received a one-page handout with local information about the flu, another third received a one-page handout with national information about the flu, and the rest received usual care (no handout). Both handouts emphasized the risk of getting the flu, the seriousness of the disease, and vaccine effectiveness. Providers were unaware of the parent's study participation.

The researchers found that nearly 72% of children whose parents were given either fact sheet were vaccinated before the end of the season compared to around 65% of those that got usual care.

Parents who received the national handout were more likely to have their child vaccinated on the day of the clinic visit (59%) compared to those who didn't receive either handout (53%).

Parents who had fewer concerns about vaccination were more likely to vaccinate their children by the end of the season (74% versus 59% of parents with significant concerns) and on the day of the clinic visit (59% and 45%, respectively). Approximately 90% of parents who said they planned to vaccinate their children did so by the end of the flu season.

What the Study Means

"We found that a low-cost handout that can be easily implemented in any pediatrics practice had a significant and meaningful impact on influenza vaccination in children," Stockwell says.

The handout is available in the paper.

Although Stockwell expected the handout with local information to have a bigger impact, the handout with national data improved vaccination rate on the day of the office visit.

"The difference in magnitude of the number of deaths from influenza may have made the national handout more impactful," Stockwell says.

Next Steps

Future research will compare the effectiveness, cost-effectiveness, and feasibility of different methods of delivering educational information about influenza--including handouts, text messages, video, and interactive social media.

Credit: 
Columbia University Irving Medical Center

Preeclampsia risk may be reduced by a healthy high-fibre diet

image: Pregnancy outcomes and infant immunity have been linked to gut bacteria.

Image: 
Pixabay user:StockSnap

A healthy diet rich in fibre is generally recommended, but new research shows it could be even more important during pregnancy to promote the wellbeing of the mother and child.

Plant-based fibre is broken down in the gut by bacteria into factors that influence the immune system.

Researchers from the University of Sydney's Charles Perkins Centre, the Barwon Infant Study from Deakin University, Monash University, James Cook University and the Australian National University collaborated to investigate the role of these metabolic products of gut bacteria during pregnancy.

Senior author of the study Professor Ralph Nanan said the simple recommendation to 'eat real food, mostly plants, and not too much' might be the most effective primary prevention strategy for some of the most serious conditions of our time.

"The mother's gut bacteria and diet appear to be crucial to promoting a healthy pregnancy," Professor Nanan, from the University of Sydney School of Medicine and Charles Perkins Centre, said.

Published today in Nature Communications, the study found that in humans, reduced levels of acetate, which is mainly produced by fibre fermentation in the gut, is associated with the common and serious pregnancy-related condition preeclampsia.

Preeclampsia occurs in up to 10 percent of pregnancies and is characterised by high blood pressure, protein in the urine and severe swelling in the mother. It also interferes with the child's immune development whilst in the womb, with some evidence suggesting a link to higher rates of allergies and autoimmune disease later in life.

The current study found that preeclampsia affected the development of an important fetal immune organ, the thymus, which sits just behind the breastbone.

Fetuses in preeclamptic pregnancies were found to have a much smaller thymus than children from healthy pregnancies.

The cells the thymus normally generates, called T cells (thymus-derived cells) - specifically those associated with the prevention of allergies and autoimmune conditions such as diabetes - also remained lower in infants after preeclampsia, even four years after delivery.

The mechanisms of acetate on the developing fetal immune system were further examined in separate experiments involving mice that showed acetate was central in driving fetal thymus and T cell development.

Together, these results showed that promoting specific metabolic products of gut bacteria during pregnancy might be an effective way to maintain a healthy pregnancy and to prevent allergies and autoimmune conditions later in life.

They may also, in part, explain the rapid increase of allergies and autoimmune conditions as Western diets are increasingly dominated by highly processed foods, which are very low in fibre.

"More studies are urgently needed to understand how we can best target this system to reduce the growing burden of immune related diseases in the modern world," said co-author Peter Vuillermin, co-lead of the Barwon Infant Study, a major birth cohort study being conducted by the Child Health Research Unit at Barwon Health in collaboration with the Murdoch Children's Research Institute (MCRI) and Deakin University.

Credit: 
University of Sydney

New evidence shows cytotoxic T cells can identify, invade, and destroy targets of large mass like Toxoplasma gondii tissue cysts

image: Three-dimensional images of T. gondii cysts containing CD8+ T cells that had fully invaded into the cysts detected in the brains of infected nude mice that received a transfer of CD8+ immune T cells. Sections (4 um thick) of their brains were applied for immunohistochemical staining for T. gondii (brown) and CD3, the T cell marker, (red), and Z-stack images were obtained using light microscopy. Upper panels show the images taken at the top and bottom of the histological section. The presence of the T cells (arrows) can be seen in both images at the top and bottom of the section. The lower panel shows a 3-dimensional image generated from the Z-stack images of the cyst at the cut-line indicated by a green arrow and line. These Z-stack images demonstrate the presence of the T cells (arrows) all the way through the sections. Scale = 10 um.

Image: 
<em>The American Journal of Pathology</em>

Philadelphia, July 10, 2019 - CD8+ cytotoxic T lymphocytes can kill host cells infected with various microorganisms as well as single individual cancer cells through direct cell-to-cell contact, but their ability to destroy a target of large mass remains unexplored. A study in The American Journal of Pathology, published by Elsevier, provided novel evidence on the capability of the immune system to eliminate large parasite-filled cysts associated with chronic Toxoplasma gondii (T. gondii) infection by utilizing the aggressive invader activity of cytotoxic T cells. They may also prove effective for attacking other sizable targets including solid cancers.

"The present study provided clear evidence that the immune system has the capability to attack and eliminate the tissue cysts of T. gondii. This sheds light on the possibility of developing a vaccine to activate these invasive cytotoxic T cells to prevent establishment of chronic infection with this parasite. This vaccine may also be applied to individuals chronically infected with T. gondii to eradicate existing tissue cysts of the parasite and cure this widespread chronic infection. This study also suggests the possibility of developing a new cancer immunotherapy that can be used to eliminate various types of solid cancers by activating the invasive cytotoxic T cells that specifically attack and penetrate into the target cancers," explained lead investigator Yasuhiro Suzuki, PhD, of the Department of Microbiology, Immunology and Molecular Genetics, University of Kentucky College of Medicine, Lexington, KY, USA.

One third of the world's human population as well as many other warm-blooded animals are currently infected with T. gondii. Although T. gondii infection usually produces few, if any obvious symptoms, the latent infection can erupt into a serious and occasionally fatal illness, toxoplasmic encephalitis, particularly in individuals with weakened immunity such as patients with cancer, HIV/AIDS, or organ transplants. In addition, recent epidemiological studies have reported increased incidence of brain cancers in T. gondii-infected individuals. Chronic infection is associated with the formation of cysts that can often grow to the size of more than 50 ?m in diameter filled with hundreds to thousands of the parasites, which are situated most often in the brain, eyes, and striated muscle including the heart. A common cause of T. gondii infection is the consumption of tissue cysts in raw or undercooked meat of infected animals, such as pork and mutton.

In this study, scientists investigated the effects of an injection of CD8+ immune T cells purified from the spleens of chronically infected mice, into mice that had ingested multiple T. gondii cysts. A few days after the T cell injection, numbers of T. gondii cysts fully invaded by the T cells were found in the brains of the mice that received the injection of the immune T cells.

The T-cell-invaded cysts displayed structural signs of deterioration and destruction. Within these deteriorated cysts, granular structures appeared intensely positive for granzyme B, a major cytotoxic protein secreted by cytotoxic T cells. These granular structures were detected in association with T. gondii bradyzoites (the slowly-multiplying encysted form of the parasite associated with the dormant stage of infection). Furthermore, the bradyzoites within the destroyed cysts were located within accumulated scavenger cells, including microglia and macrophages. The investigators also showed that perforin (a protein, released by killer cells of the immune system, which destroys targeted cells by creating pore-like lesions in their membranes) was necessary for the CD8+ T cell invasion and cyst elimination process.

In addition to becoming a powerful weapon against T. gondii infection, the investigators suggest the same principles can be used to attack solid cancers. "The invasion of the T cells into tumors could induce an infiltration of large numbers of phagocytic cells capable of attacking the cancer cells as was observed against T. gondii cysts," noted Dr. Suzuki. "An effective activation of the penetrating capability of cytotoxic T cells, which specifically recognize the target solid cancers, will most likely become a powerful therapeutic approach applicable to various types of solid cancers."

Credit: 
Elsevier

How does playing with other children affect toddlers' language learning?

Toddlers are surprisingly good at processing the speech of other young children, according to a new study. And toddlers who have more exposure to other children, such as those in daycare, may be particularly good at certain word learning skills.

Researchers at the University of Waterloo examined the word processing skills of toddlers who spend most of their time with adults compared with those who have more exposure to groups of children. They focused on how well the toddlers understood the speech of other children.

Although all of the toddlers were very good at processing child speech, the study found that toddlers who had more exposure to other children were better at associating a new word to a new object, an important part of word learning.

Child speech differs from adult speech in many ways. Even a child who is six or seven years old pronounces words a bit differently than adults. "We wanted to know if more exposure hearing other children speak would affect toddlers' ability to process child speech," said Katherine White, professor of psychology at Waterloo, who co-authored the study with PhD candidate, Dana Bernier.

In the study, the researchers conducted two experiments with a total of 88 toddlers (and their parents), some of whom spent eight hours or less per week with other children, and others who had more weekly experience in child groups.

Experiment 1 compared their processing of instructions from a seven-year-old child speaker and from an adult speaker pronouncing a familiar or novel object's name in the standard way. Experiment 2 tested the sensitivity of the toddlers' speech processing by having the child speaker mispronounce the object names.

"Our study demonstrates that toddlers are extremely good at processing the speech of young children, and that this is true even for toddlers who do not have a lot of experience with other children. This means that they could use this kind of speech, in addition to adult speech, to learn about their native language(s)," said White.

"However, we also found an intriguing difference in how toddlers processed new words that was related to how much exposure they had to other children."

"Most studies focus on how toddlers learn from adult speakers. But we think it's important to explore how toddlers process the speech of children of various ages and how much they use speech from other children to guide their language learning," said White.

Credit: 
University of Waterloo