Culture

Assisted reproduction technology leaves its mark on genes temporarily, study shows

image: Any effect that assisted reproduction technology has on babies' genes is largely corrected by adulthood, study shows.

Image: 
Murdoch Children's Research Institute

Any effect that assisted reproduction technology has on babies' genes is largely corrected by adulthood, new research led by the Murdoch Children's Research Institute has found.

Published in the latest edition of Nature Communications, the study* found events that occur in early development, including ovarian stimulation, manipulation of the embryo and the extra hormones common in fertility treatment cycles, can impact gene health or epigenetics but these effects are short lived.

Epigenetics is a process that controls how genes are turned on and off. Diet and other external environmental influences can play a role in this gene expression.

The study was designed to see how often epigenetic changes occur due to assisted reproduction technology and whether there were any differences in these changes from birth to adulthood.

"In two independent groups, we found the same effects of assisted reproduction on genes when examining heel prick blood spots collected soon after birth," study senior MCRI Professor Richard Saffery says. "These epigenetic changes were not evident in the adult blood samples."

MCRI Professor Jane Halliday, who established the cohort, and has studied the health of these individuals in adulthood, said assisted conception is linked to a small increased risk of preterm birth, low birth weight, being small for gestational age or perinatal mortality.

"Given the interventions associated with assisted reproduction technology at the time of conception, there were concerns that epigenetic changes may be taking place, silencing important genes and resulting in a heightened risk of health problems," she says.

More than seven million people around the world, including more than 200,000 people in Australia have been conceived through assisted reproduction technology since 1978.

Dr Boris Novakovic, who performed most of the analysis for the study, says that despite the expansion of assisted reproduction technology worldwide, few studies have investigated the potential underlying effects on genes.

"Previous studies have found some epigenetic changes in embryos grown in labs. However, no study has looked for these changes in the same individuals at birth and adulthood as we have done," he says.

"Our results are reassuring for families as they suggest that environment and lifestyle experienced from birth can repair any epigenetic deviations associated with fertility treatments."

The study looked at a cohort of 158 Australians aged 22-35 years conceived through assisted reproduction technology (IVF and GIFT**) and 75 people conceived naturally.

Dr Novakovic says more studies of larger sample sizes are needed in order to replicate the current findings.

Researchers from the University of Melbourne, Monash University, University of Turku, Turku University Hospital, Victorian Assisted Reproductive Treatment Authority, The Royal Women's Hospital, The Royal Children's Hospital, Hudson Institute of Medical Research, and the Monash IVF Group also contributed to the findings.

*Publication: Boris Novakovic, Sharon Lewis, Jane Halliday, Joanne Kennedy, David P Burgner, Anne Czajko, B Kim, Alex Sexton-Oates, Markus Juonala, Karin Hammarberg, David J Amor, Lex W Doyle, Sarah Ranganathan, Liam Welsh, Michael Cheung, John McBain, Robert McLachlan and Richard Saffery. 'Assisted reproductive technologies induce limited epigenetic variation at birth that largely resolves by adulthood', Nature Communications.

**80 per cent of participants were born using IVF and 20 per cent using GIFT. GIFT involves removing a woman's eggs, mixing them with sperm, and immediately placing them into a fallopian tube, unlike IVF where the fertilised egg is grown in a lab for a few days before being transferred to the womb.

Credit: 
Murdoch Childrens Research Institute

Unique fingerprint: What makes nerve cells unmistakable?

image: These are protein variants shape wiring of nerve cells in the brain.

Image: 
Biozentrum, University of Basel

Protein variations that result from the process of alternative splicing control the identity and function of nerve cells in the brain. This allows organisms to build a highly complex neuronal network with only a limited number of genes. The study describing a detailed map of neuronal splicing conducted by a research team at the Biozentrum, University of Basel, has now been published in Nature Neuroscience.

Our brain consists of hundreds, if not thousands, of different types of nerve cells that control our brain functions due to their individual characteristics. But how do the different cell types manage to develop their diverse traits? In a genome-wide analysis, the team led by Prof. Peter Scheiffele at the Biozentrum, University of Basel, has now discovered that alternative splicing leads to a broad range of variants of individual proteins, which ultimately allows to distinguish types of nerve cells.

Alternative splicing determines cell types

Alternative splicing can generate multiple different protein variants from a single gene. In the mouse model, Scheiffele's team investigated splice variants in a panel of neuronal cell types. "We have been able to identify hundreds of splice variants that enable us to differentiate between different types of neurons," says Scheiffele. "There are unique repertoires of variants in each nerve cell type."

These repertoires of splice variants significantly shape the identity and function of nerve cells. "Although all neuronal cell types contain the same set of genes, even closely-related cell types produce different splice variants," explains Scheiffele. In particular, proteins located at the neuronal contact points - the synapses, which mediate the transmission and processing of information - are extremely diverse. Thus, the splicing process also controls the function of the neuronal circuits in the brain.

Data platform for scientists

The generation and analysis of the extensive data sets is part of the EU-funded project "SPLICECODE". In collaboration with the "Center for Scientific Computing" (sciCORE), a user-friendly website has been set up which allows scientists worldwide to investigate the role of individual splice variants in brain function.

Credit: 
University of Basel

Researchers reveal how bacteria behind hospital infections block out antibiotics

image: This is the structure of the antibiotic-resistant protein that closes the pore (door) to antibiotics trying to enter the bacterial cell.

Image: 
Wong et al. (2019)

Drug-resistant bacteria responsible for deadly hospital-acquired infections shut out antibiotics by closing tiny doors in their cell walls.

The new finding by researchers at Imperial College London could allow researchers to design new drugs that 'pick the locks' of these closed doors and allow antibiotics into bacterial cells. The result is published today in Nature Communications.

The bacterium Klebsiella pneumoniae causes infections in the lungs, blood and wounds of those in hospitals, and patients that have compromised immune systems are especially vulnerable. More than 20,000 K. pneumoniae infections were recorded in UK hospitals in the past year.

Like many bacteria, K. pneumoniae is becoming increasingly resistant to antibiotics, particularly a family of drugs called Carbapenems. Carbapenems are used as antibiotics in hospitals when others have failed or are ineffective.

Therefore, rising resistance to Carbapenems could dramatically affect our ability to cure infections. For this reason, Carbapenem-resistant K. pneumoniae and are classified as 'critical' World Health Organization Priority 1 organisms.

Now, the team from Imperial has discovered one mechanism by which K. pneumoniae is able to resist Carbapenems. Antibiotics usually enter the K. pneumoniae bacteria through surface doorways known as pores. The team investigated the structure of the pores and showed that by shutting these doorways K. pneumoniae becomes resistant to multiple drugs, since antibiotics cannot enter and kill them.

First author, Dr Joshua Wong, from the Department of Life Sciences at Imperial, said: "The prevalence of antibiotic resistance is increasing, so we are becoming more and more reliant on drugs like Carbapenems that work against a wide range of bacteria.

"But now with important bacteria like K. pneumoniae gaining resistance to Carbapenems it's important we understand how they are able to achieve this. Our new study provides vital insights that could allow new strategies and drugs to be designed."

The team compared the structures of K. pneumoniae bacteria that were resistant to Carbapenems to those that weren't, and found the resistant bacteria had modified or absent versions of a protein that creates pores in the cell wall. Resistant bacteria have much smaller pores, blocking the drug from entering.

The closed doors aren't all good news for bacteria. They also mean that the bacteria can take in fewer nutrients, and tests in mice showed that the bacteria grow slower as a result.

However, the advantage in terms of avoiding antibiotics outweighed the negative impact of slower growth for the bacteria, allowing them to maintain a high level of infection.

The project was conducted in close collaboration with Dr Konstantinos Beis from the Department of Life Sciences, who is based at the Research Complex at Harwell in Oxfordshire.

The team was led by Professor Gad Frankel, from the Department of Life Sciences at Imperial, who said: "The modification the bacteria use to avoid antibiotics is difficult to get around. Any drugs to counteract this defence mechanism would likely also get blocked out by the closed doors.

"However, we hope that it will be possible to design drugs that can pick the lock of the door, and our data provides information to help scientists and pharmaceutical companies make these new agents a reality."

As resistant bacteria are weaker, these results suggest that the pressure posed by the extensive use of Carbapenems in hospital settings is a major driver in the spread of these superbugs. The study provides a direct scientific basis for the implementation of restrictive prescribing policies that would minimise the use of broad-spectrum agents such as Carbapenems.

The team are all part of the Antimicrobial Research Collaborative at Imperial, a multidisciplinary centre that addresses antibiotic resistance by advancing basic research, translating research into new prevention strategies and healthcare interventions, and informing public health policy.

Credit: 
Imperial College London

People believe achieving environmental sustainability could hinder quality of life

Attempts such as the U.S. 'Green New Deal' to create a more environmentally, economically, and socially sustainable world face a major challenge - people are sceptical you can achieve all three. And they believe that targeting environmental outcomes like climate change and pollution comes at a cost to efforts to improve their quality of life.

The study, led by Dr Paul Bain in collaboration with his colleague Dr Tim Kurz at the University of Bath's Department of Psychology and international collaborators, aimed to identify where people saw areas of compatibility and tension for achieving sustainability. To do this, over 2100 people from 12 developed and developing countries were asked what the United Nations' 17 Sustainable Development Goals (SDGs) aimed to achieve - environmental, economic, and/or social sustainability.

The findings, published today (Monday 2 September 2019) in Nature Sustainability, showed that people understood sustainability in four distinct ways. However, the dominant view, common across all countries, showed that most people saw environmental sustainability in tension with social sustainability, but not with economic sustainability.

"Vocal opponents of environmental sustainability issues such as climate change often warn of their devastating macro-economic consequences. But economic consequences may not resonate so strongly with ordinary people, whose concerns appear to be more about wellbeing and quality of life in their society," Dr Bain said.

"A minority of people believed you can achieve it all, but most believed you can't solve all problems at once, and where we direct our resources has consequences elsewhere."

"It's a bit like households - investing in solar panels and insulation for your house to address climate change may mean not sending kids to schools with high fees or getting the top health insurance, or vice versa. Yes there are economic consequences, but the main concern is how to trade-off these environmental and social outcomes - environment versus education / health."

He added: "Our research mirrors these types of personal trade-offs 'writ large' to society - people believe addressing environmental sustainability means less attention to solving social issues like education and health, and also to things like reducing inequality, fostering peace, and improving infrastructure. While we expected people to think sustainability involved trade-offs, until now we didn't know exactly what these trade-offs were."

The findings may help design better policies and the communication activity around them to overcome public scepticism that a more sustainable world is achievable. They may also have consequences for developing large-scale sustainability policies such as a 'Green New Deal'.

Dr Bain explained: "We wanted to understand what people think about sustainability across cultures, to inform ways to communicate about it more effectively."

"We can't tell people we can achieve environmental and social sustainability and expect them to just accept it - many people won't because it conflicts with their intuitions. To reduce this conflict and increase acceptance people need to be told how these tensions will be addressed, making it clear how an environmental policy such as shutting down coal mines or reducing beef consumption is supported by social initiatives to support the mining and farming communities affected."

Definitions:

Environmental sustainability refers to maintaining the viability and health of the natural world. This includes using renewable environmental resources, rationing non-renewable resources until renewable substitutes are found, and controlling pollution.

Social sustainability refers to providing an acceptable level of wellbeing and quality of life for people over time. This includes minimizing destructive conflicts, having acceptable levels of fairness, opportunity, and diversity, and meeting basic needs for health and wellbeing.

Economic sustainability refers to managing finances and investments to promote productive economic activity into the future. This includes avoiding activities leading to excessive debt and interest payments, and making optimal use of available resources.

Credit: 
University of Bath

Methylation of microRNA may be a new powerful biomarker for cancer

image: By measuring the methylation rate of microRNA, various cancer states can be detected.

Image: 
Osaka University

Levels of molecules associated with genetic function, such as microRNA, can be an important indicator of abnormal activity associated with cancer. However, little is known about how different molecules are altered in cancerous cells. Now, researchers from Japan have found a new way of distinguishing cancerous from non-cancerous tissues.

In a study published on August in Nature Communications, researchers from Osaka University revealed that the rate at which microRNA molecules undergo a process called methylation is able to discriminate cancer patients from healthy individuals.

MicroRNAs exhibit abnormal expression in cancer tissues and are stable in body fluids, making them a useful biomarker for cancer. Although microRNAs are generally measured in terms of RNA expression levels, this technique lacks sensitivity and accuracy. Particularly, although microRNAs are measured based on the assumption that they recognize and regulate targets regardless of whether or not they are methylated, their action may actually vary according to methylation status. This is something the researchers at Osaka University aimed to address.

"We found that a small group of mature microRNAs are methylated, which could potentially alter their stability and target recognition," says Masamitsu Konno, co-lead author of the study. "Thus, we wanted to investigate whether methylation could be an important indicator of abnormal microRNA function."

To evaluate the potential of microRNA methylation as a biomarker for early cancer diagnosis, the researchers determined whether levels of methylated RNAs increase or decrease in cancer cells. To do this, they measured microRNA methylation levels in serum samples from patients with pancreatic cancer and healthy controls.

"While we found methylated microRNA in the samples from pancreatic cancer patients, it was either present in very low levels or absent in the control group," explains senior author of the study Hideshi Ishii. "Further, methylation levels in serum samples were able to distinguish early pancreatic cancer patients from healthy controls with extremely high sensitivity and specificity."

The researchers also found that compared with established biomarkers, microRNA methylation was a more powerful indicator of early-stage pancreatic cancer.

"Our data indicate that levels of methylated microRNA may be more useful than those of microRNA as a biomarker for gastrointestinal cancer," says Jun Koseki, co-lead author of the study. "Clarifying the mechanisms by which methylation regulates microRNA function throughout the different stages of cancer may facilitate the development of targeted therapies, leading to improved patient outcomes."

As early detection and treatment of cancer can have a substantial effect on patient outcome, new ways to screen for cancer could be vitally important. Given the advantages with respect to existing biomarkers for cancer, it is possible that RNA methylation will be an important component of future systems for early cancer detection.

Credit: 
Osaka University

AI learns complex gene-disease patterns

Artificial intelligence (AI) is being harnessed by researchers to track down genes that cause disease. A KAUST team is taking a creative, combined deep learning approach that uses data from multiple sources to teach algorithms how to find patterns between genes and diseases.

Machine learning uses algorithms and statistical models to identify patterns and associations among data to solve specific problems. By inputting enough known data, like tagged images of "Jack," the system can eventually learn to suggest other nontagged images that include Jack.

Researchers are using this application of AI to find genes that cause diseases. However, only a limited number of genes have been experimentally confirmed to be causative. This means that scientists do not have a lot of data to input into their programs to help them learn the patterns depicting gene-disease associations. Thus, they need to be creative to find ways to teach machine learning algorithms to learn and then look for these patterns.

Database and information management specialist Panagiotis Kalnis, computational bioscientist Xin Gao and colleagues have developed a deep learning model they say outperforms current state-of-the-art methods.

First, they resorted to known databases to extract information on gene locations and functions and on how and when they turn on and off. This data was used to teach algorithms to find genes that work together. Then, they obtained data on the features of genetic diseases from other databases. This taught the algorithms how to identify diseases with similar manifestations. They combined these datasets with data on the known associations between 12,231 genes and 3,209 diseases.

The KAUST model extracts the patterns learned from how genes network and about the similarities among genetic diseases and transfers them to a deep learning model called a graph convolutional network. This delivers another set of data that is placed in matrices, such as those used in recommendation systems, to predict gene-disease association.

The model was able to identify complex, nonlinear associations between genes and diseases, allowing it to go on to predict new associations. "By making use of more information, we achieved better accuracy than the state-of-the-art methods currently in use," says Peng Han, the first author of the study. "But, even though we outperformed other methods in our experiments, it is still not accurate enough to be applied to industry," he adds.

The team next plans on improving their model's accuracy by incorporating more kinds of data. They will also apply the method to solve other types of problems where only limited data is available, such as recommending new locations to visit based on a user's past preferences.

Credit: 
King Abdullah University of Science & Technology (KAUST)

A life of low cholesterol and BP slashes heart and circulatory disease risk by 80 per cent

Modest and sustained decreases in blood pressure and cholesterol levels reduces the lifetime risk of developing fatal heart and circulatory diseases, such as heart attack and stroke, according to research part-funded by the British Heart Foundation (BHF) and supported by the National Institute for Health Research (NIHR).

The findings are being presented at the European Society of Cardiology (ESC) Congress in Paris and published in the Journal of the American Medical Association (JAMA).

Researchers have found that a long-term reduction of 1 mmol/L low-density lipoprotein (LDL), or 'bad' cholesterol, in the blood with a 10 mmHg reduction in blood pressure led to an 80 per cent lower lifetime risk of developing heart and circulatory disease.

This combination also reduced the risk of death from these conditions by 67 per cent.

The team found that even small reductions can provide health benefits. A decrease of 0.3 mmol/L LDL cholesterol in the blood and 3 mmHg lower blood pressure was associated with a 50 per cent lower lifetime risk of heart and circulatory disease.

Scientists have previously found that lowering both blood pressure and the amount of 'bad' cholesterol in the blood are two ways which can prevent the onset of heart and circulatory disease. However, the risk, which accumulates over time, has not been quantified before.

In this study, Professor Brian Ference and his team studied 438,952 participants in the UK Biobank, who had a total of 24,980 major coronary events - defined as the first occurrence of non-fatal heart attack, ischaemic stroke or death due to coronary heart disease. They used an approach called Mendelian randomisation, which uses naturally occurring genetic differences to randomly divide the participants into groups, mimicking the effects of running a clinical trial.

People with genes associated with lower blood pressure, lower LDL cholesterol and a combination of both were put into different groups, and compared against those without these genetic associations. Differences in blood LDL cholesterol and systolic blood pressure (the highest level that blood pressure reaches when the heart contracts), along with the number of cardiovascular events was compared between groups.

Professor Brian Ference now hopes that these findings can bring about change in the healthcare of people at greater risk of developing heart and circulation complications, and improved guidance for those requiring lifestyle changes.

Professor Brian A Ference, lead researcher of the study at University of Cambridge, said:

"Heart and circulatory diseases steal the lives of 168,000 people each year in the UK, which is just greater than the population of the city of Cambridge. It's vital we do everything possible to help prevent people developing these life-threating conditions.

"Even small reductions in both 'bad' cholesterol and blood pressure for sustained periods of time can pay very big health dividends, and dramatically reduce the lifetime risk of developing heart and circulatory disease."

"We now plan to take the results from this study to create a lifetime cardiovascular risk calculator and to support the development of new prevention guidelines."

Professor Sir Nilesh Samani, Medical Director of the British Heart Foundation said:

"This research again demonstrates that high blood pressure and raised cholesterol are key risk factors for heart attacks and strokes. But how many of us know our numbers for these, or have made sustained efforts to lower them? Hopefully, the findings reported today that the risk could be reduced by as much as 80 per cent, can act as a motivator for long-term change.

"Millions of people are living with untreated high blood pressure or raised cholesterol, both of which can be lowered with lifestyle changes and medication. Huge numbers of heart attacks and strokes can be prevented simply by getting to know your numbers and taking your health into your own hands.

"Simple devices are now available for measuring blood pressure. Also, everyone between the ages of 40-74 is eligible for a free NHS health check, which assesses your risk of developing heart and circulatory diseases, and includes cholesterol and a blood pressure reading. It's important that we all take advantage of this."

Credit: 
British Heart Foundation

Researchers find a new pathological mediator of ALS

image: Aberrant axon morphologies in FUS-mutant ALS patient-derived MNs with Fos-B dysregulation.

Image: 
Masashi Aoki, Tohoku University

A research collaboration based in Japan has found a new pathological mediator of amyotrophic lateral sclerosis (ALS), which could have further implications for understanding the molecular breakdown that gives rise to the neurodegenerative disease that affects nearly half a million people around the world.

They published their results on June 28 in EBioMedicine, a journal issued by The Lancet.

Led by Masashi Aoki, Professor and chair of the Department of Neurology at Tohoku University Graduate School of Medicine and an author of the paper, the team in collaboration with Hideyuki Okano (Professor, of Keio University School of Medicine, Department of Physiology) focused on aberrant behavior in motor neurons.

Motor neurons populate the base of the brain and along the spine. These cells are shaped like a hand with a palm and outstretched fingers, awaiting a signal from another cell. They take the signal and pulse it down their long arm, called an axon, to "touch" muscle fibers. The signal tells the muscle to flex or relax, but, in the case of patients with ALS, the axon divides and branches abnormally. Without connecting to its target, the axon can shrivel away. The message is lost, as is the ability to control the muscles.

Researchers still do not know how the message is disrupted, despite more than 25 genes identified as having a role in inherited ALS since 1993, according to Aoki.

"However, among all the causative genes of ALS, one gene, called FUS, was confirmed to locate at axon ends," Aoki said, referring to the "wrist" of the motor neuron, between the part of the cell that collects the signal and the part that sends the message to the muscles.

Aoki and the team obtained stem cells from a 43-year old patient with ALS and used those cells to sequence the genetic map of the motor neuron axons.

They found that a mutated version of FUS introduced what Aoki calls a toxic gain of function. It encouraged an abundance of another gene named Fos-B, which leads to axonal branching. Compare an axon to a grassy pathway. If there's a clear starting point and ending point, the path will become well-worn and clear. If there's one starting point and many ending points, no single path will emerge as the right one. With motor neurons, however, the signals can't forge ahead, and the paths will ultimately disappear completely.

To further test the Fos-B effect on motor neurons, Aoki and the team developed a Fos-B model of zebrafish, whose brain development bears a striking resemblance to that of humans. The abnormally higher-expressed Fos-B gene induced the over branching of motor neuronal axons.

"This result suggests a promising target for which ALS-linked mutations cause axonal refraction and degeneration- one of the earliest events in the disease," Aoki said, noting their results were in developing motor neurons. "We need more information to elucidate the relationship in mature motor neurons."

The researchers will continue to investigate their findings, delving deeper into the effect of Fos-B on developed motor neurons.

Credit: 
Tohoku University

Plagiarism and inclusivity shown in new study into the arts, humanities and social sciences

A new study looking at the issues arising in publication ethics that journal editors face within the arts, humanities and social sciences has highlighted that detecting plagiarism in papers submitted to a journal is the most serious issue they tackle, something which over half of editors reported encountering.

The findings of this new research also reveal that remaining inclusive whilst addressing issues around language and writing quality barriers is the most prevalent issue they experience, as global research output continues to grow.

The report has been carried out by COPE (the Committee on Publication Ethics), which provides leadership on publication ethics and offers a range of resources to support journal editors and publishers on all aspects of ethical issues in research publishing. The study was commissioned to address perceptions within COPE that Arts, Humanities, and Social Science members may not consider COPE to be as relevant to them as to Science, Technology, Medicine members.

This primary research project was the first COPE has carried out exclusively focused on arts, humanities and social sciences disciplines. The organisation is looking to explore how it should extend its guidance to journal editors; something which it is encouraging is used further by journal editors in the arts, humanities and social sciences following these results.

Completed by more than 650 journal editors (not solely COPE members), the study showed the following key findings:

64% of respondents encountered issues addressing language and writing quality barriers while seeking to remain inclusive.

58% reported detecting plagiarism as the most serious issue they dealt with, followed by fraudulent submissions (44%) and data or image fabrication (31%).

Recognising and dealing with bias in peer reviewer comments was an issue encountered by 55% of journal editors.

Journal editors felt least confident in dealing with data and/or image fabrication issues (24%), fraudulent submissions (23%), and intellectual property and copyright issues (21%).

There were no significant differences in the concerns reported by journal editors from different subject areas, or regionally, suggesting that many of the issues are experienced across multiple disciplines within the arts, humanities and social sciences. However, there was some evidence that Business, Finance and Economics journal editors were more likely to encounter or hear about publication ethics issues than other fields of study.

Following the result of the study, COPE urged the importance of working alongside journal editors to develop existing, or create new, publication ethics guidance. The organisation is encouraging journal editors to make the most of COPE's resources available to them.

Commenting on the project, Deborah Poff, COPE Chair, added: "This research is part of a renewed commitment by COPE to increase the diversity of our services for all disciplinary and interdisciplinary fields.

"These findings provide important information about the specific resource needs of our editors and publishers in numerous arts, humanities, and social sciences fields.

"The study is consistent with the focus of topics at our North American seminar held earlier this year. In the coming months, we will continue to roll out discussion material and resources specifically focused on issues in these fields."

The research was devised in collaboration with and support from Routledge, part of the Taylor & Francis Group.

Tracy Roberts, Publishing Director for the Arts, Humanities and Social Sciences at Taylor & Francis, said: "As the world's largest publisher of arts, humanities and social sciences journals, we understand the publication ethics challenges faced by journal editors in these fields and work alongside editors and editorial boards, providing support on individual cases in line with existing COPE guidance.

"However, we know that some of the challenges faced by those in the arts, humanities and social sciences does differ to STM fields.

"It is because of these differences that we believe it is incredibly valuable to support COPE in this study. Its findings offer a unique opportunity to gather an evidence-base for the development of further publication ethics guidance specifically for these disciplines, whilst also providing the foundation for more research into this crucial area."

Credit: 
Taylor & Francis Group

What drives plate tectonics?

image: These are global paleomagnetic plate reconstructions a. 270 Ma, b. 180 Ma, and inset the Present Tethyan Realm.

Image: 
©Science China Press

Plate tectonics is founded in the late 1960s, and it concerns the distribution and movements of plates, the upper most layer of the Earth. Plate movements not only control the distributions of the earthquakes, volcanos, and mineral resources in the crust, but also effect the ocean and atmospheric circulations above the crust. Therefore, plate tectonics has been regarded as the fundamental unifying paradigm for understanding the history of Earth.

However, it is not like the widely accepted kinematics of plate tectonics, the driving force of plate tectonics is still one of the most challenging problems since the birth of the theory. The subduction of oceanic slabs is considered as the dominant driving force based on observations of Cenozoic subduction systems along the circum-Pacific region. However, the difficulty to observe the oceanic subduction slabs beneath collisional orogens hampers the ability to quantitatively evaluate the role of subducting oceanic slabs. Alternative driving forces such as ridge push, continental slab-pull, plume upwelling and large-scale mantle convection have been proposed at different subduction-collision belts along the Tethyan Realm (Fig 1), the largest continental collisional zone. The Tethyan evolution can be summarized as many continental fragments were ruptured sequentially from Gondwana and then drift towards Laurasia/Eurasia.

Scientists from the State Key Laboratory of Lithospheric Evolution, Institute of Geology and Geophysics, Chinese Academy of Sciences in Beijing found "switches" between continental rupture, continental collision, and oceanic subduction initiation in the Tethyan evolution after a reappraisal of geological records from the surface and new global-scale geophysical images at depth. They proposed that the "switches" were all controlled by oceanic subductions. All oceanic Tethyan slabs acted as a 'one-way train' that transferred the Gondwana-detached continents in the south into the terminal in the north, so they depicted the whole scenario as "Tethyan one-way train" (Figure. 2a and b). The engine of the "train" was the negative buoyancy of the subducting oceanic slabs. The results also shed light on supercontinent assembly and breakup cycles. Subductions not only assemble the supercontinent but also effectively break-up the supercontinent.

The new results will not close the discussions on driving force of plate tectonics, but more future Tethyan research may test the new proposal and improve the understanding of how plate tectonics works.

Credit: 
Science China Press

Protein shakes may not be the answer for post-gym muscle pain

Protein shakes have long been touted as a gym bag essential, consumed by gym-goers in an effort to boost muscle recovery and minimise post-workout muscle soreness, but they may not be the most effective way to relieve aching muscles, according to a new study.

Sports scientists at the University of Lincoln, UK, found that neither whey-protein based shakes nor milk-based formulas enhanced the rate of muscle recovery following resistance training when compared to a carbohydrate only drink. The study is the first to compare the effectiveness of the two different protein formulas.

The blind experiment involved 30 male participants, all of whom had at least a year's resistance training experience. Researchers divided participants into three groups with each group consuming either a whey hydrolysate based drink, a milk based drink or a flavoured dextrose (carbohydrate) drink following a prescribed intensive resistance training session.

Re-testing took place after a 24 and 48-hour period following the resistance training session. Researchers asked participants to rate their levels of muscle soreness on a visual scale from 'no muscle soreness' (0) through to 'muscle soreness as bad as it could be' (200). Participants also completed a series of strength and power assessments to test their muscle function.

Results showed a significant rise in the levels of muscle soreness across the three groups 24 hours and 48 hours after the initial resistance training session, with ratings for all groups rising to over 90, significantly higher than the groups baseline ratings, which ranged from 19-26. Results also showed reductions in muscle power and function. The findings suggest there was no difference in recovery response between the different formulas and no additional benefit of protein consumption on muscle recovery.

Lead author Dr Thomas Gee, Programme Leader of BSc Strength and Conditioning in Sport at the University of Lincoln, said: "While proteins and carbohydrates are essential for the effective repair of muscle fibres following intensive strength training, our research suggests that varying the form of protein immediately following training does not strongly influence the recovery response or reduce muscle pain.

"We would hypothesise that well balanced daily nutrition practices would influence recovery from delayed onset muscle soreness to a greater extent."

Credit: 
University of Lincoln

Heart attack care in Sweden superior to UK

People suffering from heart attacks in Sweden were less likely to die from them in the short and long-term than those in England and Wales, according to a new study.

Researchers, led by the University of Leeds, have for the first time compared the care of the entire populations of Sweden, England and Wales, revealing the superior quality of care in the Scandinavian state.

Lead author Dr Oras Alabas, from the University of Leeds' School of Medicine, said: "Whilst Sweden and the UK both have high performing nationwide health systems, these results demonstrate that there are still improvements to be made when caring for heart attack patients.

"Our findings suggest that the increased use of invasive treatments and recommended medications in Sweden could be causing these differences in mortality.

"The NHS generally does a very good job of looking after heart attack patients, but this data suggests we can build on this great care by learning from our European neighbours."

Their study, published in the journal Cardiovascular Research, compared the treatment and outcomes of over 180,000 patients in Sweden to over 660,000 patients in the UK.

They found that patients who experienced the most severe form of heart attack, known as STEMI, had a net probability of death, defined as the probability of dying due to heart attack and not from other causes, of 6.7% in Sweden compared to 8.0% in the UK, in the one-month following their heart attacks. There was no difference in mortality due to their heart attack from one year after the event onwards for STEMI patients in Sweden compared with the UK.

For the less severe form of heart attack, called NSTEMI, they found that in the following one month from the onset of heart attack, patients from Sweden had a net probability of death of 4.9% compared to 6.8% for those from the UK. The net probability of death from one year after the heart attack onwards was 18.3% for patients in Sweden compared with 21.4% for patients in the UK.

The probabilities are not an individual's probability of dying, but the probability that if they die it was because of their heart attack, and not a different cause of death.

These probabilities therefore indicate how likely it is that a particular health problem will be an individual's cause of death, and thus how well patients are being treated for their heart attacks.

Co-author Professor Chris Gale, from the University of Leeds, said: "Population-based international research is a critical next step if we are to improve how we care for patients with cardiovascular disease.

"This study clearly shows that whilst the NHS is delivering very good care for patients with heart attack, improvements can still be made, which will result in a reduction in potentially avoidable deaths."

Credit: 
University of Leeds

Heart failure patients have similar odds of dementia-type brain lesions as stroke patients

Paris, France - 2 Sept 2019: A type of brain damage linked with dementia and cognitive impairment is as common in heart failure patients as it is in patients with a history of stroke, according to findings from the LIFE-Adult-Study presented today at ESC Congress 2019 together with the World Congress of Cardiology.(1)

The probability of this damage, called white matter lesions (WML), was also linked to the duration of heart failure. Patients with a long-standing diagnosis had more WML compared to those more recently diagnosed.

"Up to 50% of older patients with heart failure have cognitive impairment and heart failure is associated with an increased risk for dementia," said study author Dr Tina Stegmann of Leipzig University Hospital, Germany. "However, it is still unclear what the pathological pathways are. Some investigators have identified changes in brain structure in patients with heart failure and cognitive dysfunction, but the findings are inconsistent."

LIFE-Adult (2) is a population-based cohort study conducted in Leipzig. Between 2011 and 2014, 10,000 residents aged 18 to 80 were randomly selected for inclusion in the study. Participants underwent assessments such as a physical examination and medical history during which information on health conditions - for example heart failure and stroke - was collected.

This subgroup analysis included the 2,490 participants who additionally underwent magnetic resonance imaging (MRI) of the brain. The purpose of the analysis was to determine the frequency and associated risk factors for WML in a population cohort and potentially discover a connection with heart failure.

Most participants in the subgroup analysis had no or mild WML (87%), and 13% had moderate or severe WML. Mild WML are common and increase with age. In contrast, moderate or severe WML are associated with cognitive impairment and dementia.

There were significant independent associations between WML and age, high blood pressure, stroke and heart failure. Patients with heart failure had a 2.5 greater probability of WML than those without heart failure. Similarly, stroke patients had a two times higher likelihood of WML than those with no stroke history.

The odds of WML increased as the period with heart failure lengthened: from 1.3 for a diagnosis less than three years, to 1.7 for a diagnosis of four to six years, and 2.9 for a diagnosis longer than six years.

Dr Stegmann said the connections between heart failure, stroke, and WML could be due to shared risk factors such as age and high blood pressure. In addition, there may be a causal link between heart failure and stroke. It is well known, for instance, that the risk of stroke is higher in patients with heart failure than without.

"The role of dementia and its prevention is of growing interest in heart failure research as the overall heart failure population is ageing and suffering from numerous comorbidities," she added. "Studies are needed to see if WML could be a therapeutic target for treating cognitive decline in patients with heart failure."

Dr Stegmann concluded: "After cancer, dementia is the most feared disease by patients. But there is currently no clear indication to screen for WML in heart failure patients using brain MRI."

Credit: 
European Society of Cardiology

Decline in sports-related sudden cardiac death linked with rise in bystander resuscitation

Paris, France - 2 Sept 2019: Fewer sports-related sudden cardiac arrest victims die nowadays, a trend linked with increased bystander cardiopulmonary resuscitation (CPR), reports a study presented today at ESC Congress 2019 together with the World Congress of Cardiology.(1) The late breaking study also found that the incidence of sudden cardiac arrest during sports has not changed over the last decade.

Sudden cardiac arrest is lethal within minutes if left untreated and rapid initiation of CPR improves survival. Pre-participation screening of athletes aims to identify those at high risk and potentially exclude them from sports, with the ultimate goal of reducing the incidence of sudden cardiac arrest. In most cases, decisions on who to screen are made by international sporting bodies rather than national healthcare systems.(2)

"In our study, bystander CPR was associated with a nearly eight times greater likelihood of sudden cardiac arrest victims surviving to hospital discharge," said principal investigator Professor Xavier Jouven of the Paris-Sudden Death Expertise Centre. "Failure to reduce the incidence of sports-related sudden cardiac arrest is disappointing and questions the efficacy of screening programmes."

Prof Jouven said: "This study was done to assess trends in incidence and survival of sports-related sudden cardiac arrest. We initially expected both a decrease in incidence due to screening programmes and an increase in survival due to CPR."

The analysis was conducted using two prospective registries carried out by the Paris-Sudden Death Expertise Centre. All sudden cardiac arrests occurring during or immediately after competitive and amateur sports in Paris and the surrounding suburbs were recorded in 2005 to 2010 and 2011 to 2016.

There were 158 sports-related sudden cardiac arrests in 2005 to 2010, and 162 in 2011 to 2016. Incidence remained stable across the two periods, at around 6.9 cases per million inhabitants of Paris and the surrounding suburbs per year. There were no significant differences between time periods in average age (49 to 52 years), proportion of men (94% to 96%), and prevalence of previously known heart disease (14% to 17%).

Bystander CPR was significantly more common in 2011 to 2016 (81%) compared to 2005 to 2010 (46%). Automated external defibrillator (AED) use was significantly more frequent in the later period (11.9%) compared to the earlier one (1.3%).

The overall survival rate of athletes with cardiac arrest increased by two-thirds, from 20% in the earlier survey to 60% in later survey.

Survival rates to hospital admission and discharge were significantly higher in the later period. In 2011 to 2016, 85% of patients survived to hospital admission compared to the 51% in 2005 to 2010. The corresponding rates of survival to hospital discharge were 43% versus 26%, respectively. In-hospital mortality remained stable at around 51%. The overall burden of death due to sports-related sudden cardiac arrest decreased from 4.3 to 3.4 deaths per million inhabitants per year.

In multivariable analysis, the only factors independently associated with increased survival to hospital discharge were shockable rhythm (odds ratio 6.82; 95% confidence interval [CI] 2.77-19.80; p

Prof Jouven said: "We observed an important decrease in deaths due to sudden cardiac arrest during sports over a 12-year period which was related to more frequent CPR. The static incidence is probably caused by difficulties in early identification of individuals at high risk for sudden cardiac arrest during sports."

"To further improve survival from cardiac arrest, CPR should be taught to the general public and particularly to sports medicine practitioners," said Prof Jouven. "An AED should be available in all sports venues. Preventing sudden cardiac arrest remains the ideal goal - in the future, smartwatches and internet-connected T-shirts may alert us to warning signs occurring minutes or hours before, allowing early resuscitation and prevention."

Credit: 
European Society of Cardiology

Bacteria in pneumonia attack using bleaching agent

image: Nelson Gekara in lab

Image: 
Mattias Pettersson

Research shows that bacteria use hydrogen peroxide to weaken the immune system and cause pneumonia. This according to a study at Umeå University and Stockholm University, Sweden. Hydrogen peroxide is also known as a bleaching agent that is used to whiten teeth or hair, as a stain remover, as well as for cleaning surfaces and disinfecting wounds.

"By using hydrogen peroxide to defeat the immune system, you could say that the bacteria are fighting fire with fire. The body itself also produces hydrogen peroxide as a defence against the bacteria. Therefore, it was surprising to see that many types of bacteria actually use the same substance to overcome the body´s defences," says Nelson Gekara, research leader.

Saskia Erttmann and Nelson Gekara mainly focused their studies on Streptococcus pneumoniae. This bacterium, often called pneumococcus, is the most common bacterium causing pneumonia but can also cause, among other illnesses, meningitis or severe sepsis. In addition, this bacterium can pave the way for other microbes to attack. This makes the bacterium one of the most deadly in the world. At the same time, many people have the bacterium in the upper respiratory tract as a part of the normal flora without falling ill or even knowing about it. It is therefore important to understand how pneumococci affect the body's immune system.

The ultimate goal of any invading microbe is to reside peacefully within our bodies without evoking a strong inflammatory reaction that may result in the elimination of the microbe or cause us harm. The researchers have found that pneumococcus and other bacteria accomplish this by targeting a key component of the immune system - the inflammasomes. Inflammasomes are protein complexes, which upon recognizing foreign molecules, for example those found in microbes or damaged cells, initiate reactions to kill microbes and to clear diseased cells. The researchers found that bacteria such as pneumococci release large quantities of hydrogen peroxide, and that this causes inactivation of inflammasomes thereby weakening the immune system.

In mice models, the researchers observed that bacteria manipulated to produce less hydrogen peroxide were unable to inactivate inflammasomes and therefore elicited a faster inflammatory response that effectively cleared the bacteria from mouse lungs. The researchers also found that by inoculating the mice with a special enzyme, catalase, which breaks down hydrogen peroxide, one could increase the inflammation and inflammatory symptoms, leading to faster elimination of pneumococci from the lung.

"Inflammation often has negative connotations. However, for the body inflammation is an important process in the immune system's defence against attacking microbes. Most microbes produce hydrogen peroxide to varying degrees. Our studies demonstrate that hydrogen peroxide is an inhibitor of an important component of the inflammatory machinery suggesting that the mechanism we have uncovered is a common strategy employed by many microbes to thrive within us," says Saskia Erttmann, first author in the study and former member of Nelson Gekara's research group.

"One of the best known substances with the ability to neutralize hydrogen peroxide and that could hence boost anti-bacterial immunity are vitamins such as Vitamin C found in fruits. Perhaps the old adage 'an apple a day keeps the doctor away' is not off the mark," adds Nelson Gekara.

Credit: 
Umea University