Culture

Stem Cell trial for osteoarthritis patients reduces pain, improves quality of life

DURHAM, N.C. (PRWEB) April 09, 2019 -- Stem cells collected from the patient's own bone marrow holds great interest as a potential therapy for osteoarthritis of the knee (KOA) because of their ability to regenerate the damaged cartilage. The results were released today in STEM CELLS Translational Medicine (SCTM).

KOA is a common, debilitating disease of the aging population in which the cartilage wears away, resulting in bone wearing upon bone and subsequently causing great pain. In its end stages, joint replacement is currently the recommended treatment. In the first clinical trial of its kind to take place in Canada, researchers used mesenchymal stromal cells (MSCs), collected from the patient's own bone marrow under local anesthesia, to treat KOA.

The study was conducted by a research team from the Arthritis Program at the Krembil Research Institute, University Health Network, Toronto, led by Sowmya Viswanathan, Ph.D., and Jaskarndip Chahal, M.D. "Our goal was to test for safety as well as to gain a better understanding of MSC dosing, mechanisms of action and donor selection," Dr. Viswanathan said.

It involved 12 patients, aged 45 to 65, with moderate to severe KOA. They were divided into three groups, with each group receiving a different dose of MSCs. (Each patient was injected with his or her own cells.) The researchers then followed the patients for the next 12 months, using analytical methods that included imaging, biomarkers, molecular fingerprinting and the patient's own assessment of how he or she felt.

At the end of the 12-month period, the team noted significant improvements in the patients' pain levels and quality of life. The study also showed that the MSCs were safe at all the doses tested and that the higher the dose, the more effective the outcome.

Dr. Viswanathan said, "We also obtained novel insights into a potential anti-inflammatory mechanism of action of these cells in osteoarthritic knee joints. We noted that donor heterogeneity is an important factor, and our assembled panel of genes helps us identify cells which are potent in osteoarthritis. These are important findings which we hope to translate into a larger, powered clinical trial as part of our next steps."

"Furthermore," added Dr. Chahal, "we have been able to show that through an anti-inflammatory mechanism of action, such patients have an improvement in pain, function and quality of life. This sets the stage for the future of cell-based therapy and trials in Canada."

"This clinical pilot study advances the field of stem cell research for patients with arthritis, showing safety, and giving insights into potential therapy efficacy guidelines", said Anthony Atala, M.D., Editor-in-Chief of STEM CELLS Translational Medicine and director of the Wake Forest Institute for Regenerative Medicine. "We look forward to larger scale trial results."

Credit: 
University Health Network

Columbia experience could help reduce UK knife crime and street violence

A leading public health expert says the UK should learn lessons from systematic violence reduction work in Cali, Columbia to tackle rising rates of knife crime on British streets. The work in Columbia resulted in significant reductions in homicides between 1995 and 2018.

Writing in the Journal of the Royal Society of Medicine, Professor John Ashton describes how, faced with a horrendous toll of over 1,000 drug related homicides each year, the Mayor of Cali, public health professor Rodrigo Guerrero and his colleague Dr Alberto Concha-Eastman, adopted a classical public health model to tackle the problem.

The model is based on an understanding and detailed mapping of time, place and person with appropriate interventions to match. Interventions in Cali included restriction of alcohol sales in the affected neighbourhoods and access to weapons, police surveillance and enforcement using 24 hour courts, tackling organised crime, together with a holistic approach to poverty reduction, increased educational and employment opportunity and the mobilisation of communities, including especially the mothers of young men, fearful that their son would be next in the mortuary.

In 1995 the homicide rate in Cali per 100,000 inhabitants was 100. This reduced to 47.3 per 100,000 inhabitants in 2018.

Professor Ashton says: "The current popular refrain for a public health response to violence is being linked to recent efforts in Glasgow which seem to be having some impact. However, the roots of this approach can be traced to systematic work in Cali over the past 30 years.

"The UK is now at a stage which requires stronger community organisation and participation linked to whole systems action, if knife crime and street violence is to be reduced. We have much to learn from our colleagues in Cali."

Credit: 
SAGE

Lack of awareness of inequality means we penalize those who have least money

People can automatically assume that someone who gives less money to charity is less generous, according to new research. The assumption was made in the study when people had no knowledge of how much someone had donated as a percentage of their overall income.

The online study was carried out by researchers at the University of Exeter Business School, Yale University, MIT and Harvard Business School. Participants were able to choose to 'penalise' different groups of people based on their contributions to society. The researchers found that participants tended to 'penalise' those who had given smaller cash amounts to charity in real terms, without realising that those people had actually given more as a proportion of their income than their wealthier counterparts.

However, the participants' behaviour changed completely when they were made aware of others' incomes. Participants then 'penalised' the rich for giving a lower percentage, even when the cash amount was actually more in real terms.

"This lack of awareness of inequality can have substantial consequences for society - how we treat each other and what we expect others to contribute to society," said Dr Oliver Hauser, Senior Lecturer in Economics at the University of Exeter and lead author of the report.

In one experiment, participants were given actual figures from a list of five U.S. school districts and told the annual donation given to each of their Parents' Teachers' Associations (PTA). They then had to choose which school district should pay an additional tax bill, which would benefit all five districts collectively. Those participants who didn't know the average income of parents in each area chose to levy the tax on the poorest school district, which had given the least amount of money to the PTA in real terms. Those who were made aware of the average incomes chose to give the tax bill to a school district with richer parents, who had given less as a proportion of their income.

"If people don't realise how little the poor actually have, they may be less sympathetic to the lower contributions made by that group," said Professor Michael Norton from Harvard Business School, who is a co-author of the report.

"Conversely, they are less likely to view the rich negatively for not contributing their 'fair share.' This can have real implications for the on-going gap between the rich and poor in society, particularly when it comes to making decisions on how to distribute public resources fairly."

The income distribution used in the four studies of the research article used recent figures from actual U.S. income distribution, which ranks among the most unequal in the Western world.

The research was carried out by Dr Hauser, Professor Norton and co-authors Dr Gordon Kraft-Todd from Yale University, Associate Professor David G. Rand from Massachusetts Institute of Technology and Professor Martin Nowak from Harvard University.

The research is published in the peer-reviewed academic journal Behavioural Public Policy. Funding was provided by the Harvard Foundations of Human Behavior Initiative and Harvard Business School.

Credit: 
University of Exeter

Autism rate rises 43 percent in New Jersey, Rutgers study finds

image: Walter Zahorodny, an associate professor of pediatrics at Rutgers New Jersey Medical School who directed the New Jersey portion of the study, called the results 'consistent, broad and startling.'

Image: 
Photo: Nick Romanenko / Rutgers University

A new report by the Centers for Disease Control and Prevention, which uses research by Rutgers University, shows a significant increase in the percentage of 4-year-old children with autism spectrum disorder in New Jersey.

The study found the rate increased 43 percent from 2010 to 2014 in the state.

The report, released April 11, found that about one in 59 children has autism. New Jersey's rate was the highest of the states studied: one in 35. That puts the national rate of autism at 1.7 percent of the childhood population and New Jersey's autism rate at 3 percent.

New Jersey is known for excellent clinical and educational services for autism spectrum disorder, so the state's higher rates are likely due to more accurate or complete reporting based on education and health care records, the researchers said. Similar studies were conducted in Arizona, Colorado, Missouri, North Carolina, Utah and Wisconsin.

Walter Zahorodny, an associate professor of pediatrics at Rutgers New Jersey Medical School who directed the New Jersey portion of the study, called the results "consistent, broad and startling." The analysis of this young group of children shows U.S. autism rates are continuing to rise without plateauing.

"It's very likely that the next time we survey autism among children, the rate will be even higher," he said.

The researchers analyzed information from the health and special education records of 129,354 children who were 4 years old between 2010 to 2014 and 128,655 children who were 8 years old in that time period. They used the guidelines for autism spectrum disorder in the American Psychiatric Association's Diagnostic and Statistical Manual of Mental Disorders-IV for their primary findings.

Across the network, the researchers found the prevalence of autism spectrum disorders ranged from a low of 8 per 1,000 children in Missouri to a high of 28 per 1,000 children in New Jersey. The average was 13 per 1,000 children. The disorder is about two times more common among boys than girls and white children are more often diagnosed than black or Hispanic children.

Although the estimates are not representative of the country as a whole, they are considered the benchmarks of autism spectrum disorder prevalence, Zahorodny said.

The age that children received their first evaluation ranged from 28 months in North Carolina to 39 months in Wisconsin. The researchers discovered that children with an intellectual disability or other condition were more likely to be evaluated earlier than age 4, which gives them an advantage.

"Children who are evaluated for autism early - around their second birthday - often respond better to treatment than those who are diagnosed later," Zahorodny said. "However, it appears that only the most seriously affected children are being evaluated at the crucial time, which can delay access to treatment and special services."

The average age of diagnosis - 53 months - has not changed in 15 years.

"Despite our greater awareness, we are not effective yet in early detection," he said. "Our goal should be systematic, universal screening that pediatricians and other health providers provide at regular visits starting at 18 months to identify autism as soon as possible."

The researchers can't explain why autism rates have increased across the United States. Factors associated with a higher risk include advanced parental age (children of parents over age 30 have heightened risk), maternal illness during pregnancy, genetic mutations, birth before 37 weeks gestation and multiple births.

"These are true influences exerting an effect, but they are not enough to explain the high rate of autism prevalence," said Zahorodny. "There are still undefined environmental risks that contribute to this significant increase, factors that could affect a child in its development in utero or related to birth complications or to the newborn period. We need more research into non-genetic triggers for autism."

Credit: 
Rutgers University

More Michigan students taking, passing advanced math

Michigan high school students are going above and beyond the required math curriculum, likely an effect of the state's graduation requirements, finds new research from Michigan State University.

The Michigan Merit Curriculum, which went into effect with the class of 2011 and requires students to take four years of math, at least up to algebra 2, also seems to be influencing more students to enroll in college.

"Our research indicates that the policy is working in terms of providing more opportunities to the most disadvantaged students," said Soobin Kim, author of the study and a researcher in the MSU College of Education. "It has been successful in equalizing access to algebra 2, which is a well-established predictor for postsecondary readiness."

The researchers, including MSU faculty members Barbara Schneider and Ken Frank, found students from low-income schools were particularly affected by the curriculum, completing on average a full semester more math.

Although 28 states now have similar requirements for math course-taking in high school, the MSU researchers' work is among the first to explore whether the policies improve course-taking patterns and college enrollment outcomes.

Like other states that have passed similar policies, Michigan's set of course-taking expectations are intended to make learning opportunities more equitable and prepare young people for success in college and the workforce.

The study, published in Educational Evaluation and Policy Analysis in March, also found Michigan students are now more likely to enroll in four-year colleges.

And those effects were greatest for students who were already better prepared based on their eighth-grade test scores.

The researchers used transcripts from a representative sample of 129 Michigan high schools to analyze patterns of course-taking for more than 300,000 students over 10 years -- allowing for comparison before and after the policy change. They also matched their data to college enrollment information for each student from the National Student Clearinghouse.

Among subject areas, the research team focused on math because students tend to follow a standard sequence of courses with less variation from school to school.

Previous research found the Michigan Merit Curriculum had little to no impact on students' ACT scores or graduation rates.

Both studies were conducted by the Michigan Consortium for Education Research, a research partnership between MSU, University of Michigan and the Michigan Department of Education, which has received $6 million in grant funding from the U.S. Department of Education.

The massive dataset, created through the partnership, will continue to generate new insights.

Kim said more information is needed about how higher expectations lead to changes for students, and that requires looking in classrooms -- at factors such as academic preparation prior to high school, class size and characteristics of classmates and teachers.

"This is just the beginning," he said. "One policy will not change outcomes for all students in the same way. It takes time to explore each factor but it's our job to study them with diligence."

Credit: 
Michigan State University

Keeping the taste, reducing the salt

image: While humans need the salt in snacks like potato chips, Americans consume significantly more salt than is necessary or even healthy.

Image: 
Public Domain (<a target="_blank" href="https://en.wikipedia.org/wiki/Potato_chip#/media/File:Potato-Chips.jpg">https://en.wikipedia.org/wiki/Potato_chip#/media/File:Potato-Chips.jpg</a>)

PULLMAN, Wash. - Washington State University researchers have found a way to make food taste salty but with less of the sodium chloride tied to poor health.

"It's a stealth approach, not like buying the 'reduced salt' option, which people generally don't like," said Carolyn Ross, a Food Science professor at WSU. "If we can stair-step people down, then we increase health while still making food that people want to eat."

In a paper published in the Journal of Food Science, Ross and colleagues looked at salt blends that use less sodium chloride and include other salts like calcium chloride and potassium chloride.

Both of those salts have no adverse health effects on people, Ross said. Potassium can actually help reduce blood pressure. Unfortunately, they aren't very tasty.

"Potassium chloride, especially, tastes really bitter and people really don't like it," Ross said.

The researchers used tasting panels and WSU's electronic tongue to see just how much they could add of the replacement salts for standard sodium chloride before people found the food unacceptable to eat.

Some tasting panels tested a variety of salt solutions, or salt in water, while others tested different salt combinations in tomato soup.

Using the e-tongue and panels, they found that a blend using approximately 96.4 percent sodium chloride with 1.6 percent potassium chloride and 2 percent calcium chloride was the ideal reduction.

They had a higher reduction when they added only calcium chloride, getting acceptable rates with a combination of 78 percent sodium chloride and 22 percent calcium chloride.

"This combination of the two salts did not significantly differ compared to 100 percent sodium chloride," Ross said. "But when we added potassium chloride, consumer acceptance decreased."

While humans need salt, Americans consume significantly more than is necessary or even healthy. According to the U.S. Office of Disease Prevention and Health Promotion, the recommended maximum amount of salt consumed per day is less than 2,300 mg. The average American adult female consumes 2,980 mg per day, while males average over 4,000 mg per day.

Recent findings have suggested that gradual reductions in salt over a period of years is the best way to reduce salt consumption. Using one of the new blends for a specified time frame could lead to greater reductions down the road.

Credit: 
Washington State University

New non-antibiotic strategy for the treatment of bacterial meningitis

image: Seen here are the white blood cells (neutrophils) in the spinal fluid during bacterial meningitis. The nuclear material DNA is stained blue and proteins in neutrophils are stained red. The large aggregates as represented by overlapping areas of blue (DNA) and red (protein) portrays the extracellular DNA-protein networks called neutrophil extracellular traps (NETs). NETs are released as a defence response against invading bacteria in bacterial meningitis.

Image: 
Tirthankar Mohanty

With the increasing threat of antibiotic resistance, there is a growing need for new treatment strategies against life threatening bacterial infections. Researchers at Lund University in Sweden and the University of Copenhagen may have identified such an alternative treatment for bacterial meningitis, a serious infection that can lead to sepsis. The study is published in Nature Communications.

Our immune system has several important defenders to call on when an infection affects the central nervous system. The researchers have mapped what happens when one of them, the white blood cells called neutrophils, intervene in bacterial meningitis.

If there is an infection, the neutrophils deploy to the infected area in order to capture and neutralise the bacteria. It is a tough battle and the neutrophils usually die, but if the bacteria are difficult to eliminate, the neutrophils resort to other tactics.

"It is as though in pure frustration they turn themselves inside out in a desperate attempt to capture the bacteria they have not been able to overcome. Using this approach, they capture a number of bacteria at once in net-like structures, or neutrophil extracellular traps (NETs). This works very well in many places in the body where the NETs containing the captured bacteria can be transported in the blood and then neutralised in the liver or spleen, for example. However, in the case of bacterial meningitis these NETs get caught in the cerebrospinal space, and the cleaning station there is not very effective", explains Adam Linder, associate professor at Lund University and specialist in Infectious Diseases at Skåne University Hospital.

Researchers observed, by using advanced microscopy; that the cerebrospinal fluid of patients with bacterial meningitis was cloudy and full of lumps, which proved to be NETs. However, among patients with viral meningitis, the cerebrospinal fluid was free from NETs. When captured bacteria get caught in the cerebrospinal fluid, this adversely affects the immune system's work of clearing away bacteria and also impedes standard antibiotics from getting at the bacteria, says Adam Linder.

Would it be possible to cut up the nets so that the bacteria are exposed to the body's immune system, as well as to antibiotics, making it easier to combat the infection? As the NETs consist mainly of DNA, the researchers investigated what would happen if you brought in drugs used for cutting up DNA, so-called DNase.

"We gave DNase to rats infected with pneumococcus bacteria, which caused bacterial meningitis, and could show that the NETs dissolved and the bacteria disappeared. It seems that when we cut up the NETs, the bacteria are exposed to the immune system, which finds it easier to combat the bacteria single-handed. We were able to facilitate a significant reduction in the number of bacteria without antibiotic intervention, says Tirthankar Mohanty, one of the researchers behind the study.

Before antibiotics, the mortality rate for bacterial meningitis was around 80 per cent. With the advent of antibiotics, the mortality rate quickly fell to around 30 per cent.

In the 1950s, Professor Tillett at the Rockefeller University in the USA, found lumps in the cerebrospinal fluid of patients with bacterial meningitis. Professor Tillett discovered that these lumps could be dissolved using DNase. This was effective in combination with antibiotics and reduced the mortality rate for meningitis from around 30 per cent to about 20 per cent. However, this treatment had side effects, as the DNase was extracted from animals and could therefore trigger allergic side effects.

"At that time, everyone was so happy about the antibiotics, they reduced mortality for the infections and it was thought that we had won the war against bacteria. I believe we need to go back and take up a part of the research that took place around the time of the breakthrough for antibiotics. We can perhaps learn from some of the discoveries that were then flushed down the drain", says Adam Linder.

"The development of resistance in bacteria is accelerating and we need alternatives to antibiotics. The drug we use in the studies is a therapeutic biological product derived from humans and has already been approved for human use. They are not expensive and have also been tested against many different bacteria and infections. Bacterial meningitis is a major challenge in many parts of the world. In India, for example, it is a major cause of death among children, so there would be significant benefits there from using such a treatment strategy", says Tirthankar Mohanty.

The researchers want to go on to set up a major international clinical study and use DNase in the treatment of patients with bacterial meningitis.

Credit: 
Lund University

CSI meets conservation

image: A wild tiger in India.

Image: 
Prasenjeet Yadav

The key to solving a mystery is finding the right clues. Wildlife detectives aiming to protect endangered species have long been hobbled by the near impossibility of collecting DNA samples from rare and elusive animals. Now, researchers at Stanford and the National Centre for Biological Sciences at India's Tata Institute of Fundamental Research have developed a method for extracting genetic clues quickly and cheaply from degraded and left-behind materials, such as feces, skin or saliva, and from food products suspected of containing endangered animals.

Their proof of concept - outlined April 10 in Methods in Ecology and Evolution - could revolutionize conservation approaches and policies worldwide, the researchers said.

"It's CSI meets conservation biology," said co-author Dmitri Petrov, the Michelle and Kevin Douglas Professor in the School of Humanities and Sciences.

The specter of extinction hangs over more than a quarter of all animal species, according to the best estimate of the International Union for Conservation of Nature, which maintains a list of threatened and extinct species. Conservationists have documented extreme declines in animal populations in every region of Earth.

Clues from DNA

Helping species recover often depends on collecting DNA samples, which can reveal valuable information about details ranging from inbreeding and population history to natural selection and large-scale threats such as habitat destruction and illegal wildlife trade. However, current approaches tend to require relatively large amounts of DNA, or expensive and often inefficient strategies for extracting the material. Getting meaningful information rapidly from lower-concentration, often degraded and contaminated DNA samples requires expensive and specialized equipment.

A solution may lie in an ongoing collaboration between Stanford's Program for Conservation Genomics, including the labs of Petrov and co-authors Elizabeth Hadly and Stephen Palumbi, with India's National Centre for Biological Sciences, including the lab of co-author Uma Ramakrishnan, a molecular ecologist and former Fulbright faculty fellow at Stanford.

"I have been working on tiger conservation genetics for over a decade, but have been frustrated at how slow and unreliable the process of generating genetic data can be," Ramakrishnan said. "Conservation needs answers fast, and our research was not providing them fast enough."

The researchers looked at endangered wild tigers in India and overfished Caribbean queen conchs, examining tiger feces, shed hair and saliva found on killed prey, as well as fried conch fritters purchased in U.S. restaurants. All of the samples were too impure, mixed or degraded for conventional genetic analysis.

"Our goal was to find extremely different species that had strong conservation needs, and show how this approach could be used generally," said Palumbi, the Jane and Marshall Steele Jr. Professor of Marine Biology. "The King of the Forest - tigers - and Queen of the Caribbean - conch - were ideal targets."

Inexpensive and effective

Together, the team improvised a new approach, using a sequencing method that amplifies and reads small bits of DNA with unique differences in each sample. By doing this simultaneously across many stretches of DNA in the same test tubes, the researchers kept the total amount of DNA needed to a minimum. Making the procedure specific to tiger and conch DNA allowed for the use of samples contaminated with bacteria or DNA from other species.

The technology proved highly effective at identifying and comparing genetic characteristics. For example, the method worked with an amount of tiger DNA equivalent to about one-one-hundred-thousandth the amount of DNA in a typical blood sample. The method had a higher failure rate in conchs because the researchers did not have whole genomes at their disposal.

The approach's effectiveness, speed and affordability - implementation costs could be as low as $5 per sample, according to the researchers - represents a critical advance for wildlife monitoring and forensics, field-ready testing, and the use of science in policy decisions and wildlife trade.

"It is easy to implement and so can be done in labs with access to more or less basic equipment," said co-author Meghana Natesh of the National Centre for Biological Sciences and Sastra University in India. "If a standard procedure is followed, the data generated should be easy to share and compare across labs. So monitoring populations across states or even countries should be easier."

The scientists have made their methods freely available.

Credit: 
Stanford University

Autoimmune diseases of the liver may be triggered by exposure to an environmental factor

11 April 2019, Vienna, Austria: Investigators from a large population-based study conducted in northern England have suggested that exposure to a persistent, low-level environmental trigger may have played a role in the development of autoimmune diseases of the liver within that population. The study, which was discussed today at The International Liver Congress™ 2019 in Vienna, Austria, found a significant clustering of cases of primary biliary cholangitis (PBC), autoimmune hepatitis (AIH), and primary sclerosing cholangitis (PSC) in well-defined regions of north-east England and North Cumbria, suggesting an environmental agent (or agents) may have been involved.

The autoimmune liver diseases, PBC, AIH, and PSC, are relatively rare diseases that are associated with significant morbidity and mortality. These conditions affect people of all ages and are chronic, life-long conditions. The underlying cause of these autoimmune liver diseases is not fully defined, although an interaction between a genetic predisposition to autoimmunity and environmental factors has been proposed.

Disease clustering, whereby an abnormally large number of cases have been found in a well-defined geographical region, has been reported previously for PBC in two areas of Northern England and New York. However, according to the investigators in today's study, equivalent studies have not been performed in AIH or PSC.

The study reported today was conducted by a team of researchers from Newcastle in Northern England, supported by the National Institute for Health Research Newcastle Biomedical Research Centre. The team identified a large cohort of individuals from North-East England and North Cumbria who had PBC (n=2,150), AIH (n=963), or PSC (n=472). Spatial point analyses were used to investigate disease clustering using postal addresses, and, for those with a known year of diagnosis, spatio-temporal analyses were undertaken.

Areas with a higher than expected number of patients with each of these three conditions were found at approximately 1-2 km, with extra clusters for AIH and PSC at approximately 10 km and 7.5km in PBC. There was no sign of more patients being diagnosed within a particular timeframe that suggests an infection is less likely to be associated with development of these diseases.

"This study suggests that exposure to a persistent, low-level environmental agent may have played a role in the pathogenesis of all three autoimmune liver diseases studied, not just PBC," said Dr Jessica Dyson, Associate Clinical Lecturer at Newcastle University and Consultant Hepatologist at Newcastle upon Tyne Hospitals NHS Foundation Trust in the UK.

"The varying distances of peak clustering raises the possibility that different environmental factors contribute to PBC, AIH, and PSC. In previous PBC clustering studies, water reservoirs, industrial or coal mining factors, or waste disposal site toxins have been implicated," noted Dr Dyson. "Further work is ongoing to try to identify factors that may potentially be associated with the clustering observed in our study."

This study is very important, since autoimmune diseases of the liver are infrequent but have an increasing incidence overall," said Professor Marco Marzioni from the Università Politecnica delle Marche, Ancona, Italy, and an EASL Governing Board Member. "However, their triggers are as yet unknown. Environmental factors have been considered, but no solid data have emerged so far. The study presented today has sufficient scientific rigour to reinforce the idea that environmental exposure may play a major role in triggering autoimmune diseases of the liver."

Credit: 
Spink Health

Genome analysis shows the combined effect of many genes on cognitive traits

Individual differences in cognitive abilities in children and adolescents are partly reflected in variations in their DNA sequence, according to a study published in Molecular Psychiatry. These tiny differences in the human genome can be used together to create so-called polygenic scores; the sum of a number of genetic variants an individual carries reflecting the genetic predisposition to a particular trait. This includes differences in educational achievement (how well pupils do in English, maths, and science), how many years of education they complete, and their IQ at age 16.

Researchers at King's College London, UK analysed genetic information from 7,026 UK children at ages 12 and 16 included in the Twins Early Development Study, a longitudinal study of twins born in England and Wales between 1994 and 1996. Intelligence and educational achievement at ages 12 and 16, and their associated genetic variants, were analysed. Intelligence was assessed via verbal and non-verbal web-based IQ tests. Educational achievement was assessed by how well pupils did in English, maths, and science, which are compulsory in the UK.

The researchers showed that polygenic scores, which reflect the combined effect of multiple genetic variants, may predict up to 11% of the difference in intelligence and 16% of the difference in educational achievement between individuals.

Andrea Allegrini, the corresponding author said: "The effects of single variants on a given trait are often extremely small, and difficult to capture accurately. However, most behavioural traits share a substantial proportion of genetic variation that is a proportion of genetic variants affect multiple traits at the same time. The degree to which shared genetic influences account for similarities between traits is known as genetic correlation.

Multivariate (so-called multi-trait) genomic approaches make use of genetic correlations between traits to more accurately estimate the effect of genetic variants on a given trait. These can be used to increase the predictive power of polygenic scores. We compared several novel, state of the art, multi-trait genomic methods to maximise polygenic score prediction."

The authors found that when analysing genetic variants associated with intelligence, they were able to predict 5.3% of the difference in intelligence between individuals at age 12 and 6.7% of the difference at age 16. For educational achievement, analysing genetic variants associated with educational attainment (years of schooling), they predicted a maximum of 6.6% of the difference at age 12 and 14.8% at age 16. The authors also showed that analysing variants associated with educational attainment allowed them to predict 7.2% of the variance in intelligence at age 12 and 9.9% at age 16, because of the genetic correlation between the two traits.

When taking a multivariate/multi-trait approach, and adding three other, genetically correlated traits and their associated genes to the analysis, prediction accuracy improved to 10% of the difference in intelligence at age 16 and 15.9% of the difference in educational achievement. The authors also tested three different genomic methods to show that their predictive accuracy was similar.

Andrea Allegrini said: "Our findings indicate that there are no notable differences between the multi-trait prediction methods we tested. Even though these methods employ different mathematical models, they arrive at similar conclusions. This is extremely encouraging as it indicates that our estimates are robust, in that they are generally stable across methods tested."

He added: "However, it is also important to understand that these are average differences, which means that many people with a low genetic predisposition to educational attainment can still do very well in school, and vice versa. As such, these scores are probabilistic; they do not show that education or intelligence are determined by a person's genes."

Credit: 
Springer

Stress-related disorders linked to heightened risk of cardiovascular disease

Stress related disorders--conditions triggered by a significant life event or trauma--may be linked to a heightened risk of cardiovascular disease (CVD), finds a large Swedish study published in The BMJ today.

The risk of severe and acute CVD events, such as cardiac arrest and heart attack, was particularly high in the first six months after diagnosis of a stress related disorder, and within the first year for other types of CVD.

Most people are, at some point during their life, exposed to psychological trauma or stressful life events such as the death of a loved one, a diagnosis of a life threatening illness, natural disasters, or violence, write the authors.

And there is building evidence which suggests that severe stress reactions to significant life events or trauma are linked to the development of CVD.

But previous studies have mainly focused on male veterans or those currently active in the military with posttraumatic stress disorder (PTSD), or PTSD symptoms. And because of the smaller size of these samples, data on the effects of stress reactions on different types of CVD are limited.

So to shed some light on this, researchers used Swedish population and health registers to explore the role of clinically diagnosed PTSD, acute stress reaction, adjustment disorder, and other stress reactions in the development of CVD.

They controlled for family background, medical history, and underlying psychiatric conditions.

The researchers matched 136,637 people from an "exposed cohort" who were diagnosed with a stress related disorder between January 1987 and December 2013 with 171,314 full siblings who were free of stress related disorders and CVD.

For each exposed person, 10 people from the general population who were unaffected by stress related disorders and CVD at the date of diagnosis of the "exposed" patient were randomly selected.

Exposed and unexposed people were then individually matched by birth year and sex.

Severe stress reactions to significant life events or trauma were linked to a heightened risk of several types of CVD, especially during the first year after diagnosis, with a 64% higher risk among people with a stress related disorder compared to their unaffected sibling.

The findings were similar for people with a stress related disorder compared to the general population.

And there was a stronger link between stress related disorders and early onset CVD - cases of disease which developed before the age of 50 - than later onset ones.

Out of all studied CVDs, the excess risk during the first year was strongest for heart failure, and for major blood clots (embolism and thrombosis) after one year.

There were similar associations across sex, calendar period, medical history, and family history of CVD. But those who were diagnosed with a stress disorder at a younger age had a heightened risk of CVD.

This is an observational study based on the Swedish population and, as such, can't establish cause. The authors point out evidence from other studies suggesting a biological link between severe stress reactions and cardiovascular disease development. And they can't rule out the role of other unmeasured behavioural factors, such as smoking and alcohol intake.

But they say that their study is the first to explore the association between a number of stress related disorders, including but not limited to PTSD, and several types of CVD using sibling-based comparisons, among both men and women.

And doctors need to be aware of the "robust" link between stress related disorders and a higher subsequent risk of cardiovascular disease, particularly during the months after diagnosis, they add.

"These findings call for enhanced clinical awareness and, if verified, monitoring or early intervention among patients with recently diagnosed stress related disorders," they conclude.

In a linked editorial, Professor Simon Bacon from Concordia University in Canada, says that the design of the study "allows us to make reasonable assumptions about the similarity of the environment, lifestyles, and health behaviours between those with a disorder and their paired siblings without one. Such assumptions allow inferences about other alternative potential pathways linking these disorders to CVD outcomes."

In the future, well designed studies evaluating more appropriate interventions will be critical, not only to confirm the inferences of the new study but also to provide real benefits to patients," he concludes.

Credit: 
BMJ Group

Millions of children worldwide develop asthma annually due to traffic-related pollution

video: Susan C. Anenberg and Pattanun Achakulwisut discuss the findings of their paper, "Global, national, and urban burdens of paediatric asthma incidence attributable to ambient NO2 pollution: estimates from global datasets," published on April 10 in The Lancet Planetary Health.

Image: 
Matthew Golden, GW Milken Institute School of Public Health

WASHINGTON, D.C. (April 10, 2019) - About 4 million children worldwide develop asthma each year because of inhaling nitrogen dioxide air pollution, according to a study published today by researchers at the George Washington University Milken Institute School of Public Health (Milken Institute SPH). The study, based on data from 2010 to 2015, estimates that 64 percent of these new cases of asthma occur in urban areas.

The study is the first to quantify the worldwide burden of new pediatric asthma cases linked to traffic-related nitrogen dioxide by using a method that takes into account high exposures to this pollutant that occur near busy roads, said Susan C. Anenberg, PhD, the senior author of the study and an associate professor of environmental and occupational health at Milken Institute SPH.

"Our findings suggest that millions of new cases of pediatric asthma could be prevented in cities around the world by reducing air pollution," said Anenberg. "Improving access to cleaner forms of transportation, like electrified public transport and active commuting by cycling and walking, would not only bring down NO2 levels, but would also reduce asthma, enhance physical fitness, and cut greenhouse gas emissions."

The researchers linked global datasets of NO2 concentrations, pediatric population distributions, and asthma incidence rates with epidemiological evidence relating traffic-derived NO2 pollution with asthma development in kids. They were then able to estimate the number of new pediatric asthma cases attributable to NO2 pollution in 194 countries and 125 major cities worldwide.

Key findings from the study published in The Lancet Planetary Health:

An estimated 4 million children developed asthma each year from 2010 to 2015 due to exposure to NO2 pollution, which primarily comes motor vehicle exhaust.

An estimated 13 percent of annual pediatric asthma incidence worldwide was linked to NO2 pollution.

Among the 125 cities, NO2 accounted for 6 percent (Orlu, Nigeria) to 48 percent (Shanghai, China) of pediatric asthma incidence. NO2's contribution exceeded 20 percent in 92 cities located in both developed and emerging economies.

The top 10 highest NO2 contributions were estimated for eight cities in China (37 to 48 percent of pediatric asthma incidence) and for Moscow, Russia and Seoul, South Korea at 40 percent.

The problem affects cities in the United States as well: Los Angeles, New York, Chicago, Las Vegas and Milwaukee were the top five cities in the U.S. with the highest percentage of pediatric asthma cases linked to polluted air.

Nationally, the largest burdens related to air pollution were found in China at 760,000 cases of asthma per year, followed by India at 350,000 and the United States at 240,000.

Asthma is a chronic disease that makes it hard to breathe and results when the lung's airways are inflamed. An estimated 235 million people worldwide currently have asthma, which can cause wheezing as well as life-threatening attacks.

The World Health Organization calls air pollution "a major environmental risk to health" and has established Air Quality Guidelines for NO2 and other air pollutants. The researchers estimate that most children lived in areas below the current WHO guideline of 21 parts per billion for annual average NO2. They also found that about 92 percent of the new pediatric asthma cases that were attributable to NO2 occurred in areas that already meet the WHO guideline.

"That finding suggests that the WHO guideline for NO2 may need to be re-evaluated to make sure it is sufficiently protective of children's health," said Pattanun Achakulwisut, PhD, lead author of the paper and a postdoctoral scientist at Milken Institute SPH.

The researchers found that in general, cities with high NO2 concentrations also had high levels of greenhouse gas emissions. Many of the solutions aimed at cleaning up the air would not only prevent new cases of asthma and other serious health problems but they would also attenuate global warming, Anenberg said.

Additional research must be done to more conclusively identify the causative agent within complex traffic emissions, said the researchers. This effort, along with more air pollution monitoring and epidemiological studies conducted in data-limited countries will help to refine the estimates of new asthma cases tied to traffic emissions, Anenberg and Achakulwisut added.

Credit: 
George Washington University

Obeticholic acid improves liver fibrosis and other histological features of NASH

11 April 2019, Vienna, Austria: A prespecified interim analysis of the ongoing Phase 3 REGENERATE study has confirmed that obeticholic acid (OCA) is effective in the treatment of nonalcoholic steatohepatitis (NASH) with liver fibrosis. The 18-month analysis, which was reported today at The International Liver Congress™ 2019 in Vienna, Austria, demonstrated that the 25 mg dose of OCA studied improved fibrosis in almost one-quarter of recipients, with significant improvements also reported in other histological markers of NASH.

Nonalcoholic steatohepatitis is a severe form of nonalcoholic fatty liver disease (NAFLD) and is characterized by the presence of steatosis, hepatocellular ballooning, and lobular inflammation. The condition is associated with rapid progression of fibrosis, which can eventually lead to the development of cirrhosis and hepatocellular carcinoma. The global prevalence of NASH has been estimated to range from 1.50% to 6.45%, with almost 60% of individuals with NAFLD who undergo biopsy found to have NASH. There are currently no medications approved in Europe or the USA specifically for the treatment of NASH.2,4

Obeticholic acid is a potent activator of the farnesoid X nuclear receptor that was shown to improve liver histology and fibrosis in a Phase 2 clinical trial (FLINT) published in 2015. The Phase 3 trial reported today is the first study in NASH to be designed in conjunction with regulatory authorities, with the aim of achieving approval for OCA in NASH with fibrosis.

In the analysis reported today, 931 individuals with biopsy-confirmed NASH and significant or severe fibrosis (stages F2 or F3) were randomized to receive OCA 10 mg/day (n=312), OCA 25 mg/day (n=308), or placebo (n=311). The primary endpoints of the study were either fibrosis improvement (greater than or equal to 1 stage) with no worsening of NASH or NASH resolution with no worsening of liver fibrosis on liver biopsy. The most pronounced benefits were observed in the OCA 25 mg treatment group. Once daily OCA 25 mg met the primary endpoint of fibrosis improvement (greater than or equal to 11 stage) with no worsening of NASH in 23.1% of patients (p=0.0002 vs placebo). Although the NASH resolution primary endpoint was not met, 35.1% of patients receiving OCA 25 mg showed improvements in hepatocellular ballooning (p=0.0011 vs placebo), and 44.2% of patients had lobular inflammation (p=0.0322 vs placebo). Dose-dependent reductions in liver enzymes were also observed.

Pruritus, the most commonly-reported adverse event (AE), affected 51% of the OCA 25 mg/day treatment group, 28% of the OCA 10 mg/day treatment group, and 19% of the placebo group. More participants withdrew from the study due to pruritus in the OCA 25 mg/day group (9%) than in the OCA 10 mg/day (

'There is an urgent need for effective treatment regimens for NASH, a common liver disease which can lead to cirrhosis, liver failure and need for transplant,' said Dr Zobair Younossi, Professor and Chairman of the Department of Medicine at Inova Fairfax Medical Campus in Falls Church, Virginia, USA, who presented the study results. 'These first results from the REGENERATE study give us hope that a new targeted approach to NASH treatment may soon become available and potentially reverse some of the liver damage associated with this important liver disease.'

Professor Philip Newsome, Vice-Secretary of EASL, said. 'These data are very exciting as they demonstrate for the first time in a phase 3 trial that medical therapy, in this case obeticholic acid, is able to improve liver fibrosis compared to placebo - a key treatment goal in NASH.'

Credit: 
Spink Health

High rates of liver disease progression and mortality observed in patients with NAFLD/NASH

11 April 2019, Vienna, Austria: Two independent national studies have reported high rates of liver disease progression and mortality among patients with non-alcoholic fatty liver disease/non-alcoholic steatohepatitis (NAFLD/NASH). The studies reported today at The International Liver Congress™ 2019 in Vienna, Austria, found that within 10 years of diagnosis, up to 11% of patients with NAFLD/NASH had progressed to advanced liver diseases (defined as NAFLD/NASH patients with compensated cirrhosis [CC], decompensated cirrhosis [DCC], liver transplant [LT] or hepatocellular carcinoma [HCC]), and up to 27% of patients with NAFLD/NASH and CC had developed liver decompensation.

Fatty liver is a complex condition that affects up to one-quarter of adults worldwide. The condition is considered to be the liver manifestation of metabolic syndrome and encompasses a histological spectrum from the relatively benign non-alcoholic fatty liver to NASH, which typically has an aggressive course. NAFLD/NASH can lead to cirrhosis or HCC, and is set to become the predominant cause of liver disease in many parts of the world; however, their natural history remains incompletely defined.

In the first study, 215,655 NAFLD/NASH patients were identified retrospectively from a German insurance claims database (InGef; 2011-2016) with 100,644 new events of different liver severity stages identified during the follow-up: 79,245 events (78.7%) of non-progressive NAFLD/NASH, 411 events (0.4%) of CC, 20,614 events (20.5%) of DCC, 11 events (0.01%) of LT and 363 events (0.4%) of HCC. Amongst those with advanced liver diseases, mortality rate during 1 year of follow-up increased by up to 50% (range 8.8-51.2%), compared with non-progressive NAFLD/NASH patients (1.2%, p

'Perhaps most worryingly, during the 5-year study period, 11% of the NAFLD/NASH patients progressed to advanced liver diseases and 17% of CC patients progressed to DCC, after accounting for any dying patients," said Professor Ali Canbay from the University of Magdeburg Medical School in Magdeburg, Germany, who presented the study findings. 'This demonstrates very clearly the need for early detection and effective treatment to prevent progression and potentially reduce mortality."

In the second study, French investigators identified 125,052 NAFLD/NASH patients from the French National Database on hospital care (PMSI; 2009-2015), of whom 1,491 (1.2%) were diagnosed with CC, 7,846 (6.3%) with DCC, and 1,144 (0.9%) with HCC. As was seen in Germany, a small cohort of patients progressed rapidly, with 5.6% of NAFLD/NASH patients progressing to more severe liver disease during 7 years of follow-up, and 27.5% of NAFLD/NASH patients with CC progressing to DCC. Mortality was high across all cohorts and increased with liver disease progression. After 1 year, 2.1% of NAFLD/NASH patients, 4.6% of CC patients, and 19.1% of DCC patients had died. The corresponding mortality rates after 7 years of follow-up were 7.9%, 16.3%, and 34.6% respectively.

'Before this study, we had very limited data on the disease progression and mortality of NAFLD/NASH patients in our country,' explained Professor Jerome Boursier from Angers University Hospital in Angers, France. 'We were surprised by the high overall mortality rate among these patients (7.9%) - almost twice that of the general population of a similar age - as well as the apparent rate of under-diagnosis of cirrhotic patients, the majority only being identified following a decompensation event.'

'This shows us we must direct greater effort into finding and treating NAFLD/NASH patients as early as possible, so we can stop or even reverse disease progression.'

Professor Philip Newsome (Vice-Secretary EASL) said, "These data demonstrate the significant morbidity and mortality found in patients with NAFLD and reinforces the need to identify those patients most at risk for appropriate treatment."

Credit: 
Spink Health

'Cthulhu' fossil reconstruction reveals monstrous relative of modern sea cucumbers

image: This is a life reconstruction of Sollasina cthulhu.

Image: 
Elissa Martin, Yale Peabody Museum of Natural History

An exceptionally-preserved fossil from Herefordshire in the UK has given new insights into the early evolution of sea cucumbers, the group that includes the sea pig and its relatives, according to a new article published today in the journal Proceedings of the Royal Society B.

Palaeontologists from the UK and USA created an accurate 3D computer reconstruction of the 430 million-year-old fossil which allowed them to identify it as a species new to science. They named the animal Sollasina cthulhu due to its resemblance to monsters from the fictional Cthulhu universe created by author H.P. Lovecraft.

Although the fossil is just 3 cm wide, its many long tentacles would have made it appear quite monstrous to other small sea creatures alive at the time. It is thought that these tentacles, or 'tube feet', were used to capture food and crawl over the seafloor.

Like other fossils from Herefordshire, Sollasina cthulhu was studied using a method that involved grinding it away, layer-by-layer, with a photograph taken at each stage. This produced hundreds of slice images, which were digitally reconstructed as a 'virtual fossil'.

This 3D reconstruction allowed palaeontologists to visualise an internal ring, which they interpreted as part of the water vascular system - the system of fluid-filled canals used for feeding and movement in living sea cucumbers and their relatives.

Lead author, Dr Imran Rahman, Deputy Head of Research at Oxford University Museum of Natural History said:

"Sollasina belongs to an extinct group called the ophiocistioids, and this new material provides the first information on the group's internal structures. This includes an inner ring-like form that has never been described in the group before. We interpret this as the first evidence of the soft parts of the water vascular system in ophiocistioids."

The new fossil was incorporated into a computerized analysis of the evolutionary relationships of fossil sea cucumbers and sea urchins. The results showed that Sollasina and its relatives are most closely related to sea cucumbers, rather than sea urchins, shedding new light on the evolutionary history of the group.

Co-author Dr Jeffrey Thompson, Royal Society Newton International Fellow at University College London, said:

"We carried out a number of analyses to work out whether Sollasina was more closely related to sea cucumbers or sea urchins. To our surprise, the results suggest it was an ancient sea cucumber. This helps us understand the changes that occurred during the early evolution of the group, which ultimately gave rise to the slug-like forms we see today."

The fossil was described by an international team of researchers from Oxford University Museum of Natural History, University of Southern California, Yale University, University of Leicester, and Imperial College London. It represents one of many important finds recovered from the Herefordshire fossil site in the UK, which is famous for preserving both the soft as well as the hard parts of fossils.

The fossil slices and 3D reconstruction are housed at Oxford University Museum of Natural History.

Credit: 
University of Oxford