Culture

'Flash mob' study puts clinical decision rules for ACS to the test

A novel "flash mob" study finds that, in emergency care, acute coronary syndrome cannot be safely ruled out using the Marburg Heart Score or the family physicians' clinical assessment. In a period of only two weeks, researchers at Maastricht University collected data on 258 ACS-suspected patients by mobilizing one in five family physicians throughout the Netherlands to participate in the study. This mobilization was done by enlisting ambassadors among the FP community in the Netherlands who then spread the word through traditional professional and social networks. The study found that among 243 patients receiving a final diagnosis, 45 (18.5%) were diagnosed with acute coronary syndrome. Sensitivity for the FP rating was 86.7% and sensitivity for the MHS was 94.4%. While large, prospective studies can be time consuming and costly, this innovative "flash mob" method of research, named after the large-scale public collaborations/gatherings driven by social media, allowed for the fast investigation of one simple question on a large scale in a short timeframe.

Credit: 
American Academy of Family Physicians

Evaluating risk of death, complications in patients with heart failure after ambulatory, noncardiac

What The Study Did: Veterans Affairs data for 355,121 patients undergoing ambulatory, elective, noncardiac surgery were used to compare the risk of death and complications in patients with and without heart failure.

Authors: Sherry M. Wren, M.D., of the Stanford University School of Medicine in Palo Alto, California, is the corresponding author.

(doi:10.1001/jamasurg.2019.2110)

Editor's Note: The article includes funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

#  #  #

Media advisory: The full study is linked to this news release.

Embed this link to provide your readers free access to the full-text article This link will be live at the embargo time: https://jamanetwork.com/journals/jamasurgery/fullarticle/2738041?guestAccessKey=eb207a42-43be-4efd-91ad-407a8787f81f&utm_source=For_The_Media&utm_medium=referral&utm_campaign=ftm_links&utm_content=tfl&utm_term=071019

Credit: 
JAMA Network

A human liver cell atlas

image: Single cells were isolated from liver tissue derived from 9 different patients to perform single-cell RNA-sequencing. After applying computational single-cell analysis methods, a human liver cell atlas, or cell type map, was established, enabling the identification of previously unknown sub-types (depicted as clusters represented by numbers on the map). We discovered rare sub-types of bile duct cells in the liver, which include a population of liver epithelial progenitors that can give rise to organoids capable of developing into hepatocytes or bile duct cells.

Image: 
MPI of Immunobiology and Epigenetics, Freiburg, Grün

The liver is one of the largest and most versatile organs of the human body. It turns sugars, proteins, and fats from our food into substances useful for the body and releases them to the cells. In addition to its role in human metabolism, the liver is an immunological organ, which is indispensable for detoxifying the blood. Most strikingly, the liver is the only internal organ that can regenerate back to its full size with only 25% of its original mass.

Liver diseases are one of the biggest health problems in the world and a leading cause of death. Only in Germany, at least five million patients are suffering from fatty liver disease, liver cancer, or hepatitis. Despite the immense importance of the liver for human health, the diversity of individual liver cell types and the associated molecular and cellular processes in both healthy and diseased tissue have not yet been fully investigated.

Scientists from the Max Planck Institute of Immunobiology and Epigenetics in Freiburg and colleagues from the University of Strasbourg are now presenting a comprehensive cell atlas of the human liver published in the science journal Nature. Using what is known as single-cell RNA sequencing, the researchers led by Max Planck Group Leader Dominic Grün and in cooperation with the Baumert Lab succeeded in creating a detailed map of the cell populations in the healthy human liver. Based on the analysis of 10000 cells from nine human donors, the cell atlas maps all important liver cell types, including hepatocytes, the major metabolic cells of the liver, endothelial cells lining the blood vessels, liver resident macrophages and other immune cell types, as well as bile duct cells and liver epithelial progenitors. With this data, it is possible to capture the diversity of cell types and cell states at an unprecedented resolution and to understand how they change during development or upon disease progression.

The cell's fingerprint

The researchers also discovered an astonishing diversity among individual cells of the alleged same cell type. They found new subtypes of hepatocytes, endothelial cells, and macrophages, which, although hardly different in their morphological appearance, have discrete gene expression profiles. These discoveries were made possible by the significant progress of experimental and computational single-cell analysis methods, which enable cells to be examined at high resolution.

In single-cell RNA sequencing, the organ tissue to be investigated is dissociated into individual cells; these cells are then isolated and sequenced separately. The sequencing is used to determine how many messenger RNA molecules (mRNA) of each gene are present in the cell. "The messenger RNA transmits the blueprints stored in the DNA to the protein factories. By measuring which RNA molecules are present in a cell at a certain point in time, we can identify which genes are active. This gives us a kind of fingerprint that provides us with a comprehensive insight into the very nature of each cell. This enables us to understand which functions the cell performs, how it is regulated and also what happens when diseases develop," explains Dominic Grün.

The data obtained in this way are not only extremely extensive, but also very complex since the RNA molecules of thousands of genes in thousands of cells have to be quantified and interpreted simultaneously. In recent years, Dominic Grün has developed tailor-made algorithms helping him and his team to characterize the different cell types and understand their developmental pathways.

Identifying progenitors of liver cells

Using such fingerprints of cells, the Freiburg researchers also identified previously unknown properties of a subpopulation of bile duct cells. Bile ducts run through the entire liver to transport bile to the gallbladder. "Our data show that cells within this rare subpopulation are precursor cells or progenitors. They are not only able to form organoids, which is a marked characteristic of stem cells, but also have the potential to develop into different cell types," explains Nadim Aizarani, the first author of the study. These progenitor cells either differentiate into hepatocytes or bile duct cells when cultivated in a culture medium. The Max Planck researchers are convinced that this precursor cell population plays an important role in liver regeneration and could also be involved in the development of liver diseases or tumors.

Important reference data for cancer patients

The cell atlas and the method of single-cell RNA-sequencing, therefore, have great potential for cancer therapy. Current approaches to analyze diseased tissue, such as tumor tissue, only provided an average value of the concentration of active genes for the entire tissue sample and thus only an average view of the tumor's molecular profile. "The contribution of rare cell types or even individual cells is lost in this average value. Although it is perhaps precisely these few cells that determine whether a tissue is healthy or degenerates into cancer," explains Dominic Grün.

But single-cell sequencing, on the other hand, captures the molecular signature of each healthy or diseased cell in the sample to be examined. The comparison with reference data from healthy tissue enables scientists to target the disease-causing molecular properties of tumor cells and may help to develop improved treatment options in the future.

The Freiburg and Strasbourg researchers demonstrate in their newest study that the cell atlas of the human liver will be an essential reference database for liver cancer research. They compared the data of healthy tissue from the cell atlas with cells from hepatocellular carcinoma, the most common form of primary liver cancer. The comparison enabled conclusions to be drawn, e.g. the identification of new tumor cell markers and perturbed gene activity patterns of different cell types within the tumor. "I think that research into cancer using single-cell sequencing will help to improve the diagnosis and eventually the treatment of tumors even further. In the future, we will not only be able to uncover possible interactions between different cell types in tumors. It will also be possible to observe these interactions as the disease progresses," says Dominic Grün.

The researchers are convinced that their cell atlas of the human liver and the developed methods have laid an important foundation within biomedicine, which will advance the research and understanding of liver diseases on the molecular level to possibly create new therapeutic strategies against liver diseases in the future.

Credit: 
Max Planck Institute of Immunobiology and Epigenetics

Researchers identify cancer killing capability of lesser-known immune cells

Researchers at Trinity College Dublin have identified, for the first time in oesophageal cancer, the cancer killing capability of a lesser-known type of immune cell, presenting a new potential therapeutic target. Their research has been published today Wednesday, July 10th 2019 in the international journal 'Frontiers in Immunology'.

Oesophageal cancer is a very aggressive type of cancer with poor prognosis, and the 5-year survival rate is typically less than 15%. Linked with obesity, oesophageal cancer is one of the fastest growing cancers in the Western world and incidence is due to double in Ireland within the next few decades. Current treatment strategies work well but only for a minority (approx. 25%) of patients so new treatment options are urgently needed.

New treatment strategies targeting the immune system have had revolutionary effects in other cancer types, but the latest clinical trials show that, disappointingly, immunotherapy offers no real benefit for the majority of patients with oesophageal cancer.

The Cancer Immunopathology research team from the Trinity Translational Medicine Institute (TTMI) based in St James's Hospital, are studying unconventional types of immune cells with lesser known functions. T cells (a type of white blood cell) are very important in fighting cancer; preventing tumours arising and killing off established tumours if activated in the correct way. Up until now, the majority of immune-based studies have largely focused on conventional CD8 (cytotoxic) T cells, but unconventional T cells, although less abundant, can also have potent cancer killing ability.

The team investigated a particular type of T cell, known as a MAIT cell (mucosal-associated invariant cell) in oesophageal cancer. MAIT cells are known to protect against bacterial infections but little is known about what they do in cancer.

The team at Trinity are the first to report the characterisation of MAIT cells in the oesophageal cancer setting. They looked at MAIT cells blood and tumours from patients with oesophageal cancer or a pre-cancerous disorder called Barrett's Oesophagus, and found that:

MAIT cells are decreased in the blood of cancer patients, compared to healthy donors but are found in oesophageal tumours at higher levels than healthy tissues.

MAIT cell levels are not affected by chemoradiotherapy treatment, unlike other T cell types.

Healthy MAIT cells can kill oesophageal cancer cells in a test tube, but this killing is reduced when liquid from fresh tumour biopsies is present, meaning that factors from the tumour can prevent MAIT cell killing.

MAIT cells taken from oesophageal tumours showed high levels of markers associated with functional inhibition. This means that oesophageal tumours seem to be able to stop MAIT cells from killing them, by using these inhibitory markers to deliver a "do not kill" signal.

Overall, these results reveal an anti-tumour function for a new potential therapeutic target cell in this aggressive cancer type, which is being inhibited by the tumour itself. Finding new ways to reverse the inhibition of MAIT cell tumour-killing ability may offer a new therapeutic strategy in the fight against cancer.

Research Assistant Professor at TTMI and Principal Investigator, Dr Margaret Dunne said: "Oesophageal cancer rates are rising in Ireland, and improved treatment strategies are urgently needed. By revealing how lesser studied immune cells work in cancer, we can better understand the shortcomings of current immunotherapies and investigate new ways to boost the anti-cancer immune response."

"Immunotherapies have revolutionised cancer treatment but still only work for a minority of people. A more in-depth understanding of underlying biology will be critical to unravel why this is - and to allow more patients to benefit" she added.

Credit: 
Trinity College Dublin

The Zika epidemic in Cuba, reflected by imported cases in Barcelona

image: Returning travellers may introduce infectious diseases such as Zika into non-endemic regions.

Image: 
Ross Parmly / Unsplash

Travellers returning to Barcelona mirrored the 2017 Zika outbreak in Cuba, according to a study led by the Hospital Clínic of Barcelona and the Barcelona Institute for Global Health, an institution supported by "la Caixa".

Zika virus spread throughout Latin America between 2015 and 2016, followed by a decrease in the number of new cases. Cuba, however, was one of the last countries to report cases: the first autochthonous case was confirmed in March 2016, and recent data indicate that an outbreak with over 600 reported cases occurred mid-2017 in Cienfuegos.

This outbreak was reflected in a study initiated by Hospital Clinic in 2016, with the aim of detecting imported cases of Zika and other mosquito-borne viral diseases (which are part of arboviral diseases). Over a period of almost three years, the Tropical Medicine Service detected 42 imported cases of Zika. While in 2016, Zika-infected travellers had visited different countries in Latin America and the Caribbean, the cases diagnosed from ends of 2017 onwards only came from Cuba. "These cases could reflect an absence of herd immunity in the Cuban population, as well as the possibility of it being one of the last places in America with ongoing virus transmission," explains Alex Almuedo, first author of the study.

For Jose Muñoz, study coordinator, ISGlobal researcher and Head of the Tropical Medicine Service at the Hospital Clínic, these results underscore the need to consider possible Zika infection among travellers returning from Cuba." Only in 2016, more than one million European travellers visited the island.

"It is also important to note that 70% of the travellers that got infected with Zika did not seek advice before their trip," adds Muñoz, highlighting that the imported diseases clinics are "a key element to survey and prevent the introduction of these diseases in non-endemic regions."

Mission: detect autochthonous transmission of arboviral diseases

Along this line, the study AVATAR (Autochthonous arboVirAl Transmisión bARcelona) will be launched this season in collaboration with the Barcelona Public Health Agency and the Instituto de Salud Carlos III, to evaluate the possibility of silent autochthonous transmission in Barcelona of dengue, chikungunya and Zika - viral diseases that can be transmitted by a vector present in the region: Aedes albopictus, also known as tiger mosquito.

Credit: 
Barcelona Institute for Global Health (ISGlobal)

Impaired learning linked to family history of Alzheimer's

Adults with a first-degree relative with Alzheimer's disease perform more poorly on online paired-learning tasks than adults without such a family history, and this impairment appears to be exacerbated by having diabetes or a genetic variation in the apolipoprotein E (APOE) gene linked to the disease.

The findings, published on Tuesday in eLife, may help identify people who have increased risk for developing Alzheimer's disease and could uncover new ways to delay or prevent the disease.

"Identifying factors that reduce or eliminate the effect of a family history of Alzheimer's disease is particularly crucial since there is currently no cure or effective disease-slowing treatments," says lead author Joshua Talboom, PhD, a Postdoctoral Fellow at the Translational Genomics Research Institute in Arizona, US.

Having a family history of Alzheimer's disease is a well-known risk factor for developing the condition, but the effects on learning and memory throughout a person's life are less clear. Some studies have been conducted in this area, but most have been too small to draw significant conclusions.

To enable a larger study, Talboom and colleagues created an easy-to-use website, http://www.mindcrowd.org, that participants could log on to and complete a memory test. Participants were asked to learn 12-word pairs and were then tested on their ability to complete the missing half of the pair when presented with one of the words.

The 59,571 individuals who participated were also asked to answer questions about their sex, education, age, language, country and health, including a question about whether one of their parents or siblings had been diagnosed with Alzheimer's disease. Those with a family history of Alzheimer's were able to match about two and one-half fewer word pairs than individuals without a family history. Having diabetes appeared to compound the learning impairments seen in individuals with a family history.

A subset of 742 participants who had a close relative with Alzheimer's submitted a sample of dried blood or saliva that the researchers tested for a genetic variation in the APOE gene linked to the disease. "The APOE genotype is an important genetic factor that influences memory, and we found that those with the variation performed worse on the memory test than those without the variation," Talboom explains.

Some characteristics, however, appeared to protect against memory and learning impairments in people with a family history of Alzheimer's disease. Participants with higher levels of education experience less of a decline in scores on the learning and memory test than people with lower levels of education, even when they have a family history of the disease. Women also appear to fair better despite having Alzheimer's disease risk factors.

"Our study supports the importance of living a healthy lifestyle, properly treating diseases such as diabetes, and building learning and memory reserve through education to reduce the cognitive decline associated with Alzheimer's disease risk factors," concludes senior author Matthew Huentelman, Professor of Neurogenomics at the Translational Genomics Research Institute, Arizona.

Credit: 
eLife

Food may have been scarce in Chaco Canyon

Chaco Canyon, a site that was once central to the lives of pre-colonial peoples called Anasazi, may not have been able to produce enough food to sustain thousands of residents, according to new research. The results could shed doubt on estimates of how many people were able to live in the region year-round.

Located in Chaco Culture National Historic Park in New Mexico, Chaco Canyon hosts numerous small dwellings and a handful of multi-story buildings known as great houses. Based on these structures, researchers think that it was once a bustling metropolis that was home to as many as 2,300 people during its height from 1050 to 1130 AD.

But Chaco also sits in an unforgiving environment, complete with cold winters, blazing-hot summers and little rainfall falling in either season.

"You have this place in the middle of the San Juan Basin, which is not very habitable," said Larry Benson, an adjoint curator at the CU Museum of Natural History.

Benson and his colleagues recently discovered one more wrinkle in the question of the region's suitability. The team conducted a detailed analysis of the Chaco Canyon's climate and hydrology and found that its soil could not have supported the farming necessary to feed such a booming population.

The findings, Benson said, may change how researchers view the economy and culture of this important area.

"You can't do any dryland farming there," Benson said. "There's just not enough rain."

Today, Chaco Canyon receives only about nine inches of rain every year and historical data from tree rings suggest that the climate wasn't much wetter in the past.

Benson, a retired geochemist and paleoclimatologist who spent most of his career working for the U.S. Geological Survey, set out to better understand if such conditions might have limited how many people could live in the canyon. In the recent study, he and Ohio State University archaeologist Deanna Grimstead pulled together a wide range of data to explore where Chaco Canyon residents might, conceivably, have grown maize, a staple food for most ancestral Pueblo peoples.

They found that these pre-colonial farmers not only contended with scarce rain, but also destructive flash floods that swept down the canyon's valley floor.

"If you're lucky enough to have a spring flow that wets the ground ahead of planting, about three-quarters of the time you'd get a summer flow that destroys your crops," Benson said.

The team calculated that Chacoans could have, at most, farmed just 100 acres of the Chaco Canyon floor. Even if they farmed all of the surrounding side valleys--a monumental feat--they would still have only produced enough corn to feed just over 1,000 people.

The researchers also went one step further, assessing whether past Chaco residents could have supplemented this nutritional shortfall with wild game like deer and rabbits. They calculated that supplying the 185,000 pounds of protein needed by 2,300 people would have quickly cleared all small mammals from the area.

In short, there would have been a lot of hungry mouths in Chaco Canyon. Benson and Grimstead published their results this summer in the Journal of Archaeological Science.

For Benson, that leaves two possibilities. Chaco Canyon residents either imported most of their food from surrounding regions 60 to 100 miles away, or the dwellings in the canyon were never permanently occupied, instead serving as temporary shelters for people making regular pilgrimages.

Either scenario would entail a massive movement of people and goods. Benson estimates that importing enough maize and meat to feed 2,300 people would have required porters to make as many as 18,000 trips in and out of Chaco Canyon, all on foot.

"Whether people are bringing in maize to feed 2,300 residents, or if several thousand visitors are bringing in their own maize to eat, they're not obtaining it from Chaco Canyon," Benson said.

Credit: 
University of Colorado at Boulder

Can computer use, crafts and games slow or prevent age-related memory loss?

MINNEAPOLIS - A new study has found that mentally stimulating activities like using a computer, playing games, crafting and participating in social activities are linked to a lower risk or delay of age-related memory loss called mild cognitive impairment, and that the timing and number of these activities may also play a role. The study is published in the July 10, 2019, online issue of Neurology®, the medical journal of the American Academy of Neurology.

Mild cognitive impairment (MCI) is a medical condition that is common with aging. While it is linked to problems with thinking ability and memory, it is not the same as dementia. People with MCI have milder symptoms. They may struggle to complete complex tasks or have difficulty understanding information they have read, whereas people with dementia have trouble with daily tasks such as dressing, bathing and eating independently. However, there is strong evidence that MCI can be a precursor of dementia.

"There are currently no drugs that effectively treat mild cognitive impairment, dementia or Alzheimer's disease, so there is growing interest in lifestyle factors that may help slow brain aging believed to contribute to thinking and memory problems--factors that are low cost and available to anyone," said study author Yonas E. Geda, MD, MSc, of the Mayo Clinic in Scottsdale, Ariz., and a member of the American Academy of Neurology. "Our study took a close look at how often people participated in mentally stimulating activities in both middle-age and later life, with a goal of examining when such activities may be most beneficial to the brain."

For the study, researchers identified 2,000 people with an average age of 78 who did not have mild cognitive impairment. At the start of the study, participants completed a questionnaire about how often they took part in five types of mentally stimulating activities during middle-age, defined as ages 50 to 65, and in later life, age 66 and older. Participants were then given thinking and memory tests every 15 months and were followed for an average of five years. During the study, 532 participants developed mild cognitive impairment.

Researchers found that using a computer in middle-age was associated with a 48-percent lower risk of mild cognitive impairment. A total of 15 of 532 people who developed mild cognitive impairment, or 2 percent, used a computer in middle age compared to 77 of 1,468 people without mild cognitive impairment, or 5 percent. Using a computer in later life was associated with a 30-percent lower risk, and using a computer in both middle-age and later life was associated with a 37-percent lower risk of developing thinking and memory problems.

Engaging in social activities, like going to movies or going out with friends, or playing games, like doing crosswords or playing cards, in both middle-age and later life were associated with a 20-percent lower risk of developing mild cognitive impairment.

Craft activities were associated with a 42-percent lower risk, but only in later life.

The more activities people engaged in during later life, the less likely they were to develop mild cognitive impairment. Those who engaged in two activities were 28 percent less likely to develop memory and thinking problems than those who took part in no activities, while those who took part in three activities were 45 percent less likely, those with four activities were 56 percent less likely and those with five activities were 43 percent less likely.

"Our study was observational, so it is important to point out that while we found links between a lower risk of developing mild cognitive impairment and various mentally stimulating activities, it is possible that instead of the activities lowering a person's risk, a person with mild cognitive impairment may not be able to participate in these activities as often," Geda said. "More research is needed to further investigate our findings."

One strength of the study was the large number of participants; however a limitation was that participants were asked to remember how often they participated in mentally stimulating activities in middle-age, up to two decades before the study began, and their memories may not have been completely accurate.

Credit: 
American Academy of Neurology

'Chaos' in the home linked to poor asthma control in children

A chaotic household - one where things just don't seem to run smoothly, there's lots of noise, little gets taken care of in a timely manner, and where relaxation is difficult - as well as child and parent depression, are risk factors for worse asthma outcomes in urban minority children, according to a new paper published in the journal Pediatrics.

"Higher levels of chaos - lack of organization or set routines, among other things - seems to be a pathway linking parental depression and worse child asthma control," said Sally Weinstein, associate professor of clinical psychiatry in the University of Illinois at Chicago College of Medicine and first author on the paper.

Minority urban youth have higher rates of asthma and are more likely to have poor outcomes or even die of asthma compared to the general population. While much research exists on medications and prevention, researchers are just starting to understand how psychosocial factors affect asthma and how they might contribute to disparities.

Several studies have found that children with depression and anxiety have worse asthma outcomes, including more severe asthma and more use of rescue medications. Some studies have linked parents' depression with worse asthma outcomes in their children, while others have shown that family conflict is associated with higher levels of asthma severity.

Weinstein and colleagues wanted to look at the interplay between parent, child and family functioning and child asthma control in urban minority youth with uncontrolled asthma. Uncontrolled asthma is when children have excessive asthma symptoms and rescue medication use. The consequences of uncontrolled asthma can be severe.

The researchers looked at the relationship between parent depression and post-traumatic stress disorder, or PTSD, symptoms; child depression and PTSD symptoms; and child asthma control among 223 children between the ages of 5 and 16 years and one of their parents. The participants were enrolled in a longitudinal study examining educational interventions to help improve asthma control called the Asthma Action at Erie Trial.

Weinstein and colleagues collected data on depression, PTSD and family chaos via in-person interviews before the parents and children started the study intervention. Asthma control was measured with the Asthma Control Test, a standardized survey that evaluates asthma severity and symptoms in children. Parents were also asked about the number of days in the last two weeks when the child's activity was limited due to asthma symptoms, and the child's asthma medications.

The researchers found that parent and childhood depressive symptoms, but not PTSD symptoms, were associated with worse child asthma control. Higher levels of family chaos were also associated with worse child asthma control even when the researchers controlled for parent and child depression. Family chaos was evaluated using a 15-item questionnaire that asked respondents to rate statements such as "No matter how hard we try, we always seem to be running late;" "We can usually find things when we need them;" "We always seem to be rushed;" and "Our home is a good place to relax."

The researchers found that family chaos explained part of how parent depression affected child asthma control.

"When a parent is depressed, it's harder to keep the family routines running smoothly, and it's also harder to manage the daily demands of caring for their child's asthma, which can require multiple medications and avoidance of triggers," said Weinstein, who is also associate director of the University of Illinois Center on Depression and Resilience. "We saw that in families with greater household chaos, child asthma control tended to be worse."

"Our findings highlight the role of family chaos in worse asthma outcomes for children in these families," said Dr. Molly Martin, associate professor of pediatrics in the UIC College of Medicine and the study's principal investigator. "Pediatricians and asthma specialists should consider and address parent and child depression and provide support to optimize household routines as a way to help improve children's asthma control."

Credit: 
University of Illinois Chicago

Decentralising science may lead to more reliable results

Research results on drug-gene interactions are much less likely to be replicated if they are performed by hierarchical communities or close-knit groups of frequent collaborators who use similar methods, instead of independent groups of scientists using different methods, suggests a paper published last week in eLife.

The findings may help improve the reliability of scientific results by helping to identify possible factors that contribute to the publication of unreliable, misleading or false results about potential drug-gene interactions.

"The way science is often produced may inadvertently contribute to unreliable results," says senior author James Evans, Professor of Sociology at the University of Chicago, and External Professor at the Santa Fe Institute, US. "For example, a large group of scientists who frequently collaborate, use similar methods, share equipment, and frequently cite similar works are prone to producing the same, self-confirming results. Although such a group may produce repeated published experiments, our results demonstrate that their findings are not independent. Independent labs perform experiments in different ways with different expectations and are less prone to peer pressure than a densely connected networks of scientists."

To better understand how such factors may contribute to unreliable results in studies of interactions between drugs and genetics, Evans and his colleagues compared the results of 3,363 published studies on 51,292 drug-gene interactions in the Comparative Toxicogenomics Database with results from the LINCS L1000 program, which used robots to test thousands of drug-gene interactions.

They found that drug-gene interactions identified by multiple studies were verified by the LINCS L1000 results 45% of the time, while results from single studies could only be verified 19% of the time.

They also looked at a subset of gene-drug interactions that were investigated in more than one study. They found interconnected groups of authors using similar methods were more likely to confirm each other's results than scientists with no apparent connections. The results of these interconnected groups were less likely to be replicated by LINCS L1000 than results from independent groups.

"Even if a drug-gene interaction claim garners support from many papers, if it is studied exclusively by a centralised scientific community, the claim has a predicted probability of replication that is similar to that for a claim reported in a single paper," says lead author Valentin Danchev, PhD, a postdoctoral scholar previously in the Department of Sociology at the University of Chicago, and now at the Meta-Research Innovation Center at Stanford (METRICS), Stanford University, US.

He adds that the way science is currently organised encourages frequent collaborations between interconnected groups of researchers. These interconnected groups and the 'star' scientists that lead them not only publish results that are less likely to be replicated but may also gain disproportionate influence, potentially discouraging independent groups from researching the same drug-gene interactions or publishing findings that disagree. "Our findings highlight the importance of introducing science policies that promote decentralised and non-repeated collaboration as a path to independent replications that are robust across diverse teams, methods and settings," Danchev concludes.

Credit: 
eLife

Body plan evolution not as simple as once believed

image: The left side is a rendering of a Drosophila yakuba male fly while the right side is a Drosophila santomea male. Drosophila santomea has lost most of its body coloration during the .5 million years since the two species diverged, including bands of pigmentation that adorn each abdominal segment and the full pigmentation of posterior segments.

Image: 
Nicolas Gompel

The role of Hox genes in changing the layout of different body parts during evolution has been challenged by a study led by researchers out of the University of Pittsburgh's Department of Biological Sciences.

Hox genes are vital to developing differences in repeated body parts such as vertebrae, limbs, or digits in most animal species, including human beings. Ever since their discovery, scientists have thought that modifications to Hox genes could be the primary way that the animal body plan has been altered during evolution.

The paper, "Changes throughout a genetic network masks the contribution of Hox gene evolution," discusses experiments that pinpoint evolutionary changes in a Hox gene, but found that several other genes had evolved alongside it to generate a difference in pigmentation along the fruit fly body plan. The paper was published in Current Biology June 27.

The experiment identified evolutionary modifications in Hox gene Abd-B that caused a drastic loss of expression on the body of the Dropsophila santomea (D. santomea) fruit fly. The same gene is necessary for the fruit fly's sister species, Dropsophila yakuba (D. yakuba,) to express body pigmentation, so changes to that gene were expected to cause a loss of pigmentation across the species.

However, when researchers restored the D yakuba Abd-b gene to D. santomea, it did not restore or increase the amount of pigmentation shown. Researchers said that outcome is the result of four other genes within the D. santomea pigmentation network, three of which evolved in ways that prevent it from responding to Hox gene Abd-B.

"Hox genes are clearly very important regulators of animal development, setting up animal body plans and showing signs of change in all sorts of creatures whose body plans differ. This work shows just how complex the process of evolving those differences can be. It takes all sorts of genes working together to generate these phenotypes," said Mark Rebeiz, an associate professor of evolutionary development who was a lead author on the paper.

Credit: 
University of Pittsburgh

Unusual eating behaviors may be a new diagnostic indicator for autism

HERSHEY, Pa. -- Atypical eating behaviors may be a sign a child should be screened for autism, according to a new study from Penn State College of Medicine.

Research by Susan Mayes, professor of psychiatry, found that atypical eating behaviors were present in 70% of children with autism, which is 15 times more common than in neurotypical children.

Atypical eating behaviors may include severely limited food preferences, hypersensitivity to food textures or temperatures, and pocketing food without swallowing.

According to Mayes, these behaviors are present in many 1-year-olds with autism and could signal to doctors and parents that a child may have autism.

"If a primary care provider hears about these behaviors from parents, they should consider referring the child for an autism screening," Mayes said.

Mayes said that the earlier autism is diagnosed, the sooner the child can begin treatment with a behavior analyst. Previous studies have shown applied behavior analysis to be most effective if implemented during the preschool years. Behavior analysts use a number of interventions, including rewards, to make positive changes in the children's behavior and teach a range of needed skills.

Keith Williams, director of the Feeding Program at Penn State Children's Hospital, uses this therapy to help a variety of individuals with unusual eating behaviors. He said that identifying and correcting these behaviors can help ensure children are eating a proper diet.

"I once treated a child who ate nothing but bacon and drank only iced tea," Williams said. "Unusual diets like these don't sustain children."

Williams also noted that there is a distinct difference between worrisome eating behaviors and the typical picky eating habits of young children. He explained that most children without special needs will slowly add foods to their diets during the course of development, but children with autism spectrum disorders, without intervention, will often remain selective eaters.

"We see children who continue to eat baby food or who won't try different textures," Williams said. "We even see children who fail to transition from bottle feeding."

Mayes said that many children with autism eat a narrow diet consisting primarily of grain products, like pasta and bread, and chicken nuggets. She said that because children with autism have sensory hypersensitivities and dislike change, they may not want to try new foods and will be sensitive to certain textures. They often eat only foods of a particular brand, color or shape.

The research also showed that most children with autism who had atypical eating behaviors had two or more types -- almost a quarter had three or more. Yet, none of the children with other developmental disorders who did not have autism had three or more. According to Williams, this is a common, clinical phenomenon -- and it has prompted him and his colleagues to recommend some children for further evaluation.

"When we evaluate young children with multiple eating problems, we start to wonder if these children might also have the diagnosis of autism," Williams said. "In many cases, they eventually do receive this diagnosis."

The researchers evaluated the eating behaviors described in parent interviews of more than 2,000 children from two studies. They investigated the difference in the frequency of unusual eating behaviors between typical children and those with autism, attention deficit hyperactivity disorder and other disorders.

Williams said the study data shows that atypical eating behaviors may help diagnostically distinguish autism from other disorders. Even though children from both groups have unusual eating habits, they are seven times more common in autism than in other disorders, according to the study data.

"This study provided further evidence that these unusual feeding behaviors are the rule and not the exception for children with autism," Williams said.

Credit: 
Penn State

Exactly how fast is the universe expanding?

video: The collision of two neutron stars (GW170817) flung out an extraordinary fireball of material and energy that is allowing a Princeton-led team of astrophysicists to calculate the Hubble constant, the speed of the universe's expansion. They used a super-high-resolution radio 'movie' (left) that they compared to a computer model (right). To generate their 'movie,' the science team combined data from enough radio telescopes spread over a large enough region to generate an image with such high resolution that if it were an optical camera, it could see individual hairs on someone's head 6 miles away. The movie emphasizes observations taken 75 days and 230 days after the merger. The middle panel shows the radio afterglow light curve.

Image: 
Video by Ore Gottlieb and Ehud Nakar, Tel Aviv University

Exactly how fast is the universe expanding?

Scientists are still not completely sure, but a Princeton-led team of astrophysicists has used the neutron star merger detected in 2017 to come up with a more precise value for this figure, known as the Hubble constant. Their work appears in the current issue of the journal Nature Astronomy.

"The Hubble constant is one of the most fundamental pieces of information that describes the state of the universe in the past, present and future," said Kenta Hotokezaka, the Lyman Spitzer, Jr. Postdoctoral Fellow in Princeton's Department of Astrophysical Sciences. "So we'd like to know what its value is."

Currently, the two most successful techniques for estimating the Hubble constant are based on observations of either the cosmic microwave background or stars blowing themselves to pieces in the distant universe.

But those figures disagree: Measurements of exploding stars -- Type Ia supernovae -- suggest that the universe is expanding faster than is predicted by Planck observations of the cosmic microwave background.

"So either one of them is incorrect, or the models of the physics which underpin them are wrong," said Hotokezaka. "We'd like to know what is really happening in the universe, so we need a third, independent check."

He and his colleagues -- Princeton's NASA Sagan Postdoctoral Fellow Kento Masuda, Ore Gottlieb and Ehud Nakar from Tel Aviv University in Israel, Samaya Nissanke from the University of Amsterdam, Gregg Hallinan and Kunal Mooley from the California Institute of Technology, and Adam Deller from Swinburne University of Technology in Australia -- found that independent check by using the merger of two neutron stars.

Neutron star mergers are phenomenally energetic events in which two massive stars whip around each other hundreds of times per second before merging in an extraordinary collision that flings out a burst of gravitational waves and an enormous blast of material. In the case of the neutron star merger that was detected on Aug. 17, 2017, the two stars -- each the size of Manhattan and with almost twice the mass of the sun -- were moving at a significant fraction of the speed of light before they collided.

The gravitational wave burst from a neutron star merger makes a distinctive pattern known as a "standard siren." Based on the shape of the gravitational wave signal, astrophysicists can calculate how strong the gravitational waves should have been. They can then compare that to the measured strength of the signal to work out how far away the merger occurred.

But there's a catch -- this only works if they know how the merging stars were oriented with respect to Earth's telescopes. The gravitational wave data can't distinguish between mergers that were nearby and edge-on, distant and face-on, or something in between.

To separate those possibilities, the researchers used a super-high-resolution radio "movie" of the fireball of material that was left behind after the neutron stars merged. To make their movie, they combined data from radio telescopes spread across the world.

"The resolution of the radio images we made was so high, if it was an optical camera, it could see individual hairs on someone's head 3 miles away," said Deller.

"By comparing the miniscule changes in the location and shape of this distant bullet of radio-emitting gas against several models including one developed on supercomputers, we were able to determine the orientation of the merging neutron stars," said Nakar.

Using this, they calculated how far away the merging neutron stars were -- and then, by comparing that with how fast their host galaxy is rushing away from ours, they could measure the Hubble constant.

After the 2017 neutron star merger (GW170817) was registered by nearly every astronomical instrument on the planet, astrophysicists calculated that the Hubble constant value was between 66 and 90 kilometers per second per megaparsec. By using tight constraints on the orientation of the collision, published last year by Mooley and several of the same co-authors, including Hotokezaka, the current group of collaborators were able to pin that estimate down further, to between 65.3 and 75.6 km/s/Mpc.

While that precision is "quite good," said Hotokezaka, it's still not good enough to distinguish between the Planck and Type Ia models. He and his colleagues estimate that to get that level of precision, they would need data from 15 more collisions like GW170817 -- with its helpful abundance of data up and down the entire electromagnetic spectrum -- or 50 to 100 collisions that are detected only with gravitational waves.

"This is the first time that astronomers have been able to measure the Hubble constant by using a joint analysis of a gravitational-wave signals and radio images," said Hotokezaka. "It is remarkable that only a single merger event allows us to measure the Hubble constant with a high precision -- and this approach relies neither on the cosmological model (Planck) nor the cosmic-distance ladder (Type Ia)."

"A Hubble constant measurement from superluminal motion of the jet in GW170817" by K. Hotokezaka, E. Nakar, O. Gottlieb, S. Nissanke, K. Masuda, G. Hallinan, K. P. Mooley and A. T. Deller appears in the current issue of the journal Nature Astronomy (DOI: 10.1038/s41550-019-0820-1.) The research was supported by Princeton University, the Israel Science Foundation, the Netherlands Organization for Scientific Research, the National Aeronautics and Space Administration, the National Science Foundation (AST-1654815), and the Australian Research Council (FT150100415).

Credit: 
Princeton University

Could vacuum physics be revealed by laser-driven microbubble?

image: Envisioned picture showing all of the main events of microbubble implosion, i.e., laser illumination, hot electron spread, implosion, and proton flash at the end.

Image: 
M. Murakami

Osaka, Japan -- A "vacuum" is generally thought to be nothing but empty space. But in fact, a vacuum is filled with "virtual particle-antiparticle pairs" of electrons and positrons that are continuously created and annihilated in unimaginably short time-scales.

The quest for a better understanding of vacuum physics will lead to the elucidation of fundamental questions in modern physics, which is integral in unravelling the mysteries of space exploration such as the Big Bang. However, to forcibly separate the virtual pairs using a laser's electric field and cause them to appear not as virtual particles but real particles, the laser intensity required would be ten million times higher than what today's laser technology is capable of. This field intensity is the so-called "Schwinger limit", named a half century ago after the American Nobel laureate, Julian Schwinger.

Scientists at Osaka University discovered a novel mechanism which they refer to as microbubble implosion (MBI) in 2018. In MBI, super-high energy hydrogen ions (relativistic protons) are emitted at the moment when bubbles shrink to atomic size through the irradiation of hydrides with micron-sized spherical bubbles by ultraintense, ultrashort laser pulses.

In this study, the group led by Masakatsu Murakami confirmed that during MBI, an ultrahigh electrostatic field close to the Schwinger field could be achieved because micron-sized bubbles embedded in a solid hydride target implode to have nanometer-sized diameters upon ionization.

From the 3D simulations carried out at the Osaka University Institute of Laser Engineering, they also found that the density during the maximum compression of the bubble reaches several hundred thousand to one million times solid density. At this density, something no larger than a lump sugar would weigh a few hundred kilograms. The energy density at the bubble center was found to be about one million times higher than that at the sun. These astonishing numbers have been thought to be impossible to achieve on Earth. Their research results were published in Physics of Plasmas.

Credit: 
Osaka University

High-safety, flexible and scalable Zn//MnO2 rechargeable planar micro-batteries

image: Schematic of screen printing fabrication of printed Zn//MnO2 MBs and Optical photographs showing the stepwise printing fabrication of Zn//MnO2 MBs.
(a-d) Schematic of screen printing fabrication of printed Zn//MnO2 MBs: (a) the black PET substrate, (b) the printed graphene current collectors, (c) the printed MnO2 cathode, (d) the printed Zn anode. (e-h) Optical photographs showing the stepwise printing fabrication of Zn//MnO2 MBs: (e) the black PET substrate, (f) the graphene current collectors, (g) the printed MnO2 cathode and (h) the printed Zn anode on the interdigital graphene fingers. (i-k) Zn//MnO2 MBs printed onto the different substrates, including (i) cloth, (j) A4 paper, and (k) glass.

Image: 
©Science China Press

Increasing development of micro-scale electronics has stimulated demand of the corresponding micro-scale power sources, especially for micro-batteries (MBs). However, complex manufacturing process and poor flexibility of the traditional stacked batteries have hindered their practical applications.

Planar MBs have recently garnered great attention due to their simple miniaturization, facile serial/parallel integration and capability of working without separator membranes. Furthermore, planar geometry has extremely short ion diffusion pathway, which is attributed to full integration of printed electronics on a single substrate. Also, in order to get rid of the safety issues induced by the flammable organic electrolyte, the aqueous electrolyte, characterized by intrinsic nonflammability, high ionic conductivity, and nontoxicity, is a promising candidate for large-scale wearable and flexible MB applications. As the consequence, various printing techniques have been used for fabricating planar aqueous MBs. "In particular, screen printing can effectively control the precise pattern design with adjustable rheology of the inks, and is very promising for large-scale application." The author said.

In a new article published in Beijing-based National Science Review, Zhong-Shuai Wu at Dalian Institute of Chemical Physics, Chinese Academy of Sciences, constructed aqueous rechargeable planar Zn//MnO2 batteries by applicable and cost-effective screen printing strategy. "The planar Zn//MnO2 micro-batteries, free of separators, were manufactured by directly printing the zinc ink as the anode and γ-MnO2 ink as the cathode, high-quality graphene ink as metal-free current collectors, working in environmentally benign neutral aqueous electrolytes of 2 M ZnSO4 and 0.5 M MnSO4." The author stated. Diverse shapes of Zn//MnO2 MBs were fabricated onto different substrates, implying the potential widespread applications.

The planar separator-free Zn//MnO2 MBs, tested in neutral aqueous electrolyte, deliver high volumetric capacity of 19.3 mAh/cm3 (corresponding to 393 mAh/g), at 7.5 mA/cm3, and notable volumetric energy density of 17.3 mWh/cm3, outperforming lithium thin-film batteries (

This satisfactory result will open numerous intriguing opportunities in various applications of intelligent, printed and miniaturized electronics. Also, this work will inspire scientists working in nanotechnology, chemistry, material science and energy storage, and may have significant impact on both future technological development of planar micro-scale energy-storage devices and research of graphene based materials.

Credit: 
Science China Press