Culture

Large-scale replication study challenges key evidence for the pro-active reading brain

When listening to a speaker, we often feel that we know what the speaker will say next. How is this possible? It is assumed that our brain routinely uses clues within a sentence to estimate the probability of upcoming words. Activating information about a word before it appears helps to rapidly integrate its meaning, once it appears, with the meaning of the sentence.

"For over 10 years, language scientists and neuroscientists have been guided by a high impact study published in Nature Neuroscience showing that these predictions by the brain are very detailed and can even include the first sound of an upcoming word," explains Mante Nieuwland, cognitive neuroscientist at the Max Planck Institute for Psycholinguistics (MPI) and the University of Edinburgh. These findings had, however, not yet been explicitly replicated since 2005, when the study came out.

Today, a new paper published in eLife by a scientific team led by Nieuwland of the MPI in the Netherlands questions the replicability of those results. The study is the first large-scale, multi-laboratory replication effort for the field of cognitive neuroscience and shows that the predictive function of the human language system may operate differently than the field has come to believe.

Same question, state-of-the-art approach

"Inspired by recent demonstrations for the need for large subject-samples and more robust analyses in psychology and neuroscience research, we re-examined the research question of the original study. We did so by following the original methods and applying improved and current analysis methods," says Guillaume Rousselet from the University of Glasgow, co-author of the study. Furthermore, the researchers pre-registered their analyses, providing a time-stamped proof that their analysis was not tailored to achieve the reported results.

The team embarked on a massive brain imaging study: Across 9 UK laboratories (University of Birmingham, University of Bristol, University of Edinburgh University of Glasgow, University of Kent, University College London, University of Oxford, University of Stirling, and University of York), 334 participants - 10 times the original amount - read sentences that were presented one word at a time, while electrical brain activity was recorded at the scalp. Each sentence contained an expected or unexpected combination of an article and a noun (e.g., "The day was breezy so the boy went outside to fly a kite/an airplane").

Surprising nouns and articles

"We saw that unexpected nouns generated an increased brain response compared to expected nouns. Just like the original study," Nieuwland says. Nevertheless, this reaction, also called an enhanced N400 response, is not the core argument that the participants' brains actually anticipated the nouns. After all, it was generated after the nouns were read, and could mean that nouns like 'kite' are merely easier to process than nouns like 'airplane'.

The key evidence for prediction of a yet unseen noun was originally obtained on the preceding articles. In English, the correct use of the article 'a' or 'an' depends on the first sound of the next word. Even though 'a' and 'an' do not differ in their meaning, the 2005 study showed that unexpected articles also elicited an enhanced N400 response compared to expected articles. Presumably 'an' tells the readers that the next word cannot be 'kite'. This supported the claim that has stood since 2005 - that readers can make such precise predictions as the first sound of upcoming words.

"Crucially, our findings now show that there is no convincing evidence for this claim. With the original analysis, we did not replicate this pattern for the articles. With our improved analysis, we also did not find an effect that was statistically reliable, although the observed pattern did go in the expected direction," according to Nieuwland.

"Of course, it may be that people do predict the sound of upcoming words, but that they do not reliably use the articles to change their prediction. This could be because an unexpected article does not rule out that the expected noun will eventually appear ('a' can precede 'kite' if they are separated by another word, like in 'an old kite'). Also, we have to consider this study only investigates the English language. Other research has shown very different findings in languages such as Spanish, Dutch and French, for which articles correspond to nouns in grammatical gender regardless of intervening words. "

Less straightforward than assumed

The authors caution that these new findings should not be interpreted as being against prediction more generally. "There is a larger body of behavioural and neuroscience work that supports a role of prediction in language processing, for example of the meaning of an upcoming word, although many of those other results in the existing literature, especially in neuroscience, still need to be replicated." However, these new findings show that the reading brain is perhaps not as pro-active as is often assumed, by demonstrating a potential limit to the detail in which it predicts.

Credit: 
Max Planck Institute for Psycholinguistics

Cohesive neighborhoods, less spanking result in fewer child welfare visits

ANN ARBOR--The child welfare system is more likely to intervene in households in "less neighborly" neighborhoods and in which parents spank their kids, a new study shows.

Researchers at the University of Michigan and Michigan State University conducted analyses on nearly 2,300 families from 20 large U.S. cities who responded to surveys and interviews. Participating families had a child who was born between 1998-2000.

They found that living in neighborhoods with strong social cohesion and trust--where neighbors are willing to help each other and generally get along--protects families against getting involved in the child welfare system.

In addition, Child Protective Services is less likely to intervene in households where kids are rarely spanked.

Other factors, such as poverty and mothers feeling depressed, also increase the odds of CPS involvement after controlling for neighborhood risk and spanking.

In the study, mothers reported the neighborhood conditions in which they lived, such as supportive relationships between neighbors and whether they spanked their 3-year-old child within the past month. The moms also reported contact with CPS when their child had been 3-5 years.

"Our findings suggest that promoting caring, neighborly relationships among residents that support the needs and challenges of families with young children can help ensure children's safety," said study co-author Andrew Grogan-Kaylor, U-M associate professor of social work.

About 57 percent of the 3-year-olds in the sample had been spanked by a parent or parental figure in the past month. CPS investigated 7.4 million children for suspected maltreatment during 2016, according to the U.S. Department of Health and Human Services.

Unlike previous research that only factored spanking and neighborhood conditions separately as precursors of child maltreatment, the current study examined these factors simultaneously, said study lead author Julie Ma, assistant professor of social work at UM-Flint.

"Both the types of neighborhoods in which parents choose, or are forced, to raise their children and parents' decisions about whether they spank their children influence the chances of CPS involvement," she said. "Programs and policies should address strategies for building supportive resident interactions in the neighborhoods, as well as nonphysical child discipline to help reduce maltreatment."

Credit: 
University of Michigan

Later school start times really do improve sleep time

A new study in SLEEP, published by Oxford University Press, indicates that delaying school start times results in students getting more sleep, and feeling better, even within societies where trading sleep for academic success is common.

The study aimed to investigate the short and longer-term impact of a 45-min delay in school start time on sleep and well-being of adolescents.

Singapore leads the world in the Programme for International Student Assessment rankings, which measures international scholastic performance in 15-year-olds. East Asian students live in a culture where the importance of academic success is deeply ingrained. This drive for academic achievement leads to high attainment in international academic assessments but has contributed to the curtailment of nocturnal sleep on school nights to well below the recommended eight to ten hours of sleep, putting students at risk of cognitive and psychological problems.

In Singapore, school typically starts around 7:30 AM, which is one hour earlier than the 8:30 AM or later start time recommended by the American Academy of Pediatrics, the American Medical Association, and the American Academy of Sleep Medicine. Sleep deprivation among Singaporean adolescents is rampant, and the average time in bed on school nights is 6 and a half hours.

In July 2016, an all-girls' secondary school in Singapore delayed its start time from 7:30 to 8:15 in the morning by restructuring its schedule in a way that did not delay school end time. Researchers investigated the impact of starting school later on students' sleep and well-being one month and nine months after the institution of the start time delay.

The sample consisted of 375 students in grades 7-10 from an all-girls' secondary school in Singapore that delayed its start time from 7:30 to 08:15 in the morning. Researchers assessed self-reports of sleep timing, sleepiness, and well-being (depressive symptoms and mood) before the school made the schedule change, and evaluated the measures again at approximately one and nine months after the delay. Total sleep time was also measured.

Later school start times have been shown to benefit sleep and well-being in Western cultures, but its usefulness in East Asian countries where students are driven to trade sleep for academic success is less clear. Most studies on later school start times have been conducted in Western countries. These studies have consistently found increased sleep duration on school nights with later start times. However, the sustainability of sleep habit improvement is not as well characterized.

Researchers wondered if students would continue to get more sleep if schools delayed their start times; the gains may not be sustained if students gradually delay their bedtime. For example, one study found that the sleep gained two months after a 45-minute delay in start time was no longer observed after another seven months, due to a delay in the sleep period. Delaying bedtimes, partly as a result of mounting academic workload, is a pressing reality in most East Asian households. Compounding this erosion of sleep time in East Asian societies is the resistance to changing the already packed school schedules. For example, recently, a secondary school in Hong Kong agreed to delay its start time, but only by 15 minutes. Nevertheless, a four-minute increase in time-in-bed on weekdays was found, together with gains in mental health, prosocial behavior and better attentiveness in class and peer relationships.

The results of this new study indicate that after one month, bedtimes on school nights were delayed by nine minutes while the times students got up were delayed by about 32 minutes, resulting in an increase in time in bed of 23 minutes.

Participants also reported lower levels of subjective sleepiness and improvement in well-being at both follow-ups. Notably, greater increase in sleep duration on school nights was associated with greater improvement in alertness and well-being.

Critically, with a later school start time the percentage of participants whose self-reported sleeping time on weekdays was at least 8 hours--the amount generally considered appropriate for adolescents--increased, from 6.9% to 16%. Total sleep time increased by about 10 minutes at the nine-month follow-up.

"Starting school later in East Asia is feasible and can have sustained benefits," said the paper's lead researcher, Michael Chee. "Our work extends the empirical evidence collected by colleagues in the West and argues strongly for disruption in practice and attitudes surrounding sleep and wellbeing in societies where these are believed to hinder rather than enhance societal advancement."

Credit: 
Oxford University Press USA

New method lets doctors quickly assess severity of brain injuries

A new way to rapidly assess levels of consciousness in people with head injuries could improve patient care.

The new score - based on the Glasgow Coma Scale - could also help doctors assess the health of the patient's central nervous system in cases of serious trauma or intensive care.

Using it could improve the way doctors around the world care for patients in a coma from brain injury.

The Glasgow Coma Scale (GCS), which was created at the University of Glasgow and the city's Southern General Hospital in 1974.

The 13 point scale - covering the patient's ability to open their eyes, speak and move - has revolutionised the care of brain injured patients worldwide.

The original GCS team joined forces with researchers at the University of Edinburgh to improve the scale by adding a simple score for pupil response.

Using health records from more than 15,000 patients, they showed that the new score, known as the GCS-Pupil (GCS-P), would have improved doctors' ability to predict a patient's condition in the six months following a brain injury.

A major advantage of the GCS-P is its simplicity and it could be adopted into hospitals easily, allowing doctors to quickly assess prognosis, experts say.

There are almost 350,000 hospital admissions involving damage to the brain in the UK per year, equating to one admission every 90 seconds.

Dr Paul Brennan, who co-led the study from the University of Edinburgh's Centre for Clinical Brain Sciences, said: "The importance of the Glasgow Coma Scale to medicine cannot be overstated and our simple revision really improves its predictive ability and usefulness.

"Making major decisions about brain injured patients relies on quick assessments and the new method gives us rapid insights into the patient's condition. Our next step is to test the GCS-P more widely on large data sets from Europe and the US."

Professor Sir Graham Teasdale, Emeritus Professor of Neurosurgery at the University of Glasgow, who first developed the GCS and co-led the study, said: "This has been a very successful collaboration. It promises to add a new index to the language of clinical practice throughout the world. The GCS-P will be a platform for bringing together clinical information in a way that can be easily communicated and understood. "

Credit: 
University of Edinburgh

Study finds humans and others exposed to prenatal stress have high stress levels after birth

image: Thi is the overall weighted effect size and 95 percent highest posterior density intervals and the independent influence of each moderator variable in explaining variation in effect size. Width of lines and size of points are proportional to the number of effect sizes in each category.

Image: 
Adrian V. Jaeggi

Vertebrate species, including humans, exposed to stress prenatally tend to have higher stress hormones after birth, according to a new Dartmouth-led study published in Scientific Reports. While previous research has reported examples of maternal stress experience predicting offspring stress hormones in different species, this study is the first to empirically demonstrate the impact of prenatal stress on offspring stress hormone levels using data from all known studies across vertebrates.

Through a meta-analysis of 114 results from a total of 39 observational and experimental studies across 14 vertebrate species, including birds, snakes, sheep and humans, the study examines the impact of prenatal exposure to maternal stress on offspring. The researchers analyzed the role of the hypothalamic pituitary adrenal (HPA)-axis, the stress physiological system that is shared across all vertebrates, which ultimately, results in the production of stress hormones known as "glucocorticoids." The HPA-axis is the hormonal system responsible for mobilizing an animal's stress response. Offspring exposed prenatally to maternal stress were found to have more stress hormone levels (glucocorticoids) after birth. This could reflect a biological adaptation with an evolutionary history, as more stress hormones could increase an animal's chances for survival in a stressful environment.

In the present study, the researchers tested the strength of the effect of prenatal stress on offspring stress hormone levels across a range of characteristics. Remarkably, the effects of prenatal stress on offspring stress hormones were consistent across species, regardless of evolutionary relationships or factors, such as brain or body size. There were also no differences when considering offspring sex, age of the offspring at the time of assessment, or the timing of the stressor exposure prenatally or its severity.

Only two factors influenced the size of the effect. Experimental studies had a stronger effect than observational studies. In addition, studies that measured glucocorticoid recovery showed a greater association with prenatal stress than was observed at baseline or during peak glucocorticoid response.

"Animals, including humans, modify their stress hormones in response to their environment. Your stress response is set like a thermostat-- your body can amp up or down stress hormones in response to anticipated environmental conditions," explains lead author Zaneta Thayer, an assistant professor of anthropology at Dartmouth.

An animal's stress response tends to be activated by external factors, such as when its see a predator or whether food is availabile. Higher stress hormone levels among offspring may help extend survival but come at a cost and may affect other physiological systems, such as reproduction. In humans, the mere anticipation of stress or just thinking about prior experiences of discrimination or trauma can activate a stress response. Overactive stress hormones can lead to chronic health problems in humans, including anxiety, depression and cardiovascular disease.

One of the studies included in the meta-analysis looked at how maternal stress hormones in pregnant snow hares changed in relation to the abundance of their natural predators, lynxes, over a 10-year cycle. The research team found that in years where there were more lynxes, snow hare offspring had more stress hormones and anti-predator behaviors.

"Our stress response is meant to be adaptive to acute stress, such as being chased by predators. However, humans' stress response is often triggered by social evaluative threats and is not serving the adaptive purpose that it was designed for," added Thayer. "This research confirms what other scientists have long speculated that there are trends across species when it comes to linking prenatal stress and offspring hormonal stress responses."

Prior work co-authored by Thayer has explored early origins of humans' health disparities and the impacts of maternal stress during pregnancy on offspring's postnatal stress hormone levels.

Credit: 
Dartmouth College

Vampire bats' bloody teamwork

Many mammals consume blood as part of their diet, but blood is actually a pretty poor source of energy. Only bats (order Chiroptera) include species that feed exclusively on blood.

So how do vampire bats manage to survive on such low-grade nourishment? A recently published article in Nature Ecology & Evolution provides part of the answer. The bats had to evolve in tandem with their microorganisms.

The challenges were clear. Blood consists of 78 per cent liquid. The remainder is 93 per cent proteins and only one per cent carbohydrates. Blood provides very little in the way of vitamins. On top of all that, a blood-based diet exposes these animals to blood-borne pathogens.

To find the answer, researchers had to look at the vampire bat's genome.

So now the genes of the common vampire bat (Desmodus rotundus) have been thoroughly investigated. But not only its genes.

Professor Tom Gilbert is the senior author of the Nature article and has collaborated with PhD student and first author Lisandra Zepeda Mendoza on this research.

Gilbert works at the Centre for GeoGenetics at the University of Copenhagen, and also holds a part-time position as an adjunct professor at the Norwegian University of Science and Technology's (NTNU) University Museum.

"Coping with this kind of diet requires one species to co-evolve with other species," says Gilbert.

But exactly which other species vampire bats have co-evolved with may not be immediately apparent.

Some vampire bat characteristics are easy to recognize. Vampire bats have developed specialized adaptations that enable them to access blood and then make use of it.

All three species of vampire bats are native to the Americas. Anyone who has watched animal programs or scary movies is probably familiar with some of their adaptive behaviors.

Everyone knows about bats' razor sharp teeth and especially their striking incisors. The teeth are practical for penetrating the skin of their victim.
The bats also have specialized cells, called thermoreceptors, that can detect heat and are useful in finding bare skin on a sleeping animal at night.

Substances in the bat's saliva prevent the blood in a wound from coagulating. The bat's inner adaptations are at least as interesting as the more obvious outer ones. Their kidneys are specially adapted to cope with high protein content. Their immune system helps deal with any pathogens.

However, none of these known adaptations explains how vampire bats have evolved to rely exclusively on blood for their nutrition. This is what the researchers behind the recent Nature article set out to investigate.

This development seems to require one species to develop in tandem with other species. The way this happens may alter the way we perceive evolution: it turns out that some of the other species we co-evolve with may be found within us.

According to the article, a diet this specialized requires a highly specific adaptation of the genome of the species itself. But it also requires a uniquely adapted microbiome.

A microbiome consists of the entire genetic material of all the microorganisms that live in our bodies, whether we're talking about viruses, bacteria or fungi.

You and I and everyone you know are full of other organisms, maybe around 100 trillion of them. The actual number is controversial and subject to debate - but a lot of organisms at any rate.

Some folks may feel disgust at this thought, but there's little to loathe about them. We are completely dependent on other organisms to survive, and the vast majority of them are useful, or at least don't pull any bad pranks that you would notice.

We have even co-evolved with several of these organisms and share evolutionary history with them. At least that's the way it works with vampire bats.

"Our results show that vampire bats became blood sippers after their own genome and microbiome co-evolved closely," the Nature article says.

The genes of the vampire bats thus evolved along with all the microbes in their bodies.

In other words, in order to understand vampire bats you have to look at the bat's own genes and all the genes in the microbiome as a whole. This is what researchers call the "hologenome" - the genes of the host plus all its symbiotic microbial guests.

The common vampire bat has a unique hologenome. The vampire bat's microbiome helps to compensate for the lack of vitamins and various fatty substances in what would otherwise be an imbalanced diet. Its microbes also help the body get rid of waste and maintain the cells' fluid balance through osmoregulation.

The article's researchers emphasize the value they found in studying both the host and its interaction with the microbiome as they tried to figure out the adaptations that underlie the vampire bat's radical diet.

The bat's ability to survive on a diet that would be inadequate for other mammals is in fact only made possible through the help of the microbiome.

"But the main finding probably applies to all animals in regards to their diet, whether we're talking about cows and grass, vultures and carrion or koalas and eucalyptus. We have to look at both the animal itself and the collective microbes to understand what's happening. Now we've arrived at a point where this is possible both technically and economically," says Professor Gilbert.

Credit: 
Norwegian University of Science and Technology

Tiny injectable sensor could provide unobtrusive, long-term alcohol monitoring

image: Alcohol monitoring chip is small enough to be implanted just under the surface of the skin.

Image: 
David Baillot/UC San Diego Jacobs School of Engineering

Engineers at the University of California San Diego have developed a miniature, ultra-low power injectable biosensor that could be used for continuous, long-term alcohol monitoring. The chip is small enough to be implanted in the body just beneath the surface of the skin and is powered wirelessly by a wearable device, such as a smartwatch or patch.

"The ultimate goal of this work is to develop a routine, unobtrusive alcohol and drug monitoring device for patients in substance abuse treatment programs," said Drew Hall, an electrical engineering professor at the UC San Diego Jacobs School of Engineering who led the project. Hall is also affiliated with the Center for Wireless Communications and the Center for Wearable Sensors, both at UC San Diego. Hall's team presented this work at the 2018 IEEE Custom Integrated Circuits Conference (CICC) on Apr. 10 in San Diego.

One of the challenges for patients in treatment programs is the lack of convenient tools for routine monitoring. Breathalyzers, currently the most common way to estimate blood alcohol levels, are clunky devices that require patient initiation and are not that accurate, Hall noted. A blood test is the most accurate method, but it needs to be performed by a trained technician. Tattoo-based alcohol sensors that can be worn on the skin are a promising new alternative, but they can be easily removed and are only single-use.

"A tiny injectable sensor--that can be administered in a clinic without surgery--could make it easier for patients to follow a prescribed course of monitoring for extended periods of time," Hall said.

The biosensor chip measures roughly one cubic millimeter in size and can be injected under the skin in interstitial fluid--the fluid that surrounds the body's cells. It contains a sensor that is coated with alcohol oxidase, an enzyme that selectively interacts with alcohol to generate a byproduct that can be electrochemically detected. The electrical signals are transmitted wirelessly to a nearby wearable device such as a smartwatch, which also wirelessly powers the chip. Two additional sensors on the chip measure background signals and pH levels. These get canceled out to make the alcohol reading more accurate.

The researchers designed the chip to consume as little power as possible--970 nanowatts total, which is roughly one million times less power than a smartphone consumes when making a phone call. "We don't want the chip to have a significant impact on the battery life of the wearable device. And since we're implanting this, we don't want a lot of heat being locally generated inside the body or a battery that is potentially toxic," Hall said.

One of the ways the chip operates on such ultra-low power is by transmitting data via a technique called backscattering. This occurs when a nearby device like a smartwatch sends radio frequency signals to the chip, and the chip sends data by modifying and reflecting those signals back to the smartwatch. The researchers also designed ultra-low power sensor readout circuits for the chip and minimized its measurement time to just three seconds, resulting in less power consumption.

The researchers tested the chip in vitro with a setup that mimicked an implanted environment. This involved mixtures of ethanol in diluted human serum underneath layers of pig skin.

For future studies, the researchers are planning to test the chip in live animals. Hall's group is working with CARI Therapeutics, a startup based in the Qualcomm Institute Innovation Space at UC San Diego, and Dr. Carla Marienfeld, an addiction psychiatrist at UC San Diego who specializes in treating individuals with substance abuse disorders, to optimize the chip for next generation rehab monitoring. Hall's group is developing versions of this chip that can monitor other molecules and drugs in the body.

"This is a proof-of-concept platform technology. We've shown that this chip can work for alcohol, but we envision creating others that can detect different substances of abuse and injecting a customized cocktail of them into a patient to provide long-term, personalized medical monitoring," Hall said.

Credit: 
University of California - San Diego

Scientists learn how to avoid a roadblock when reprogramming cells

image: Kazutoshi Takahashi (left) and Tim Rand (right), scientists in Shinya Yamanaka's laboratory at Gladstone, helped answer lingering questions about cellular reprogramming.

Image: 
Gladstone Institutes

SAN FRANCISCO, CA--April 10, 2018--Over a decade ago, Shinya Yamanaka and Kazutoshi Takahashi made a discovery that would revolutionize biomedical research and trigger the field of regenerative medicine. They learned how to reprogram human adult cells into cells that behave like embryonic stem cells. Scientists were shocked that something so complex could be done so simply, and they had thousands of questions.

The reprogrammed cells are known as induced pluripotent stem cells (iPSCs). Researchers can create iPSCs from a patient's blood or skin cells, and use these patient-specific cells to study diseases or even create new tissues that could be transplanted back into the patient as therapy.

Initially, Nobel Laureate and Gladstone Senior Investigator Yamanaka, MD, PhD, and Staff Research Investigator Takahashi, PhD, identified four genes--abbreviated as O, S, K, and M--that cause cells to transform into iPSCs. The genes O, S, and K were known to help the cells become pluripotent, which allows them to produce any other cell type in the body.

The role of gene M (short for MYC), however, was unclear. They knew that by adding MYC, they could reprogram cells 10 percent more efficiently. But they didn't know why.

Twelve years later, Yamanaka and Takahashi finally defined the role of MYC in this important reprogramming process, answering several lingering questions. Their findings are published today in the scientific journal Cell Reports.

They discovered that MYC helps cells get around a significant roadblock in the process. They also found that, in some instances, MYC isn't actually needed for adult cells to successfully transform into iPSCs.

The Power of Three Discoveries

To reprogram cells, scientists typically add four genes (O, S, K, and MYC) to a dish containing adult cells. This allows the cells to start multiplying, which is a distinctive feature of stem cells. But after three days, the cells suddenly encounter a roadblock and stop multiplying, or proliferating. Then, on day seven, the cells start multiplying again and go on to become iPSCs.

If the researchers don't add MYC to the dish, the cells go through the same process, but they never overcome the obstacle, so they cannot successfully convert into iPSCs.

"We realized that MYC seems to help cells get around this roadblock, and that this needs to happen for adult cells to turn into iPSCs, but we still didn't quite understand how MYC did that," explained Takahashi. "Interestingly, we were able to figure it out thanks to three discoveries that happened independently in the lab, while people were working on different things."

The first discovery helped them find an early indicator of a cell's potential to finish reprogramming. It also allowed them to easily identify when the roadblock would occur, providing a valuable time reference for the subsequent findings.

The second discovery stemmed from a separate project on a protein called LIN41. The scientists found that if they replaced MYC with LIN41 in the cocktail of genes involved in reprogramming--meaning if they used O, S, K and LIN41--they could convert adult cells into iPSCs with the same efficiency.

"This was strange because it meant that, contrary to what we believed, MYC isn't necessary for cells to reprogram efficiently," said Tim Rand, MD, PhD, staff scientist at Gladstone and a first author of the study. "It turns out that adding LIN41 altogether avoids the onset of the roadblock that prevents cells from converting into iPSCs."

The team found that when they use the combination of O, S, K, and LIN41, the adult cells don't stop proliferating after the third day. Instead, they continue to multiply as if nothing happened and successfully complete the reprogramming process. This is because LIN41 blocks another protein, called p21, which causes the roadblock.

The third discovery proved to be even more astonishing. It showed that, in a particular cell line, neither MYC nor LIN41 are needed to enhance reprogramming.

The scientists went through the same process using tumor-derived cells that continuously multiply. Then, they removed LIN41, and nothing happened. Puzzled, they tried to remove MYC and, once again, nothing changed.

"That result was very shocking to me," said Rand. "Given everything we thought we knew about MYC and LIN41 at the time, we couldn't comprehend how these genes were so beneficial in somatic cell reprogramming, but absolutely useless in tumor reprogramming. Eventually, when we realized how it fit in, it was such useful information. It made us realize that certain cell types can fortuitously accomplish the role of MYC and LIN41 during reprogramming--to disable the p21 response. If I could relive that day over again, I would make sure it was a big celebration."

Rand and the rest of the team realized that without p21, there is no roadblock, so LIN41 is not needed to avoid it. They also showed that MYC is mainly useful because it activates LIN41. So, without the p21 roadblock, MYC isn't needed either.

Bringing Clarity to a Complex Process

Through these multiple discoveries, the Gladstone scientists noticed that the reprogramming process involves many genes and proteins important for cancer biology. In fact, they believe the roadblock trying to prevent cells from multiplying is the same one that tries to prevent cancer from spreading.

"When cancer biologists add certain factors to a cell that should drive it toward cancer, the cell panics and, to protect itself, it stops multiplying," said Takahashi. "We think the same thing is happening here, because cells are reacting to reprogramming as if it were cancer. It's not that they're trying to block the cells from transforming into iPSCs, but they've simply never been exposed to this process before and don't know how to react."

The new study explains many important activities involved in cellular reprogramming, and debunks certain leading theories about the role of MYC in this process.

"For a long time now, the entire field was collecting data on MYC, LIN41, and other genes and proteins without knowing what most of it meant," said Yamanaka, who is also director of the Center for iPS Cell Research and Application (CiRA) at Kyoto University, and professor at UC San Francisco. "Our study finally allows us to clearly understand all the data and address questions about the roles and importance of many of these elements."

With a clearer picture of the reprogramming process in hand, the field of regenerative medicine can now build upon these findings to answer the next set of burning questions.

Credit: 
Gladstone Institutes

Solo medical practices outperform groups in treatment of cardiac disease

In a recently published article in the Annals of Family Medicine, Donna Shelley, MD, MPH, et al, aimed to describe small, independent primary care practices’ performance in meeting the Million Hearts ABCSs (aspirin use, blood pressure control, cholesterol management, and smoking screening and counseling), as well as on a composite measure that captured the extent to which multiple clinical targets are achieved for patients with a history of arteriosclerotic cardiovascular disease. They also explored relationships between practice characteristics and ABCS measures.

The article, entitled “Quality of Cardiovascular Disease Care in Small Urban Practices,” concludes that achieving targets for ABCS measures varied considerably across practices; however, small practices were meeting or exceeding Million Hearts goals (i.e., 70 percent or greater). Practices were less likely to consistently meet clinical targets that apply to patients with a history of ASCVD risk factors. Greater emphasis is needed on providing support for small practices to address the complexity of managing patients with multiple risk factors for primary and secondary ASCVD.

Quality of Cardiovascular Disease Care in Small Urban Practices,” by Donna Shelley, MD, MPH, et al, New York, New York
http://www.annfammed.org/content/16/Suppl_1/S21

Journal

The Annals of Family Medicine

Credit: 
American Academy of Family Physicians

Leadership and adaptive reserve are not associated with blood pressure control

In a recently published study in the Annals of Family Medicine, Kamal Henderson, MD, et al, assessed whether a practice’s adaptive reserve and high leadership capability in quality improvement are associated with population blood pressure control. The article, entitled “Organizational Leadership and Adaptive Reserve in Blood Pressure Control: The Heart Health NOW Study,” reveals that adaptive reserve (i.e., the ability of a practice to weather the process of change) and leadership capability in quality improvement implementation are not statistically associated with achieving top quartile practice-level hypertension control at baseline in the Heart Health NOW project. Findings, however, may be limited by a lack of patient-related factors and small sample size to preclude strong conclusions.

Credit: 
American Academy of Family Physicians

Major disruptions are frequent in primary care

In primary care practices, sustainability of performance improvements and ability to deliver continuity of care to patients can be adversely affected by major disruptive events, such as relocations and changes in ownership, clinicians, and key staff. This is according to a recently published study in the Annals of Family Medicine entitled “The Alarming Rate of Major Disruptive Events in Primary Care Practices in Oklahoma,” in which James Mold, MD, MPH, et al, documented the rates of major disruptive events in a cohort of primary care practices in Oklahoma.

During a 2-year period, major disruptive events occurred at an alarming rate, adversely affecting quality improvement efforts. Most reported events involved losses of clinicians and staff. More research is needed to identify and address the root causes of these events.

The Alarming Rate of Major Disruptive Events in Primary Care Practices in Oklahoma,” by James W. Mold, MD, MPH, et al, Oklahoma City, Oklahoma
http://www.annfammed.org/content/16/Suppl_1/S52

Journal

The Annals of Family Medicine

Credit: 
American Academy of Family Physicians

Overlapping mechanisms in HIV cognitive disorders and Alzheimer's disease

image: Aβ oligomers are elevated in the brains of HIV(+) cases. Paraffin-embedded tissue sections from hippocampus of HIV(-) and HIV(+) individuals were prepared for immunofluorescent analysis and visualized by laser confocal microscopy. Representative images are shown from hippocampal sections triple-labeled for Aβ oligomers (red), MAP2 (green), and nuclei (blue). Red and green colocalization appears yellow.

Image: 
Stern et al., JNeurosci (2018)

A protein involved in Alzheimer's disease (AD) may be a promising target for treating neurological disorders in human immunodeficiency virus (HIV) patients, suggests a study published in JNeurosci of rat neurons and brain tissue from deceased humans. The research shows that the two conditions may damage neurons in similar ways.

Although HIV-associated neurological disorders (HAND) and AD have symptoms in common, whether they also share underlying mechanisms of disease progression is controversial because HAND patients do not exhibit the amyloid plaques that are characteristic of AD. To address this question, Kelly Jordan-Sciutto and colleagues investigated the role of a well-known AD protein -- β-site amyloid precursor protein cleaving enzyme 1 (BACE1) -- in HAND. The researchers found elevated levels of BACE1 and Aβ oligomers -- the compound thought to be responsible for neuronal damage in AD -- in postmortem brain tissue of HIV-positive humans. Treating rat neurons with HIV-infected white blood cells from healthy humans revealed similar mechanisms of neurotoxicity.

Credit: 
Society for Neuroscience

Stop prioritizing the car to tackle childhood obesity, governments/planners urged

The dominance of the "windscreen perspective" whereby governments and planners view the world from quite literally the driving seat, has allowed car travel to become the "default choice," argue the authors.

Consequently, investment in road building far exceeds that for active travel--public transport, footpaths, and cycle lanes-- "resulting in an environment that often feels too risky for walking or cycling," they suggest.

The average length of a school journey has nearly doubled since the 1980s to just under 4 miles in 2013. But the age at which parents will allow their children to go to school by themselves has been steadily creeping up amid fears about road safety.

So they drive their children to school. But what is often not recognised is just how much air pollution children travelling by car are exposed to inside the vehicle under urban driving conditions, the authors point out.

Encouraging independent travel not only helps shed the pounds, but has knock-on social and mental health benefits, and it breaks the cycle of normalising car travel for future generations, they say.

They admit there is no single solution, but safe routes to school are needed. The UK could adopt the school travel initiatives pioneered by Germany, The Netherlands, and Denmark, they suggest.

And it could plough more cash into the Sustainable Travel Towns programme, already implemented in some parts of the UK.

This programme of town-wide measures, which aims to curb car use, has helped boost economic growth, cut carbon emissions and promote quality of life in those areas where it has been adopted, the authors point out.

"For a fraction of the road building programme cost, we could see not just safe routes to schools, but, even more importantly, safe routes wholesale across urban areas, they argue.

In an accompanying letter, sent to all four UK transport ministers--Chris Grayling in England; Humza Yousaf (Scotland); Ken Skates (Wales); and Karan Bradley (Northern Ireland)--the authors point to significant savings to the NHS, reductions in pollution levels, and ingraining sustainable travel behaviours among future generations if active travel were to be prioritised.

"The rhetoric of improving the environment in favour of children's active travel has been visible for at least two decades, but tangible changes have largely been absent from transport planning," they write.

"We suggest the time is right to redress the imbalance and give back to today's children many of the freedoms that older adults recall and benefited from in terms of the levels of independent mobility," they conclude.

Credit: 
BMJ Group

Blood flow is a major influence on tumor cell metastasis

video: This video shows blood flow tuning of tumor metastasis.

Image: 
Jacky Goetz

Scientists have long theorized that blood flow plays an integral role in cancer metastasis. But new research testing this long-held hypothesis in zebrafish and humans confirms that the circulatory blood flow impacts the position where circulating tumor cells ultimately arrest in the vasculature and exit into the body, where they can form a metastasis.

In a paper published April 9 in Development Cell, researchers from the French National Institute of Health and Medical Research (INSERM) found that in the model of the zebrafish embryo, labeled circulating tumor cells (CTCs) could be followed throughout the vasculature. The location where the tumor cells arrested was found to be closely correlated with blood flow rates less than 400-600 μm/s. The larger aim of the study was to visualize the impact of blood flow on important steps in metastasis--arrest of the CTCs, adhesion to the vasculature, and extravasation of the CTCs from the blood vessel.

"A long-standing idea in the field is that arrest is triggered when circulating tumor cells end up in capillaries with a very small diameter simply because of size constraints," says author Jacky G. Goetz, PhD, whose laboratory conducted the study. "This research shows that this position is not only driven by physical constraint but that blood flow has a strong impact on allowing the tumor cells to establish adhesion with the vessel wall. I think this is an important addition to understanding how and where tumor cells would eventually form metastases."

Researchers chose the zebrafish embryo model since its vasculature is highly stereotyped. "This made it much easier to document the position of all the tumor cells after they were injected," explains Goetz. The team compiled all of the images together and created heat maps of the position of the tumor cells in the vasculature.

The researchers also found that blood flow is essential for the process of extravasation, when tumor cells leave the circulatory blood vessel and cross the endothelial barrier at a new site to establish a secondary tumor. "When we did timelapse imaging in the zebrafish embryo, we found that endothelial cells appear to curl around the tumor cells that are arrested in the blood vessel," says Goetz. "Blood flow at this step is essential. Without flow, endothelial remodeling does not occur. You need a certain amount of flow to keep the endothelium active so that it can remodel around the tumor cell."

They further confirmed this observation in brain metastases in mice using intravital correlative microscopy, an imaging technique developed by the Goetz laboratory, in collaboration with Y.Schwab (EMBL, Heidelberg), which combines imaging living multicellular model systems with electron microscopy to provide details of dynamic or transient events in vivo.

The researchers next applied these findings to study brain metastases in 100 human patients from with heterogeneous primary tumor locations. Similar to the zebrafish model, they mapped the position of the metastases and generated heatmaps. "We were able to merge the brain metastases map to a perfusion map of a control patient and found that it nicely reproduced exactly what we did in the zebrafish showing that metastases preferably develop in areas with low perfusion," says Goetz.

The researchers conclude that all of these findings show that blood flow at metastatic sites regulates where and how metastatic outgrowth develops. Looking ahead, the researchers plan on studying methods to inhibit the endothelial remodeling ability of the blood vessel to potentially impair extravasation and inhibit metastasis.

Credit: 
Cell Press

Binge-eating mice reveal obesity clues

image: Chocolate bar similar to the ones used in this study.

Image: 
CC0

Obesity is a growing issue in many countries, accelerated by easy access to calorie-dense foods that are pleasurable to eat (known as an 'obesogenic environment'). But while it's clear that eating too much leads to weight gain, little is known about the underlying behaviours that lead to overeating.

To mimic this obesogenic environment, the teams led by Mara Dierssen at CRG and Rafael Maldonado at UPF offered mice the option of a high-fat 'cafeteria' diet or a mixture of chopped-up commercial chocolate bars alongside their regular lab chow, before carrying out a detailed analysis of the animals' activity and feeding behaviour. Their results have been published in two back-to-back articles in the journal Addiction Biology.

Working together with Cedric Notredame (CRG) and Elena Martín-García (UPF), the scientists found that as well as becoming obese, the mice started very early to show the signs of addiction-like behaviour and binge-eating in response to these enticing foods.

For example, when offered chocolate for just one hour per day, the animals will compulsively 'binge', consuming as much chocolate in one hour as they would over a whole day if it was continually available. They also showed inflexible behaviours, similar to those seen in addiction, choosing to wait for chocolate while ignoring freely available standard chow. Yet, at the same time, the chocolate did not seem to satiate hunger as well as regular food.

The team found that animals on the high fat or chocolate diet also changed their daily routines. They were more likely to eat during the daytime - mice are usually nocturnal and feed at night - and they ate shorter more frequent 'snacks' rather than larger, longer-spaced meals.

A major problem in treating obesity is the high rate of relapse to abnormal food-taking habits after maintaining an energy balanced diet. The scientists evaluated this relapse and found that extended access to hypercaloric diets impairs the control of food seeking behaviour and has deleterious effects on learning, motivation and behavioural flexibility.

"Our results revealed that long-term exposure to hypercaloric diets impair the ability to control eating behaviour leading to negative effects on the cognitive processes responsible for a rational control of food intake" says Maldonado, head of the Neuropharmacology Laboratory at UPF.

"Obesity is not just a metabolic disease - it is a behavioural issue. People who are overweight or obese are usually told to eat less and move more, but this is too simplistic." explains Mara Dierssen, group leader of the Cellular and Systems Neurobiology laboratory at CRG. "We need to look at the whole process. By understanding the behaviours that lead to obesity and spotting the tell-tale signs early, we could find therapies or treatments that stop people from becoming overweight in the first place."

The scientists are now expanding their research to larger numbers of animals and they are also planning a study to look at addiction-like behaviours in obese people to see how well their results translate to humans.

"It is very hard to lose weight successfully, and many people end up trapped in a cycle of yo-yo dieting," Dierssen explains. "We need to focus on preventing obesity, and this study shows us that understanding and modifying behaviour could be the key", as Maldonado states "these studies reveal the major behavioural and cognitive changes promoted by hypercaloric food intake, which could be crucial for the repeated weight gain and the difficulties to an appropriate diet control".

Credit: 
Center for Genomic Regulation