Culture

Employee contract structures in startups can be determining factors of success

CATONSVILLE, MD, September 9, 2019 - Conventional wisdom in the startup community is that with the right incentives, the venture can meet and exceed expectations, and a major component of this is how you structure your contracts for founders and early employees.

New research has found that when it comes to those contracts, it may be less about incentive, and more about identifying the right people to incentivize.

New research in the upcoming INFORMS journal Management Science suggests that identifying personality types is as much if not more important than contract elements.The study, "Equity Contracts and Incentive Design in Startup Teams," was conducted by Evgeny Kagan of Johns Hopkins University, and Stephen Leider and William Lovejoy both of the University of Michigan, to examine the effects of different contracts based on effort and value in startups.

The most compelling finding of the research is that incentive clauses in contracts tend to work only when applied to individuals who would be high-performing team members regardless. Team members who may not be high-performers do not tend to perform any better when given incentives in an employment contract.

According to Kagan, this means that it could be best to focus on including certain performance-based clauses in contracts for people who seem more likely to respond in kind. At the same time, others who may not appear to be high performers at the outset may not warrant performance-based contracts.

"Equally-split contracts can encourage free-riding behavior. They are embraced by the least desirable collaborator types who protect themselves by not taking risks but still sharing equally in the proceeds," said Kagan, assistant professor at Johns Hopkins Carey Business School.

One of the suggestions given by the authors, because it is about personality types and not incentives that matters most is that founders may want to delay contracting. By doing so, they can learn about the personalities of the other team members and decide whether a strong incentive, or a simple, equal split contract is appropriate.

The findings of this research encourage investors to avoid startups with equal ownership splits between founders, especially when those contracts have been chosen early in the collaboration.

Credit: 
Institute for Operations Research and the Management Sciences

Brain cells that suppress drug cravings may be the secret to better addiction medicines

For the nearly 20 million U.S. adults who are addicted to drugs or alcohol, no effective medical treatment exists--despite plentiful scientific knowledge surrounding the factors that trigger relapse.

It's a quandary that prompted a research quest for Nobuyoshi Suto, PhD, of Scripps Research's Department of Neuroscience.

Rather than continue to dig for clues on what drives relapse among those who struggle with compulsive drug use, Suto and his team decided to take a different approach: They explored how the brain responds to environmental cues that suppress--not promote--drug cravings, specifically for alcohol and cocaine, two of the largest classes of abused drugs.

By shedding new light on these poorly understood brain mechanisms, their findings may contribute to better medicines to treat addiction, Suto says. The research, supported by grants from NIH's National Institute on Drug Abuse and the National Institute on Alcohol Abuse and Alcoholism, appears in Nature Communications.

"Medications designed to counter brain processes that lead to relapse have seen limited success in patients, as have non-drug interventions such as cue-exposure therapy that seeks to help individuals deal with addiction triggers," Suto says. "We believed an alternate strategy would be beneficial, so we sought to explore what happens in the brain in the absence of triggers, when cravings are not driving behavior."

The study examined how nerve cells behaved in the brain's infralimbic cortex. This brain region is believed to be responsible for impulse control.

For their experiments, the scientists worked with male rats that were conditioned to be compulsive users of alcohol or cocaine. Suto and his team wanted to find out what happens in the brain when the rats received environmental cues (a citrus scent, in the case of this study) that drugs were not available. Those signals, known as "omission cues," were successful at suppressing all of the main factors that promote drug relapse.

The team then dug deeper into the underlying "anti-relapse" brain mechanisms, using a laboratory technique that would remove any ambiguity about what role the neurons play in shaping behavior.

"Our results conclusively establish that certain neurons that respond to omission cues act together as an ensemble to suppress drug relapse," Suto says.

Additional research will build on these findings.

"A medical breakthrough is needed in addiction treatment," Suto adds. "Our hope is that further studies of such neural ensembles--as well as the brain chemicals, genes and proteins unique to these ensembles--may improve addiction medicine by identifying new druggable targets for relapse prevention."

Credit: 
Scripps Research Institute

Overcoming resistance in pancreatic cancer

image: Hematoxylin and Eosin (H&E) and Masson's Trichrome staining of tumors in organoid mouse model of PDA.

Image: 
Tuveson lab/CSHL, 2019

Cold Spring Harbor, NY -- Cancer is relentless and resilient. When a drug blocks a cancer cell's main survival pathway, the cell avoids the obstacle by taking different pathways or detours to save itself. This tactic is called "developing resistance," and it's one of the key challenges researchers face when seeking effective therapeutics to combat pancreatic ductal adenocarcinoma (PDA).

A group of researchers at Cold Spring Harbor Laboratory (CSHL) has now found a way to tackle this problem and stop the growth of pancreatic tumors in mice. Their findings are published in the journal Clinical Cancer Research.

Pancreatic cancer has a five-year survival rate of only 8 percent. Professor David Tuveson's lab at CSHL is focused on identifying better treatment strategies to help prolong survival for patients, including new drugs that can be introduced into clinical trials.

More than 90 percent of pancreatic cancer patients carry a mutation, which controls cell growth and death, in the cancer-causing gene KRAS. The KRAS oncogene is difficult to drug directly, so researchers are testing indirect routes to shutting it down. One approach targets the AKT and MAP-Kinase (MAPK) downstream signaling pathways that support KRAS.

"Some clinical trials have targeted these pathways, but high toxicity levels and therapeutic resistance development precluded further investigation of these regimens," said Youngkyu Park, a Research Investigator in the Tuveson lab. "Toxicity can occur when anti-tumor agents aren't malignancy-specific. That means they risk killing healthy cells as well."

The Tuveson lab encountered the problem of resistance pathways when it tried to barricade both the AKT and MAPK pathways in PDA.

To develop an effective cancer drug, the team created drug cocktails that block both the main pathways supporting pancreatic cancer cell growth and cancer cell-specific resistance pathways.

By culturing normal human cells and cancer cells in 3D organoid models and testing them concurrently, the team was able to distinguish particular signaling mechanisms that only affected pancreatic cancer cells. This allowed them to pinpoint the ERBB signaling pathway as the pancreatic cancer-specific resistance mechanism following AKT/MAPK blockade.

By inhibiting ERBB signaling in addition to MAPK signaling, the researchers observed pancreatic tumors shrink in organoid mouse model of PDA.

"We hope this study will help other research groups to use the same methodological approach we use in the paper," said Mariano Ponz-Sarvisé, a former CSHL Clinical Fellow and an author on the study. "I believe that for some drugs, this approach can help find new avenues to overcome resistance."

Credit: 
Cold Spring Harbor Laboratory

Strong student-adult relationships lower suicide attempts in high schools

image: This is an illustration of a social network at a school, with each dot representing a student and connected lines representing friendships. Group one, with more interconnected friendships, has fewer cases of suicide attempts/ideation, while the more isolated group has more suicide attempts.

Image: 
University of Rochester Medical Center

High schools where students are more connected to peers and adult staff, and share strong relationships with the same adults, have lower rates of suicide attempts, according to a new study published by the Journal of Child Psychology and Psychiatry.

The study, “Peer-adult network structure and suicide attempts in 38 high schools: implications for network-informed suicide prevention,” surveyed 10,291 students from 38 high schools to determine social integration through the relationship network structure of each school.

Students were asked to name up to seven of their closest friends at their school. In a novel approach, students were also asked to name up to seven adults in their school they trust and feel comfortable talking to about personal matters. Researchers used the friendship and adult nominations submitted to build comprehensive social networks for each school.

Researchers used this data to determine whether differences in social networks between schools resulted in different rates of suicide attempts and suicidal ideation (thinking about or planning suicide). Their findings revealed the following:

Rates of suicide attempts and ideation were higher in schools where students named fewer friends, friendship nominations were concentrated in fewer students, and students’ friends were less often friends with each other.
Suicide attempts specifically were higher in schools where students were more isolated from adults, and student nominations of adults were concentrated among fewer students (i.e. a few students had disproportionately more trusted adults vs. other students).
Schools in which 10 percent more students were isolated from adults correlated to a 20 percent increase in suicide attempts.
Conversely, suicide attempts were lower in schools where students and their close friends shared strong bonds with the same adult, and where a smaller number of adults were nominated by a larger share of students.

Schools in which many students name the same trusted adults “may reflect the presence of clearly identified, competent adults being connected to many students,” said the study.

This focus on social networks had been relatively unexplored in previous research on suicide, according to lead author Peter A. Wyman, PhD, professor in the Department of Psychiatry at the University of Rochester School of Medicine and Dentistry. “Most suicide prevention is centered on the high-risk individual,” Wyman said. “We wanted this study to provide us with new ways of thinking on how to intervene to strengthen protective relationships on a broader school-level, and even on a community level.”

The number of children and teens who have been brought to the emergency room for suicide attempts or suicidal ideation has nearly doubled in recent years, according to a recent study published by JAMA Pediatrics. There were 1.12 million emergency room visits for suicide attempts or suicidal ideation by children ages 5 to 18 years old in 2015, up from 580,000 in 2007.

In addition, suicide is the second leading cause of death among young people in the U.S. age 10 to 18 and rates have been increasing by nearly 2 percent per year.

Wyman hopes these study results could potentially help schools develop more effective, comprehensive interventions. “Despite a great deal of effort, suicide rates continue to rise. This study identifies protective schoolwide network factors, such as cohesion between adolescents’ peer and adult networks. This network-informed perspective gives us some new concepts for suicide prevention,” he said. “Strengthening inter-generational cohesion so that more friendship groups share a trusted adult could make it easier for youth to close the circle through that connection if a friend is at-risk.”

Participating schools had wide differences in the percentage of students who nominated trusted adults. In the lowest ranked school, only 8.3 percent of students named a trusted adult, while 53.4 percent of students named a trusted adult in the highest ranked school. Authors of the study recommended looking at characteristics of school staffs, such as diversity and attitudes about youth, and the school leadership climate to better understand why these differences exist.

In addition, the study recommended developing strategies to strengthen protective social networks, including training student peer group leaders to promote positive social behaviors, and working to prepare responsive adults and connect those adults into student social groups.

“The time has come for our field to think more broadly about suicide prevention,” said Anthony R. Pisani, PhD, associate professor at the University of Rochester School of Medicine and Dentistry. “Individual risk factors, like depression, substance use or traumatic history, are important, but we also need to think about the health of the social ties and systems in which we are all interwoven.”

Wyman and Pisani have used these strategies to guide two intervention programs – Sources of Strength and Above the Influence of Vaping – that they have helped implement in 60 high school and middle schools across New York state. Sources of Strength, which was developed in North Dakota in the 1990s, utilizes preventative, population-based approaches by identifying peer leaders of social groups throughout schools and preparing them to become positive influences on their friends’ coping behaviors. Above the Influence of Vaping is a substance abuse prevention program targeting middle schoolers developed by Wyman and Pisani that uses similar peer leader engagement methods combined with science-based peer-to-peer messaging.
Both programs have successfully worked to increase student connections with adults, and this study reinforces the need to strengthen those bonds, according to Wyman.

“One of the most important predictors of lower suicide attempt rates in this study was positive youth-adult connections widely spread across the school,” said Wyman, “we have to be thinking about the broader population to make sure more students are connected to adults prepared to support them.”

Journal

Journal of Child Psychology and Psychiatry

DOI

10.1111/jcpp.13102

Credit: 
University of Rochester Medical Center

Experience of being a minority puts US teens at higher risk of anxiety, depression

BOSTON - Puerto Rican teens growing up as minorities in the South Bronx are more likely to experience anxiety and depression than their peers growing up as a majority in Puerto Rico, even under similar conditions of poverty, says a new study in World Psychiatry. Researchers looked at nearly 2,000 Puerto Rican youth over two decades to understand how minority status and factors such as racism, poverty, violence and social support influence mental health. Although youth in Puerto Rico are poorer and face more homicides than young people living in the South Bronx, the experience of living as a minority group in the United States led to worse mental health outcomes.

"How others interact with you as a minority can affect your mental health and how you see yourself. The mere experience of growing up as a minority can elevate your psychiatric risks," says lead author Margarita Alegria, PhD, chief of the Disparities Research Unit at Massachusetts General Hospital (MGH). "Exposure to racism and discrimination and the perception of low social position are consequences of minority status that may lead to depression, anxiety and feeling like 'the other,'" she adds.

The Boricua Youth Study is the first large longitudinal study examining what puts minority youth at risk for depression and anxiety. The study was conducted by Alegria and colleagues at MGH, Harvard Medical School, Columbia University and the University of Puerto Rico. Researchers looked at 1,863 Puerto Rican youth ages 15-29 living in New York's South Bronx and San Juan, Puerto Rico to explore whether growing up as part of a minority group in disadvantaged neighborhoods puts young people at risk for depression and anxiety and what factors lead to that risk. They also interviewed 1,100 parents and caregivers in both places to get their perspectives.

The researchers examined four general buckets of categories that influence mental health: environmental and social factors, cultural and minority stress, parent and peer relations, and family/individual vulnerability.

The key influencers that put teens at risk for mood disorders included perceived discrimination (i.e. neighborhood discrimination, the stress of being a minority and unfair treatment) and cultural factors (i.e. weaker ethnic identity and intercultural conflict). The strengths of childhood social support and good peer relationships explained the differences in mental health outcomes between minority/majority youth.

As a minority group, youth in the South Bronx also face complex home dynamics that could affect their mental health. Families often provide Latino youth with a sense of identity and source of connection to their culture. Researchers found that intergenerational conflicts sometimes stemmed from minority youth assimilating to New York's cultural norms.

Compared to their peers in Puerto Rico, parents in the South Bronx reported more neighborhood discrimination, a lower level of family connection and more family cultural distress. Similarly, young people in the South Bronx reported weaker ethnic identity and lower levels of familism than their peers in Puerto Rico.

The authors note that the findings have implications for immigrant youth nationally since "It is not individual risk but rather the environments and social context that could play a prominent role in the development of internalizing disorders."

Neighborhood-based interventions focused on building positive social relationships, like youth civic organizations in after school programs could be effective ways to combat anxiety and depression among minority youth, the authors add. Strong parental and peer relationships also offer these youth important buffering tools to combat the stress of discrimination and counter the negative social mirror that puts them at risk for internalizing experiences.

Credit: 
Massachusetts General Hospital

Acute periodontal disease bacteria love colon and dirt microbes

image: Bacteria behind acute periodontitis. Aggregatibacter actinomycetemcomitans most often lives peacefully in the mouth until circumstances lead it to become infectious. It forms flower-like colonies, here under a microscope sporting a colorful stain added by a researcher. Aa is gram-negative.

Image: 
Derren Ready (2012) CIL:38942, CIL. Dataset. https://doi.org/doi:10.7295/W9CIL38942 Creative Commons license

True or false? Bacteria living in the same space, like the mouth, have evolved collaborations so generous that they are not possible with outside bacteria. That was long held to be true, but in a new, large-scale study of microbial interactions, the resounding answer was "false."

Research led by the Georgia Institute of Technology found that common mouth bacteria responsible for acute periodontitis fared better overall when paired with bacteria and other microbes that live anywhere but the mouth, including some commonly found in the colon or in dirt. Bacteria from the oral microbiome, by contrast, generally shared food and assistance more stingily with gum infector Aggregatibacter actinomycetemcomitans, or Aa for short.

Like many bacteria known for infections they can cause - like Strep - Aa often live peacefully in the mouth, and certain circumstances turn them into infectors. The researchers and their sponsors at the National Institutes of Health would like to know more about how Aa interacts with other microbes to gain insights that may eventually help fight acute periodontitis and other ailments.

"Periodontitis is the most prevalent human infection on the planet after cavities," said Marvin Whiteley, a professor in Georgia Tech's School of Biological Sciences and the study's principal investigator. "Those bugs get into your bloodstream every day, and there has been a long, noted correlation between poor oral hygiene and prevalence of heart disease."

Unnatural pairing

The findings are surprising because bacteria in a microbiome have indeed evolved intricate interactions making it seem logical that those interactions would stand out as uniquely generous. Some mouth microbes even have special docking sites to bind to their partners, and much previous research has tightly focused on their cooperations. The new study went broad.

"We asked a bigger question: How do microbes interact with bugs they co-evolved with as opposed to how they would interact with microbes they had hardly ever seen. We thought they would not interact well with the other bugs, but it was the opposite," Whiteley said.

The study's scale was massive. Researchers manipulated and tracked nearly all of Aa's roughly 2,100 genes using an emergent gene tagging technology while pairing Aa with 25 other microbes -- about half from the mouth and half from other body areas or the environment.

They did not examine the mouth microbiome as a whole because multi-microbial synergies would have made interactions incalculable. Instead, the researchers paired Aa with one other bug at a time -- Aa plus mouth bacterium X, Aa plus colon bacterium Y, Aa plus dirt fungus Z, and so on.

"We wanted to see specifically which genes Aa needed to survive in each partnership and which ones it could do without because it was getting help from the partner," said Gina Lewin, a postdoctoral researcher in Whiteley's lab and the study's first author. They published their results in the Proceedings of the National Academy of Sciences.

Q & A

How could they tell that Aa was doing well or poorly with another microbe?

The researchers looked at each of Aa's genes necessary for survival while it infected a mouse -- when Aa was the sole infector, when it partnered with a fellow mouth bacterium and when paired with a microbe from colon, dirt, or skin.

"When Aa was by itself, it needed a certain set of genes to survive - like for breathing oxygen," Lewin said. "It was striking that when Aa was with this or that microbe that it normally didn't live around, it no longer needed a lot of its own genes. The other microbe was giving Aa things that it needed, so it didn't have to make them itself."

"Interactions between usual neighbors -- other mouth bacteria -- looked more frugal," Whiteley said. "Aa needed a lot more of its own genes to survive around them, sometimes more than when it was by itself."

How did the emerging genetic marking method work?

To understand "transposon sequencing," picture a transposon as a DNA brick that cracks a gene, breaking its function. The brick also sticks to the gene and can be detected by DNA sequencing, thus tagging that malfunction.

Every Aa bacterium in a pile of 10,000 had a brick in a random gene. If Aa's partner bacterium, say, E. coli, picked up the slack for a broken function, Aa survived and multiplied even with the damaged gene, and researchers detected a higher number of bacteria containing the gene.

Aa surviving with more broken genes meant a partner microbe was giving it more assistance. Aa bacteria with broken genes that a partner could not compensate for were more likely to die, reducing their count.

Does this mean the mouth microbiome does not have unique relationships?

It very likely does have them, but the study's results point to not all relationships being cooperative. Some microbiomes could have high fences and share sparsely.

"One friend or enemy may be driving your behavior, and other microbes may just be standing around," Lewin said.

Smoking, poor hygiene, or diabetes -- all associated with gum disease -- might be damaging defensive microbiomes and allowing outside bacteria to help Aa attack gum tissue. It's too early to know that, but Whiteley's lab wants to dig deeper, and the research could have implications for other microbiomes.

Credit: 
Georgia Institute of Technology

Success of gene therapy for a form of inherited blindness depends on timing

Nearly two decades ago, a gene therapy restored vision to Lancelot, a Briard dog who was born with a blinding disease. This ushered in a period of hope and progress for the field of gene therapy aimed at curing blindness, which culminated in the 2017 approval of a gene therapy that improved vision in people with Leber congenital amaurosis (LCA), a rare, inherited form of blindness closely related to the condition seen in Lancelot. It represents the first FDA-approved gene therapy for an inherited genetic disease.

The gene therapy, which provides a functional copy of the RPE65 gene, has improved vision in patients, allowing them to experience the world in a way they never would have otherwise. But questions remain about how long-lasting these improvements will be and whether progressive degeneration of vision cells have been halted with the therapy.

In a new paper in the journal Molecular Therapy, researchers from the University of Pennsylvania turned back to canines to learn more about the factors that determine the outcome of gene therapy; this time, they treated dogs at more advanced stages of the disease, timepoints at which human patients are more likely to be treated. They discovered that dogs that were provided the therapy when more than 63% of their photoreceptor cells were still present but nonfunctional had great success. The effect of the treatment seemed lifelong, and there was an arresting of the progressive degeneration. But for those dogs that had lost more than half of their photoreceptor cells before receiving the treatment, the disease seemed to continue to progress, despite a short-term restoration of sight.

"Earlier work by our group and others had suggested that if you treated the disease at a time when the retina was degenerating, that degeneration continued, in people and in dogs," says Gustavo D. Aguirre of Penn's School of Veterinary Medicine. "This was in spite of short-term gains in vision. We wanted to follow up to get details about the extent of retinal degeneration that would still be compatible with a lasting effect."

Fortunately, the lab had access to data that would help answer that question.

Previous studies had revealed that treating dogs with the RPE65 mutation at a very young age led to lifelong improvements in vision and in retinal health. But humans with LCA, many of whom are already losing vision cells in the first decade of life, are less likely to receive gene therapy at such an early stage of disease. Questions and concerns about the longevity of treatment, in both dogs and human patients, were raised first in 2013. In 2015, studies in patients treated with gene therapy showed that photoreceptors continue to be lost in the treated area years afterward, even as patients continued to experience improved vision.

To learn more about how treatment could sustain the health of the retina when given at a later timepoint, Aguirre, Gardiner, and colleagues turned to affected dogs.

"We had imaging data from various timepoints before, during, and after treatment for the dogs," says Kristin Gardiner, lead author on the study and a staff veterinarian at Penn's University Laboratory Animal Resources group. Comparing "landmarks" on the dogs' eyes throughout these timepoints, using data from a specialized imaging test called optical coherence tomography and comparing it with retina histopathology data from treated and untreated animals, "you can estimate a thickness of different layers of the retina at the time of treatment," Gardiner says. The thickness of the outer nuclear layer is an indication of how many photoreceptor cells are still alive, and is thus a measure of the eye's health at a cellular level. The Penn Vet researchers teamed with co-corresponding author Artur Cideciyan of the Perelman School of Medicine's Scheie Eye Institute and colleagues to obtain precise data on this thickness from a number of different points in the retina.

When treatment was given at a point when dogs retained 63% or more of the normal photoreceptor cells, the therapy's effect was lasting.

"Treatment can be forever at this stage," says Aguirre.

But when dogs had fewer than 63% of the photoreceptor cells remaining at the time of treatment, the progressive degeneration continued in spite of the gene therapy.

"If you look at this stage superficially, the dogs are seeing; they look good," Aguirre says. "But if you look at the microtopography of their retina, they're not doing well."

Unfortunately, patients--both dog and human--can still be relatively young when they reach this threshold level, leading to concern that the improved vision patients experience after receiving gene therapy may not last their entire lives.

The researchers say that the finding underscores the importance of considering secondary therapies to go along with the gene therapy that is aimed at correcting the underlying genetic mutation. They're currently testing other therapies that prevent cell death.

One other observation that may trigger additional study is that eyes receiving the gene therapy treatment showed progressive but still less degeneration across the retina, not just in the area where the gene therapy vector reached.

"We saw this slight global enhancement of the protective effect," says Gardiner. "We are currently pursuing this unexpected effect."

Credit: 
University of Pennsylvania

The diet-microbiome connection in inflammatory bowel disease

Much remains mysterious about the factors influencing human inflammatory bowel disease (IBD), but one aspect that has emerged as a key contributor is the gut microbiome, the collection of microorganisms dwelling in the intestines.

Diet is known to profoundly affect this microbial community, and special diets have been used as therapies for intestinal disorders including Crohn's disease in people. They're also commonly used in dogs, which can develop a chronic intestinal disease that mirrors many features of Crohn's.

In a new study published in the journal Microbiome, researchers from the University of Pennsylvania investigated the connection between a prescription diet, the gut microbiome, and a successful entry into disease remission in pet dogs receiving treatment at Penn Vet's Ryan Veterinary Hospital. They discovered key features of the microbiome and associated metabolic products that appeared only in dogs that entered disease remission. A type of bacteria that produces these compounds, known as secondary bile acids, alleviated disease in a mouse model. And comparing the impact of diet on the dog's microbiome with that seen during diet therapy in children with Crohn's, the study team found notable similarities.

"The bacteria in the gut are known to be a really important factor in tipping the scales toward disease," says Daniel Beiting, senior author on the work and an assistant professor in Penn's School of Veterinary Medicine. "And the environmental factor that seems to contribute the most to rapid changes in the microbiome is what you eat. Given that dogs' microbiomes are extremely similar to those of humans, we thought this was an intriguing model to ask, 'Could diet be impacting this disease through an impact on the microbiome?'"

To begin pursuing this question required treating a population of pet dogs with canine chronic enteropathy (CE), a chronic condition involving weight loss and gut inflammation, diarrhea, occasional vomiting, loss of appetite, and a chronic relapsing and remitting, just as seen in Crohn's disease. The study involved 53 dogs, 29 with CE being treated at Penn Vet's Ryan Veterinary Hospital, and 24 healthy controls.

Researchers collected stool samples at the outset of the study and at different times as the sick dogs began a prescription diet to treat their disease. Using advanced genetic sequencing techniques, the team developed a catalog of the microbes present in the stool, a stand-in for the animals' gut microbiome. They also collected information about the metabolic products present in the stool.

"That gives us a functional read-out of the microbiome," says Beiting. "It doesn't just tell us who is there but also what they're doing."

Twenty of the 29 sick dogs quickly entered remission. Together, the genomic and metabolite analyses revealed characteristic changes in these dogs. In particular, those that responded well to the diet tended to have an increase in metabolites known as secondary bile acids. These are produced when certain microbes in the gut consume the bile that is released by the liver.

One of these "good" microbes that can give rise to secondary bile acids was the bacterium Clostridium hiranonis, which the researchers found in greater numbers in dogs that went into remission. Dogs that responded well to the diet also had fewer harmful bacteria, such as Escherichia coli and Clostridium perfringens after starting treatment.

To learn more about what these apparent markers of remission were doing, the team took bacteria from the dogs--both when they were sick and after they had entered remission--and grew them in the lab.

"Having these organisms gave us the opportunity to test our hypothesis about what actually causes remission," says Shuai Wang, a postdoc at Penn Vet and the study's lead author.

Taking the secondary bile acids found to be associated with remission, the researchers applied them to the E. coli and C. perfringens grown from the sick dogs and found the bile acids inhibited their growth. They also gave C. hiranonis from the dogs to mice with a form of inflammatory bowel disorder to see if the bacteria could reduce disease in a different animal model.

"We observed a stabilization of secondary bile acid levels and reduced inflammation," Wang says.

"This allowed us to show that secondary bile acids and C. hiranonis aren't just biomarkers of remission," says Beiting, "they can actually effect change. Bile acids can block the growth of pathogens, and C. hiranonis can improve gut health in mice."

As a final step, the researchers looked to a dataset taken from children with Crohn's disease who were treated with a specialized liquid diet known as exclusive enteral nutrition. Youngsters who responded to the therapy had an increase in numbers of the bacteria species Clostridium scindens, which, like C. hiranonis, is a potent producer of secondary bile acids.

The authors say the findings offer hope for better dietary therapies for IBD, perhaps ones that deliver "good" bacteria such as C. scindens or C. hiranonis while suppressing disease-associated species.

"Similar environmental exposures of dogs and children make the canine IBD model an excellent model of pediatric inflammatory bowel disease," says Robert N. Baldassano, a study coauthor and pediatric gastroenterologist at Children's Hospital of Philadelphia. "This study has greatly improved our knowledge of pediatric IBD and will lead to new therapies for children suffering with this disease."

Credit: 
University of Pennsylvania

How we make decisions depends on how uncertain we are

A new Dartmouth study on how we use reward information for making choices shows how humans and monkeys adopt their decision-making strategies depending on the uncertainty of information present. The results of this study illustrated that for a simple gamble to obtain a reward, when the magnitude or amount of the reward is known but the probability of the reward is unknown and must be learned, both species will switch their strategy from combining reward information in a multiplicative way (in which functions of reward probability and magnitude are multiplied to obtain the so-called subjective value) to comparing the attributes in an additive way to make a decision. The findings published in Nature Human Behavior, challenge one of the most fundamental assumptions in economics, neuroeconomics and choice theory that decision-makers typically evaluate risky options in a multiplicative way when in fact this only applies in a limited case when information about both the magnitude and probability of the reward are clearly known.

"This is the first cross-species study using similar experimental design to show that both humans and monkeys change their strategy when they go from choice under risk (when reward probabilities are known) to choice under uncertainty (when reward probabilities are unknown and must be learned), from combining information in a multiplicative way to comparing information in an additive way," said senior author Alireza Soltani, an assistant professor of psychological and brain sciences at Dartmouth. "Comparing reward attributes may seem like comparing apples to oranges; however, when you compare different pieces of reward information rather than combine them, you become a more flexible decision-maker," he added.

The team of researchers from three universities found that when the probability of the reward must be learned (but the magnitude of reward is provided), as the environment becomes more uncertain both humans and monkeys would more often opt for bigger but more risky options by putting less weight on the probability and more weight on the magnitude of the reward. The team also examined neural activity in the monkeys' brain during the task and found a correlation between this adjustment in behavior and how prefrontal neurons represents reward information. Specifically, consistent with the behavior, neurons in the dorsolateral prefrontal cortex represented magnitude more strongly in a more uncertain environment when more weight was put on magnitude.

To understand the findings, consider the following hypothetical scenario (not part of the actual methods used in the research). Pretend it's your lucky day where you could win money in a free sweepstakes. All you need to do is pick a ticket from one of two bowls: Bowl 1 contains 99 winning tickets each valued at $100 and 1 ticket with $0 value. Bowl 2 contains 50 winning tickets valued at $250 and 50 tickets with $0 value. Which bowl do you choose from? Most people will pick Bowl 1 because humans are risk averse. Bowl 1 offers a better combination of properties, even though Bowl 2 could be more lucrative. In order to decide which option to go with, you probably came up with a subjective value for each of the two bowls by multiplying the probability of winning and the subjective utility or desirability of the winning tickets.

Consider another scenario where you only know the dollar amount of the winning tickets in each bowl but don't know the probability of picking a winning ticket. However, you have been observing people who have been choosing tickets from the two bowls before you and have learned that Bowl 1 almost always gives $100 winning tickets but Bowl 2 gives $250 winning tickets only half the time. In this uncertain scenario, you probably choose the bowl that you think is better by comparing how often the two bowls have been awarding winning tickets relative to the amounts of winning tickets they award. In this scenario, as the decision-maker, you used an additive strategy because you compared reward information across the two options rather than trying to combine it.

For the actual study, a series of gambling tasks were administered on a computer for which monkeys and human participants had to choose from two options. Humans (Dartmouth undergraduate students) were awarded a combination of points that were converted to money and extra credit for a course, and monkeys (studied at Yale School of Medicine and University of Minnesota) were awarded with drops of juice according to their choices and the outcomes of the gambles.

"Speaking more broadly, our results show that in an uncertain reward environment, which is the case most of the time, we may not construct the so-called subjective value as prescribed by normative models of choice, and that flexibility is more important than being rational or optimal," added Soltani.

Credit: 
Dartmouth College

Identity crisis for fossil beetle helps rewrite beetle family tree

image: Images and measurements of the fossil beetle that revealed it was a different kind of beetle than originally thought.

Image: 
Martin Fikáček et al.

There are more different kinds of beetle than just about any other kind of animal--scientists have described about 5,800 different species of mammals, compared with nearly 400,000 species of beetles. Of those 400,000 kinds of beetles, more than 64,000 species are members of the rove beetle family, staphylinidae. These mostly small earwig-looking insects are found all over the world, and they've been around since the time of the dinosaurs. But scientists are still figuring out exactly when rove beetles first evolved. A new study in Systematic Entomology suggests that the fossil beetle species believed to be the oldest rove beetle isn't a rove beetle at all, meaning the beetle family tree needs a rewrite.

The beetle at the center of this mix-up, about the size of Franklin D. Roosevelt's nose on the U.S. dime, is Leehermania prorova. When the fossils of Leehermania were first discovered in the 1990s along the Virginia and North Carolina border, they were believed to be the oldest rove beetles ever discovered--by about 50 million years.

Until 2012, the only public information on the fossils was two images, published in 1996 and 2005, but no formal description. Anyone who didn't have direct access to the fossils of the species could only make guesses about its placement in the tree of life based on those photos.

So, when a formal description of the beetle was finally published, beetle scientists around the world were excited to read it.

"When Leehermania was formally described, and more photos came out, we thought to ourselves 'that doesn't look quite right for a staphylinid,'" says Margaret Thayer, a scientist at the Field Museum in Chicago and one of the paper's nine authors. It didn't look like the rove beetles that Thayer has spent her career studying.

"I happened to be at the museum when I first read the paper, so I went and looked through the specimens in our collection to compare," said Alfred Newton, also a Field Museum scientist and paper author. His hunch was that this beetle might be more closely related to Hydroscaphidae, a living family of miniature insects known as skiff beetles, placed in a different suborder from rove beetles.

Across the Atlantic, Martin Fikáček recalled a similar feeling upon comparing the description and photos with the classification of Leehermania as a staphylinid. To Fikáček, a scientist at the National Museum in Prague, the beetle seemed to be a closer fit in the Myxophaga--the suborder that contains skiff beetles. Scientist Chenyang Cai at China's Nanjing Institute of Geology and Paleontology and several other authors came to the same conclusion.

One of the clues that Leehermania wasn't really a staphylinid was its mandibles--the pincer-like jaws. "staphylinids all have exposed mandibles, from at least some angle," says Newton. "In Leehermania, what were originally interpreted as mandibles are actually maxillary palpi--a different mouthpart structure entirely. The mandibles aren't exposed here at all, at least from what we can see."

Another hallmark of staphylinid beetles is their somewhat club-shaped antennae, which start with a narrow base and get wider toward the tip. In Leehermania, the antennae were club-shaped, but the club was more narrowed toward the tip.

Given the hidden mandibles, distinct antennal shape, and other features, including "paratergites"--little plates on the sides of most staphylinid abdomens that are absent in Leehermania--and the shape of the female insects' genitalia, something wasn't adding up. Leehermania seemed to be a much better fit in the suborder Myxophaga than in staphylinidae.

Thanks to the power of the internet, the scientists were able to collaborate freely and quickly across four continents. "The international collaboration that occurred here was really important to the success of the study," said Shûhei Yamamoto, a Field Museum scientist and paper author who studies staphylinidae and other beetles.

As the group's hunch turned to a theory, then a study, then a formal analysis, the tests they ran showed Leehermania fitting nicely as a member of the beetle suborder Myxophaga, likely as a sister to the ancestors of today's skiff beetles. This discovery means that the rove beetle family isn't yet documented to be as old as scientists thought, but the skiff beetle family is now way older--Leehermania lived 226 million years ago, 100 million years before the next oldest fossil skiff beetle known.

Misclassification of extinct species happens all the time in science, for a variety of reasons.

For one, fossils can be extremely difficult to decipher. Since compression fossils like Leehermania are trapped in a sheet of rock, there is often only one viewing angle, though two in this case: a bird's-eye-view called "dorsal," or the top surface, and the "lateral" or side view. Any information about the species has to be gathered from these limited perspectives, so some information on colors, textures, patterns, anatomical details, and of course life-cycle information may be impossible to retrieve. Analysis is even more challenging when your specimens are only 2-3 mm long.

Lack of comparative data also causes problems for researchers. Not only are many characteristics of the insects lost in fossils, but until 2011, the large amount of data used here to test Leehermania's placement in different families didn't exist.

"Our analysis made use of a huge data set of morphological characters of beetles gathered for the 'Beetle Tree of Life' [BToL] project," says Thayer. "That project was really crucial to our analysis and provided a framework upon which we were able to analyze Leehermania." Four authors of the new paper, including Thayer and Newton, were among the authors of the published version of the BToL morphology paper. DNA-based analyses published by the BtoL project and other researchers were also essential to the Leehermania analyses.

Testing and revising the placement of living things in the tree of life is like working on a huge sudoku puzzle with contributors from all over the world. You have methods to figure out where the numbers should go, but if they're incorrectly placed, you only know--eventually-- based on their relationships to the surrounding numbers. If you carry on with the puzzle for too long with an incorrect placement, numbers filled in after the fact might also be incorrect. Revisiting Leehermania's classification was important to help other researchers avoid using the fossils incorrectly to date analyses of beetles as a whole or identify other beetles as staphylinids based on Leehermania.

For the staphylinid family, losing their oldest ancestor produces new questions about how the family evolved.

"The re-classification of Leehermania means that staphylinids are now 50 million years younger than we thought," says Fikáček. "But if staphylinids are so much younger, that means that this family evolved into many lineages much more rapidly than we thought they did." Of course, older staphylinidae fossils are likely to turn up in the future and new analyses will be needed.

At a time in the Earth's history when life was still recovering after a mass extinction, the appearance of Leehermania and staphylinidae is a testament to how resilient and adaptable beetles can be to diverse, and often harsh, living conditions.

"Throughout history, beetles have survived conditions that other animals have not," says Fikáček. "As we study these insects, we might reveal some secret to evolutionary success that beetles possess."

Credit: 
Field Museum

Black sheep: Why some strains of the Epstein Barr virus cause cancer

The Epstein Barr virus (EBV) is very widespread. More than 90 percent of the world's population is infected - with very different consequences. Although the infection does not usually affect people, in some it can cause glandular fever or various types of cancer. Researchers at the German Cancer Research Center (DKFZ) have now discovered why different virus strains cause very divergent courses of disease.

More than 90 percent of all people become infected with the Epstein Barr virus (EBV) during their lifetime. The infection usually remains undetected throughout their life. However, the virus can also cause diseases - with regional differences: Glandular fever (infectious mononucleosis) primarily occurs in Europe and North America and normally affects adolescents or young adults. In equatorial Africa, Burkitt lymphoma is associated with EBV infection. And in Taiwan, southern China and Southeast Asia, the virus often causes nasopharyngeal carcinomas, cancers of the nose and throat area. This is one of the most common types of cancer in young adults in these countries.

"Nasopharyngeal carcinomas are sometimes seen here too, but really very rarely," commented Henri-Jacques Delecluse from DKFZ. So how does EBV cause completely different diseases in different parts of the world? "One possible explanation is that different types of virus are responsible," Delecluse explained. "And we have now found evidence of precisely that." The DKFZ researchers are publishing their results in the journal Nature Microbiology.

In the laboratory, Delecluse and his team studied a virus strain that had previously been isolated from a nasopharyngeal carcinoma. M81, as this particular type of virus is called, has certain peculiarities. Thus the researchers had previously already discovered that M81 infects not only the immune system's B cells, but also epithelial cells of the nasal mucous membrane very efficiently. In contrast, virus strains that cause glandular fever in Europe almost only infect B cells. And although the virus strains that are common here cause the infected B cells to multiply in a Petri dish, they do not produce any new virus particles, unlike M81.

As the DKFZ researchers discovered, one of the reasons for this different behavior is a genetic element called EBER2, of which there are many different variations. EBER2 is what is called a "non-coding RNA" (ncRNA), in other words a piece of RNA that does not contain a blueprint for protein molecules. M81 has an EBER2 variant that is particularly often found in EBV strains from nasopharyngeal carcinomas.

To find out how this variant affects the behavior of the virus, the DKFZ researchers used molecular biology tools to extract EBER2 from the M81 genome. "The virus was indeed no longer able to multiply in the infected cells," Delecluse noted. Even when an EBER2 element from a virus strain that is widespread in Europe was inserted into the M81 virus, it was no longer able to produce virus particles.

The researchers also discovered how EBER2 helps M81 multiply. "EBER2 from M81 stimulates the production of CXCL8, a cytokine that plays an important role in inflammation and carcinogenesis," Delecluse explained, adding that this was not only true of the infected cells themselves. "The EBER2 RNA is wrapped in little envelopes in the infected cell and transported to neighboring cells, which then also begin to produce CXCL8," he continued, explaining that this ultimately stimulated the virus to produce offspring.

"We have therefore finally found evidence that different types of virus can be responsible for different diseases," said Delecluse, emphasizing the significance of his results. "This finding is a strong argument for pressing on with vaccine research in order to develop protection against the most dangerous strains of EBV in future," he concluded. The vaccination against human papillomavirus (HPV), which can cause cervical cancer, already uses a similar principle.

Credit: 
German Cancer Research Center (Deutsches Krebsforschungszentrum, DKFZ)

Using a wearable device to exercise more? Add competition to improve results

While using a wearable device alone may not always be enough to motivate more exercise, adding fun and competition can be the catalyst needed to drive real results, according to a new study from researchers at Penn Medicine and Deloitte Consulting LLP. The two teams combined behavioral insights, gaming elements such as points and levels, and social elements like support, collaboration, or competition to generate significantly positive results in a workplace physical activity program. But when the study, called STEP UP, turned off the gaming elements, participants in the competition arm were the only ones who sustained higher levels of physical activity. Results were published today in JAMA Internal Medicine.

"Gamification and wearable devices are used commonly in workplace wellness programs and by digital health applications, but there is an opportunity to improve their impact on health behaviors by better incorporating behavioral insights and social incentives," said Mitesh Patel, MD, MBA, the director of Penn Medicine's Nudge Unit and an assistant professor of Medicine and Health Care Management. "We found that a behaviorally designed gamification program led to significant increases in physical activity compared to a control group that used wearable devices alone. During the nine-month trial, the average person in the competition arm walked about 100 miles more than the average person in control."

For six months, roughly 600 Deloitte employees from 40 U.S. states took part in a physical activity program. Each participant who was classified as obese or overweight, had daily, personalized step goals, with steps recorded via wearable devices that provided feedback to the participants. Four groups were formed: One in which participants only had their goals and the device, and three others with games tied to their goals.

The "gamified" groups could achieve points and different tiers, or "levels." Importantly, the games were designed to use principles from behavioral economics. This included having all participants sign a commitment contract, before beginning, pledging to strive for their daily goal, agreeing to have points allocated upfront lost -- instead of gained -- if goals were not met, and having a "fresh start" each week with a new set of points. Additionally, there were five levels to the game. Each participant started at the middle, which allowed for progression or regression based on goal achievement. All of these elements were adapted from a previous clinical trial that tested a similar approach among families.

Each gamified group was built around a social element. The support group participants chose a "sponsor" who received a weekly notification of whether the step goals were reached and could provide encouragement or motivation.

The collaboration group was split into teams of three. Each day, a member was randomly selected to represent the team and, if they reached their goal on the prior day, the whole team kept its points.

The competition group was also split into clusters of three who received a weekly leaderboard email showing their individual rankings compared to each other.

During the six-month intervention, the gamification with competition group increased their physical activity by 920 steps per day more than control, a significant difference. Support and collaboration also lead to significant increases of 689 and 637 steps more per day than control, respectively. The real difference between the arms of the study was seen in the three months after the gamification was turned off. The competition arm was the only one of the three gamification arms that had a lasting impact on its members, with a 569 daily step increase compared to control. Both former collaboration and support employees averaged more steps than the control group, but neither were significant.

"Many wellness solutions and patient engagement applications are implemented without proper testing of whether or not they actually work," said Greg Szwartz, managing director and leader for the advanced analytics and predictive modeling group in life science with Deloitte Consulting LLP. "We partnered with the Penn Medicine Nudge Unit to conduct a rigorous clinical trial that would provide evidence on the most effective approach overall and how to tailor future interventions for each individual."

Key to the next steps of this research will be the data that they collected from each participant on a wide range of characteristics including demographics, personality type, and social networks.

"Most interventions are designed as one-size-fits-all, in which a single intervention is deployed to a large population," said Patel. "Even if the program works on average, many participants may not benefit. Our next step will be to use data from this trial to develop behavioral profiles that could be used in the future to match the right intervention to the right person."

Credit: 
University of Pennsylvania School of Medicine

HIV significantly increases risk for irregular heartbeat

HIV infection significantly increases the risk of atrial fibrillation (AF) -- one of the most important causes of irregular heartbeats and a leading cause of stroke -- at the same rate or higher than known risk factors such as hypertension and diabetes, according to a study by researchers at UC San Francisco.

In a database review of nearly 17.3 million Californians, the researchers found that HIV infection was associated with an 80 percent higher risk of AF vs. 89 percent for hypertension and 22 percent for diabetes. Findings appear Sept. 9, 2019, in the Journal of the American College of Cardiology (JACC).

"This is the first paper demonstrating that HIV is a risk factor for AF, and the potency of that risk is similar to other well-established AF risk factors," said senior author Gregory Marcus, MD, MAS, a UCSF Health cardiologist and associate chief of cardiology for research in the UCSF Division of Cardiology. "Because AF can be asymptomatic and stroke may be the first manifestation, it's important for caregivers to be aware of patients at heightened risk."

With effective antiretroviral therapy, the life expectancy of HIV-positive patients has increased. However, previous studies have shown that these patients are at increased risk of cardiovascular diseases and sudden cardiac death, at least in part due to antiretroviral therapy. This is the first study to link HIV to irregular heartbeat, as well.

Atrial fibrillation affects an estimated 2.2 million Americans, according to the National Stroke Association, and about 15 percent of people who have strokes have AF. The stroke association estimates that up to 80 percent of strokes among people with AF can be prevented.

In the JACC study, Marcus and his colleagues utilized the Healthcare Cost and Utilization Project database to identify 17,293,971 California residents at least 21 years old (18,242 with HIV) who received care in an outpatient surgery unit, inpatient hospital unit or emergency department from January 2005 to December 2011.

Over an average follow up of 4.7 years, the data reflected 625,167 new AF diagnoses, with 1,076 in HIV-positive patients. After adjusting for demographics, number of clinical visits and cardiovascular comorbidities, the researchers found that HIV-positive patients had an incidence of 18.2 AF diagnoses per thousand person-years, compared to 8.9 in patients without HIV.

The association between HIV and AF risk was significantly higher in younger patients, African Americans and Hispanics, and those without hypertension, diabetes or alcohol abuse. The increased risk in African Americans was interesting, Marcus said, as previous studies have shown that whites are at significantly higher risk of AF.

"Physicians caring for HIV-infected patients should be aware of this strong relationship," said Marcus, who also holds the Endowed Professorship of AF research in the UCSF School of Medicine. "Increased awareness may help in recognizing the diagnosis and consequently result in more frequent prescription of appropriate therapies, such as anticoagulation, to reduce morbidity and mortality."

Microbial infection in HIV-positive patients also may influence AF risk. Future research should determine if antiretroviral therapies are an effective treatment, Marcus said.

One in four adults over age 40 is at risk for AF, with a projection of nearly 6 million people in the nation having the condition by 2050.

Credit: 
University of California - San Francisco

Use of antibiotics in preemies has lasting, potentially harmful effects

Nearly all premature babies receive antibiotics in their first weeks of life to ward off or treat potentially deadly bacterial infections. Such drugs are lifesavers, but they also cause long-lasting collateral damage to the developing microbial communities in the babies' intestinal tracts, according to research from Washington University School of Medicine in St. Louis.

A year and a half after babies leave the neonatal intensive care unit (NICU), the consequences of early antibiotic exposure remain, the study showed. Compared to healthy full-term babies in the study who had not received antibiotics, preemies' microbiomes contained more bacteria associated with disease, fewer species linked to good health, and more bacteria with the ability to withstand antibiotics.

The findings, published Sept. 9 in Nature Microbiology, suggest that antibiotic use in preemies should be carefully tailored to minimize disruptions to the gut microbiome - and that doing so might reduce the risk of health problems later in life.

"The type of microbes most likely to survive antibiotic treatment are not the ones we typically associate with a healthy gut," said senior author Gautam Dantas, PhD, a professor of pathology and immunology, of molecular microbiology, and of biomedical engineering. "The makeup of your gut microbiome is pretty much set by age 3, and then it stays pretty stable. So if unhealthy microbes get a foothold early in life, they could stick around for a very long time. One or two rounds of antibiotics in the first couple weeks of life might still matter when you're 40."

Healthy gut microbiomes have been linked to reduced risk of a variety of immune and metabolic disorders, including inflammatory bowel disease, allergies, obesity and diabetes. Researchers already knew that antibiotics disrupt the intestinal microbial community in children and adults in ways that can be harmful. What they didn't know was how long the disruptions last.

To find out whether preemies' microbiomes recover over time, Dantas and colleagues - including first author Andrew Gasparrini, PhD, who was a graduate student at the time the study was conducted, and co-authors Phillip I. Tarr, MD, the Melvin E. Carnahan Professor of Pediatrics, and Barbara Warner, MD, director of the Division of Newborn Medicine - analyzed 437 fecal samples collected from 58 infants, ages birth to 21 months. Forty-one of the infants were born around 2 ½ months premature, and the remainder were born at full term.

All of the preemies had been treated with antibiotics in the NICU. Nine had received just one course, and the other 32 each had been given an average of eight courses and spent about half their time in the NICU on antibiotics. None of the full-term babies had received antibiotics.

The researchers discovered that preemies who had been heavily treated with antibiotics carried significantly more drug-resistant bacteria in their gut microbiomes at 21 months of age than preemies who had received just one course of antibiotics, or full-term infants who had not received antibiotics. The presence of drug-resistant bacteria did not necessarily cause any immediate problems for the babies because most gut bacteria are harmless - as long as they stay in the gut. But gut microbes sometimes escape the intestine and travel to the bloodstream, urinary tract or other parts of the body. When they do, drug resistance can make the resulting infections very difficult to treat.

Moreover, by culturing bacteria from fecal samples taken eight to 10 months apart, the researchers discovered that the drug-resistant strains present in older babies were the same ones that had established themselves early on.

"They weren't just similar bugs, they were the same bugs, as best we could tell," Dantas said. "We had cleared an opening for these early invaders with antibiotics, and once they got in, they were not going to let anybody push them out. And while we didn't show that these specific bugs had caused disease in our kids, these are exactly the kind of bacteria that cause urinary tract and bloodstream infections and other problems. So you have a situation where potentially pathogenic microbes are getting established early in life and sticking around."

Further studies showed that all of the babies developed diverse microbiomes by 21 months of age - a good sign since lack of microbial diversity is associated with immune and metabolic disorders in children and adults. But heavily treated preemies developed diverse microbiomes more slowly than lightly treated preemies and full-term infants. Further, the makeup of the gut microbial communities differed, with heavily treated premature infants having fewer healthy groups of bacteria such as Bifidobacteriaceae and more unhealthy kinds such as Proteobacteria.

The findings already have led Warner, who takes care of premature infants in the NICU at St. Louis Children's Hospital, and her fellow neonatalogists to scale down their use of antibiotics.

"We're no longer saying, 'Let's just start them on antibiotics because it's better to be safe than sorry,'" Warner said. "Now we know there's a risk of selecting for organisms that can persist and create health risks later in childhood and in life. So we're being much more judicious about initiating antibiotic use, and when we do start babies on antibiotics, we take them off as soon as the bacteria are cleared. We still have to use antibiotics - there's no question that they save lives - but we've been able to reduce antibiotic use significantly with no increase in adverse outcomes for the children."

Credit: 
Washington University School of Medicine

Precious metal flecks could be catalyst for better cancer therapies

Tiny extracts of a precious metal used widely in industry could play a vital role in new cancer therapies.

Researchers have found a way to dispatch minute fragments of palladium - a key component in motor manufacture, electronics and the oil industry - inside cancerous cells.

Scientists have long known that the metal, used in catalytic converters to detoxify exhaust, could be used to aid cancer treatment but, until now, have been unable to deliver it to affected areas.

A molecular shuttle system that targets specific cancer cells has been created by a team at the University of Edinburgh and the Universidad de Zaragoza in Spain.

The new method, which exploits palladium's ability to accelerate - or catalyse - chemical reactions, mimics the process some viruses use to cross cell membranes and spread infection.

The team has used bubble-like pouches that resemble the biological carriers known as exosomes, which can transport essential proteins and genetic material between cells. These exosomes exit and enter cells, dump their content, and influence how the cells behave.

This targeted transport system, which is also exploited by some viruses to spread infection to other cells and tissues, inspired the team to investigate their use as shuttles of therapeutics.

The researchers have now shown that this complex communication network can be hijacked. The team created exosomes derived from lung cancer cells and cells associated with glioma - a tumour that occurs in the brain and spinal cord - and loaded them with palladium catalysts.

These artificial exosomes act as Trojan horses, taking the catalysts - which work in tandem with an existing cancer drug- straight to primary tumours and metastatic cells.

Having proved the concept in laboratory tests, the researchers have now been granted a patent that gives them exclusive rights to trial palladium-based therapies in medicine.

The study was funded by the Engineering and Physical Sciences Research Council and the European Research Council. It has been published in the journal, Nature Catalysis.

Professor Asier Unciti-Broceta, from the University of Edinburgh's CRUK Edinburgh Centre, said: "We have tricked exosomes naturally released by cancer cells into taking up a metal that will activate chemotherapy drugs just inside the cancer cells, which could leave healthy cells untouched."

Professor Jesús Santamaría, of the Universidad de Zaragoza, said: "This has the potential to be a very exciting technology. It could allow us to target the main tumour and metastatic cells, thus reducing the side effects of chemotherapy without compromising the treatment."

Credit: 
University of Edinburgh