Culture

Exploring the risk of ALL in children with Down syndrome

image: Dr. Philip Lupo (left) and Dr. Karen Rabin, the corresponding authors of this work.

Image: 
Baylor College of Medicine

Acute lymphoblastic leukemia (ALL), is the most common childhood cancer. Children with trisomy 21 (Down syndrome) are 10 to 20 times more likely to develop ALL than children without Down syndrome. Historically, children with Down syndrome and ALL had more complications from treatment and a poorer outcome. However, outcomes are improving as we learn more about ALL in Down syndrome and how to best provide treatment and supportive care.

At Baylor College of Medicine, Dr. Karen R. Rabin and Dr. Philip J. Lupo have been investigating the genetic underpinnings of why there is higher risk of ALL in Down syndrome.

"Children with Down syndrome stand out to me as a subpopulation within ALL that is still a little bit of a mystery. We don't understand why they have an increased risk of leukemia, although this has been recognized since the 1950s," said Rabin, associate professor of pediatric hematology and oncology and member of the Dan L Duncan Comprehensive Cancer Center at Baylor College of Medicine.

Although there have been a number of ideas to explain this mystery, the jury is still out. In this study, Rabin, Lupo and their colleagues uncovered new clues that hint at explanations for this unsolved mystery.

"We conducted a genome-wide association study (GWAS) that enabled us to look for genetic differences between children with Down syndrome and children with Down syndrome with ALL that might explain the increased susceptibility to ALL in Down syndrome," said Rabin, who also is director of the Leukemia Program at Texas Children's Hospital. "The reasoning was that if we found cases of Down syndrome/ALL with a higher percentage of certain gene variants that were not present in children with Down syndrome who did not have ALL, then we could infer that those genetic variants may be important for developing ALL."

It was a long project. To conduct a GWAS type of study the researchers needed to assemble a large number of cases and controls. It took several years, but finally they put together about 500 cases (Down syndrome/ALL) and over 1000 controls (Down syndrome/no-ALL). They achieved these numbers thanks to collaborations with groups from other institutions.

The results

Previous GWAS studies had looked into the genetic variants that increase the risk of ALL in the general childhood population (children who do not have Down syndrome). These studies had identified several genes associated with a higher risk of ALL.

"We found four genetic variants that were strongly associated with ALL risk in children with Down syndrome. While these genes have been previously identified in studies of ALL among children without Down syndrome, the effects were much stronger in our study," said Lupo, associate professor of pediatric hematology and oncology and member of the Dan L Duncan Comprehensive Cancer Center at Baylor. He also is the director of the Childhood Cancer Epidemiology and Prevention Program at Texas Children's Hospital.

The researchers took a closer look into two of these genes. They found, for instance, that children with Down syndrome carrying a particular variant of the CDKN2A gene have a 1.7 times higher risk of developing ALL than children without Down syndrome who carry the same variant.

"There is something about having the Down syndrome genetic background that changes the effect of that genetic variant," Rabin said.

The second gene that stood out to the researchers was IKZF1.They looked into the function of this gene, which is known to be involved in the development of B cells, a type of immune cell that typically transforms into leukemic cells in ALL.

They discovered new aspects about this gene that had not been described before. For instance, in the lab the researchers studied the effect of reducing the expression of the IKZF1 gene in cells derived from individuals with or without Down syndrome. They found that reducing IKZF1 expression resulted in significantly higher proliferation rates in Down syndrome than non-Down syndrome cells. One characteristic of cancer cells is their higher proliferation rate, therefore these results suggest a mechanism by which changes in IKZF1 expression may contribute to developing ALL.

Although there is still much to learn, the researchers are optimistic.

"Our findings give us clues to explain why these genes seem to be important for causing leukemia in children with Down syndrome. Having a better understanding of this medical mystery can help us develop tests to identify children who have a higher risk of developing leukemia, and to uncover cellular pathways with the potential of becoming targets for treatment," Rabin said.

"Our findings will serve as the framework for future assessments that we hope will improve outcomes among these children, and have led to a new study funded by the NIH to further explore the role of genetics on ALL risk in children with Down syndrome," Lupo said.

Credit: 
Baylor College of Medicine

Symbiosis as a tripartite relationship

image: The three-dimensional representation of the sponge tissue illustrates the close contact of sponge cells (red) with the bacteria (turquoise) living in the sponge.

Image: 
© Martin T. Jahn, GEOMAR

Sponges form an extensive animal phylum with over 7,500 species worldwide, which occur in a wide range of habitats in the ocean. A special feature of this animal phylum is their ability to filter seawater, through which these organisms obtain their food. In doing so, certain sponge species can move up to 24,000 litres through their body per day. The surrounding seawater contains a wide range of viruses - on average, one millilitre of water contains 10 million viruses. The filter-feeding lifestyle of sponges combined with the rich proliferation of viruses in the ocean therefore might suggest that marine sponges may have a similar viral composition as the surrounding water.

Researchers from the Collaborative Research Centre (CRC) 1182 "Origin and Function of Metaorganisms" at Kiel University (CAU) and the GEOMAR Helmholtz Centre for Ocean Research Kiel have now surprisingly shown that sponges possess a very specific viral sequence signature (i.e., virome), which is remarkably unique even for the individuals of a given species. Certain bacteriophages - i.e. viruses that attack bacteria - are further able to modulate the host immune system and thus protect bacterial symbionts from being digested. While viruses are typically known for their pathogenic properties, the new research findings now also demonstrate a positive influence of bacteriophages on the interaction of host organisms with bacteria. The results were obtained through international cooperation between three countries, including researchers at the universities of Würzburg, Barcelona and Utrecht. The study published today in the renowned journal Cell Host & Microbe thus sheds new light on the symbiosis between multicellular organisms and their microbial communities, which may be regulated by bacteriophages in a tripartite relationship.

An unexplored microcosm

In order to analyse the composition of the viral community of sponges, the researchers examined four different sponge species from a defined location in the Mediterranean Sea. In each case, they compared numerous individuals and different tissues of the same species with each other. "Contrary to our original assumption, each sponge individual has its own unique virome even when living right next to each other". Therefore, no two sponges are alike with regard to their viral community," summarised Martin T. Jahn, a doctoral researcher at GEOMAR and early career researcher at the CRC 1182. "The composition of the virome is thus not primarily determined by the environment or the exposure of the tissue to the surrounding water, but is rather defined by internal factors," said the first author of the study, who collaborated with other early career researchers from four working groups at the CRC 1182.

Notably, the viruses discovered in sponges were largely unknown. "We have found almost 500 new genera of viruses in our samples," emphasised Jahn. "These viruses are completely new, and possibly only occur in sponge, and nowhere else in nature," said Jahn. This order of magnitude shows that the study of viral diversity is only just beginning.

The animal host, bacteria and phages interact with each other

The observed differences between the viral communities of sponges and those from seawater provoked the question whether sponge viruses have specific functions. The researcher team investigated the viral gene inventories and discovered genes which are similar to those of multicellular organisms, where they are responsible for interactions of certain proteins. "This surprising result awakened our special interest," said Ute Hentschel Humeida, CRC 1182 member and professor of marine microbiology at GEOMAR. "We wanted to understand why the bacteriophages have a gene encoding a protein, which we would rather expect in multicellular organisms", continued Hentschel Humeida.

In order to investigate the role of this so-called ANKp protein, they examined its impact in a model system: they expressed the protein in the bacterium Escherichia coli and investigated its effect on certain scavenger cells (macrophages) that occur in the immune system of vertebrates. The result points to a central role of the ANKp protein: it caused E. coli to be significantly less destroyed by the scavenger cells. Strikingly, the protein apparently enables the bacteriophages to interact with the animal host in that it downregulates the host's immune response, thereby protecting the bacteria from being digested. Therefore, the scientists suggest that bacteriophages are part of a tripartite interaction of host organism, bacteria and bacteriophages, where they provide mechanisms for maintaining symbiotic co-existence.

Extension of the symbiosis concept?

The researchers at the CRC 1182 interpret the new results as a novel and important contribution of bacteriophages to the symbioses of multicellular host organisms and their microbial partners. "We suspect that bacteriophages are major players in the interaction between multicellular host organisms - including humans - and bacteria," summarised Martin T. Jahn. "Viral proteins such as ANKp may even enable this interplay of hosts and bacteria in the first place, because they allow the bacteria to evade the immune system of the host," continued Jahn. "The fundamental concept of symbiosis can therefore be understood as an interaction between three parties," concluded Hentschel Humeida. In the future, Hentschel Humeida and team will further investigate this hypothesis, which is of central importance for metaorganism research, and confirm the functional participation of bacteriophages in host-microbe symbioses.

Credit: 
Kiel University

We are all mutants, more or less

SALT LAKE CITY - Everyone is a mutant but some are prone to diverge more than others, report scientists at University of Utah Health.

At birth, children typically have 70 new genetic mutations compared to their parents (out of the 6 billion letters that make both parental copies of DNA sequence). A new study published in eLife shows that number varies dramatically with some people being born with twice as many mutations as others, and that characteristic runs in families.

That difference is based largely on two influences. One is the age of a child's parents. A child born to a father who is 35 years old will likely have more mutations than a sibling born to the same father at 25.

"The number of mutations we pass on to the next generation increases with parental age," said Thomas Sasani, lead author of the study and a graduate student in human genetics at U of U Health. Previous studies have demonstrated the phenomenon, also confirmed by this study.

Another difference is that the effects of parental age on mutation rates differ considerably among families -- much more than had been previously appreciated. In one family, a child may have two additional mutations compared to a sibling born when their parents were ten years younger. Two siblings born ten years apart to a different set of parents may vary by more than 30 mutations.

"This shows that we as parents are not all equal in this regard," said Aaron Quinlan, PhD, senior author of the study. He is also a professor of human genetics at U of U health and associate director of the Utah Center for Genetic Discovery. "Some of us pass on more mutations than others and this is an important source of genetic novelty and genetic disease."

Impacts of new mutations depend on where they land in our DNA, and on the passage of time. On occasion the genetic changes cause serious disease, but the majority occur in parts of our genetic code that don't have obvious effects on human health.

And even though new changes make up a small fraction of the overall DNA sequence, they add up with each subsequent generation. Increasing the so-called mutation load could potentially make individuals more susceptible to illness, said Sasani. It remains to be determined whether factors that impact the mutation rate increase the likelihood for certain diseases.

Although the majority of new mutations originally arise in fathers' sperm, not all do. One in five mutations come from mothers' eggs, and increasing age does not drive as many new mutations in moms as it does in dads. Further, it's estimated that one in ten new mutations seen in children come from neither parent. Instead, they arise anew in the embryo soon after fertilization.

The new insights were found by performing whole genome sequencing and genetic analysis on 603 individuals from 33 three-generation families from Utah, the largest study of its kind. The families were part of the Centre d'Etude du Polymorphisme Humain (CEPH) consortium that were central to many key investigations that formed a modern understanding of human genetics. The large size of the Utah CEPH families, which had as many as 16 children over a span of 27 years, made them well-suited for this new investigation.

It's surprising that the Utah CEPH families have a large range in the number of mutations they accumulate, says Quinlan. That's because the families are similar in many ways. They are all of European ancestry, live within the same geographic region, and likely have similar lifestyles and environmental exposures.

"We don't know what's driving the variability," he says, but reasons that it stems from a combination of genetics, environment, and exposure to mutagens. Given that these influences differ widely across the globe, Quinlan hypothesizes that "variability in mutation rates worldwide must be much, much larger."

Credit: 
University of Utah Health

Commit a crime? Loved ones got your back

ANN ARBOR--Reading about a child abuse case or someone burglarizing homes often stirs feelings of disgust, anger and disbelief when it's learned the perpetrator's family or friends did nothing to stop it or report it to police.

But when it's your own family member or friend who committed the crime, you're less likely to do anything as well, according to a new University of Michigan study.

The findings, published in Personality and Social Psychology Bulletin, indicate that people are more likely to protect those close to them when moral infractions are committed, particularly highly severe acts such as theft, blackmail and groping.

Regardless of gender, political orientation, morals or disgust by the offense, the tendency is to not sacrifice the relationship--even for the good of society. Researchers expressed surprise that people tend to become more protective of a loved one as the severity of the crime increases.

"We were really taken aback to see that most people predict that they will protect those close to them even in the face of heinous moral infractions," said Aaron Weidman, a psychology research fellow and the study's co-lead author.

Weidman and colleagues analyzed the responses from more than 2,800 people across 10 studies. They tested whether people are more likely to report that they'd protect those close to them (versus strangers) after imagining them commit immoral acts of theft and sexual harassment.

For example, participants were asked to imagine that a police officer asked them if they knew anything about an immoral act they had witnessed. They were more willing to lie (and thus break the law) to protect someone close to them, such as a family member or close friend.

On the other hand, if the perpetrator was a stranger, participants wanted the individual to be formally punished, possibly turning them in to law enforcement or subjecting them to social ostracizing.

To understand these results, the research team examined potential psychological explanations for this behavior. They found that many people justify their decision to protect those they know and love by reporting that they'd discipline the perpetrator on their own. By doing this, people maintain their self-image as a morally upstanding individual, as well as preserving the close relationship, the researchers said.

"Loyalty is a powerful motivator that, under certain circumstances, can override other virtues like honesty." said Walter Sowden, the study's other lead author, and former U-M psychology doctoral student who is now an Army research psychologist.

The researchers also demonstrated how this pervasive bias to protect friends and loved ones could be attenuated--by instructing people to adopt a psychologically distanced perspective. In two experiments, they found that asking participants to reason about the most severe forms of moral transgression from a third-person perspective nudged them toward making the more ethical decision.

Credit: 
University of Michigan

What wolves' teeth reveal about their lives

image: Biologist Blaire Van Valkenburgh has spent more than three decades studying the skulls of large carnivores. Here she displays a replica of a saber-toothed cat skull. At left are the skulls of a spotted hyena (in white) and a dire wolf (the black skull).

Image: 
Christelle Snow/UCLA

UCLA evolutionary biologist Blaire Van Valkenburgh has spent more than three decades studying the skulls of many species of large carnivores -- including wolves, lions and tigers -- that lived from 50,000 years ago to the present. She reports today in the journal eLife the answer to a puzzling question.

Essential to the survival of these carnivores is their teeth, which are used for securing their prey and chewing it, yet large numbers of these animals have broken teeth. Why is that, and what can we learn from it?

In the research, Van Valkenburgh reports a strong link between an increase in broken teeth and a decline in the amount of available food, as large carnivores work harder to catch dwindling numbers of prey, and eat more of it, down to the bones.

"Broken teeth cannot heal, so most of the time, carnivores are not going to chew on bones and risk breaking their teeth unless they have to," said Van Valkenburgh, a UCLA distinguished professor of ecology and evolutionary biology, who holds the Donald R. Dickey Chair in Vertebrate Biology.

For the new research, Van Valkenburgh studied the skulls of gray wolves -- 160 skulls of adult wolves housed in the Yellowstone Heritage and Research Center in Montana; 64 adult wolf skulls from Isle Royale National Park in Lake Superior that are housed at Michigan Technological University; and 94 skulls from Scandinavia, collected between 1998 and 2010, housed in the Swedish Royal Museum of Natural History in Stockholm. She compared these with the skulls of 223 wolves that died between 1874 and 1952, from Alaska, Texas, New Mexico, Idaho and Canada.

Yellowstone had no wolves, Van Valkenburgh said, between the 1920s and 1995, when 31 gray wolves were brought to the national park from British Columbia. About 100 wolves have lived in Yellowstone for more than a decade, she said.

In Yellowstone, more than 90% of the wolves' prey are elk. The ratio of elk to wolves has declined sharply, from more than 600-to-1 when wolves were brought back to the national park to about 100-to-1 more recently.

In the first 10 years after the reintroduction, the wolves did not break their teeth much and did not eat the elk completely, Van Valkenburgh reports. In the following 10 years, as the number of elk declined, the wolves ate more of the elk's body, and the number of broken teeth doubled, including the larger teeth wolves use when hunting and chewing.

The pattern was similar in the island park of Isle Royale. There, the wolves' prey are primarily adult moose, but moose numbers are low and their large size makes them difficult to capture and kill. Isle Royale wolves had high frequencies of broken and heavily worn teeth, reflecting the fact that they consumed about 90% of the bodies of the moose they killed.

Scandinavian wolves presented a different story. The ratio of moose to wolves is nearly 500-to-1 in Scandinavia and only 55-to-1 in Isle Royale, and, consistent with Van Valkenburgh's hypothesis, Scandinavian wolves consumed less of the moose they killed (about 70%) than Isle Royale wolves. Van Valkenburgh did not find many broken teeth among the Scandinavian wolves. "The wolves could find moose easily, not eat the bones, and move on," she said.

Van Valkenburgh believes her findings apply beyond gray wolves, which are well-studied, to other large carnivores, such as lions, tigers and bears.

Extremely high rates of broken teeth have been recorded for large carnivores -- such as lions, dire wolves and saber-toothed cats -- from the Pleistocene epoch, dating back tens of thousands of years, compared with their modern counterparts, Van Valkenburgh said. Rates of broken teeth from animals at the La Brea Tar Pits were two to four times higher than in modern animals, she and colleagues reported in the journal Science in the 1990s.

"Our new study suggests that the cause of this tooth fracture may have been more intense competition for food in the past than in present large carnivore communities," Van Valkenburgh said.

She and colleagues reported in 2015 that violent attacks by packs of some of the world's largest carnivores -- including lions much larger than those of today and saber-toothed cats -- went a long way toward shaping ecosystems during the Pleistocene.

In a 2016 article in the journal BioScience, Van Valkenburgh and more than 40 other wildlife experts wrote that preventing the extinction of lions, tigers, wolves, bears, elephants and the world's other largest mammals will require bold political action and financial commitments from nations worldwide.

Discussing the new study, she said, "We want to understand the factors that increase mortality in large carnivores that, in many cases, are near extinction. Getting good information on that is difficult. Studying tooth fracture is one way to do so, and can reveal changing levels of food stress in big carnivores."

Credit: 
University of California - Los Angeles

Tapeworms need to keep their head to regenerate

image: Tapeworm stem cells drive growth and regeneration

Image: 
Tania Rozario

Scientists have identified the stem cells that allow tapeworms to regenerate and found that their location in proximity to the head is essential, according to a new study in eLife.

These novel insights can help explain how tapeworms grow in their human and animal hosts and could be helpful in finding new ways to target these parasites.

Tapeworms are famous for the enormous lengths they reach and their ability to grow thousands of segments, called proglottids. During their normal life cycle, they shed large parts of their body and then regenerate to maintain a certain length. However, it has never been fully understood how they achieve this.

"We know that tapeworm regeneration is likely to involve stem cells, but up until now their potential to regenerate has never been comprehensively studied," explains lead author Tania Rozario of the Morgridge Institute for Research at the University of Wisconsin-Madison, US. "In this study, we explored which parts of the tapeworm are able to regenerate and how this regeneration is driven by stem cells."

The team used a toolbox of molecular techniques to answer these questions. First, they removed certain fragments of the worms and then grew them in the lab to determine which regions of the body can regenerate. This showed that neither the head nor posterior body alone can regenerate and that the neck portion is needed. Astonishingly, severing the head from the neck did not stop the tapeworm from continuing to grow, but the regeneration of new segments was inhibited. Only when the tapeworm head and neck were left intact could the tapeworm continuously regenerate segments.

Next, they tested whether the neck contains special stem cells that allow tapeworms to regenerate. They labelled rapidly multiplying cells in the worms and then studied their location in the body. They found that these cells exist throughout the whole body and not just in the neck. Further studies also revealed no evidence for stem cells that are unique to the neck.

These findings led the team to speculate that stem cells are found throughout the tapeworm but that signals only operating in the neck are necessary to activate them. To test this, they administered irradiated tapeworms (that would be destined to die) with donor cells from different parts of a healthy worm. These donor cells rescued the worms and allowed them to regenerate. When stem cells were removed from the donor cells, this rescue could not occur.

This proves that the rapidly growing cells identified in the study are bona fide stem cells that can establish and regenerate within another host worm. Moreover, stem cells from any part of the body could rescue the injured worms, which suggests that external factors, rather than features of stem cells in the neck, allow regeneration to occur.

"It appears that in tapeworms, location matters enormously," concludes senior author Phillip Newmark, Howard Hughes Medical Institute Investigator at the Morgridge Institute for Research. "The head and neck environments provide cues that control the ability of stem cells to regenerate segments, even though the stem cells involved in this process are not confined to either one of these areas of the body."

Credit: 
eLife

Outer hair cells regulate ear's sensitivity to sound

The ear's tiny outer hair cells adjust the sensitivity of neighbouring inner hair cells to sound levels rather than acting like an amplifier, suggests a new study published today in eLife.

The discovery in gerbils contributes to our understanding of the role that outer hair cells play in hearing. The findings could also be useful for developing better ways to protect these delicate cells from harm to prevent hearing loss.

Tiny cells with hair-like protrusions in the inner ear act like microphones by converting vibrations caused by sound into electrical signals that the brain interprets. These inner hair cells work alongside outer hair cells, whose role in hearing is sometimes debated.

"When outer hair cells are damaged, vibrations become much smaller than those in a healthy ear," explains lead author Anna Vavakou, a PhD student in the Department of Neuroscience at Erasmus Medical Center in Rotterdam, the Netherlands. "This has led scientists to believe that the outer hair cells actively amplify sounds, but there is a problem with this theory. The electrical properties of the outer hair cells would make them too sluggish to deal with the fast vibrations of high-pitched sounds, especially in animals with ultrasound hearing."

To better understand what the outer hair cells do, Vavakou and her colleagues used a new technology called optical coherence tomography vibrometry to measure the minute movements of outer hair cells in live gerbils in response to sounds. The cells were able to move fast enough to respond to sounds up to tones of about 2.5 kilohertz - or about halfway up the upper octave of a piano keyboard. At higher tones, the team saw that these cells were less able to keep up with the vibrations.

This suggests that while gerbils have good ultrasonic hearing, their outer hair cells are not able to amplify these sounds, but they do accurately track variations in sound levels.

"Rather than amplifying sound, the cells seem to monitor sound level and regulate sensitivity accordingly," Vavakou says. "This is what engineers call automatic gain control, which is used in many devices like cell phones."

"A better understanding of what these outer hair cells do is critically important," explains senior author Marcel van der Heijden, who leads the Auditory Periphery Laboratory at the Erasmus University Medical Center. "Factors such as loud noise and certain drugs, including antibiotics, can easily damage these cells, which can in turn lead to poorer hearing. Figuring out their exact role could help guide efforts to prevent or even cure common forms of hearing loss."

Credit: 
eLife

Naming of new interstellar visitor: 2I/Borisov

image: The first-ever comet from beyond our Solar System, as imaged by the Gemini Observatory. The image of the newly discovered object, named 2I/Borisov, was obtained on the night of 9-10 September 2019 using the Gemini Multi-Object Spectrograph on the Gemini North Telescope on Hawaii's Mauna Kea.

Image: 
Gemini Observatory/NSF/AURA

On 30 August 2019 the amateur astronomer Gennady Borisov, from MARGO observatory, Crimea, discovered an object with a comet-like appearance. The object has a condensed coma, and more recently a short tail has been observed. Mr. Borisov made this discovery with a 0.65-metre telescope he built himself.

After a week of observations by amateur and professional astronomers all over the world, the IAU Minor Planet Center was able to compute a preliminary orbit, which suggested this object was interstellar -- only the second such object known to have passed through the Solar System.

The orbit is now sufficiently well known, and the object is unambiguously interstellar in origin; it has received its final designation as the second interstellar object, 2I. In this case, the IAU has decided to follow the tradition of naming cometary objects after their discoverers, so the object has been named 2I/Borisov.

Of the thousands of comets discovered so far, none has an orbit as hyperbolic as that of 2I/Borisov. This conclusion is independently supported by the NASA JPL Solar System Dynamics Group. Coming just two years after the discovery of the first interstellar object 1I/'Oumuamua, this new finding suggests that such objects may be sufficiently numerous to provide a new way of investigating processes in planetary systems beyond our own.

2I/Borisov will make its closest approach to the Sun (reach its perihelion) on 7 December 2019, when it will be 2 astronomical units (AU) from the Sun and also 2 AU from Earth. By December and January it is expected that it will be at its brightest in the southern sky. It will then begin its outbound journey, eventually leaving the Solar System forever.

Astronomers are eagerly observing this object, which will be continuously observable for many months, a period longer than that of its predecessor, 1I/'Oumuamua. Astronomers are optimistic about their chances of studying this rare guest in great detail.

Estimates of the sizes of comets are difficult because the small cometary nucleus is embedded in the coma, but, from the observed brightness, 2I/Borisov appears to be around a few kilometres in diameter. One of the largest telescopes in the world, the 10.4m Gran Telescopio Canarias in the Canary Islands, has already obtained a spectrum of 2I/Borisov and has found it to resemble those of typical cometary nuclei.

This new interstellar visitor raises intriguing questions: Why have interstellar objects not been discovered before? What is the expected rate of their appearance in the inner Solar System? How do such objects compare to similar bodies within the Solar System? Large telescopic surveys capable of scanning large fractions of the sky on a regular basis may help to answer these questions and more in the near future.

Credit: 
International Astronomical Union

Could we feed one million people living on mars?

image: New Space is the only international peer-reviewed journal dedicated to academic, industry, and government contributions to space entrepreneurship and innovation.

Image: 
Mary Ann Liebert, Inc., publishers

New Rochelle, NY, September 24, 2019--A provocative new study looks at the resource utilization and technological strategies that would be needed to make a Mars population of one million people food self-sufficient. A detailed model of population growth, caloric needs, land use, and potential food sources showed that food self-sufficiency could be achieved within 100 years. The study is published in New Space: The Journal of Space Entrepreneurship and Innovation, a peer-reviewed journal from Mary Ann Liebert, Inc., publishers. Click here to read the full-text article free on the New Space website through October 24, 2019.

In the article entitled "Feeding One Million People on Mars," coauthors Kevin Cannon and Daniel Britt, University of Central Florida, Orlando, evaluated different food sources and quantitatively modeled the shifting balance between food supplied provided from Earth and that produced locally on Mars over time. The model is based on a diet composed of plants, insects, and cellular agriculture, which can produce "clean" meat and fish, algae, chicken-less eggs and cow-less milk. The study takes into account the energy, water, and other systems needed for food production. The researchers discuss the implications of their findings and present recommendations for future research.

"To meet the human right of survival, some minimum daily requirement for calories and nutrition will be a necessary activity for settlement on any moon or planet. Anything above these minimum requirements, however, could be a commercial activity," says Editor-in-Chief of New Space Ken Davidian, who has worked in the commercial space transportation industry for over 30 years. "It's not hard to imagine that coffee, or extra fruit, or any food item that exceeds the minimum requirements, would be a fungible item, if customers want to indulge themselves."

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News

Wistar receives over $12M for clinical research on opioid use in HIV-infected people

image: (L-R) Dr. Luis Montaner with members of his lab at The Wistar Institute

Image: 
The Wistar Institute

PHILADELPHIA -- (September 24, 2019) -- The Wistar Institute was awarded two major grants totaling more than $12 million from the National Institute on Drug Abuse (NIDA), part of the National Institutes of Health, to fund an international multidisciplinary clinical research consortium spearheaded by Wistar's HIV Research Program. The consortium, including several partner institutions in the U.S. and abroad, will investigate the impact of opioid use disorder (OUD) and medications for opioid use disorder (MOUDs) on immune recovery in response to antiretroviral therapy (ART) in HIV-infected people.

"We have uncovered a potential link between substance abuse, HIV infection and MOUDs that may determine health outcomes only if the right medication is chosen," said study leader Luis J. Montaner, D.V.M., D.Phil., the Herbert Kean, M.D., Family Professor and director of the HIV-1 Immunopathogenesis Laboratory at Wistar's Vaccine & Immunotherapy Center.

Both HIV infection and chronic opioid exposure are associated with immune activation, which leads to T-cell depletion and progression to acquired immunodeficiency syndrome (AIDS).

OUD is commonly treated with drugs that either activate (agonists) or block (antagonists) the opioid receptor. "Yet, we have a very limited understanding of how the medications we use to treat OUD impact disease progression and the response to ART in people living with HIV," commented Montaner.

The overarching goal of this research is to investigate the role of opioid receptor involvement in modulating the levels of immune activation, and the effects of different classes of MOUDs, in people living with HIV. Effectively controlling immune activation after ART in persons taking MOUDS can directly impact health and mortality.

The NIDA support of this initiative will fund two clinical studies:

The first grant provides $8,373,891 over five years for an international trial conducted among the U.S., Vietnam and France, in collaboration with the Vietnam Ministry of Health, the Perelman School of Medicine at the University of Pennsylvania, the Institute of Applied Medicine and Epidemiology (a French-led initiative to expand access to HIV/hepatitis prevention and treatment services), and the Pasteur Institute.

The goal of this three-arm randomized trial, conducted in Vietnam and co-led by Montaner and David Metzger, Ph.D., a research professor and director of the HIV Prevention Research Division at the Perelman School of Medicine at the University of Pennsylvania, is to evaluate the impact of long-term opioid receptor stimulation or blockage with MOUDs on immune reconstitution in HIV-infected people who inject drugs and are initiating ART. Early preliminary data suggest that chronic opioid receptor engagement by an opioid receptor agonist while on ART may result in increased immune activation and inflammation associated with increased levels of persistent HIV, when compared to a full opioid receptor antagonist. To verify this hypothesis, the study will assess recovery outcomes and adherence to therapy 48 weeks after initiation of ART in 225 participants with OUD who receive either methadone (opioid receptor agonist), extended-release naltrexone (antagonist) or buprenorphine (partial agonist).

A second, complementary grant will provide $3,889,138 over five years for mechanistic studies on local persons living with HIV on ART and taking MOUDs. Collaborators on this research are the Perelman School of Medicine at the University of Pennsylvania, Jonathan Lax Treatment Center, and the Icahn School of Medicine at Mount Sinai. The study will assess the preliminary observation that greater myeloid activation and HIV persistence are present in people receiving opioid receptor agonists when compared to people treated with opioid receptor antagonist naltrexone.

Blood and tissue samples from individuals living with HIV who are receiving ART and treatment with different MOUDs will be used to study the mechanisms that regulate persistent immune activation and residual HIV expression.

"We expect the results of this major collaborative effort, which has its hub in Philadelphia, to have broad clinical implications in informing the best pharmacologic strategy for the management of opioid use disease in HIV-infected people starting ART," said Montaner. "This is directly relevant in light of the opioid epidemic ongoing in our nation and will help ensure that the right medications are used for both HIV and OUD, with the ultimate objective of saving lives in the future."

Credit: 
The Wistar Institute

The problem with promoting 'responsible dog ownership'

image: Dr Carri Westgarth and companion with dogs

Image: 
University of Liverpool

Dog welfare campaigns that tell people to be 'responsible owners' don't help to promote behaviour change, a new University of Liverpool report suggests.

Dog owners interviewed for a study published in Anthrozoös all considered themselves to be responsible owners, despite there being great variation in key aspects of their dog-owning behaviour.

"Policy and campaigning messages related to dog ownership and welfare tend to focus on the concept of being a responsible owner. However, while 'responsible dog ownership' has considerable appeal as a concept, how it is perceived and interpreted has not been studied in-depth," explains lead researcher Dr Carri Westgarth, a dog behaviour expert at the University of Liverpool.

In order to better understand beliefs and views about responsibility in dog ownership, the researchers carried out in-depth interviews with dog-owning households and shorter interviews with dog owners while walking their dogs or representing their breed at a dog show. The interviews focused on dog walking, an issue perceived to be a component of responsible dog ownership, as well as other aspects of campaign messages, such as dog fouling, aggression and neutering.

Dr Westgarth also reflected on her own experiences of walking her three dogs, and on her many conversations with other owners over the two-year study period.

Dr Westgarth said: "It's clear from our research that responsible dog ownership means different things to different people at different times. It emerges from a blurred intersection of the needs of dogs, owners, and others, where often the dog comes first.

"Dog owners do what they perceive to be best for their individual dog, even if this goes against general advice given such as how often dogs need walking or neutering campaigns.

"Yet this perception may be different from what others feel is best for that dog, or how people who are impacted by the dog want the dog and their owner to behave.

"Therefore, simply telling owners that they should 'be responsible' is of limited use as a message to promote behaviour change because they already believe that they are. Any educational messages for dog owners need to be specific what they want owners to do and explain why that is in the best interest of the dog that they love so much."

The report authors say that further research is now required in order to understand the implications for wider aspects of responsible dog ownership practices.

Credit: 
University of Liverpool

Using light to speed up computation

image: Researchers in Japan have developed a type of processor called PAXEL, a device that can potentially bypass Moore's Law and increase the speed and efficiency of computing. In APL Photonics, the researchers looked at using light for the data transport step in integrated circuits, since photons are not subject to Moore's Law. Instead of integrated electronic circuits, much new development now involves photonic integrated circuits. The PAXEL accelerator takes this approach and uses power-efficient nanophotonics. This image shows the evolution and bottlenecks of electronic integrated circuits for digital computing, and cloud versus fog computing and use of PAXEL devices.

Image: 
Ken-ichi Kitayama

WASHINGTON, D.C., September 24, 2019 -- A group of researchers in Japan has developed a new type of processor known as PAXEL, a device that can potentially bypass Moore's Law and increase the speed and efficiency of computing. PAXEL, which stands for photonic accelerator, is placed at the front end of a digital computer and optimized to perform specific functions but with less power consumption than is needed for fully electronic devices.

Metal-oxide semiconductor field-effect transistors are the basis for most integrated electronic circuits, but they are limited by Moore's Law, which says the number of microprocessor chips on a single electronic circuit will double every two years. There is an inherent limit to this, though, based on the way the size of the microprocessor chips relates to the quantum mechanical nature of electrons.

It is possible to partially overcome the Moore's Law problem by using parallel processing, in which multiple processors carry out simultaneous computations. This approach does not work for every application, however.

In a paper in APL Photonics, from AIP Publishing, the researchers looked at another technique to use light for the data transport step in integrated circuits, since photons are not subject to Moore's Law. Instead of integrated electronic circuits, much new development now involves photonic integrated circuits (PICs). The PAXEL accelerator takes this approach and uses power-efficient nanophotonics, which are very small PICs.

Nanophotonics, such as those used in PAXEL, operate at the speed of light and can carry out computations in an analog fashion, with data mapped onto light intensity levels. Multiplications or additions are then performed by varying light intensity. The investigators considered different PAXEL architectures for a variety of uses including artificial neural networks, reservoir computing, pass-gate logic, decision-making and compressed sensing.

One particularly interesting application of PAXEL is in so-called fog computing. This is like cloud computing but uses computational resources (servers) near the "ground" where the originating event occurs. A compact PAXEL attached to a tablet or other hand-held device could detect signals and transmit the information through a 5G wireless link to nearby fog computing resources for data analysis.

Applications of this new technology are expected in a wide array of areas including medical and veterinary point-of-care testing, diagnostics, drug and food testing, and biodefense. As more of our household and business devices are connected through the web, better computing capacity, including data transport with higher energy efficiency, will be needed. Advances such as PAXEL are expected to help meet these needs.

Credit: 
American Institute of Physics

Large-scale enhanced recovery program improves outcomes for bariatric surgery patients

CHICAGO (September 24, 2019): A large-scale implementation of a protocol to improve recovery of patients after weight-loss operations was found to reduce rates of extended hospitalization by almost half at 36 participating accredited bariatric surgery centers nationwide, according to a study published online ahead of print in the current issue of the journal Surgery for Obesity and Related Diseases. The initiative, titled ENERGY--for Employing Enhanced Recovery Goals in Bariatric Surgery--compared outcomes of 8,946 bariatric operations before with 9,102 operations that occurred after implementation of the protocol, known as an enhanced recovery program (ERP).

For this study, ERP measured 26 different process measures aimed at improving outcomes after weight-loss operations. "The key finding of this study is that the more adherent a program was to all of the process measures of the protocol, the greater the reduction their patients experienced in their extended length of stay," said lead author Stacy A. Brethauer, MD, FACS, professor of surgery at The Ohio State University, Columbus. The researchers defined extended length of stay (LOS) as any hospitalization of more than four days after the operation. Before the ERP, 8.1 percent of operations resulted in an extended LOS; after ERP, the rate declined to 4.5 percent. "This result was accomplished without increasing readmission rates," Dr. Brethauer said. At centers that complied with 23 or more of the 26 ERP process measures, the rate of extended LOS was 2.3 percent vs. 5.4 percent at those that complied with 19 or 20.

ERPs have been around for nearly two decades, first adopted in the United States by colorectal and orthopedic surgery and anesthesiology units in hospitals. The goal is to maintain a patient's normal physiological state as much as possible throughout the operation and recovery process, Dr. Brethauer explained. "This goal is accomplished by allowing patients to arrive for surgery in a physiologically 'fed' state after drinking a carbohydrate drink two hours prior to surgery, minimizing fluid overload during and after surgery, maintaining tight blood sugar control, implementing opioid-sparing multimodal pain management strategies, and minimizing emotional and physical stress that can accompany a major operation," Dr. Brethauer said. The protocol also eliminates the use of drains and urinary catheters, encourages early mobilization after operations, and the use of regional anesthetic blocks and non-opioids as first- and second-line pain management treatments.

"Patient education is critical to a successful enhanced recovery program," he said. "Setting expectations and describing the opioid-sparing pain management strategies to patients before their operations is important and helps patients understand their role in their recovery." In this study population, 87 percent of patients in ERP had no Foley catheter placed, 81 percent received oral liquids within eight hours of the operation, and 84 percent were up and moving within eight hours of their procedure.

As for pain management, 82 percent of patients received a regional block or lidocaine drip during surgery, 79 percent received either acetaminophen (Tylenol®) in combination with another non-narcotic pain medication such as celecoxib (Celebrex®) or gabapentin as their primary pain medication after surgery, and 25 percent received no opioid medications after they left the recovery room. "Routine use of patient-controlled analgesia with opioids was not allowed," Dr. Brethauer noted. "All of these measures improve recovery by reducing nausea, postoperative ileus (bowel obstruction) and other opioid-related adverse events. Also, as fewer patients are exposed to opioids after bariatric operations, there will likely be fewer patients who become addicted and continue to use opioids beyond their recovery period.

The Metabolic and Bariatric Surgery Accreditation and Quality Improvement Program (MBSAQIP) launched this quality improvement project to implement a prescriptive ERP. The researchers invited 80 MBSAQIP centers identified as outliers for extended LOS in the MBSAQIP database and 36 enrolled after reviewing the protocol and the commitment required. "Implementation of the protocol required multiple stakeholders--anesthesia, nursing, pharmacy, administration, surgical team--at each site to commit to the protocol," Dr. Brethauer said.

As a result of the study, MBSAQIP is encouraging all of its accredited centers to adopt enhanced recovery protocols into their surgical practice, Dr. Brethauer said. "Enhanced recovery can and should be implemented on a large scale in bariatric surgery with the goal of decreasing variations in care, eliminating practices that are not evidence-based, and improving clinical outcomes," he said. The published study includes the ENERGY protocol as an appendix. Furthermore, MBSAQIP is developing an implementation toolkit for its accredited centers to use in furthering ERP efforts.

The next national quality improvement project will focus specifically on opioid prescribing after bariatric surgery, Dr. Brethauer added. "This project will involve many more centers, and opioid-sparing strategies in the hospital and at discharge will be implemented and measured for one year in hopes of decreasing opioid exposure to our patients and minimizing the number of opioid prescriptions out in the community," he said.

Credit: 
American College of Surgeons

Modest improvements in diets of US adults but still too much sugar, saturated fat

Bottom Line: U.S. adults made modest improvements to their diets in recent years but still eat too much low-quality carbohydrates and saturated fat based on an analysis of nationally representative survey data. The study included data from nearly 44,000 adults who reported their dietary intake in a 24-hour period. Researchers report a decline in the consumption of low-quality carbohydrates (primarily added sugar) and increases in high-quality carbohydrates (primarily whole grains), plant protein (primarily whole grains and nuts) and polyunsaturated fatty acids from 1999 to 2016. However, intake of low-quality carbohydrates and saturated fat remained high. There was slight improvement in overall diet quality as assessed by a measure of adherence to key recommendations in dietary guidelines. A limitation of the study is its use of self-reported dietary data.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

Authors: Zhilei Shan, M.D., Ph.D., Harvard T. H. Chan School of Public Health, Boston, Fang Fang Zhang, M.D., Ph.D., Tufts University, Boston, and coauthors

(doi:10.1001/jama.2019.13771)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

USPSTF recommendation on screening for asymptomatic bacteriuria in adults

Bottom Line: The U.S. Preventive Services Task Force (USPSTF) recommends screening people who are pregnant for asymptomatic bacteriuria (bacteria in the urine without signs or symptoms of a urinary tract infection) using urine culture and not screening other adults. The condition is present in an estimated 2% to 10% of pregnant women and is associated with pyelonephritis, a kidney infection that is a common reason for hospitalization in pregnant women. The USPSTF routinely makes recommendations about the effectiveness of preventive care services and this statement updates its 2008 recommendation.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jama.2019.13069)

Editor's Note: Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network