Culture

WSU grizzly research reveals remarkable genetic regulation during hibernation

image: Dr. Lynne Nelson (left foreground), and Dr. Charlie Robbins (left side back row) conduct a cardiac ultrasound on a groggy bear during hibernation in past research done at Washington State University's Bear Research, Education, and Conservation Center. Nelson's work contributed to the understanding of a grizzly's unique physiology.

Image: 
Photo by Henry Moore Jr. BCU/WSU

PULLMAN, Wash.--Being a human couch potato can greatly increase fat accumulation, hasten the onset of Type II diabetes symptoms, result in detrimental blood chemistry and cardiovascular changes, and eventually, bring about one's death.

Large hibernators such as bears however have evolved to adapt to and reverse similar metabolic stressors they face each year before and during hibernation to essentially become immune to these ill effects.

New RNA sequencing-based genetic research conducted at Washington State University's Bear Research, Education, and Conservation Center shows grizzlies express a larger number of genes in preparation for, and during hibernation to cope with such stressors, than do any other species studied.

The king-of-the-gene switching superlative even holds true when one corrects for the different sample sizes used in other hibernation studies.

The work was conducted in Pullman, Washington, home of the only university-based captive grizzly bear population in the world. It was published Sept. 13, in Communications Biology, a Springer Nature publication. The WSU scientists biopsied muscle, liver, and fat tissues for the study.

It begins with the wonders of hibernation

For centuries, people have been fascinated with various species known to hibernate. Science fiction writers describe fantastic space journeys and hibernation states employed with humans. Medically-induced comas in humans get them past extraordinary traumatic or disease states, organs are cooled for storage and transport, and scientists continue to wonder if hibernation could be induced as a therapeutic tool. A wide variety of species have been studied, including those that 'hibernate' in the warmer months and ones that hibernate in winter and whose body temperature can sometimes drop to near freezing. But not bears. Bears appear to be more like humans.

Unlike what some assume is a sleep equivalent, hibernation is a very specialized metabolic state that varies by species and the environment where they hibernate. The mechanism for hibernation in all species studied is controlled by their gene expression.

"Bears and other hibernators have sleep and wake cycles, but these differ in both the type of sleep and the frequency with which they occur," said Professor Heiko Jansen, lead author of the paper.

With grizzlies, the observable details of hibernation are astounding. During hibernation for nearly five months, grizzlies maintain only a slightly lowered body temperature and essentially do not eat, urinate, or defecate. They do however, give birth and produce milk and all the while they do not lose significant bone or muscle mass. In metabolic terms, during hibernation they are the ultimate recyclers of the waste products that mammals usually have to eliminate or suffer with toxicity from their buildup.

Studies done with humans show that even as little as 24 hours of fasting and confinement to a bed can lead to measureable blood glucose and chemistry changes and bone and muscle loss. Grizzlies though maintain a near normal blood glucose level throughout hibernation. By turning down their sensitivity to insulin, bears can conserve the glucose they produce.

Grizzly cardiac studies conducted previously at WSU show that during hibernation bears' heart rates can slow down to as low as 5 beats per minute, with 12 to 15 being average, while the consistency of their blood resembles thick gravy. Their heart saves energy by essentially confining its pumping to two of its four chambers. Yet, startle a hibernating grizzly, and it can rev its heart up to 100 beats per minute in mere seconds.

Grizzlies have developed this unique set of adaptations in order to survive harsh winters when food is scarce. By hibernating then they expend little energy and survive, give birth, and nurse until food becomes abundant again.

Gene expression in hibernating bears is remarkable

While other studies in other species have looked at gene expression in tissues before and during hibernation, the work until now has never been done with grizzlies. The WSU results while somewhat expected, far exceeded the level of differing genetic expression seen before.

"The number of differentially expressed genes is striking," said WSU Associate Professor, Joanna Kelley.

Through sequencing RNAs, the team looked at hyperphagia [pronounced HY-per-fay-gee-uh] and subsequent hibernation across six bears. Hyperphagia means the period right before hibernation when bears begin eating to excess in order to store energy as fat. Bears at this time of year would be considered morbidly obese by human standards.

Among the discoveries was that all three tissues studied had dynamic gene expression changes occurring during hibernation. Perhaps more importantly, they discovered there was a subset of the same genes in all three tissues making the same changes at the same time.

Fat is the tissue that fuels hibernation and probably orchestrates the sparing of other tissues. But, despite the calorie intake and fat accumulation, bears do not suffer the same negative effects people do. Furthermore, they reverse the process by switching genes on and off based upon the season.

Fasting a bear during the active season as if it is time to hibernate does not make the same genes switch on and off like it would in late fall. Feed a bear in hibernation, and the genes can't be fooled then either; hibernation continues.

"Many people assumed that as bears go through hyperphagia, fat just sort of accumulates and sits there as a fuel reservoir," explained Jansen. "In fact, our studies have shown that the fatty tissue is far from inert. Fat is actually very metabolically active being driven by the expression of over 1000 unique genes in fat during hibernation as compared to the level of expression seen during the normal seasonal activities.

"What fat's entire role is in the overall hyperphagia and hibernation process remains an exciting area to continue to explore."

Jansen went on to explain that during the active period and subsequent hyperphagia, the genetic expression varied among the tissues studied. While many genes in fat were being differentially expressed, there were no genes being expressed like that in muscle tissue and only three were expressed differentially in liver tissue.

Differential gene expression also means genes may be upregulated or downregulated depending on the gene, sort of like a panel of light switches being on or off. Of the genes expressed in fatty tissue, more than 2000 were upregulated and about 1800 were downregulated in hibernation compared to the active season.

"Seeing the same sets of genes in different tissues being upregulated or downregulated that best serves the animal at the same time suggests there may be a common control mechanism for the process making this more one of an 'on-demand' regulation," Kelley said.

Credit: 
Washington State University

Shape-shifting robots built from smarticles could navigate Army operations

image: Five identical "smarticles" -- smart active particles -- interact with one another in an enclosure. By nudging each other, the group -- dubbed a "supersmarticle" -- can move in random ways. The research could lead to robotic systems capable of changing their shapes, modalities and functions.

Image: 
Rob Felt, Georgia Tech)

RESEARCH TRIANGLE PARK, N.C. -- A U.S. Army project took a new approach to developing robots -- researchers built robots entirely from smaller robots known as smarticles, unlocking the principles of a potentially new locomotion technique.

Researchers at Georgia Institute of Technology and Northwestern University published their findings in the journal Science Robotics.

The research could lead to robotic systems capable of changing their shapes, modalities and functions, said Sam Stanton, program manager, complex dynamics and systems at the Army Research Office, an element of U.S. Army Combat Capabilities Development Command's Army Research Laboratory, the Army's corporate research laboratory.

"For example, as envisioned by the Army Functional Concept for Maneuver, a robotic swarm may someday be capable of moving to a river and then autonomously forming a structure to span the gap," he said.

The 3D-printed smarticles -- short for smart active particles -- can do just one thing: flap their two arms. But when five of these smarticles are confined in a circle, they begin to nudge one another, forming a robophysical system known as a "supersmarticle" that can move by itself. Adding a light or sound sensor allows the supersmarticle to move in response to the stimulus -- and even be controlled well enough to navigate a maze.

The notion of making robots from smaller robots -- and taking advantage of the group capabilities that arise by combining individuals -- could provide mechanically based control over very small robots. Ultimately, the emergent behavior of the group could provide a new locomotion and control approach for small robots that could potentially change shapes.

"These are very rudimentary robots whose behavior is dominated by mechanics and the laws of physics," said Dan Goldman, a Dunn Family Professor in the School of Physics at the Georgia Institute of Technology and the project's principal investigator. "We are not looking to put sophisticated control, sensing and computation on them all. As robots become smaller and smaller, we'll have to use mechanics and physics principles to control them because they won't have the level of computation and sensing we would need for conventional control."

The foundation for the research came from an unlikely source: a study of construction staples. By pouring these heavy-duty staples into a container with removable sides, former doctoral student Nick Gravish -- now a faculty member at the University of California San Diego -- created structures that would stand by themselves after the container's walls were removed.

Shaking the staple towers eventually caused them to collapse, but the observations led to a realization that simple entangling of mechanical objects could create structures with capabilities well beyond those of the individual components.

"Dan Goldman's research is identifying physical principles that may prove essential for engineering emergent behavior in future robot collectives as well as new understanding of fundamental tradeoffs in system performance, responsiveness, uncertainty, resiliency and adaptivity," Stanton said.

The researchers used a 3D printer to create battery-powered smarticles, which have motors, simple sensors and limited computing power. The devices can change their location only when they interact with other devices while enclosed by a ring.

"Even though no individual robot could move on its own, the cloud composed of multiple robots could move as it pushed itself apart and shrink as it pulled itself together," Goldman said. "If you put a ring around the cloud of little robots, they start kicking each other around and the larger ring -- what we call a supersmarticle -- moves around randomly."

The researchers noticed that if one small robot stopped moving, perhaps because its battery died, the group of smarticles would begin moving in the direction of that stalled robot. The researchers learned to could control the movement by adding photo sensors to the robots that halt the arm flapping when a strong beam of light hits one of them.

"If you angle the flashlight just right, you can highlight the robot you want to be inactive, and that causes the ring to lurch toward or away from it, even though no robots are programmed to move toward the light," Goldman said. "That allowed steering of the ensemble in a very rudimentary, stochastic way."

In future work, Goldman envisions more complex interactions that use the simple sensing and movement capabilities of the smarticles. "People have been interested in making a certain kind of swarm robots that are composed of other robots," he said. "These structures could be reconfigured on demand to meet specific needs by tweaking their geometry."

Swarming formations of robotic systems could be used to enhance situational awareness and mission-command capabilities for small Army units in difficult-to-maneuver environments like cities, forests, caves or other rugged terrain.

Credit: 
U.S. Army Research Laboratory

Study shows pre-disaster collaboration key to community resilience

New Orleans, LA - LSU Health New Orleans-led research reports that the key to improving community resiliency following disasters is a dynamic partnership between community-based organizations and public health agencies established pre-disaster. The results are published in the American Journal of Public Health available here.

"Promoting community resilience to disasters has recently become a national public health priority," notes Benjamin Springgate, MD, MPH, Chief of Community and Population Medicine at LSU Health New Orleans. "This is especially important in South Louisiana, at risk for both natural and man-made disasters, and its vulnerable populations."

The Community Resilience Learning Collaborative and Research Network (C-LEARN) is a multiphase study examining opportunities to improve community resilience to the threats of disaster in South Louisiana. Although community and faith-based organizations are trusted and often fill vital roles when local, state or federal response to disasters is delayed or inadequate, members of these organizations feel that local health authorities do not include them in pre-disaster planning.

During Phase I of the study, the researchers interviewed 48 participants from 12 parishes who were employees or volunteers at community-based organizations focused on health, social services or community development. Participants represented 47 agencies that provide primary care, housing and homelessness services, social services and advocacy, faith-based services including spiritual, social and cultural needs, consulting, funding and education. Key themes included maintaining continuous, effective communication and year-round network building; forging pre-disaster strategic partnerships; providing appropriate education and training; and building an integrated system that enables rapid disaster response.

"Many of those we interviewed do not specialize in disaster management, yet their firsthand experience in disaster response after hurricanes Katrina and Rita, the BP Oil Disaster and the 2016 Great Flood in Baton Rouge offered invaluable insights," adds Dr. Springgate, who is also the principle investigator of C-LEARN.

One of the new insights participants revealed was that preventive coordination of community members, faith-based organizations, nonprofits, academic institutions, hospitals, police, public health services, neighborhood associations and government agencies contributes to planning and response systems that react to disasters quickly, equitably and effectively.

The authors conclude, "Results of this study indication that to most effectively bolster community resilience in disaster-prone areas, community-based organizations and public health agencies must maintain continuous, effective communication and year-round network building, participate in partnerships before a disaster strikes, provide appropriate education and training and contribute to building an integrated system that enables rapid disaster response."

"By strengthening interagency relationships between sectors, we are now conducting Phase II of our Community Resilience Learning Collaborative and Research Network study testing whether agencies are better equipped to support each other and address their communities' diverse needs," says Springgate.

Credit: 
Louisiana State University Health Sciences Center

Unlock your smartphone with earbuds

image: A University at Buffalo-led research team is developing EarEcho, a biometric tool that uses modified wireless earbuds to authenticate smartphone users via the unique geometry of their ear canal.

Image: 
University at Buffalo

BUFFALO, N.Y. -- Visit a public space. Chances are you'll see people wearing earbuds or earphones.

The pervasiveness of this old-meets-new technology, especially on college campuses, intrigued University at Buffalo computer scientist Zhanpeng Jin.

"We have so many students walking around with speakers in their ears. It led me to wonder what else we could do with them," says Jin, PhD, associate professor in the Department of Computer Science and Engineering in the UB School of Engineering and Applied Sciences.

That curiosity has led to EarEcho, a biometric tool a research team led by Jin is developing that uses modified wireless earbuds to authenticate smartphone users via the unique geometry of their ear canal.

A prototype of the system, described in this month's Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, a journal published quarterly by the Association for Computing Machinery, proved roughly 95% effective.

UB's Technology Transfer office has filed a provisional patent application for the technology.

How EarEcho works

The team built the prototype with off-the-shelf products, including a pair of in-ear earphones and a tiny microphone. Researchers developed acoustic signal processing techniques to limit noise interference, and models to share information between EarEcho's components.

When a sound is played into someone's ear, the sound propagates through and is reflected and absorbed by the ear canal -- all of which produce a unique signature that can be recorded by the microphone.

"It doesn't matter what the sound is, everyone's ears are different and we can show that in the audio recording," says Jin. "This uniqueness can lead to a new way of confirming the identity of the user, equivalent to fingerprinting."

The information gathered by the microphone is sent by the earbuds' Bluetooth connection to the smartphone where it is analyzed.

To test the device, 20 subjects listened to audio samples that included a variety of speech, music and other content. The team conducted tests in different environmental settings (on the street, in a shopping mall, etc.) and with the subjects in different positions (sitting, standing, head tilted, etc.).

EarEcho proved roughly 95 percent effective when given 1 second to authenticate the subjects. The score improved to 97.5 percent when it continued to monitor the subject in 3 second windows.

How EarEcho can be used

Theoretically, users could rely on EarEcho to unlock their smartphones, thereby reducing the need for passcodes, fingerprints, facial recognition and other biometrics.

But Jin sees its greatest potential use in continuously monitoring a smartphone user. EarEcho, which works when users are listening to their earbuds, is a passive system, meaning users need not take any action, such as submitting a fingerprint or voice command, for it to work, he says.

Such a system, he argues, is ideal for situations where users are required to verify their identity such as making mobile payments. It also could eliminate the need to re-enter passcodes or fingerprints when a phone locks up after not being used.

"Think about that," says Jin, "just by wearing the earphones, which many people already do, you wouldn't have to do anything to unlock your phone."

Credit: 
University at Buffalo

Learning to read boosts the visual brain

image: Illiterate people in India learning to read

Image: 
Falk Huettig

Reading is a recent invention in the history of human culture--too recent for dedicated brain networks to have evolved specifically for it. How, then, do we accomplish this remarkable feat? As we learn to read, a brain region known as the 'visual word form area' (VWFA) becomes sensitive to script (letters or characters). However, some have claimed that the development of this area takes up (and thus detrimentally affects) space that is otherwise available for processing culturally relevant objects such as faces, houses or tools.

An international research team led by Falk Huettig (MPI and Radboud University Nijmegen) and Alexis Hervais-Adelman (MPI and University of Zurich) set out to test the effect of reading on the brain's visual system. The team scanned the brains of over ninety adults living in a remote part of Northern India with varying degrees of literacy (from people unable to read to skilled readers), using functional Magnetic Resonance Imaging (fMRI). While in the scanner, participants saw sentences, letters, and other visual categories such as faces.

If learning to read leads to 'competition' with other visual areas in the brain, readers should have different brain activation patterns from non-readers--and not just for letters, but also for faces, tools, or houses. 'Recycling' of brain networks when learning to read has previously been thought to negatively affect evolutionary old functions such as face processing. Huettig and Hervais-Adelman, however, hypothesised that reading, rather than negatively affecting brain responses to non-orthographic (non-letter) objects, may, conversely, result in increased brain responses to visual stimuli in general.

"When we learn to read, we exploit the brain's capacity to form category-selective patches in visual brain areas. These arise in the same cortical territory as specialisations for other categories that are important to people, such as faces and houses. A long-standing question has been whether learning to read is detrimental to those other categories, given that there is limited space in the brain", explains Alexis Hervais-Adelman.

Reading-induced recycling did not detrimentally affect brain areas for faces, houses, or tools--neither in location nor size. Strikingly, the brain activation for letters and faces was more similar in readers than in non-readers, particularly in the left hemisphere (the left ventral temporal lobe).

"Far from cannibalising the territory of its neighbours, the visual word form area (VWFA) is rather overlaid upon these, remaining responsive to other visual categories", explains Falk Huettig. "Thus learning to read is good for you", he concludes. "It sharpens visual brain responses beyond reading and has a general positive impact on your visual system".

Credit: 
Max Planck Institute for Psycholinguistics

How sleepless nights compromise the health of your gut

It is well known that individuals who work night-shifts, or travel often across different time zones, have a higher tendency to become overweight and suffer from gut inflammation. The underlying cause for this robust phenomenon has been the subject of many studies that tried to relate physiological processes with the activity of the brain's circadian clock, which is generated in response to the daylight cycle.

Now, the group of Henrique Veiga-Fernandes, at the Champalimaud Centre for the Unknown in Lisbon, Portugal, discovered that the function of a group of immune cells, which are known to be strong contributors to gut health, is directly controlled by the brain's circadian clock. Their findings were published today in the scientific journal Nature.

"Sleep deprivation, or altered sleep habits, can have dramatic health consequences, resulting in a range of diseases that frequently have an immune component, such as bowel inflammatory conditions", says Veiga-Fernandes, the principal investigator who led the study. "To understand why this happens, we started by asking whether immune cells in the gut are influenced by the circadian clock."

The big clock and the little clock

Almost all cells in the body have an internal genetic machinery that follows the circadian rhythm through the expression of what are commonly known as "clock genes". The clock genes work like little clocks that inform cells of the time of day and thereby help the organs and systems that the cells make up together, anticipate what is going to happen, for instance if it's time to eat or sleep.

Even though these cell clocks are autonomous, they still need to be synchronised in order to make sure that "everyone is on the same page". "The cells inside the body don't have direct information about external light, which means that individual cell clocks can be off", Veiga-Fernandes explains. "The job of the brain's clock, which receives direct information about daylight, is to synchronise all of these little clocks inside the body so that all systems are in synch, which is absolutely crucial for our wellbeing".

Among the variety of immune cells that are present in the intestine, the team discovered that Type 3 Innate Lymphoid Cells (ILC3s) were particularly susceptible to perturbations of their clock genes. "These cells fulfill important functions in the gut: they fight infection, control the integrity of the gut epithelium and instruct lipid absorption", explains Veiga-Fernandes. "When we disrupted their clocks, we found that the number of ILC3s in the gut was significantly reduced. This resulted in severe inflammation, breaching of the gut barrier, and increased fat accumulation."

These robust results drove the team to investigate why is the number of ILC3s in the gut affected so strongly by the brain's circadian clock. The answer to this question ended up being the missing link they were searching for.

It's all about being in the right place at the right time

When the team analysed how disrupting the brain's circadian clock influenced the expression of different genes in ILC3s, they found that it resulted in a very specific problem: the molecular zip-code was missing! It so happens that in order to localise to the intestine, ILC3s need to express a protein on their membrane that works as a molecular zip-code. This 'tag' instructs ILC3s, which are transient residents in the gut, where to migrate. In the absence of the brain's circadian inputs, ILC3s failed to express this tag, which meant they were unable to reach their destination.

According to Veiga-Fernandes, these results are very exciting, because they clarify why gut health becomes compromised in individuals who are routinely active during the night. "This mechanism is a beautiful example of evolutionary adaptation", says Veiga-Fernandes. "During the day's active period, which is when you feed, the brain's circadian clock reduces the activity of ILC3s in order to promote healthy lipid metabolism. But then, the gut could be damaged during feeding. So after the feeding period is over, the brain's circadian clock instructs ILC3s to come back into the gut, where they are now needed to fight against invaders and promote regeneration of the epithelium."

"It comes as no surprise then", he continues, "that people who work at night can suffer from inflammatory intestinal disorders. It has all to do with the fact that this specific neuro-immune axis is so well-regulated by the brain's clock that any changes in our habits have an immediate impact on these important, ancient immune cells."

This study joins a series of groundbreaking discoveries produced by Veiga-Fernandes and his team, all drawing new links between the immune and nervous systems. "The concept that the nervous system can coordinate the function of the immune system is entirely novel. It has been a very inspiring journey; the more we learn about this link, the more we understand how important it is for our wellbeing and we are looking forward to seeing what we will find next", he concludes.

Credit: 
Champalimaud Centre for the Unknown

A promising HIV vaccine shows signs of cross-protective benefits

video: The journey towards an effective HIV vaccine. This material relates to a paper that appeared in the Sep. 18, 2019, issue of Science Translational Medicine, published by AAAS. The paper, by G.E. Gray at University of the Witwatersrand in Johannesburg; South Africa; and colleagues was titled, "Immune correlates of the Thai RV144 HIV vaccine regimen in South Africa."

Image: 
©South African Medical Research Council, Produced by JP Crouch for Blue Pear Visuals

One of the most successful candidate HIV vaccines to date - initially tested in Thailand, where it had modest effects - showed surprisingly strong efficacy when evaluated in a South African cohort, where a different strain of HIV is known to circulate. The research hints the RV144 vaccine regimen could provide protection against multiple strains of HIV, whose genetic diversity is a challenge for vaccine strategies. Scientists have attempted to create a vaccine for HIV that can provide long-lasting protection, but most candidates have failed to provide substantial benefits in early clinical trials. The RV144 vaccine granted modest protection against the clade B HIV subtype in a clinical trial in Thailand. However, it is unclear whether RV144 could provide benefits to people living in regions such as South Africa that are dominated by different clades of the virus. To investigate, Glenda Gray and colleagues compared immunological data from the RV144 study to an analysis of 100 HIV-negative South Africans who were given the same vaccine in a phase 1b trial. Surprisingly, they discovered that the RV144 regimen stimulated even stronger immune responses in the South Africans while being well-tolerated. Specifically, the vaccine elicited CD4+ T cell and anti-HIV antibody responses that are associated with protection against HIV, irrespective of sex, age or locale. The authors also observed that immune responses waned over time in both studies, suggesting that additional booster doses could help maintain the vaccine's efficacy. More research is needed to see if the vaccine grants protection from infection, but the findings indicate that RV144 could be more adaptable across endemic regions than previously thought.

Credit: 
American Association for the Advancement of Science (AAAS)

Wilderness areas halve extinction risk

image: Areas surrounding the Madidi National Park in the Bolivian Amazon has been identified as a vital 'at risk' wilderness area.

Image: 
Areas surrounding the Madidi National Park in the Bolivian Amazon has been identified as a vital 'at risk' wilderness area.

The global conservation community has been urged to adopt a specific target to protect the world's remaining wilderness areas to prevent large scale loss of at-risk species.

A University of Queensland and CSIRO study has found that wilderness areas - where human impact is minimal or absent - halves the global risk of species extinction.

UQ Centre for Biodiversity and Conservation Science Director Professor James Watson said vital wilderness areas could not be restored so urgent action was needed to ensure these areas were marked for conservation and remained protected.

"Wilderness areas have decreased by more than three million square kilometres - half the size of Australia - since the 1990s," Professor Watson said.

"Once these wilderness areas are gone, they are lost forever."

CSIRO researcher and UQ Adjunct Fellow Dr Moreno Di Marco said wilderness areas acted as a buffer against extinction risk, and the risk of species loss was more than twice as high for biological communities found outside wilderness areas.

"This new research has identified the importance of wilderness areas in hosting highly unique biological communities and representing the only remaining natural habitats for species that have suffered losses elsewhere," he said.

Vital 'at risk' wilderness areas include parts of Arnhem Land, areas surrounding the Madidi National Park in the Bolivian Amazon, partially protected forests in Southern British Columbia, and surrounding savannah areas within the Zemongo Reserve in the Central African Republic.

The researchers used new global biodiversity modelling infrastructure developed at CSIRO integrated with the latest wilderness map developed by UQ, University of Northern British Colombia and the Wildlife Conservation Society.

The study provided fine-scale estimates of probability of species loss around the globe.

Professor Watson said that beyond saving biodiversity, Earth's remaining intact ecosystems are critical in also abating climate change, regulating essential biogeochemical and water cycles, and ensuring the retention of long-term bio-cultural connections of indigenous communities.

Credit: 
University of Queensland

Brain tumors form synapses with healthy neurons, Stanford-led study finds

Scientists at the Stanford University School of Medicine have shown for the first time that severe brain cancers integrate into the brain's wiring.

The tumors, called high-grade gliomas, form synapses that hijack electrical signals from healthy nerve cells to drive their own growth. Experiments demonstrated that interrupting these signals with an existing anti-epilepsy drug greatly reduced the cancers' growth in human tumors in mice, providing the first evidence for a possible new way to treat gliomas.

A paper describing the findings will be published online Sept. 18 in Nature.

"One of the most lethal aspects of high-grade gliomas is that the cancer cells diffusely invade normal brain tissue so that the tumor and the healthy brain tissue are knitted together," said senior author Michelle Monje, MD, PhD, associate professor of neurology and neurological sciences. The discovery helps explain why gliomas are so intractable, she added. "This is such an insidious group of tumors. They're actually integrating into the brain."

The study's lead author is postdoctoral scholar Humsa Venkatesh, PhD.

Discovering that tumors wire themselves into the brain was "unsettling," Monje said. Still, she said she is optimistic about what the knowledge means for glioma patients. Several drugs already exist for treating electrical-signaling disorders such as epilepsy, and these may prove useful for gliomas, she said. "There is real hopefulness to this discovery," she said. "We've been missing this entire aspect of the disease. Now we have a whole new avenue to explore, one that could complement existing therapeutic approaches."

How the tumors grow

High-grade gliomas form synapses with healthy neurons that transmit electrical signals to the cancerous tissue, the study found. The tumors also contain cell-to-cell electrical connections known as gap junctions. Together, the two types of connections allow electrical signals from healthy nerve cells to be conducted into and amplified within the tumors.

High-grade gliomas include glioblastoma, a brain tumor seen in adults that has a five-year survival rate of 5%; diffuse intrinsic pontine glioma, a pediatric brain tumor with a five-year survival rate below 1%; and other diagnoses such as pediatric glioblastoma and diffuse midline gliomas occurring in the spinal cord and thalamus. Studies published by Monje's team in 2015 and 2017 indicated that high-grade gliomas use normal brain activity to drive their growth.

To learn how this worked, the scientists first analyzed the gene expression of thousands of individual cancer cells biopsied from newly diagnosed glioma patients. The cancer cells strongly increased the expression of genes involved in forming synapses.

The researchers then used electron microscopy, a technique that can reveal tiny details of cell anatomy, to show that structures that look like synapses exist between neurons and glioma cells. To confirm that these synapses indeed connect healthy neurons and malignant glioma cells, the scientists studied mice with cells from human gliomas implanted in their brains. After the glioma tumors had become established, the researchers used antibodies that bound to fluorescent markers expressed by the cancer cells to confirm that synapses go into malignant cells. "We saw very clear neuron-to-glioma synaptic structures," Monje said.

Using brain tissue from mice with human gliomas, the researchers measured the transmission of electrical signals into and through the tumors. They recorded two types of electrical signals: brief signals lasting four to five milliseconds, which are transmitted across a synaptic junction from a healthy neuron to a cancer cell by way of neurotransmitter molecules; and sustained electrical signals lasting one to two seconds that reflect electrical current propagated by a flux of potassium ions across the tumor cells' membranes. The potassium currents are caused by signals from neurons and are amplified by gap junctions that connect the cancer cells in an electrically coupled network.

The scientists also conducted experiments using a dye to visualize the gap-junction-connected cells, and used drugs capable of blocking gap junctions to confirm that this type of junction existed between the tumor cells and mediated their electrical coupling. Further experiments measuring changes in calcium levels confirmed that the tumor cells are electrically coupled via gap junctions.

"The live calcium imaging made it strikingly clear that this cancer is an electrically active tissue," said Venkatesh, the lead author. "It was startling to see that in cancer tissue."

The researchers showed that about 5-10% of glioma cells receive synaptic signals, and about 40% exhibit prolonged potassium currents that are amplified by gap junction interconnections such that half of all tumor cells have some type of electrical response to signals from healthy neurons.

Possible drug therapies

In humans who were having the electrical activity in their brains measured before surgery to remove glioblastoma tumors, and in mice with human gliomas, the researchers saw hyper-excitability of healthy neurons near the tumors, a finding that could help explain why human glioma patients are prone to seizures.

Using optogenetic techniques, which relied on laser light to activate the cancer cells in mice implanted with human gliomas, the researchers demonstrated that increasing electrical signals into the tumors caused more tumor growth. Proliferation of the tumors was largely prevented when glioma cells expressed a gene that blocked transmission of the electrical signals.

Existing drugs that block electrical currents also reduced growth of high-grade gliomas, the research found. A seizure medication called perampanel, which blocks activity of neurotransmitter receptors on the receiving end of a synapse, reduced proliferation of pediatric gliomas implanted into mice by 50%. Meclofenamate, a drug that blocks the action of gap junctions, resulted in a similar decrease in tumor proliferation.

Monje's team plans to continue investigating whether blocking electrical signaling within tumors could help people with high-grade gliomas. "It's a really hopeful new direction, and as a clinician I'm quite excited about it," she said.

Credit: 
Stanford Medicine

Gigantic asteroid collision boosted biodiversity on Earth

image: This is an illustration of an asteroid collision.

Image: 
Don Davis

An international study led by researchers from Lund University in Sweden has found that a collision in the asteroid belt 470 million years ago created drastic changes to life on Earth. The breakup of a major
asteroid filled the entire inner solar system with enormous amounts of dust leading to a unique ice age and, subsequently, to higher levels of biodiversity. The unexpected discovery could be relevant for tackling global warming if we fail to reduce carbon dioxide emissions.

In the last few decades, researchers have begun to understand that evolution of life on Earth also depends on astronomical events. One example of this is when the dinosaurs were wiped out instantaneously by the Cretaceous-Paleogene impact of a 10 km asteroid.

For the first time, scientists can now present another example of how an extraterrestrial event formed life on Earth. 470 million years ago, a 150 km asteroid between Jupiter and Mars was crushed, and the dust spread through the solar system.

The blocking effect of the dust partially stopped sunlight reaching Earth and an ice age began. The climate changed from being more or less homogeneous to becoming divided into climate zones - from Arctic conditions at the poles, to tropical conditions at the equator.

The high diversity among invertebrates came as an adaptation to the new climate, triggered by the exploded asteroid.

"It is analogous to standing the middle of your living room and smashing a vacuum cleaner bag, only at a much larger scale", explains Birger Schmitz, professor of geology at Lund University and the leader of the study.

An important method that led to the discovery was the measurements of extraterrestrial helium incorporated in the petrified sea floor sediments at Kinnekulle in southern Sweden. On its way to Earth, the dust was enriched with helium when bombarded by the solar wind.

"This result was completely unexpected. We have during the last 25 years leaned against very different hypotheses in terms of what happened. It wasn't until we got the last helium measurements that everything fell into place" says Birger Schmitz.

Global warming continues as a consequence of carbon dioxide emissions and the temperature rise is greatest at high latitudes. According to the Intergovernmental Panel on Climate Change, we are approaching a situation that is reminiscent of the conditions that prevailed prior to the asteroid collision 470 million years ago.

The last decade or so, researchers have discussed different artificial methods to cool the Earth in case of a major climate catastrophe. Modelers have shown that it would be possible to place asteroids, much like satellites, in orbits around Earth in such a way that they continuously liberate fine dust and hence partly block the warming sunlight.

"Our results show for the first time that such dust at times has cooled Earth dramatically. Our studies can give a more detailed, empirical based understanding of how this works, and this in turn can be used to evaluate if model simulations are realistic", concludes Birger Schmitz.

Credit: 
Lund University

Babies' gut bacteria affected by delivery method, Baby Biome project shows

Babies born vaginally have different gut bacteria - their microbiome - than those delivered by Caesarean, research has shown. Scientists from the Wellcome Sanger Institute, UCL, the University of Birmingham and their collaborators discovered that whereas vaginally born babies got most of their gut bacteria from their mother, babies born via caesarean did not, and instead had more bacteria associated with hospital environments in their guts.

The exact role of the baby's gut bacteria is unclear and it isn't known if these differences at birth will have any effect on later health. The researchers found the differences in gut bacteria between vaginally born and caesarean delivered babies largely evened out by 1 year old, but large follow-up studies are needed to determine if the early differences influence health outcomes. Experts from the Royal College of Obstetricians and Gynaecologists say that these findings should not deter women from having a caesarean birth.

Published in Nature today (18th Sept), this largest ever study of neonatal microbiomes also revealed that the microbiome of vaginally delivered newborns did not come from the mother's vaginal bacteria, but from the mother's gut. This calls into question the controversial practice of swabbing babies born via caesarean with mother's vaginal bacteria. Understanding how the birth process impacts on the baby's microbiome will enable future research into bacterial therapies.

The gut microbiome is a complex ecosystem of millions of microbes, and is thought to be important for the development of the immune system. Lack of exposure to the right microbes in early childhood has been implicated in autoimmune diseases such as asthma, allergies and diabetes. However, it is not fully understood how important the initial gut microbiome is to the baby's immune system development and health, how a baby's microbiome develops, or what happens to it with different modes of birth.

To understand more about the development of the microbiome, and if the delivery method affected this, researchers studied 1,679 samples of gut bacteria from nearly 600 healthy babies and 175 mothers. Faecal samples were taken from babies aged four, seven or 21 days old, who had been born in UK hospitals by vaginal delivery or caesarean. Some babies were also followed up later, up to one year of age.

Using DNA sequencing and genomics analysis, the researchers could see which bacteria were present and found there was a significant difference between the two delivery methods. They discovered that vaginally delivered babies had many more health-associated (commensal) bacteria from their mothers, than babies who were born by caesarean.

Dr Trevor Lawley, a senior author on the paper from the Wellcome Sanger Institute, said: "This is the largest genomic investigation of newborn babies' microbiomes to date. We discovered that the mode of delivery had a great impact on the gut bacteria of newborn babies, with transmission of bacteria from mother to baby occurring during vaginal birth. Further understanding of which species of bacteria help create a healthy baby microbiome could enable us to create bacterial therapies."

Previous limited studies had suggested that vaginal bacteria were swallowed by the baby on its way down the birth canal. However, this large-scale study found babies had very few of their mother's vaginal bacteria in their guts, with no difference between babies born vaginally or by caesarean.

During birth, the baby will come into contact with bacteria from the mother's gut. The study discovered it was the mother's gut bacteria that made up much of the microbiome in the vaginally delivered babies. Babies born via caesarean had many fewer of these bacteria. This study therefore found no evidence to support controversial 'vaginal swabbing' practices, which could transfer dangerous bacteria to the baby.

In place of some of the mother's bacteria, the babies born via caesarean had more bacteria that are typically acquired in hospitals, and were more likely to have antimicrobial resistance. The researchers isolated, grew and sequenced the genomes of more than 800 of these potentially pathogenic bacteria, confirming that they were the same as strains causing bloodstream infections in UK hospitals. Although these bacteria don't usually cause disease while in the gut, they can cause infections if they get into the wrong place or if the immune system fails.

Dr Nigel Field, a senior author on the paper from UCL, said: "Our study showed that as the babies grow and take in bacteria when they feed and from everything around them, their gut microbiomes become more similar to each other. After they have been weaned, the microbiome differences between babies born via caesarean and delivered vaginally have mainly evened out. We don't yet know whether the initial differences we found will have any health implications."

Dr Alison Wright, Consultant Obstetrician and Vice President of The Royal College of Obstetricians and Gynaecologists said: "In many cases, a Caesarean is a life-saving procedure, and can be the right choice for a woman and her baby. The exact role of the microbiome in the newborn and what factors can change it are still uncertain, so we don't think this study should deter women from having a caesarean. This study shows that more research is required to improve our understanding of this important area."

All women who have a caesarean are now offered antibiotics before the delivery to help prevent the mother developing postoperative infections, meaning that the baby also receives a dose of antibiotics via the placenta. This could also cause some of the microbiome differences seen between the two birth methods.

Principal Investigator of the Baby Biome Study, Professor Peter Brocklehurst, of the University of Birmingham, said: "The first weeks of life are a critical window of development of the baby's immune system, but we know very little about it. We urgently need to follow up this study, looking at these babies as they grow to see if early differences in the microbiome lead to any health issues. Further studies will help us understand the role of gut bacteria in early life and could help us develop therapeutics to create a healthy microbiome."

Credit: 
Wellcome Trust Sanger Institute

Shifting the focus of climate-change strategies may benefit younger generations

Strategies to limit climate change that focus on warming in the next couple of decades would leave less of a burden for future generations.

Research led by Imperial College London and the International Institute for Applied Systems Analysis (IIASA), Austria, suggests a new underpinning logic for strategies that seek to limit climate change. Their new proposal is published today in Nature.

Most strategies seek to limit climate change by the year 2100. The strategies may include tactics such as deployment of new renewable technologies, removing carbon from the atmosphere (through planting trees or new technologies), or mandating energy efficiency targets.

However, by focusing on the year 2100, these strategies are inconsistent with the Paris Agreement climate goal - to keep warming below 2°C, and ideally below 1.5°C, at any time in the future.

Strategies that focus on the year 2100 could allow potentially dangerous warming to happen in the short term - in the next couple of decades - and then rely on removing carbon dioxide from the atmosphere in later decades to reach the overall targets by 2100.

These strategies place a burden of investment on later generations, and also rely on carbon removal technologies being widely available, which is in no way certain and thus a risky approach.

Instead, the team suggests climate change strategies should consider when maximum warming will occur, what that level of warming should be, and whether warming is stabilised afterwards, or efforts are made to slowly reverse it.

The researchers suggest it is more sensible, and fairer, to limit warming faster before 2050 and rely less on unproven technologies and investment by future generations - or at least make these intergenerational value judgments explicit when designing climate change strategies.

Lead researcher Dr Joeri Rogelj, from the Grantham Institute at Imperial and the IIASA, said: "When climate-change strategies were first proposed, more than 20 years ago, the planet had only warmed about 0.5°C, so there was time for a long, smooth transition to energy systems and economies that kept warming below 2°C by 2100.

"Now, however, we are at around 1°C warming and science of the last decade has shown that 2°C cannot be considered a safe limit. The need to stabilise warming more quickly is paramount, and therefore we suggest a focus on reaching net zero carbon emissions as a key milestone of any climate strategy.

"Turning the focus from the far future to the next decades, where push will come to shove in terms of adequate climate action, will help us reach the Paris Agreement goals without placing undue burden on future generations."

Net zero carbon emissions is when a region (such as a city or country) balances the carbon they emit with the carbon they remove - often by methods such as by planting trees or deploying technologies that capture and store carbon underground.

The research team suggests this benchmark should be the focus of climate change efforts in the short term, to limit warming that occurs in the next couple of decades and until it is stabilised.

From net zero carbon, countries could then decide their strategy based on how much they need to further reduce their global warming contributions through added carbon removal.

Dr Rogelj said: "Shifting the focus to more short-term warming will underpin the next assessments by the Intergovernmental Panel on Climate Change (IPCC), and we hope it will also help policymakers formulate realistic strategies.

"Policymakers want to know how and when we can reach net zero carbon, and our new logic for strategies could make these questions answerable."

Credit: 
Imperial College London

Undervalued wilderness areas can cut extinction risk in half

image: The research showed some wilderness areas, such as areas surrounding Madidi National Park in the Bolivian Amazon, play an extraordinary role in their respective regional contexts, where their loss would drastically reduce the probability of persistence of biodiversity.

Image: 
Rob Wallace/WCS

Wilderness areas, long known for intrinsic conservation value, are far more valuable for biodiversity than previously believed, and if conserved, will cut the world's extinction risk in half, according to a new study published in the journal Nature.

Wilderness areas - where human impact has been absent or minimal - are dwindling. The latest maps show over 3 million square kilometres (1.15 million square miles) of wilderness destroyed since the 1990s (an area the size of India), and that less than 20 percent of the world can still be called wilderness. Many of these areas are found outside of national parks and other protected areas. Until know, the direct benefits of wilderness for stopping species extinction were largely unknown.

By taking advantage of the new global biodiversity modelling infrastructure "BILBI" developed at CSIRO, which is able to provide fine-scale estimates of probability of species loss around the globe, and integrating this with the latest human footprint map generated by the University of Queensland (UQ), University of Northern British Colombia and WCS, a collaboration of scientists demonstrated that today many wilderness areas are critical to prevent the loss of terrestrial species in many areas of the world.

"Wilderness areas clearly act as a buffer against extinction risk, the risk of species loss is over twice as high for biological communities found outside wilderness areas. But wilderness habitat makes an even larger contribution, as some species can occur both inside and outside wilderness; this habitat is essential to support the persistence of many species that otherwise live in degraded environmental conditions," said Moreno Di Marco of CSIRO Land and Water, the lead author of the study.

The research showed the differing roles of wilderness around the world, with some areas playing an extraordinary role in their respective regional contexts, where their loss would drastically reduce the probability of persistence of biodiversity. Examples of such areas include parts of the Arnhem Land in Australia (covered by several indigenous protected areas), areas surrounding Madidi National Park in the Bolivian Amazon, forests in southern British Columbia (which are only partly protected), and savannah areas inside and outside the Zemongo reserve in the Central African Republic.

Said the paper's senior author, James Watson of WCS and UQ: "This research provides the evidence for how essential it is for the global conservation community specifically target protecting Earth's remaining wilderness. These places are getting decimated and beyond crucial for saving biodiversity, they are essential for abating climate change, for regulation of biogeochemical and water cycles, and are ensuring the long-term bio-cultural connections of indigenous communities."

The authors argue that a strategic expansion of the global protected-area estate is needed to preserve the irreplaceable wilderness areas that are most at risk, alongside national land-use legislation and the enforcement of business standards for stopping industrial footprints within intact ecosystems. The value of wilderness in the international biodiversity agenda can be no longer understated if nations are truly committed to achieving the Sustainable Development Goals.

WCS has protected the planet's most critical natural strongholds for over a century, leading with an effective science-driven model that is shaped and championed by communities.

Credit: 
Wildlife Conservation Society

Planned roads would be 'dagger in the heart' for Borneo's forests and wildlife

Malaysia's plans to create a Pan-Borneo Highway will severely degrade one of the world's most environmentally imperilled regions, says a research team from Australia and Malaysia.

"This network of highways will cut through some of the last expanses of intact forest in Borneo, greatly increasing pressures from loggers, poachers, farmers and oil-palm plantations," said Professor Bill Laurance, project leader from James Cook University in Australia.

"This would be a nightmare for endangered species such as the Bornean orangutan, clouded leopard and dwarf elephant," said Professor Laurance.

The study focused on new planned highways in the Malaysian state of Sabah, in the north of Borneo, the world's third-largest island.

"Some of the planned highways are relatively benign, but several are flat-out dangerous," said Dr Sean Sloan, lead author of the study and also from James Cook University. "The worst roads, in southern Sabah, would chop up and isolate Sabah's forests from the rest of those in Borneo."

"Slicing up the forests is toxic for large animals, such as elephants, bearded pigs and sloth bears, that must migrate seasonally to find enough food or otherwise face starvation," said Professor Laurance.

The new roads would also bisect protected areas in northern Borneo, making them vulnerable to illegal poachers and encroachers, say the researchers.

"Borneo already has many 'empty forests' as a result of severe poaching," said co-author Dr Mohammed Alamgir from James Cook University. "The poachers follow the roads into forests and then use snares and automatic rifles to kill the larger animals."

Road planners in Sabah have suggested that wildlife could use bridge-underpasses beneath roads to move and migrate, but the research team says this is unrealistic.

"When you build a new road you typically get a lot of forest destruction and fires, along with poaching, and that means vulnerable wildlife will largely avoid the area," said Dr Alamgir.

"Relying on underpasses to reduce road impacts is like trying to treat cancer with a band-aid," he said.

"This is the opposite of sustainable development," said Professor Laurance. "Clearly, several of the planned roads would be like plunging a dagger into the heart of Borneo's endangered forests and wildlife."

Credit: 
James Cook University

Rethinking scenario logic for climate policy

Current scenarios used to inform climate policy have a weakness in that they typically focus on reaching specific climate goals in 2100 - an approach which may encourage risky pathways that could have long-term negative effects. A new IIASA-led study presents a novel scenario framework that focuses on capping global warming at a maximum level with either temperature stabilization or reversal thereafter.

Scenarios can be seen as stories of possible futures that allow the description of factors that are difficult to quantify, such as those that influence the interconnected energy-economy-environment system. They are useful in terms of providing a way to explore how the future could evolve, and how today's decisions could affect longer-term systemic outcomes. The resulting information is commonly used by policymakers to inform decisions around issues like the measures we should take to limit global warming to well-below 2°C and preferably even 1.5°C relative to preindustrial levels, as stipulated in the Paris Agreement.

According to the authors of the study published in Nature, the vast majority of these scenarios, however, do not actually provide answers to these questions. They explain that while the scientific community has been creating scenarios that try to hit a climate target in 2100, they have been doing so without caring about how much warming we would be committing to over the coming decades, or whether that climate target was temporarily exceeded. This contrasts strongly with the near term climate change impacts and decisions policymakers and society actually care about. This dissonance became clear to the authors through their involvement as leading authors in the scientific assessments of the Intergovernmental Panel on Climate Change. The current approach has also resulted in a large literature of pathways that favor risky strategies with little emissions reductions in the near term, along with a strong reliance on CO2 removal in the second half in the century. In their study, the researchers resolve this issue and present a new scenario logic in which these important societal value judgments have to be explicitly decided on.

"Once we became aware that virtually all scenarios that are used to inform climate policy are biased towards risky pathways that for no good reason put a disproportionally large burden on younger generations, we started looking for a new logic that could resolve the issue. We were looking for a new way of designing long-term climate change mitigation scenarios that are more closely aligned with the intentions of the UN Paris Agreement on Climate Change," explains Joeri Rogelj, a senior researcher with the IIASA Energy Program and lead author of the study.

The study draws on insights from physical science to propose a new simple mitigation scenario framework that focuses on capping global warming at a specific maximum level with either temperature stabilization or reversal thereafter. It makes intergenerational trade-offs regarding the timing and stringency of mitigation action an explicit design criterion and provides a framework in which future CO2 removal deployment can be explored independently of variations in desired climate outcomes in light of social, technological, or ethical concerns. In this regard, the authors focus on three key issues: the time by which global CO2 emissions become net zero, the total amount of CO2 emitted until then, and the amount of CO2 that is annually removed from the atmosphere by human activities in the far future.

In terms of achieving net zero CO2 emissions, however, the authors note that this is not yet sufficient to meet the emission reduction requirements spelled out in the Paris Agreement, where a balance between sinks and sources of all greenhouse gases is required. The study's proposed scenario logic will allow modelers to translate geophysical and political science insights in a quantitative framework and defines how models that simulate the energy-economy-environment system can be used to determine climate change mitigation scenarios in line with current policy discussions. The staged design of the new scenario framework also allows researchers to explore mitigation investment decisions at various points in time, where choices at one time could influence the possibilities available at others.

"This new logic illustrates in a much clearer way how the choices we make as a society about emissions reductions in the next two to three decades determine the maximum level of warming, as well as our reliance on CO2 removal to return global warming to safer levels in the longer term. Interestingly, this logic also shows that once a transformation to a carbon neutral society is achieved, annual investments in the energy sector are pretty much equal to those expected for a world in which we didn't tackle climate change at all. Additional climate change mitigation investments thus play a role in the next decades, but don't have to be a continuous burden on future generations," concludes Rogelj.

Credit: 
International Institute for Applied Systems Analysis