Culture

Sense of smell, pollution and neurological disease connection explored

image: Schematic of the mid-sagittal plane of the mouse skull and brain. The pathway of cerebrospinal fluid flow is shown by blue arrows, flowing around the brain and spinal cord. Cerebrospinal fluid drains from the brain and into the nasal cavity through a bony plate known as the cribriform plate.

Image: 
Patrick Drew, Penn State

A consensus is building that air pollution can cause neurological diseases such as Alzheimer's disease and Parkinson's disease, but how fine, sooty particles cause problems in the brain is still an unanswered question. Now a team of Penn State researchers, using mice, have found a possible way, but more research is still needed.

The researchers looked at how cerebrospinal fluid, the liquid that flows around the brain and spinal cord, flows out through the nose, and what happens when the flow of fluid is stopped.

"There has been a lot of interest in understanding cerebrospinal fluid movement in the last 5 years," said Patrick Drew, Huck Distinguished Associate Professor of Engineering Science and Mechanics, Neurosurgery and Biomedical Engineering. "More and more it is realized that it does not just cushion the brain, but may also transfer stuff out of the brain and spinal column area."

The question, however, is how does the cerebrospinal fluid -- or CSF -- leave the enclosed area of the brain and spinal column and where does it go? Research into old scientific papers indicated that some scientists had speculated that one exit pathway was through the nose.

"I was trying to label cerebrospinal fluid with a dye for another experiment," said Jordan N. Norwood, graduate student in cellular and developmental biology and Drew's student. "We started seeing this dyed cerebrospinal fluid drain out through the nose."

More research into old scientific papers showed that not only had others suggested that the cerebrospinal fluid left through the nose, but that there was a connection to the sense of smell. The researchers also found that there is a long-held connection between loss of smell and the early beginnings of such neurological diseases as Alzheimer's disease and Parkinson's disease.

Using chemical ablation, the researchers destroyed the olfactory sensory nerves that come through the mouse's hard palate. Destruction of these nerves causes loss of the sense of smell, but also caused the flow of cerebrospinal fluid to stop.

"The mice seem normal after we used zinc sulfate to ablate the nerves in the nose," said Drew.

Because the flow of fluid from the nose stopped, the researchers checked to see if the pressure around the brain and in the spinal cord increased.

"Animals and people are constantly making CSF so if it doesn't go out, pressure will go up," said Drew. "But we found that the pressure did not increase after the flow from the nose stopped."

The researchers believe that some other pathway may increase its flow or CSF to compensate for what would normally go out through the nose. These other pathways could include those around the brain that drain into the lymphatic system.

Another possibility is that the production of CSF decreases in response to stoppage of CSF flow through the nose.

The researchers suggest in a recent issue of eLife, "that damage to olfactory sensory neurons (such as from air pollution) could contribute to altered CSF turnover and flow, providing a potential mechanism for neurological disease." They also state that "reduced CSF turnover may be a contributing factor to the buildup of toxic metabolites and proteins that cause neurodegenerative disorders."

Both the effects of pollution and the effects of reduced CSF turnover might explain the origin of some of these diseases.

"By ablating the neurons, we were able to disrupt and disable flow in the nose," said Norwood. "People in areas with heavy air pollution may be breathing stuff that does the same thing as our experiments.

"Next we would like to collaborate with a lab in the Materials Research Institute that is working with soot or jet fuel particles to see if we get the same effect," she added.

Credit: 
Penn State

Researchers discover genetic mutation behind serious skull disorder

CORVALLIS, Ore. - A collaboration led by scientists at Oregon State University, the University of Oxford in the United Kingdom and Erasmus University in The Netherlands has identified a new genetic mutation behind the premature fusing of the bony plates that make up the skull.

The findings are a key step toward preventing a serious cranial condition that affects roughly one child in 2,250, and also toward understanding how the protein the gene encodes works in the development and function of other organ systems such as skin, teeth and the immune system.

In the skull, when one or more of the fibrous joints, called skull sutures, between cranial bones close too soon - a condition known as craniosynostosis - the resulting early plate fusion disrupts proper growth of the skull and brain.

Pressure inside the cranium can lead to a variety of medical problems including impaired vision, respiration and mental function, as well as abnormal head shape. Males are affected at slightly higher rates, and most cases are termed "sporadic" - meaning they occur by chance.

"As an individual grows, sutures are supposed to close gradually, with complete fusion taking place in the third decade of life," said Oregon State researcher Mark Leid. "Proper suture formation, maintenance and ossification require an exquisitely choreographed balance - stem cells and their progeny need to proliferate and differentiate at just the right time."

Leid, professor and interim dean of the OSU College of Pharmacy, and scientists Stephen Twigg of Oxford and Irene Mathijssen of Erasmus University in Rotterdam performed whole-genome sequencing on a male craniosynostosis patient and found a mutation in a gene known as BCL11B.

Neither of the patient's parents had symptoms of craniosynostosis, a family history of the condition, or carried the mutation, which generated a single amino acid change in the BCL11B protein.

The international research group proved that the human patient's mutation was causative for craniosynostosis by utilizing a mouse model harboring the same mutation. Like the human patient, the genetically modified mouse exhibited craniosynostosis at birth.

"Our data demonstrate that the identified amino acid substitution caused craniosynostosis in the patient we studied," Leid said. "The mouse model that we created should be useful in dissecting the mechanisms behind the role of the BCL11B protein in keeping sutures open, as well as the role of the protein in the development and function of other organ systems."

Credit: 
Oregon State University

Fairtrade benefits rural workers in Africa, but not the poorest of the poor

image: This is a farmworker in the cocoa sector of Cote d'Ivoire.

Image: 
J Sellare

A new study from the University of Göttingen and international partners has analysed the effects of Fairtrade certification on poor rural workers in Africa. The results show that Fairtrade improves the situation of employees in agricultural cooperatives, but not of workers in the smallholder farm sector, who are often particularly disadvantaged. The study was published in "Nature Sustainability".

When consumers of cocoa, coffee and other tropical goods, decide to purchase products with the Fairtrade label, they pay a certain premium, expecting to help improve the socioeconomic conditions in developing countries. The study authors wanted to know whether Fairtrade really benefits poor rural workers in Africa. For the study, they collected representative data from 1000 cocoa farmers and workers in 50 different cooperatives in Cote d'Ivoire. Cote d'Ivoire in West Africa is the largest cocoa producer and exporter worldwide.

"Previous studies had analysed the effects of Fairtrade on smallholder farmers, ignoring that these farmers also employ agricultural workers for crop cultivation and harvesting", says Matin Qaim, an agricultural economist at the University of Göttingen. "Workers in the small farm sector constitute a large group. They are often neglected by development initiatives, although they typically belong to the poorest of the poor", he adds.

Fairtrade requires minimum wages and fair labour conditions for workers and employees in certified value chains. "These conditions are met for the employees in cocoa cooperatives. At the cooperative level, Fairtrade requirements are regularly monitored", says Eva-Marie Meemken from Cornell University in the USA. "However, our data show no effects on the livelihoods of farmworkers, even though the farmers themselves benefit from Fairtrade certification. Monitoring the wages and labour conditions on thousands of small farms is costly and therefore rarely done. But it doesn't work without monitoring", Meemken states. "Better solutions have to be found in order to implement the fairness model more comprehensively."

Credit: 
University of Göttingen

Scissors get stuck -- another way bacteria use CRISPR/Cas9

In biotech these days, CRISPR/Cas9 is a hot topic, because of its utility as a precise gene editing tool. Before humans repurposed it, CRISPR/Cas9 was a sort of internal immune system bacteria use to defend themselves against phages, or viruses that infect bacteria, by slicing up the phages' DNA.

Scientists at Emory University School of Medicine and the Max Planck Unit for the Science of Pathogens have found that the "scissors" component of CRISPR/Cas9 sometimes gets stuck.

Cas9, an enzyme that cuts DNA, can also block gene activity without doing any cutting. In the pathogenic bacterium Francisella novicida, Cas9 regulates genes that need to be shut off for the bacteria to cause disease.

The results were published June 27 in Molecular Cell.

Emory microbiologist David Weiss, PhD and colleagues had identified Cas9 several years ago when looking for genes that regulated F. novicida's virulence. F. novicida is a close relative of the bacterium that causes tularemia, and it grows inside mammalian cells. For the current paper elucidating why Cas9 is important for virulence, his lab teamed up with researchers in Germany led by Emmanuelle Charpentier, PhD, whose work on CRISPR/Cas9 led to its use as a gene editing tool.

The researchers found that in F. novicida, Cas9 regulates just four genes, all of which must be turned off so that the bacteria can cause disease. In its DNA scissors/phage defense role, Cas9 is guided by an RNA that is complementary to the target. When Cas9 is acting to block gene activity, Cas9 uses a different RNA guide sequence, which doesn't allow the scissors to cut because of its shorter length.

In other types of bacteria, Cas9 also appears to be important for the ability to cause disease.

"These findings raise the possibility that turning genes on and off may be a broad function of Cas9 in diverse bacteria," says graduate student Hannah Ratner, the first author of the paper. "A question raised by this study is whether the ability of Cas9 to repress transcription can help explain the vast number of unidentified Cas9 targets."

In addition, the researchers were able to re-engineer Cas9 to repress a new target, a gene that makes the bacteria resistant to a last-line antibiotic, re-sensitizing the bacteria to antibiotic treatment.

"The programmability of the same protein for multiple different functions highlights and expands the incredible versatility of Cas9 for genome engineering applications," Ratner says.

Credit: 
Emory Health Sciences

Hubble captures the galaxy's biggest ongoing stellar fireworks show

video: In the mid-1800s, mariners sailing the southern seas navigated at night by a brilliant star in the constellation Carina. The star, named Eta Carinae, was the second brightest star in the sky for more than a decade. Those mariners could hardly have imagined that by the mid-1860s the brilliant orb would no longer be visible. Eta Carinae was enveloped by a cloud of dust ejected during a violent outburst named "The Great Eruption."

Watch on YouTube: https://www.youtube.com/watch?v=eVI0rZFPEpo

Download in HD: https://svs.gsfc.nasa.gov/13244

Image: 
NASA's Goddard Space Flight Center

Imagine slow-motion fireworks that started exploding 170 years ago and are still continuing. This type of firework is not launched into Earth's atmosphere, but rather into space by a doomed super-massive star, called Eta Carinae, the largest member of a double-star system. A new view from NASA's Hubble Space Telescope, which includes ultraviolet light, shows the star's hot, expanding gases glowing in red, white and blue. Eta Carinae resides 7,500 light-years away.

The celestial outburst takes the shape of a pair of ballooning lobes of dust and gas and other filaments that were blown out from the petulant star. The star may have initially weighed more than 150 Suns. For decades, astronomers have speculated about whether it is on the brink of total destruction.

The fireworks started in the 1840s when Eta Carinae went through a titanic outburst, called the Great Eruption, making it the second-brightest star visible in the sky for over a decade. Eta Carinae, in fact, was so bright that for a time it became an important navigational star for mariners in the southern seas.

The star has faded since that eruption and is now barely visible to the unaided eye. But the fireworks aren't over yet because Eta Carinae still survives. Astronomers have used almost every instrument on Hubble over the past 25 years to study the rambunctious star.

Using Hubble's Wide Field Camera 3 to map the ultraviolet-light glow of magnesium embedded in warm gas (shown in blue), astronomers were surprised to discover the gas in places they had not seen it before.

Scientists have long known that the outer material thrown off in the 1840s eruption has been heated by shock waves after crashing into the doomed star's previously ejected material. In the new images, the team had expected to find light from magnesium coming from the same complicated array of filaments as seen in the glowing nitrogen (shown in red). Instead, a completely new luminous magnesium structure was found in the space between the dusty bipolar bubbles and the outer shock-heated nitrogen-rich filaments.

"We've discovered a large amount of warm gas that was ejected in the Great Eruption but hasn't yet collided with the other material surrounding Eta Carinae," explained Nathan Smith of Steward Observatory at the University of Arizona in Tucson, Arizona, lead investigator of the Hubble program. "Most of the emission is located where we expected to find an empty cavity. This extra material is fast, and it 'ups the ante' in terms of the total energy for an already powerful stellar blast."

The newly revealed gas is important for understanding how the eruption began, because it represents the fast and energetic ejection of material that may have been expelled by the star shortly before the expulsion of the bipolar lobes. Astronomers need more observations to measure exactly how fast the material is moving and when it was ejected.

The streaks visible in the blue region outside the lower-left lobe are a striking feature in the image. These streaks are created when the star's light rays poke through the dust clumps scattered along the bubble's surface. Wherever the ultraviolet light strikes the dense dust, it leaves a long, thin shadow that extends beyond the lobe into the surrounding gas. "The pattern of light and shadow is reminiscent of sunbeams that we see in our atmosphere when sunlight streams past the edge of a cloud, though the physical mechanism creating Eta Carinae's light is different," noted team member Jon Morse of BoldlyGo Institute in New York.

This technique of searching in ultraviolet light for warm gas could be used to study other stars and gaseous nebulas, the researchers say.

"We had used Hubble for decades to study Eta Carinae in visible and infrared light, and we thought we had a pretty full accounting of its ejected debris. But this new ultraviolet-light image looks astonishingly different, revealing gas we did not see in other visible-light or infrared images," Smith said. "We're excited by the prospect that this type of ultraviolet magnesium emission may also expose previously hidden gas in other types of objects that eject material, such as protostars or other dying stars. Only Hubble can take these kinds of pictures."

Eta Carinae has had a violent history, prone to chaotic eruptions that blast parts of itself into space like an interstellar geyser. One explanation for the monster star's antics is that the convulsions were caused by a complex interplay of as many as three stars, all gravitationally bound in one system. In this scenario, the most massive member would have swallowed one of the stars, igniting the massive Great Eruption of the mid-1800s. Evidence for that event lies in the huge, expanding bipolar lobes of hot gas surrounding the system.

A fortuitous trick of nature also allowed astronomers in a previous Hubble study to analyze the Great Eruption in detail. Some of the light from the eruption took an indirect path to Earth and is just arriving now. The wayward light was heading away from our planet when it bounced off dust clouds lingering far from the turbulent stars and was rerouted to Earth, an effect called a "light echo."

The stellar behemoth will eventually reach its fireworks show finale when it explodes as a supernova. This may have already happened, although the geyser of light from such a brilliant blast hasn't yet reached Earth.

Credit: 
NASA/Goddard Space Flight Center

Brain network evaluates robot likeability

image: Images of robots shown to participants, from least human-like (left) to most human-like (right).

Image: 
Rosenthal-von der Pütten <em>et</em> <em>al</em>., <em>JNeurosci</em> 2019

Researchers have identified a network of brain regions that work together to determine if a robot is a worthy social partner, according to a new study published in JNeurosci.

Using functional magnetic resonance imaging, Astrid Rosenthal-von der Pütten, Fabien Grabenhorst and colleagues evaluated brain activity from the prefrontal cortex and amygdala as human participants scored images of robots on their likability, familiarly, and human-likeness. The participants also chose the robot from which they would prefer to receive a gift, indicating their social value.

Participants preferred more lifelike robots, but disliked the ones that appeared "too human," including artificially altered humans. Activity in the ventromedial prefrontal cortex followed the same pattern, increasing with the more lifelike robots until dropping significantly in relation to the most lifelike choice. The scientists concluded that each of several brain regions had a unique role in assessing the images, and the combination of their inputs determined whether or not a robot was likeable.

This direct representation of a psychological pattern in the brain provides insight into how people respond to and assess artificial social partners. The findings may also apply to the evaluation of human social partners.

Credit: 
Society for Neuroscience

Stem cell stimulation improves stroke recovery

image: Transplanted stem cells grow into healthy neurons after stimulation.

Image: 
Ping Yu <em>et</em> <em>al.</em>, <em>JNeurosci</em> 2019

Stem cell stimulation shows promise as a potential noninvasive stroke treatment, according to research in mice published in JNeurosci. If extended to humans, this technique could greatly improve patients' quality of life.

Ling Wei, Shang Ping Yu, and colleagues at Emory University injected neural stem cells into the brains of mice after a stroke and activated the cells through nasal administration of a protein. The stem cells activated by this new, noninvasive technique called optochemogenetics grew healthier and formed more connections compared to the stem cells that did not receive stimulation. Additionally, the mice that received both stem cells and stimulation displayed the most recovery, with some behaviors returning to pre-stroke levels.

The combination of stem cell injection and stimulation increased the likelihood of a successful stroke recovery in mice. Instead of just injecting stem cells in the damaged area of the brain, following up with stimulation creates an ideal environment for the cells to develop and form connections with surrounding neurons.

Credit: 
Society for Neuroscience

Sleep readies synapses for learning

image: Reconstructions of some of the dendritic spines used in the study.

Image: 
Spano <em>et</em> <em>al.</em>, <em>JNeurosci</em> 2019

Synapses in the hippocampus are larger and stronger after sleep deprivation, according to new research in mice published in JNeurosci. Overall, this study supports the idea that sleep may universally weaken synapses that are strengthened from learning, allowing for new learning to occur after waking.

Sleep is thought to recalibrate synaptic strength after a day of learning, allowing for new learning to take place the next day. Chiara Cirelli and colleagues at the University of Wisconsin-Madison examined how synapses in the hippocampus, a structure involved in learning, changed following sleep and sleep deprivation in mice.

Consistent with previous studies in the cortex, the researchers observed that synapses were larger, and therefore stronger, after the mice were awake for six to seven hours compared to after they were asleep for the same amount of time. Additionally, the researchers found that the synapses were strongest when the mice were forced to stay awake and interact with new stimuli, compared to mice that stayed awake on their own. This is consistent with the hippocampus' role in learning, and suggests that synaptic changes take place when learning occurs, not merely from being awake.

Credit: 
Society for Neuroscience

Scientists track the source of the 'Uncanny Valley' in the brain

Scientists have identified mechanisms in the human brain that could help explain the phenomenon of the 'Uncanny Valley' - the unsettling feeling we get from robots and virtual agents that are too human-like. They have also shown that some people respond more adversely to human-like agents than others.

As technology improves, so too does our ability to create life-like artificial agents, such as robots and computer graphics - but this can be a double-edged sword.

"Resembling the human shape or behaviour can be both an advantage and a drawback," explains Professor Astrid Rosenthal-von der Pütten, Chair for Individual and Technology at RWTH Aachen University. "The likeability of an artificial agent increases the more human-like it becomes, but only up to a point: sometimes people seem not to like it when the robot or computer graphic becomes too human-like."

This phenomenon was first described in 1978 by robotics professor Masahiro Mori, who coined an expression in Japanese that went on to be translated as the 'Uncanny Valley'.

Now, in a series of experiments reported in the Journal of Neuroscience, neuroscientists and psychologists in the UK and Germany have identified mechanisms within the brain that they say help explain how this phenomenon occurs - and may even suggest ways to help developers improve how people respond.

"For a neuroscientist, the 'Uncanny Valley' is an interesting phenomenon," explains Dr Fabian Grabenhorst, a Sir Henry Dale Fellow and Lecturer in the Department of Physiology, Development and Neuroscience at the University of Cambridge. "It implies a neural mechanism that first judges how close a given sensory input, such as the image of a robot, lies to the boundary of what we perceive as a human or non-human agent. This information would then be used by a separate valuation system to determine the agent's likeability."

To investigate these mechanisms, the researchers studied brain patterns in 21 healthy individuals during two different tests using functional magnetic resonance imaging (fMRI), which measures changes in blood flow within the brain as a proxy for how active different regions are.

In the first test, participants were shown a number of images that included humans, artificial humans, android robots, humanoid robots and mechanoid robots, and were asked to rate them in terms of likeability and human-likeness.

Then, in a second test, the participants were asked to decide which of these agents they would trust to select a personal gift for them, a gift that a human would like. Here, the researchers found that participants generally preferred gifts from humans or from the more human-like artificial agents - except those that were closest to the human/non-human boundary, in-keeping with the Uncanny Valley phenomenon.

By measuring brain activity during these tasks, the researchers were able to identify which brain regions were involved in creating the sense of the Uncanny Valley. They traced this back to brain circuits that are important in processing and evaluating social cues, such as facial expressions.

Some of the brain areas close to the visual cortex, which deciphers visual images, tracked how human-like the images were, by changing their activity the more human-like an artificial agent became - in a sense, creating a spectrum of 'human-likeness'.

Along the midline of the frontal lobe, where the left and right brain hemispheres meet, there is a wall of neural tissue known as the medial prefrontal cortex. In previous studies, the researchers have shown that this brain region contains a generic valuation system that judges all kinds of stimuli; for example, they showed previously that this brain area signals the reward value of pleasant high-fat milkshakes and also of social stimuli such as pleasant touch.

In the present study, two distinct parts of the medial prefrontal cortex were important for the Uncanny Valley. One part converted the human-likeness signal into a 'human detection' signal, with activity in this region over-emphasising the boundary between human and non-human stimuli - reacting most strongly to human agents and much less to artificial agents.

The second part, the ventromedial prefrontal cortex (VMPFC), integrated this signal with a likeability evaluation to produce a distinct activity pattern that closely matched the Uncanny Valley response.

"We were surprised to see that the ventromedial prefrontal cortex responded to artificial agents precisely in the manner predicted by the Uncanny Valley hypothesis, with stronger responses to more human-like agents but then showing a dip in activity close to the human/non-human boundary--the characteristic 'valley'," says Dr Grabenhorst.

The same brain areas were active when participants made decisions about whether to accept a gift from a robot by signalling the evaluations that guided participants' choices. One further region - the amygdala, which is responsible for emotional responses - was particularly active when participants rejected gifts from the human-like, but not human, artificial agents. The amygdala's 'rejection signal' was strongest in participants who were more likely to refuse gifts from artificial agents.

The results could have implications for the design of more likable artificial agents. Dr Grabenhorst explains: "We know that valuation signals in these brain regions can be changed through social experience. So, if you experience that an artificial agent makes the right choices for you - such as choosing the best gift - then your ventromedial prefrontal cortex might respond more favourably to this new social partner."

"This is the first study to show individual differences in the strength of the Uncanny Valley effect, meaning that some individuals react overly and others less sensitively to human-like artificial agents," says Professor Rosenthal-von der Pütten. "This means there is no one robot design that fits--or scares--all users. In my view, smart robot behaviour is of great importance, because users will abandon robots that do not prove to be smart and useful."

Credit: 
University of Cambridge

Glowing brain cells illuminate stroke recovery research

image: Gaussia luciferase (Gluc) is fused to the ChR protein. ChR can be activated by blue light or by light emitted by Gluc when binding to its substrate coelenterazine (CTZ). YFP = yellow fluorescent protein.

Image: 
Shan Ping Yu

A promising strategy for helping stroke patients recover, transplanting neural progenitor cells to restore lost functions, asks a lot of those cells. They're supposed to know how to integrate into a mature (but damaged) brain. The cells need help.

To provide that help, researchers at Emory University School of Medicine have developed an "optochemogenetics" approach that modifies a widely used neuroscience tool. The stimulation that transplanted cells need to flourish comes from light, generated within the brain. In a mouse model of stroke, neural progenitor cells received light stimulation, which promoted functional recovery in the mice.

The results are scheduled for publication in Journal of Neuroscience.

Neuroscience aficionados may be familiar with optogenetics, which allows scientists to study the brain conveniently, activating or inhibiting groups of neurons at the flip of a switch. (Mice with fiber optic cables attached to their heads can be found in many labs.) Emory investigators led by Shan Ping Yu, MD, PhD and Ling Wei, MD wanted to remove the cable: that is, figure out how to selectively stimulate brain cells non-invasively.

They teamed up with Jack Tung, PhD, Ken Berglund, PhD and Robert Gross, MD, who had created "luminopsins", engineered proteins that are both light-sensitive and generate their own light when provided with a chemical called CTZ (coelenterazine). The protein components come from Volvox algae and from Gaussia princeps, a fingernail-sized crustacean that lives in the deep ocean.

Yu and Wei were looking for ways to coax neural progenitor cells - capable of multiplying and differentiating into mature neurons - to survive in the brain after the destruction of a stroke. They were working with a mouse model, in which the sensory and motor regions on one side of the brain are damaged.

"It is not sufficient to put the cells into the damaged brain and then not take care of them." Yu says. "If we expect progenitor cells to differentiate and become functional neurons, the cells have to receive stimulation that mimics the kind of activity they will see in the brain. They also need growth factors and a supportive environment."

Yu and Wei are in Emory University School of Medicine's Department of Anesthesiology, while Tung, Berglund and Gross are in the Department of Neurosurgery and the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory.

In experiments described in the current paper, scientists introduced genes encoding luminopsins into induced pluripotent stem cells, which were cultured to form neural progenitor cells. The neural progenitor cells were delivered into the brains of mice a week after stroke. CTZ, which emits light when acted upon by luminopsins, was then provided intranasally twice a day for two weeks. Intranasal delivery bypasses the blood-brain barrier and repeated administrations are clinically feasible, Yu points out. Bioluminescence could be detected in the cell graft area and was visible for around one hour after CTZ administration.

CTZ promoted an array of positive effects in the progenitor cells: more survival and intact axons, more connections within the brain and better responses to electrical stimulation. It also promoted recovery of function in the affected limb in the mice. The mice were tested in activities such as: reaching and grasping food pellets, or removing adhesive dots from their paws. In young mice, CTZ and progenitor cells together could restore use of the stroke-affected limb back to normal levels, and even in older mice, they produced partial recovery of function.

When Yu was asked about clinical prospects, he said that optochemogenetics represents a significant advance, compared with its constituent technologies.

"Optogenetics is a fantastic technical tool, but it presents some barriers to clinical implementation," he says. "You have the invasive fiber optic light delivery, and the limited distance of light diffusion, especially on the larger scale of the human brain."

Delivery of cells into the brain and making them glow are complex, but they offer scientists some flexibility when designing experiments: direct light application, which can be turned on and off quickly, or the steady support of CTZ stimulation. Luminopsins can provide "the capabilities for the cells to be activated either in a way mimicking neuronal activities of fast activation or manipulations of the channel/cell in clinical treatments," the authors write.

Yu and his colleagues are also testing their approach for the delivery of neural progenitor cells in the context of traumatic brain injury.

Credit: 
Emory Health Sciences

Pharmaconutrition &mdash; Modern drug design for functional studies

image: Model of the bitter receptor TAS2R14 with one of its activators (ligands) flufenamic acid.

Image: 
Dr. Antonella Di Pizio / Leibniz-LSB@TUM

Antonella Di Pizio and Maik Behrens of the Leibniz-Institute for Food Systems Biology at the Technical University of Munich, together with their cooperation partners, have developed highly effective activators for the bitter receptor TAS2R14 in a German-Israeli research project. The new substances are used to investigate the as yet unknown physiological functions of the receptor, for example, in the human immune system.

The team of scientists published their results in the journal Cellular and Molecular Life Sciences (Di Pizio et al., 2019; DOI: 10.1007/s00018-019-03194-2).

Bitter (taste) receptor with health effect?

It has only been known for approximately 15 years that humans detect bitterness with the aid of 25 different receptor variants. TAS2R14 is one of these. However, unlike most of the other bitter receptors, it detects a broad spectrum of bitter substances. In addition to secondary plant substances, such as caffeine, its activators also include medications. However, the bitter receptor is not just relevant for taste perception. Recent findings indicate that it provides other physiological functions that are important for our health. It is found on lung and testicle cells and plays a role in the innate immune response.

Old drug substance as the basis for modern drug design

In order to specifically investigate the manifold functions of TAS2R14 in different organs and tissues, among other things, highly effective (potent) activators of the receptor are necessary. Using a structure-based computer-aided modeling approach, the German-Israeli team of scientists has now succeeded for the first time in synthesizing three such highly potent substances. The original substance for the drug design was the drug flufenamic acid. The well-known active ingredient is one of the non-steroidal anti-inflammatory drugs and is contained in muscle and joint salves. It has an anti-inflammatory and analgesic effect, in that it blocks enzymes that promote the release of prostaglandins.

"We chose this active ingredient as the basis for our investigations because it stimulates the receptor even in the most minute concentrations. This means that approximately eight millionths of a gram of the substance per liter are already sufficient for this purpose," explains bioinformatician and lead author, Antonella Di Pizio. The new derivatives are extremely potent activators, more effective than the known drug, and in the future they will be used as tools in functional studies.

A new research area "pharmaconutrition" with a systems biology approach

"Due to the many new findings, we no longer regard bitter substances exclusively as pure flavoring components, but also as medically effective nutritional components," says molecular biologist Maik Behrens. "Likewise, today bitter receptors must no longer be viewed as just sensors that warn us of potentially toxic substances before swallowing." To research the correlations between bitter substances, bitter receptors and the human organism a new, far-reaching systems biology approach is required, the biologist continued. The Leibniz Institute is pursuing this approach by combining basic molecular research with the latest methods of bioinformatics and high-throughput technologies.

Credit: 
Leibniz-Institut für Lebensmittel-Systembiologie an der TU München

Alcohol and pregnancy policies: Birth outcomes & prenatal care use by race

In the U.S. state policies pertaining to alcohol use during pregnancy have been in effect for more than 40 years.

These policies include:

Mandatory warning signs

Priority access to substance abuse treatment for pregnant women

Requirements to report evidence of alcohol use during pregnancy to law enforcement or child welfare agencies-- or to a health authority for the purposes of data gathering and treatment

Laws that define alcohol use during pregnancy as child abuse/child neglect

Laws that limit toxicological tests as evidence in criminal prosecutions of fetal or child harm

Involuntary commitment of pregnant women to treatment or to protective custody.

Previous research has found that some of these policies increase adverse birth outcomes and decrease prenatal care use.

This research examines whether effects of alcohol/pregnancy policies vary by race.

The authors examine 1972-2015 Vital Statistics data and policy data. The dataset includes more than 150 million singleton births. Outcomes are preterm birth (PTB), low birthweight (LBW), and prenatal care use.

Results show that the effect of alcohol/pregnancy policies varied by race for preterm birth, varied in a few cases for low birthweight, and generally did not vary for prenatal care use.

For White women, most policies had adverse effects on PTB and/or LBW including policies intended to support pregnant women who use or abuse alcohol such as mandatory warning signs laws, priority access to substance abuse treatment for pregnant women and for pregnant women with children, laws that limit toxicological tests as evidence of fetal or child harm, reporting requirements for data gathering and treatment purposes and prohibitions against criminal prosecution. One policy that is punitive toward pregnant women - child abuse/neglect laws - was also associated with adverse effects.

For Black women, four policies had beneficial effects for PTB including policies supportive of women: mandatory warning signs laws and reporting requirements for data and treatment purposes. Additionally, two policies that are punitive - civil commitment laws and reporting requirements to child protective service laws - were associated with beneficial effects on PTB.

The authors conclude that the effect of alcohol/pregnancy policies on birth outcomes varies by race. Future research should explore why some policies appear to have opposite effects for White and Black women.

Credit: 
Pacific Institute for Research and Evaluation

New metalloenzyme-based system allows selective targeting of cancer cells

RIKEN researchers have developed a promising method to deliver a drug to cancer cells without affecting surrounding tissues, involving a clever combination of an "artificial metalloenzyme" that protects a metal catalyst, and a sugar chain that guides the metalloenzyme to the desired cells.

In the field of organic synthetic chemistry, many metal catalysts have been developed with the capacity to synthesize molecules such as drugs and functional materials. Recently, researchers have begun to focus on chemical reactions in living bodies catalyzed by transition metals --elements belonging to groups 3 to 11 on the periodic table. However, they have run into difficulties: transition metal catalysts are easily "quenched" -- meaning they are inactivated by substances such as antioxidants, so it has been difficult to get them to perform chemical reactions in actual organisms.

The international research team including Chief Scientist Katsunori Tanaka of the RIKEN Cluster for Pioneering Research and RIKEN Baton Zone Program and Special Postdoctoral Researcher Kenward Vong developed an artificial "metalloenzyme" that contains a metal ion and is able to save the ion from being quenched, making it possible for the chemical reaction to take place in vivo. The metal ion in this case was ruthenium, which catalyzes a "pro-drug" into umbelliprenin, a plant-derived compound known to have anti-cancer activity. Further, by attaching a sugar "delivery tag" to the surface of the artificial metalloenzyme, they were able to target it specifically to cancer cells where the drug was needed.

To perform the work, the group worked with a protein called human serum albumin, which is abundant in the human body. The researchers introduced a ruthenium catalyst into the hydrophobic "pocket" inside the protein. They found that in vitro, the ruthenium was able to carry out chemical reactions. "We were pleasantly surprised," says Tanaka, who led the group, "that our newly developed metalloenzyme worked well in the presence of glutathione, an antioxidant that is abundant in actual cells and can inactivate ruthenium. This told us that the ruthenium catalyst is well protected from hydrophilic components such as glutathione in the hydrophobic pocket of the albumin molecule, while hydrophobic compounds can come in contact with the catalyst within the pocket and undergo catalysis."

After determining that the catalysis would work, the researchers modified the surface of the albumin, attaching sugar chains that allowed it to be transported to specific cells of interest. Target cells are recognized by the pattern of sugar chains. Doing this, they successfully delivered the catalyst to cancer cells, and used it to produce umbelliprenin, which they determined actually had cytotoxic effects on the cancer cells.

"We confirmed that the method we developed can be applied to metal-catalyzed reactions using other catalysts such as gold, and the artificial metalloenzyme could be generally used in vivo," adds Tanaka. "If transition metal catalysis can be performed on specific organs or diseased cells in the body, it will allow us to rapidly and stably synthesize drugs there, minimizing side effects. Our findings could become a key in the fight against such diseases. Furthermore, we can consider using other natural compounds, which show strong anti-cancer activity but have not been used so far. We have opened a door to a new era where we can synthesize and activate natural chemical compounds in actual organisms."

Credit: 
RIKEN

NCI study finds increased risk of cancer death following treatment for hyperthyroidism

Findings from a study of patients who received radioactive iodine (RAI) treatment for hyperthyroidism show an association between the dose of treatment and long-term risk of death from solid cancers, including breast cancer. The study, led by researchers at the National Cancer Institute (NCI), part of the National Institutes of Health, was published July 1, 2019 in JAMA Internal Medicine.

"We identified a clear dose-response relationship between this widely used treatment and long-term risk of death from solid cancer, including breast cancer, in the largest cohort study to date of patients treated for hyperthyroidism," said Cari Kitahara, Ph.D., of NCI's Division of Cancer Epidemiology and Genetics, lead author of the study. "We estimated that for every 1,000 patients treated currently using a standard range of doses, about 20 to 30 additional solid cancer deaths would occur as a result of the radiation exposure."

RAI, which has been used widely in the United States for the treatment of hyperthyroidism since the 1940s, is one of three commonly used treatments for hyperthyroidism. The other two are anti-thyroid drugs, which have been rising in popularity, and surgical treatment, which is used least often.

The new findings are from a long-term follow-up study of a large cohort of people with hyperthyroidism (mainly Graves' disease) who were treated with radiation between 1946 and 1964, the Cooperative Thyrotoxicosis Therapy Follow-up Study. In the new analysis--which included nearly 19,000 people from the original cohort, all of whom had received RAI and none of whom had had cancer at study entry--the researchers used a novel, comprehensive method of estimating radiation doses to each organ or tissue. Most of the radiation is absorbed by the thyroid gland, but other organs like the breast and stomach are also exposed during treatment.

The researchers observed positive dose-response relationships between the dose absorbed by an organ and mortality from cancer at that site. The relationship was statistically significant for female breast cancer, for which every 100 milligray (mGy) of dose led to a 12% increased relative risk of breast cancer mortality, and for all other solid tumors considered together, for which relative risk of mortality was increased by 5% per every 100 mGy.

Based on these findings, the researchers estimated that for every 1,000 patients aged 40 years with hyperthyroidism who were treated with the radiation doses typical of current treatment, a lifetime excess of 19 to 32 radiation-attributable solid cancer deaths would be expected.

According to the researchers, in the United States, about 1.2% of the population has hyperthyroidism, and women are much more likely to develop the condition than men. Therefore, the findings for breast cancer mortality are particularly relevant for the large population of women treated for hyperthyroidism, Dr. Kitahara said.

"We found the increased risks of death from solid cancer overall and from breast cancer more specifically to be modest, but RAI is still a widely used treatment for hyperthyroidism," she said. "It's important for patients and their physicians to discuss the risks and benefits of each available treatment option. The results of our study may contribute to these discussions."

The researchers wrote that additional research is needed to more comprehensively assess the risk-benefit ratio of radiation versus other available treatment options for hyperthyroidism. Furthermore, because the types of anti-thyroid drugs administered to patients in the original cohort were different from those prescribed more recently, the researchers wrote that more studies are needed to evaluate long-term health effects of current anti-thyroid drugs, including in comparison to RAI treatment.

Credit: 
NIH/National Cancer Institute

Building up an appetite for a new kind of grub

image: Food preparation using cricket garnish and ground cricket.
Photo was taken by Guiomar Melgar-Lalanne and Alan-Javier Hernández-Álvarez during an 'Insect Technological Venue' at the University of Veracruz with the Chef Mario Melgarejo

Image: 
Guiomar Melgar-Lalanne and Alan-Javier Hernández-Álvarez, courtesy of Chef Mario Melgarejo

Edible insects could be a key ingredient to avoiding a global food crisis, according to a new report, but there are significant barriers to overcome before they are part of the mainstream.

The rapidly changing climate and an expanding global population are serious risks for worldwide food security. Edible insects have a high nutritional value and significantly lower carbon footprint compared to meat production and are a viable option as a sustainable source of protein. Despite this, edible insect cultivation remains rare in Western countries, where eating insects is still considered unusual.

In a new study, researchers from the University of Leeds and University of Veracruz in Mexico have reviewed current insect farming methods, processing technologies and commercialisation techniques, as well as current perceptions towards entomophagy - the practice of eating insects.

Their report is published in the journal Comprehensive Reviews in Food Science and Food Safety and reviews research collected from around the world. It highlights that the benefits of increasing insect consumption have been widely explored, but not the technological and processing approaches that can help achieve this goal. The researchers emphasis that commercialisation and processing techniques that focus on the preferences of the younger generation is the best way to normalise edible insects.

Study author Dr Alan-Javier Hernández-Álvarez from the School of Food Science and Nutrition at Leeds said: "Edible insects are fascinating. Although humans have eaten insects throughout history, and approximately two billion people around the globe regularly eat them today, research on the subject is relatively new.

"Edible insects could be the solution to the problem of how to meet the growing global demand for food in a sustainable way.

"The 'ick factor' remains one of the biggest barriers to edible insects becoming the norm. Eating behaviour is shaped largely during early childhood and in Western countries, eating insects, especially in whole and recognisable forms, remains something seen mostly on TV shows.

"In some European countries consumers, particularly young adults, have shown interest in new food products that use insects in un-recognisable form, such as flour or powder used in cookies or energy drinks. Developing efficient large-scale processing technologies that can develop insects powders could go a long way to helping introduce insects as a common source of protein and nutrients."

Study author Dr Guiomar Melgar-Lalanne from the University of Veracruz said: "In Western countries it is the younger generation that show more willingness to try new food products, including edible insects. The 'foodies boom' and the rise of veganism and flexitarians have opened the door to alternative food sources.

"However, in some countries where insects have been part of a culinary tradition, such as Mexico, Nigeria and Botswana, negative perceptions of eating insects have taken root. In these countries the younger population is rejecting insects as a food source, associating it with poverty and a provincial mind set.

"In Mexico for example, where insect markets are increasingly popular among tourists, self-consumption, harvesting and farming are declining in rural areas. Despite the growing demand for edible insects in urban areas, harvesting and indoor farming is limited because farmers associate insects with poverty and do not see it as a potential source of income.

"Promoting insects as an environmentally sustainable protein source appeals to the current attitudes in the younger generation. Another successful strategy involves serving insects as snacks between meals, which would increase inclusion of insects in daily diets. These types of snacks are increasing in popularity in the global market.

"But if edible insects are to become a common food source current farming techniques and technologies could struggle with the demand and need to be expanded."

Compared to meat production, insect farming uses much smaller amounts of land, water and feed, and it is possible to cultivate them in urban areas. Insect farming also produces far fewer greenhouse gases.

However, more development is needed in large-scale insect farming. Increasing demand could create a bottleneck in the production of more edible insects in an economically efficient, safe and sustainable matter. The lack of availability creates accessibility issues and therefore reduces opportunities for increasing trade. There is significant need for a technological leap from wild harvesting to indoor farming.

Improvements to edible insect farming and processing techniques could also open the door for increasing the use of insects for other purposes. Chitin extracted from certain insect exoskeletons has the potential for use in food preservation. It also has a number of industrial applications such as surgical thread and as a binder used in glue.

Dr Hernández-Álvarez added: "Food is only the tip of the iceberg for insects' sustainable potential.

"Refining extraction technologies could make insects a feasible and sustainable option for replacing some currently available functional ingredients. These aspects should be a focus of future research and technological development."

Credit: 
University of Leeds