Culture

U of T researchers engineer antibodies that unlock body's regenerative potential

image: University of Toronto researchers Stephane Angers and Sachdev Sidhu have created antibodies that could one day stimulate tissue in the body to repair itself, as described in a study published online in eLife, an open-access journal.

Image: 
Steve Southon, Leslie Dan Faculty of Pharmacy, University of Toronto

Our body makes antibodies to fight infections. But the synthetic versions of these molecules could hold the key to stimulating the body's ability to regenerate.

The findings come from a decade-long collaboration between the teams of Sachdev Sidhu, a professor in the Donnelly Centre for Cellular and Biomolecular Research, and Stephane Angers, Associate Dean of Research in the Leslie Dan Faculty of Pharmacy, that have been creating synthetic antibodies for diverse applications.

Antibodies are increasingly being developed into drugs thanks to their ability to bind and affect the function of other proteins in cells. Because they are encoded by genes, antibodies can be created in the lab using genome and protein engineering technologies.

Now Sidhu and Angers' teams have created antibodies that could one day stimulate tissue in the body to repair itself, as described in a study published online in eLife, an open-access journal. A newly launched Toronto startup, AntlerA Therapeutics, will turn the antibodies into drug-like molecules for regenerative medicine.

The work was done in collaboration with the Toronto Recombinant Antibody Centre, co-founded by Sidhu in the Donnelly Centre, which has created a massive catalog of synthetic antibodies for research and drug discovery.

"We are developing new molecules that have never been seen before and whose potential for regenerative medicine is enormous," says Sidhu, also a professor in U of T's Department of Molecular Genetics and co-founder of AntlerA.

"By capitalizing on the momentum of stem cells research and regenerative medicine that already exists in Toronto, we are ideally situated to commercialize these molecules."

Both Sidhu and Angers are members of the recently launched Precision Medicine Initiative (PRiME) at U of T that seeks to accelerate treatments targeting the biological underpinnings of an individual's disease. The team recently obtained support from the Medicine by Design program to continue to develop FLAgs and extend the technology to other growth factors.

Engineering new molecules

The antibodies were engineered to mimic key growth factors, proteins called Wnt (pronounced as "wynt") that normally instruct stem cells-- cells that can turn into any cell type in the body -- to form tissue in the embryo. Wnt proteins also activate stem cells for tissue repair following injury in adults, while mistakes in Wnt signaling can lead to cancer.

Scientists have long sought to co-opt Wnt as a tool for activating tissue regeneration. But these efforts were stymied by the molecules' complicated chemistry -- Wnt proteins are attached to fat molecules, or lipids, which makes their isolation in active form difficult.

"People have been trying for decades to purify Wnt proteins and make drugs out of them," says Sidhu. "Drug development would require further engineering of Wnt proteins. But Wnt are difficult to purify, let alone engineer--therefore they unlikely become drugs."

The associated lipids also prevent Wnt proteins from dissolving in water, making them unsuitable as medicines because they cannot be injected.

That's why the researchers decided to design antibodies that behave like Wnts, by binding to and activating two classes of Wnt receptors, Frizzled and LRP5/6, on the surface of cells, but are also water soluble and therefore easier to work with.

Called FLAgs, for Frizzled and LRP5/6 Agonists, the antibodies can be designed to replicate any one of the hundreds possible Wnt-receptor combinations (humans have 19 different Wnt proteins that can activate 10 Frizzled and eight co-receptors including LRP5/6).

To generate FLAgs, Yuyong Tao, a postdoctoral fellow in Sidhu's lab, came up with a new molecular configuration that does not exist in nature. Whereas natural antibodies have two binding sites, allowing them to bind to two targets, FLAgs have four, which means that a single molecule can recognize multiple receptors at the same time and mimic how Wnt proteins act in the body.

Stimulating tissue self-repair

When added to cell culture, FLAgs were able to substitute for Wnt proteins-- a hard-to-source but necessary ingredient in culture medium--and stimulate the formation of stem cell-derived intestinal organoids, three-dimensional balls of tissue that resemble the small intestine.

"These 3D organoids hold great potential for research and drug discovery but to grow them you need a source of Wnt proteins to activate stem cells," says Angers, whose team presented the findings earlier this month at an eminent Gordon conference in the U.S. "Now we have a defined protein, which we can easily obtain in large amounts and which can support the growth of organoids from various tissues."

"This is going to be really important and transformative for a lot people in the field," he says.

Most strikingly, when injected into mice, the FLAgs were able to activate the gut stem cells, showing that the antibodies are stable and functional inside the body. The finding raises hopes that FLAgs could be used as treatment for irritable bowel disease and other ailments to regenerate the intestinal lining when it is damaged. Other FLAg variants show promising results in lung, liver and bone regeneration as well as having the potential for treating eye disease.

AntlerA has already attracted investment to develop FLAgs into cutting edge therapeutics and is actively working on treatments for vision loss and bowel diseases. The startup's name was inspired by FLAgs' geometrical shape which resembles the antlers of deer and moose which are the fastest regenerating organs in animals.

"The type of discovery we report in our study was possible with a convergence of expertise," says Angers, co-founder of AntlerA. "Thanks to the close collaboration and proximity between our labs, we were able to apply protein engineering to activate a critical stem cell signaling pathway with the ultimate goal to develop regenerative medicine promoting the repair of diverse tissues in the body."

Credit: 
University of Toronto - Leslie Dan Faculty of Pharmacy

A new signaling pathway for mTor-dependent cell growth

image: Left: Protein kinase N (PKN) is active and inhibits PI3KC2ß (green). Right: PKN is missing. This causes PI3KC2ß to change the localization in the cell and to become active.

Image: 
Alexander Wallroth, FMP

The activation of mTor complex 1 in the cell is central to many vital processes in the body such as cell growth and metabolism. Overactivity of this signalling pathway can result in diseases such as in diabetic insulin resistance and cancer. A team led by the scientist Volker Haucke (Leibniz - Forschungsinstitut für Molekulare Pharmakologie and Freie Universität Berlin) has now discovered how inactivation of a certain lipid kinase promotes mTor complex 1 activity, and may therefore constitute a new point of attack for the treatment of diabetes and cancer. The results have just been published in the renowned journal Nature Cell Biology.

The signaling pathways in somatic cells are highly complex and specific mechanisms can only be triggered if several "switches" are "flipped" in a fixed sequence. However, given the high number of substances and substance complexes involved in the cellular signal transmission, finding these "switches" and identifying their role is a real challenge. The question of how mTor complex 1 can be deactivated in the cell has also been long unresolved. Even so, FMP researchers were able to shed light on this "switch" as early as 2017: revealing that a certain lipid kinase (PI3KC2ß) acts as a natural brake for the mTor protein and ensures that the mTor complex 1 is switched off, for example, when certain hormonal signals such as insulin are absent.

Now, a team led by FMP researcher Alexander Wallroth from Volker Haucke's research group has been scrutinizing how this lipid kinase is regulated. Alexander Wallroth: "We manipulated the lipid kinase in various ways and looked at the effects these manipulations had on mTOR and activity on cell growth". This work has allowed the researchers to uncover a mechanism for inactivating the PI3KC2ß lipid kinase. Another kinase that plays a key role here is protein kinase N (PKN), which renders the PI3KC2ß lipid kinase inactive, thereby indirectly activating mTOR in the process. Protein kinase N is regulated by growth factors that stimulate the mTor complex 2 at the cell membrane - the second protein complex in which mTor is present in the cell. This, in turn, activates PKN, which ultimately inactivates the lipid kinase.

"In doing so, we revealed two further components prone to pharmacological attack", explains Alexander Wallroth. Successfully inhibiting PKN activates the lipid kinase PI3KC2ß and ends up inhibiting mTOR-dependent cell growth. Conversely, if the signaling pathway is activated by growth factors, mTOR complex 2, and, finally, PKN, the lipid kinase remains inactive and the mTOR complex 1 is able to drive cell growth. Although inhibitors capable of inhibiting PKN have already been identified, their lack of specificity and tendency to block many other vital cellular processes currently precludes their use in living tissue.

"The most fascinating finding was the discovery of a cell biological signaling pathway that links mTor complexes 1 and 2, for example switching off 2 also impacts on 1," says Alexander Wallroth. In their previous work, the researchers were able to show that the lipid kinase PI3KC2ß, when activated, acts directly on mTor complex 1. Similarly, if PKN is activated by mTor complex 2, thereby inactivating the lipid kinase, the activity of mTor complex 1 is also affected. Up to now, little was known about mTor complex 2 compared to complex 1. The new results show that mTor complex 2 has a decisive impact on the activity of the important complex 1. "This paves the way for further and more detailed research into medical intervention to encompass various diseases such as insulin resistance or cancer," emphasizes Alexander Wallroth.

Credit: 
Forschungsverbund Berlin

High fat diet during pregnancy slows learning in offspring, rat study suggests

image: Tamashiro and Cordner

Image: 
Johns Hopkins Medicine

In a bid to further explore how a mother-to-be's diet might affect her offspring's brain health, Johns Hopkins Medicine researchers have found that pregnant and nursing rats fed high fat diets have offspring that grow up to be slower than expected learners and that have persistently abnormal levels of the components needed for healthy brain development and metabolism.

In the experiments, pregnant rats were allowed to overeat repeatedly a diet similar in fat to that of typical fast food meals that people eat. Although the study was performed on animals, the researchers say their findings -- described in the August issue of Experimental Neurology -- likely apply in some measure to other mammals including humans, and they add to evidence that unhealthy diets may damage a fetus's developing brain in specific ways.

Because so much of mammalian brain biology and metabolism is similar, the research "may well hold warnings for people that high fat diets during pregnancy are a concern," says Kellie Tamashiro, Ph.D., M.S., associate professor of psychiatry and behavioral sciences at the Johns Hopkins University School of Medicine. "Pregnant moms who may not have access to fresh foods or healthy diets may have their unborn children's brain development suffer, and there's an opportunity to intervene to change their offspring's learning trajectory to a better one."

Tamashiro's laboratory studies how, during pregnancy, stress, diet and the immune system can contribute to neuropsychiatric and metabolic diseases such as diabetes. During the current study, which focused on the impact of a high fat diet on the developing brain, they fed pregnant rats a diet of 60% calories from fat in their food pellets throughout pregnancy and until their pups were finished weaning -- a period of three weeks. The typical American and Western European diet has about 45% calories from fat. The researchers compared the rats' high fat food to a fast food diet.

"The pregnant rats had free access to a very high fat diet where they were eating as much as they wanted and they did overeat, just like we do in Western society," says Zachary Cordner, M.D., Ph.D., chief resident of psychiatry at The Johns Hopkins Hospital and the study's lead author.

When the offspring were finished weaning, at about 21 days after birth, they were fed for the next three months a normal rat chow diet in which 20% of the calories came from fat. At about 4 months of age, the adult rats were evaluated for their learning and memory abilities using a maze that consisted of a round platform with 20 holes around the perimeter, only one of which led to an exit.

The researchers note that rats don't like wide-open spaces, because such an environment exposes them to predators in the wild, so they would instinctively seek to exit the open platform. Normal rats consistently take three to four tries to learn where the maze exit is, but rats from mothers fed the high fat diets took up to nine times to learn the location.

A week after learning the exit location, the rats were tested again. Those born to mothers on a normal diet remembered the maze and only took about five seconds to find the exit, but the rats whose mothers had a high fat diet took about 20 seconds on average.

In a second set of experiments, the team exploited the fact that normally, rats are curious and like to check out new objects in their environment. When they familiarized rats with Lego blocks and then swapped one of the known blocks with a different one the next day, the rats typically spent more time exploring the new one. But rats from mothers who were fed the high fat diets spent just as much time around the familiar object as they did the new one, suggesting they didn't recognize the object as new.

"The rats from moms fed high fat diets were slow learners," says Cordner. "They were able to learn as well as the normal rats, but it took them longer to do it."

To figure out what might have accounted for the slow learning, the researchers compared the levels of products made by genes in the brains of normal rats with the levels in rats whose mothers were fed high fat diets during pregnancy and nursing. They focused on the part of the brain that is vital to learning and memory just when the pups were finished weaning and a few months later when the rats were adults.

The rats from pregnant mothers that were fed high fat diets had lower levels of insulin receptors, leptin receptors and glucose transporter 1 than did rats that came from mothers fed normal diets.

Insulin helps regulate blood sugar. In rats and all mammals, the insulin receptor detects insulin and initiates the process to help get sugar out of the blood -- using the glucose transporter -- and into the body's cells for energy. Leptin is a hormone that suppresses hunger, and it binds to the leptin receptor to regulate body weight and metabolism.

"The roles of genes involved in energy metabolism affect learning and memory too, and this role changes over time," says Cordner. "Initially, these genes are involved in the formation of the fetal brain, and later on in life they are involved in learning and memory, in addition to energy metabolism.

"Our findings build on what we already know about the role of these hormones and nutrients on brain health, and clearly support the idea that a high fat diet does impact neuropsychiatric risk that carries over into adulthood, most likely by interfering with how genes are regulated and expressed," adds Cordner.

Credit: 
Johns Hopkins Medicine

How bees live with bacteria

image: A solitary bee leaves an artificial nest. The individual breeding chambers are separated and each contains only one larva. This prevents direct contact with sisters or mothers.

Image: 
(Photo: Alexander Keller / University of Wuerzburg)

An apple plantation in spring. The trees are in full bloom. But to ensure that they also yield in autumn, workers have to do a real fluff job for weeks: each individual flower is manually pollinated with brushes - because there are no bees left to do the job. Not a nice vision of the future. But in some regions of China this is already reality. And the disappearance of the bees is reported all over the world.

The exact reason for the bee mortality is not known. Pesticides from agriculture, destruction of habitats, pathogens - probably several factors play together. A research group at Julius-Maximilians-Universität (JMU) Würzburg in Bavaria, Germany, is now focusing on another factor. It is the bacteria that live in and with bees. Many of them are important for the health of bees. If they suffer, so do the bees.

Many relationships between bees and bacteria

The intestines of honeybees, for example, contain bacteria that help digest food and stimulate the immune system of the insects. The beehive also contains useful microbes, some of which produce antibiotics, thus preventing the spread of harmful fungi.

"Most research in this field is devoted to social bees, especially the western honey bee Apis mellifera," says Dr. Alexander Keller of the JMU Biocenter. Solitary bees would have experienced only little attention. These are pollinators of great ecological importance for the environment and agriculture. More than 90 percent of the 17,500 bee species known worldwide are solitary bees.

Honeybees only suitable as a model to a limited extent

According to Keller, considerably more research is needed to better understand the relationship between solitary bees and microbes and thus perhaps better combat bee mortality. Many species of solitary bees are threatened or already extinct.

Until now, research has been based on the assumption that the knowledge gained from honey bees can be transferred to solitary bees. However, this is only possible to a very limited extent, despite some fundamental similarities. This is the conclusion reached by the JMU research group in a review article published in the journal "Trends in Microbiology". It summarises the current state of research on bees and their microbial associations.

The central finding is that solitary bees are much more strongly influenced by environmental factors and human-induced changes than socially organised bees when establishing their relationships with microbes. The consequences of climate change, agricultural changes and habitat degradation, for example, have not yet been clarified and require research specifically adapted to solitary bees.

Alexander Keller and his team are currently working with the JMU Chair of Zoology III (Animal Ecology and Tropical Biology) and international partners to investigate the landscape ecological factors that influence the microbial associations of solitary bees.

Credit: 
University of Würzburg

New biosensor provides insight into the stress behaviour of plants

image: Microscopy images (FRET method) of three different regions of a root showing changes in phosphatidic acid under salt stress over time (from left to right).

Image: 
W. Li et al./ <i>Nature Plants</i>

They are tiny signalling molecules that play important roles in many processes in living organisms. However, the exact function of these substances is often still unknown, which is why scientists are constantly on the lookout for new methods with which they can further investigate them. Researchers at the Universities of Münster (Germany) and Nanjing (China) have developed such a method for an important messenger substance in plants, called phosphatidic acid.

This lipid takes on different roles in the organism: It influences the flexibility and bending of cell membranes, regulates the metabolism of the plant and also serves as a signalling substance to regulate the localization or activity of proteins. However, researchers have not been able to find out which part of the phosphatidic acid pool in the cell has a function for the metabolism and which part serves as a signalling substance. A biosensor developed by German and Chinese scientists has now changed this: By incorporating this sensor into plants, they were able to track the activity of the phosphatidic acid spatially and temporally for the first time.

"Our approach enables us to elucidate the dynamics of phosphatidic acid more precisely, especially in plants under stress," says co-author Prof. Jörg Kudla from the University of Münster. A plant is stressed, for example, when exposed to dry or salty soils. The measurements obtained with the new method could in future help to breed plants that are more resilient to adverse environmental conditions. The study was published in the journal Nature Plants.

Background and methods:

Until now, scientists had only been able to measure the occurrence of phosphatidic acid biochemically and thus determined the total amount of the lipid in a certain tissue or organism. It remained unclear in which cells or in which parts of these cells the substance was active and why its concentration changed.

The newly developed biosensor is based on the principle of fluorescence resonance energy transfer (FRET). The sensor is a plasma membrane targeted fusion protein of a phosphatidic acid binding domain placed between two different fluorescent proteins that fluoresce blue and yellow when stimulated by light. Binding of phosphatidic acid to this sensor changes its conformation and this results in a change of the colour of the emitted light. Therefore, the new sensor is called "PAleon", which is derived from the abbreviation PA for phosphatidylic acid and chameleon. The scientists measure these signals using modern microscopy methods.

In this way, the researchers looked at the roots and guard cells of Arabidopsis plants. They observed different regions in the roots and exposed them to different stimuli, such as increased salt concentration of the growth media. The colours under the microscope showed how and where the distribution and concentration of phosphatidic acid changed.

In their investigations the researchers observed that, if the salt stress for the plants increased, the concentration of phosphatidic acid in roots also increased. In addition, the researchers were also able to determine the location in the roots where the changes took place. "Since the sensor only detects the so-called bioactive amount, we can conclude that the changes in phosphatidic acid concentration and distribution which we detected is due to its function as a signalling substance," said Jörg Kudla. It should be remembered that phosphatidic acid can also play a role in metabolic processes and movements of the cell membrane.

With their new method, the scientists have already discovered that the activity of a certain protein that synthesizes phosphatidic acid, phospholipase D, is important for plants to adapt to salt stress. In addition, the substance is obviously closely related to the pH value of a cell. "Our method has already provided us with fundamental new insights into the mechanisms of salt tolerance in plants," says Prof. Wenhua Zhang, lead author of the study at Nanjing University in China.

The procedure should be applicable in most plant organisms and also transferable to animal cells and organisms. In their next steps, the scientists intend to use the sensor in other cell and tissue regions and also to investigate the functional interactions of phosphatidic acid with other messenger substances.

Credit: 
University of Münster

Scientists advance search for memory's molecular roots

image: A model of the CaMKII protein shows multiple domains that allow it to bind actin filaments in the dendrites of neurons into bundles, giving the dendrites their shape. Researchers at Rice University, the University of Houston and the University of Texas Health Science Center at Houston believe the complex is key in the formation of long-term memory.

Image: 
Wolynes Research Lab/Rice University

HOUSTON - (Aug. 26, 2019) - A new piece of a difficult puzzle -- the nature of memory -- fell into place this week with a hint at how brain cells change structure when they learn something.

Interactions between three moving parts -- a binding protein, a structural protein and calcium -- are part of the process by which electrical signals enter neural cells and remodel the molecular structures thought to enable cognition and the storage of memories.

Colleagues from Rice University, the University of Houston (UH) and The University of Texas Health Science Center at Houston (UTHealth) combined theories, simulations and experiments to determine how a central binding protein -- calcium-calmodulin-dependent kinase II (CaMKII) -- binds and unbinds from the cytoskeleton of a neuron.

The team's report in the Proceedings of the National Academy of Sciences gives the first clear details of how the binding sites of CaMKII act to align actin filaments -- the structural protein -- into long, rigid bundles. The bundles serve as the supporting skeletons of dendritic spines, spiky protrusions that receive chemical messages through synapses from other neurons.

Peter Wolynes, a theoretical physicist at Rice, joined an ongoing collaboration by UH physicist Margaret Cheung and UTHealth neurobiologist Neal Waxham that aimed to understand how signals make their way through dendrites, the branches on nerve cells that transmit information between cells.

Finding the complete structure of CaMKII has proven too complex for X-ray crystallography, although parts of its structure were known. When combined with the actin that makes up the cytoskeleton, the system also became the largest protein Wolynes and his team have analyzed via their protein-structure prediction program, AWSEM.

When they were done, the structure predicted by the computer was a remarkable match for two-dimensional electron microscope images by Waxham and his group that clearly show parallel actin filaments are held together, ladder-like, by rungs of CaMKII.

"There definitely are preliminary chemical steps involving the enzyme activity of CaMKII before you get to this stage; therefore, we don't have a completely clear picture of how to put everything together," Wolynes said. "But it's clear the assembly of the complex is the key step where chemistry turns into a larger-scale structure that can hold a memory."

CaMKII is uniquely suited to interact with actin, the most abundant protein in eukaryotic cells and one that has special abilities in neurons, where it not only has to give thousands of dendrites (in each of billions of neurons) their resting forms but also must give them a level of plasticity to adapt to a constant barrage of signals.

Actin molecules self-assemble into long, twisting filaments. The hydrophobic pockets between these molecules are perfectly configured to bind CaMKII, a large protein with multiple parts, or domains. These domains lock in to three consecutive binding sites on the filament, and the twists put binding sites at regular intervals to keep the proteins from piling up.

CaMKII's "association" domain is a six-fold subunit that also binds to adjacent filaments to form actin bundles, the backbones of dendritic spines that give these protrusions their shapes.

These bundles remain rigid if the dendrite contains little calcium. But when calcium ions enter through the synapse, they combine with calmodulin proteins, allowing them to bind to another part of CaMKII, the floppy regulatory domain. That triggers the disassociation of a domain of CaMKII from the filament, followed by the rest of the protein, opening a short window of time during which the bundles can reconfigure.

"When enough calcium comes in, the activated calmodulin breaks up these structures, but only for a while," Wolynes said. "Then the cytoskeleton reforms. During that time, the dendritic spine can take on a different shape that might be bigger."

"We know calcium brings information into the cell," Cheung added. "But how nerve cells know what to do with it really depends on how this protein encodes information. One part of our work is to connect that on a molecular level and then project how these simple geometrical rules develop larger microscale structures."

The team's calculations showed the association domain is responsible for about 40% of the protein's binding strength to actin. A linker domain adds another 40% and the crucial regulatory domain provides the final 20% -- a sensible strategy, since the regulatory domain is on the lookout for incoming calcium-calmodulins that can unzip the entire protein from the filament.

The project came together through Rice's Center for Theoretical Biological Physics (CTBP), of which Wolynes is co-director and Cheung a senior scientist. Their association goes back to when both were at the University of California, San Diego, he as a professor and she as a graduate student of Rice physicist José Onuchic, also a CTBP co-director. Wolynes also served on her thesis review panel, she said.

Cheung was aware of previous work by Wolynes and his Rice group that suggested actin stabilizes prion-like fibers thought to encode memories in neurons and decided it was a good match for her research with Waxham to see how calcium activates CaMKII.

"This is one of the most interesting problems in neuroscience: How do short-term chemical changes lead to something long term, like memory?" Waxham said. "I think one of the most interesting contributions we make is to capture how the system takes changes that happen in milliseconds to seconds and builds something that can outlive the initial signal."

The puzzle is far from complete, Wolynes said. "The earlier work by Margaret and Neal was about the initiation of memory events," he said of his colleagues' study of calmodulin. "Our prion paper was on the preservation of memory, at the end of the learning process. And actin's in the middle. There may be many other things in the middle, too.

"These big-picture questions are interesting to a lot of people," he said. "This is a key element of the problem, but it's clearly not the end of the story."

Credit: 
Rice University

Spontaneous brain fluctuations influence risk-taking

Minute-to-minute fluctuations in human brain activity, linked to changing levels of dopamine, impact whether we make risky decisions, finds a new UCL study.

The findings, published in the Proceedings of the National Academy of Sciences (PNAS), could explain why humans are inconsistent and sometimes irrational.

"Experts have long struggled to explain why people are so erratic, making one decision one day and the opposite decision another day. We know that the brain is constantly active, even when we aren't doing anything, so we wondered if this background activity affects our decision-making," said the study's co-lead author, Dr Tobias Hauser (UCL Queen Square Institute of Neurology).

"It appears that our inconsistent behaviour is partly explained by what our brain is doing when we are doing nothing."

The researchers focused on people in a state of rest (awake but not doing anything). At rest, the human brain remains active, with strong fluctuations in activity that remain unexplained.

For the study, 43 people completed a gambling task while in an MRI scanner. They were asked to choose between a safe option (gaining a small amount of money) and a risky option (gambling to try to get a larger amount of money). If they chose the risky option and lost, they would receive nothing.

The researchers monitored brain activity in the dopaminergic midbrain, the area of the human brain containing most of the dopamine neurons. Dopamine is a neurotransmitter known to play a role in risky decision-making. Whenever the activity in that brain area was either very high or very low, the study participants were asked to make a decision between a risky and a safe option.

The researchers, based at the Max Planck UCL Centre for Computational Psychiatry & Ageing Research and the Wellcome Centre for Human Neuroimaging, UCL, found that when this brain area was in a state of low activity before participants were presented with their options, they were more likely to choose the risky option than when their brains were in a state of high activity (while still lying idle in the scanner).

Assessing the impact of these brain fluctuations, the researchers say the effect size is comparable to other known factors affecting risk-taking behaviour, such as drugs that influence the neurotransmitter dopamine that are routinely taken by people with Parkinson's disease. The effect is also similar to ageing; being young is associated with greater risk-taking compared to being elderly.

"Our brains may have evolved to have spontaneous fluctuations in a key brain area for decision making because it makes us more unpredictable and better able to cope with a changing world," explained senior author Dr Robb Rutledge (UCL Queen Square Institute of Neurology).

The researchers aim to continue their research to find out if variations in background brain activity may have other impacts, and whether they could be related to other medical conditions in case the findings could eventually inform treatment approaches, such as for pathological gambling.

"Our findings underscore the importance of taking time when making important decisions, as you might make a different decision if you just wait a few minutes," said co-lead author PhD student Benjamin Chew (UCL Queen Square Institute of Neurology).

Credit: 
University College London

New rider data shows how public transit reduces greenhouse gas and pollutant emissions

Public transit has long been an answer for people looking to leave their car at home and reduce their air pollution emissions. But now, with better rider tracking tools, the University of Utah and the Utah Transit Authority can better answer the question: How much does public transit reduce pollution emissions?

In a paper published in Environmental Research Communications, University of Utah researchers Daniel Mendoza, Martin Buchert and John Lin used tap-on tap-off rider data to quantify the emissions saved by buses and commuter rail lines, and also project how much additional emissions could be saved by upgrading the bus and rail fleet. The study was conducted in cooperation with the Utah Transit Authority and the Utah Department of Environmental Quality, Division of Air Quality.

High-resolution rider data

Mendoza and his colleagues are certainly not the first to ask how much pollution public transit can save. But a couple of recent technological advances have enabled them to answer the question with a level of detail previously unparalleled.

The first is the advance of tap-on tap-off farecards that provide anonymized data on where those riders who have electronic passes enter and exit public transit. Approximately half of UTA's passengers use an electronic fare medium. "Now we can truly quantify trips in both time and space," Mendoza says. "We accounted for all of the 2016 passenger miles by scaling the farecard data, and we know which trips farecard holders make on buses, light rail and commuter rail."

The second is the General Transit Feed Specification system. It's the data source that supplies Google Maps with transit information to help users find the bus or train they need. With that data source, the researchers could track where and how often UTA's buses and trains run.

So, with high-resolution data on the movement of both vehicles and passengers, the researchers could paint a nearly comprehensive picture of public transit along the Wasatch Front.

Balancing emissions

So, with that data, the researchers could quantify the emissions produced and miles traveled of the transit systems (TRAX light rail uses electricity produced outside the Wasatch Front, hence the emissions aren't in Salt Lake's air) and balance that with the miles traveled by passengers and the estimated amount of car travel avoided by riding transit.

On weekdays during rush hours, and in densely populated areas, the balance was clearly on the side of reduced emissions. "That tapers off significantly during the evening hours, on the outskirts of the city, and definitely during the weekends," Mendoza says. In those situations, the number of passengers and how far they rode transit did not offset certain criteria pollutant emissions. (Criteria pollutants are six common air pollutants that the EPA sets standards for through the Clean Air Act.)

For transit to improve its regional reduction in emissions, particularly PM2.5 and NOx, the following strategies, alone or in combination, could be employed: more daily riders per trip, more clean-fuel buses and train cars and/or fewer low-ridership trips.

What-ifs

The current study looks at the bus and train fleet as they are now, with some UTA buses around 20 years old and FrontRunner trains whose engines are rated a Tier 0+ on a 0-4 scale of how clean a locomotive's emissions are (Tier 4 is the cleanest; UTA is scheduled to receive funds programmed through the Metropolitan Planning Organizations to upgrade FrontRunner locomotives to Tier 2+). So, Mendoza and his colleagues envisioned the future.

"What if we upgrade all these buses, some of them from 1996 or so?" Mendoza says. "They emit a significantly larger amount than the newer buses, which are 2013 and newer."

What if, they asked, UTA upgraded their buses to only 2010 models and newer, fueled by either natural gas or clean diesel? And what if the FrontRunner engines were upgraded to Tier 3?

Emissions of some pollutants would drop by 50%, and some by up to 75%, they found.

"Now, with this information, UTA can go to stakeholders and funding agencies and say, 'Look, we've done this analysis," Mendoza says. "This is how much less we can pollute.'"

Mendoza adds that taking transit offers additional benefits besides reducing air pollution. Taking transit gives riders time to read, work or listen while traveling. How does Mendoza know? He's a dedicated transit rider. "I always get to where I need to go pretty much on time and completely unstressed," he says. "I almost never drive."

Credit: 
University of Utah

Social, executive brain functions crucial for communication

image: An example image set shown to the participants in the experiment.

Image: 
Healey et al., eNeuro 2019

Impairments in social and executive brain functions hinder effective communication, according to research in patients with dementia recently published in eNeuro.

Non-language brain regions are thought to be critical for effective language functions, due to the complex, social nature of communication. Frontotemporal dementia affects social and executive brain functions but does not cause speech impairment, allowing researchers to study the role of non-language functions in communication.

Meghan Healey and colleagues at the University of Pennsylvania showed patients with dementia and healthy participants illustrations of an object near and then on a bookshelf. The participants chose descriptive words from a multiple-choice list in order to communicate which object moved to an imaginary partner, who was said to be colorblind in some rounds.

The patients selected descriptions that were either overdetailed or too vague more often than the healthy participants, even though they had comparable scores on simple language tests. The healthy participants performed best with a colorblind partner, since they knew to avoid color descriptors and chose other adjectives. The patients, on the other hand, were not sensitive to the needs of their conversational partner. These findings demonstrate that social and executive functions must be integrated during language processing for successful communication.

Credit: 
Society for Neuroscience

Honeybee brain development may enhance waggle dance communication

image: Honeybee brain anatomy, with a reconstruction of a DL-Int-1 neuron (left).

Image: 
Kumaraswamy et al., eNeuro 2019

Changes in a vibration-sensitive neuron may equip forager honeybees for waggle dance communication, according to research recently published in eNeuro.

Forager honeybees share information about the location and value of food sources by moving their body from side to side and beating their wings. The observing bees interpret the waggle dance through sensory organs that send the information to vibration-sensitive neurons, including DL-INT-1.

After developing from an egg, young adult honeybees emerge from their cell and begin learning their social position. The honeybees that become foragers learn the waggle dance, which may require further brain development. To explore this, Ajayrama Kumaraswamy and colleagues at Ludwig-Maximilians-Universität München, Fukuoka University, and University of Hyogo recorded the electrical activity of DL-INT-1 neurons in young adult and forager honeybees and then created computer simulations and three-dimensional models of the neurons.

In specific regions of DL-INT-1 neurons, the older bees had less dense branching compared to the younger bees. Additionally, the neurons in older bees demonstrated enhanced signaling and more precise connections to other brain regions. These findings suggest that important adaptations occur in the honeybee during the transition into the forager role, which allow them to effectively communicate via the waggle dance.

Credit: 
Society for Neuroscience

Japanese trees synchronize allergic pollen release over immense distances

image: Created from public open data by Ministry of the Environment pollen observation system "Hanako-san". The in-phase matrix visualized the annual changes of phase-synchronizations across the Japanese islands.

Image: 
Kenshi Sakai, TUAT

Tokyo, Japan - Complaints of allergic rhinitis (hayfever) are common worldwide, affecting around 17% of the Japanese population in spring and summer (around 20 million people). In Japan, the main tree species causing hayfever are Japanese cedar and Japanese cypress, with a combined land area of over 7 million hectares. Their pollen is dispersed between February and May and causes a range of symptoms from itchy eyes and runny noses through to severe respiratory disorders.

Management of these symptoms is dependent on timely and accurate pollen forecasting. It is known that the pollen from these trees has alternate annual cycles of ON (abundant pollen) and OFF (lower amounts) but it is not yet known how the dispersal of pollen synchronizes across Japan as studies to date have been based on only time-limited and local data.

In a study published on August 7th in Scientific Reports, researchers from Tokyo University of Agriculture and Technology (TUAT) made use of publicly available data from the Japanese Ministry of the Environment's "Hanako-san" pollen observation system to investigate annual fluctuations in pollen distribution at 120 locations for 14 years and to clarify the spatial synchronization across the Japanese islands.

This research is the first of its kind to study this synchronicity at the national scale and operates across a wide range of academic fields such as nonlinear physics, ecology, and environmental science. "We used a new 'in-phase and out-of-phase' approach as an update to the previous correlation-based methods," says co-author of the study Kenshi Sakai. "We found that the phase synchronizations within and between regions were strong in eastern regions of Japan, such as Hokkaido and Kanto, but the synchronization in western regions was much weaker."

Remarkably, all 120 sites in Japan behaved perfectly in-phase in 2009-2010, even over distances of over 1600 km. In a dramatic contrast, almost perfect desynchronization occurred in 2015-2016. "We still don't know what causes these regional differences in the pollen dispersal behavior," says lead author Akira Ishibashi, "It could be related to correlations in environmental variabilities at the locations of the different tree populations, known as the Moran effect."

Further research will be needed to pinpoint the specific mechanisms behind the observed synchronicity, but the findings will be useful for application in numerical forecasting of pollen abundance, which is crucial to clinicians and individuals suffering with hayfever.

"These findings and our proposed technique with an in-phase matrix have other applications outside of improving the prediction accuracy of allergenic pollen," says Sakai. "The new technique could also be used to quantitatively explain other environmental changes throughout Japan."

Credit: 
Tokyo University of Agriculture and Technology

Researchers use AI to plot green route to nylon

image: Miguel Modestino, professor of chemical biomolecular engineering and Danela Blanco, a Ph.D. student pose with reactor for organic electrochemistry.

Image: 
The NYU Tandon School of Engineering

BROOKLYN, New York, Monday, August 26, 2019 - The chemical and allied industries face such challenges as ready access to reliable energy supplies, waste reduction, water conservation, and energy efficiency. Organic electrosynthesis - an electricity-driven, energy-efficient process that can easily integrate with renewable energy sources - could help solve them.

A team at the NYU Tandon School of Engineering reported that in its search to develop an innovative, environmentally friendly process to make adiponitrile (ADN) - the main precursor to nylon 6, 6 - it found a way to greatly improve the efficiency of organic electrosynthesis. The researchers credited their success in part to what they believe is the first use of artificial intelligence to optimize an electrochemical process.

Miguel Modestino, a professor of chemical and biomolecular engineering, and doctoral student Daniela Blanco tweaked how electrical current is delivered to catalytic electrodes and then applied artificial intelligence (AI) to further optimize the reaction. By doing so they achieved a 30% improvement in ADN production

The findings, detailed in the Proceedings of the National Academy of Sciences (PNAS), could have major implications since the team targets one of the largest organic electrosynthesis processes in the chemical industry: the electrohydrodimerization of acrylonitrile (AN) to ADN.

Demand for ADN is high and growing: The market for nylon is expected to increase 4% annually through 2023. Only one company currently uses a Monsanto-invented electrochemical process to make ADN; the lion's share of the nylon precursor is made via a toxic, energy intensive thermal hydrocyanation of butadiene. By contrast, electrosynthesis of ADN is a green, efficient, chemical process that uses water-based electrolytes and can be directly coupled with renewable electricity sources such as wind or sunlight.

The standard electrosynthetic process for ADN employs an "always on" direct electrical current delivered to the electrocatalytic site. But the NYU Tandon researchers found that a direct current did not maximize output of ADN and generated a great deal of the unwanted byproduct propionitrile (PN). They decided to engineer a system that delivers an intermittent current to constantly replenish reagent concentration at the electrocatalytic site (a phenomenon called mass transport) and improve ADN output.

The pair supplied an artificial neural network with data from 16 different experimental cases of pulse times.

"By analyzing electrochemical pulse techniques with AI, we were able to visualize ADN conversion efficiency across a range of pulse times without having to do more than a few physical experiments," said Modestino. "This innovative, integrated approach led to an unprecedented 30% improvement in ADN production and a 325% increase in the ratio of ADN to PN, mostly due a large decrease in production of the latter."

Blanco explained that this technique could advance industry adoption of more sustainable processes. That is exactly what she and a former student in Modestino's laboratory envisioned when they founded a green-chemistry startup company, Sunthetics, to commercialize a sustainable nylon production process based on their research.

"We wanted to show with this new research that we can make the ADN electrochemical process more competitive," she said. "Currently only 30% of global ADN output employs electrosynthesis; the rest of production involves processing over an energy- and oil-intensive catalytic reactor," she said.

The next step for the team will be to use this AI approach to accelerate their research endeavors. "Instead of using a classical research model involving lengthy experimental campaigns, AI tools can help us predict experimental outcomes. To the best of our knowledge, this is the first time AI has been used to optimize an electrochemical process," Modestino said.

Credit: 
NYU Tandon School of Engineering

Canadian children's diet quality during school hours improves over 11-year period

image: This is a chart showing mean School-HEI sub-scores for Canadian school-aged children between 9 a.m. and 2 p.m., in 2004 and 2015.

Image: 
UBC Media Relations

Surveys taken 11 years apart show a 13 per cent improvement in the quality of foods consumed by Canadian children during school hours, a new UBC study has found.

"It's essential to look at what foods children are eating at school and how their diets have changed over time to identify challenges and opportunities to promote healthier diets among Canadian children," said Claire Tugault-Lafleur, a postdoctoral research fellow at UBC who was lead author of the study published today in Public Health Nutrition. "Because we had observed dietary differences across age groups and provinces in 2004, we also wanted to see whether children's diets at school changed the same way for everyone over time."

Researchers examined data from Canadian Community Health Surveys conducted in 2004 and 2015, involving over 7,000 children across Canada between the ages of six and 17. Respondents provided information about the foods and beverages they had consumed in the previous 24 hours.

The study looked at the nutritional profile of foods consumed during school hours (9 a.m. to 2 p.m.), and during the entire school day, using a holistic measure of dietary quality known as the Canadian Healthy Eating Index (C-HEI). This index assigns a score based on 11 key components of a healthy diet.

The overall quality of children's diet scores during school hours improved by 13 per cent from what had been reported about a decade earlier. Increased fruit and vegetable intake, and fewer calories from minimally nutritious foods (particularly sugar-sweetened beverages and salty packaged snacks), were responsible for the improvement in total dietary quality scores. Still, as of 2015, the average school-hour diet needs substantial improvement for children to meet recommendations in Canada's Food Guide.

In both survey years, the lowest school-hour HEI sub-scores were reported for dark green and orange vegetables, whole fruit, whole grains, and milk and alternatives (see Figure 1).

In both 2004 and 2015, researchers found significant differences in school-hour diet quality across age groups. Younger children reported higher school-hour diet quality scores than older youth in both survey years. Researchers also found some differences across provinces, although all provinces saw improvements over time.

Food insecurity--defined as inadequate or insecure access to food due to financial constraints--appeared to be having a greater impact on children's diet quality at school in 2015. A worrisome finding was that in 2015 (but not 2004), children living in food-insecure households had slightly lower (but statistically significant) diet quality scores compared to children living in food secure households.

Improving children's eating habits through policies and programs is becoming a priority in Canada. In 2017, the Government of Canada launched a consultation process to develop a national food policy with a focus on improving health and access to affordable food. Canada is the only G7 country without a national school food program, and there is an ongoing petition to implement an adequately funded national cost-shared universal healthy school food program.

"Interventions which help ensure that all Canadians can afford nutritious meals for their children have potential of helping Canadian children move closer towards national dietary recommendations," said Tugault-Lafleur.

Associate professor Jennifer Black and professor emeritus Susan Barr of UBC's food, nutrition and health program in the faculty of land and food systems oversaw Tugault-Lafleur's research and co-authored the study.

Credit: 
University of British Columbia

From cradle to grave: postnatal overnutrition linked to aging

image: Diabetes increases the risk of serious health complications.

Image: 
Image by Mary Pahlke from Pixabay

Researchers at Baylor College of Medicine have found a new answer to an old question: how can overnutrition during infancy lead to long-lasting health problems such as diabetes? The report, published today in the journal Environmental Epigenetics, focuses on the pancreatic Islets of Langerhans, which produce insulin and other hormones.

Islets of mice that were overnourished during the first 21 days after birth (their 'infancy') tended to gain DNA methylation tags, chemical modifications that alter gene expression, while mice that were not overnourished showed similar changes, but much later in life.

"It's been known for several decades that mice that are overnourished during the suckling period remain overweight and will be prone to disease for their entire lives. Particularly, they have problems regulating their blood sugar levels," said corresponding author Dr. Robert A. Waterland, professor of pediatrics - nutrition at the USDA/ARS Children's Nutrition Research Center at Baylor College of Medicine and Texas Children's Hospital and of molecular and human genetics at Baylor.

Previous studies also have shown that patients with Type-2 diabetes have altered DNA methylation, the addition of methyl chemical groups, in their insulin-producing pancreatic islets. These alterations have been linked to islet malfunction and the onset of diabetes, but how they occur remains a mystery.

Looking to shed light on this important topic, Waterland and his colleagues investigated whether early postnatal overnutrition could alter epigenetic development in pancreatic islets.

Epigenetics refers to molecular mechanisms that determine which genes will be turned on or off in different cell types. Think of one's DNA as the computer hardware, and epigenetics as the software that determines what the computer can do. Epigenetics works by adding or removing chemical tags on genes to mark those that should be used. DNA methylation is one of the better studied tags and plays an important role in development.

The researchers worked with two groups of mice, one was overnourished during infancy and the other was not and represented the control group."Adjusting litter size during the suckling period provides a natural means to overnourish mouse pups," said Waterland, who is a member of the Dan L Duncan Comprehensive Cancer Center at Baylor. "Normal size litters have about 10 mice, and served as our control group. The overnourished group came from moms whose litters were reduced to only four pups each. These pups get an 'all you can eat buffet' and become overweight by the time of weaning."

But weight was not the only difference between the two groups of pups. The researchers applied genome-scale DNA methylation profiling to islets of overnourished and control mice at both 21 days (weaning) and 180 days after birth (considered middle-age for mice).

The results revealed that islets from control mice tended to gain DNA methylation as they aged. Compared to controls, however, islets of overnourished mice showed increases in DNA methylation right at weaning. Unexpectedly, there was a substantial overlap between the DNA methylation profile of middle-aged controls and that of the much younger 21 day old overnourished mice.

"By the age of weaning, islets of overnourished mice show an epigenetic profile resembling that of much older mice," Waterland said. "Our interpretation is that postnatal overnutrition causes accelerated epigenetic aging in the islets. Since the ability to regulate blood sugar declines with age, this premature epigenetic aging may help explain how overnutrition during infancy increases the risk of diabetes later in life."

Diabetes is a serious, pervasive health concern worldwide. According to a 2017 report from the Centers for Disease Control and Prevention, 9.3 percent of the U.S. population - about 30 million people - are afflicted with the condition, which increases the risk of serious health complications including premature death, vision loss, heart disease, stroke, kidney failure and amputation of toes, feet or legs.

"In these days of escalating pediatric overnutrition and obesity, we urgently need to understand the adverse consequences of overnutrition in human infancy. I believe that optimizing nutrition during these critical periods of development will prove to be an effective approach to prevent adult disease," Waterland said.

Credit: 
Baylor College of Medicine

Elderly have poor prognosis after recovery in long-term acute care hospitals

While long-term acute care hospitals (LTACHs) are designed to help patients recover and regain independence, fewer than 1-in-5 older adults who were transferred to such facilities were alive five years later, leaving them with a worse prognosis than terminal illnesses such as advanced cancer, according to research at UC San Francisco and The University of Texas Southwestern Medical Center.

In a study of 14,072 hospital patients admitted to an LTACH, the researchers found that the average patient spent 65.6 percent of his or her remaining life in a hospital or inpatient setting, and more than a third (36.9 percent) died in one, never returning home. Only 16 percent ever enrolled in hospice for an average of 10 days, which is far lower than Medicare beneficiaries not cared for in LTACHs, indicating a potential missed opportunity to improve care at the end of life.

The researchers said the results shed light on more realistic expectations for patients entering LTACHs and could give patients and their physicians the information they need in choosing care consistent with their values and preferences. Findings appear Aug. 26, 2019, in Journal of the American Geriatrics Society (JAGS).

"Understanding the clinical course after LTACH admission can inform goals of care discussions, planning for care at the end of life and prioritizing health care needs," said lead author Anil Makam, MD, MAS, assistant professor of medicine at UCSF. "It also may lead some patients to shift from intensive life-sustaining and rehabilitative treatment to hospice care, with a focus on managing their symptoms and improving the quality of their remaining life."

LTACHs provide extended, complex, post-acute care to more than 120,000 Medicare beneficiaries annually. They differ from acute-care hospitals and skilled-nursing facilities by focusing on treating patients who require extended inpatient care, typically for three to five weeks after initial hospitalization.

In the JAGS study, Makam and his UT Southwestern colleagues reviewed a portion of national Medicare data from 2009 to 2013 on hospitalized patients at least 65 years old who were transferred to an LTACH to recover. They examined patient mortality, recovery (60 consecutive days without inpatient care), time spent in an inpatient facility after LTACH admission, receipt of an artificial life-prolonging procedure such as a feeding tube or tracheostomy, and palliative care physician consultation.

Of 14,072 qualifying patients, 40 percent were admitted to an LTACH for a respiratory diagnosis. The average survival was 8.3 months, with one- and five-year survival rates of 45 percent and 18 percent, respectively. Almost 53 percent of patients never achieved 60-day recovery, the researchers said.

By comparison, the five-year survival rates for the most common adult cancers are breast, 75 percent; prostate, 69 percent; lung, 13 percent; colorectal, 51 percent; and bladder, 74 percent.

The study found that the youngest group (65-69 years old) and those with a musculoskeletal diagnosis had the most favorable prognosis, with an average survival of 17.3 months among the 65-69 age group and 25.9 months for those with musculoskeletal conditions. Patients 85 years or older had the worst prognosis, with an average survival of 4 months, and spent an average of 97.7 percent of the remainder of their lives as inpatients. Those with a primary respiratory diagnosis were close behind, with an average survival of 5.3 months and 88.8 percent of their remaining lives as inpatients.

During their hospital and subsequent LTACH stay, 30.9 percent of patients received an artificial life-prolonging procedure, with feeding tubes being the most common (22.3 percent). A startlingly low 1 percent were consulted by a palliative care physician, and 3.2 percent by a geriatrician, leading the researchers to advocate further study to examine their unmet palliative care needs.

"As we didn't interview or survey individuals, we don't know if they desired life-sustaining or intensive care, were informed of their prognosis, received palliative care from their primary physician in the LTACH, or the extent of their symptom burden or quality of life," Makam said. "But, in many ways, LTACHs are a more ideal setting for palliative care interventions than acute care hospitals given their much longer length of stay, higher concentration of very ill patients and less focus on diagnostic evaluation."

To prevent unnecessary harm and avoid additional burdensome care in these patients, the researchers also recommend that clinicians strongly consider waiving non-vaccination preventive care treatments for asymptomatic conditions or modifying risk factors to prevent adverse outcomes in the distant future.

Credit: 
University of California - San Francisco