Culture

Newly identified meningeal lymphatic vessels answers key questions about brain clearance

image: The waste of the brain is drained by the CSF and exits the brain through the mLVs at the skull base (basal mLVs). The basal mLVs are connected to the lymphatic system through a hole in the skull (skull foramen) and they have abundant lymphatic vessel branches with a finger-like protrusions. There exists a valve within the vessel structure of the basal meningeal lymphatic vessels that allows the lymph to flow in one direction. In particular, the basal meningeal lymphatic vessels are anatomically located in close proximity to the CSF and have structures favoring the absorption and drainage of the CSF. The CSF is cleared outside the central nervous system into the deep cervical lymph nodes after drainage from the basal meningeal lymphatic vessels. Figure B is a schematic diagram showing that the basal mLVs undergo a severe deformation process and that their functionality is impaired with age.

Image: 
IBS

Just see what happens when your neighborhood's waste disposal system is out of service. Not only do the piles of trash stink but they can indeed hinder the area's normal functioning. That is also the case when the brain's waste management is on the blink. The buildup of toxic proteins in the brain causes a massive damage to the nerves, leading to cognitive dysfunction and increased probability of developing neurodegenerative disorders such as Alzheimer's disease. Though the brain drains its waste via the cerebrospinal fluid (CSF), little has been understood about an accurate route for the brain's cleansing mechanism.

Scientists led by Dr. Gou Young Koh at the Center for Vascular Research within the Institute for Basic Science (IBS) at the Korea Advanced Institute of Science and Technology (KAIST), Daejeon, South Korea have reported the basal side of the skull as the major route, so called "hotspot" for CSF drainage. They found that basal meningeal lymphatic vessels (mLVs) function as the main plumbing pipes for CSF. They confirmed macromolecules in the CSF mainly runs through the basal mLVs. Notably, the team also revealed that the brain's major drainage system, specifically basal mLVs are impaired with aging. Their findings have been reported in the journal Nature on July 24.

Throughout our body, excess fluids and waste products are removed from tissues via lymphatic vessels. It was only recently discovered that the brain also has a lymphatic drainage system. mLVs are supposed to carry waste from the brain tissue fluid and the CSF down the deep cervical lymph nodes for disposal. Still scientist are left with one perplexing question -- where is the main exit for the CSF? Though mLVs in the upper part of the skull (dorsal meningeal lymphatic vessels) were reported as the brain's clearance pathways in 2014, no substantial drainage mechanism was observed in that section.

"As a hidden exit for CSF, we looked into the mLVs trapped within complex structures at the base of the skull," says Dr. Ji Hoon Ahn, the first author of this study. The researchers used several techniques to characterize the basal mLVs in detail. They used a genetically engineered lymphatic-reporter mouse model to visualize mLVs under a fluorescence microscope. By performing a careful examination of the mice skull, they found distinctive features of basal mLVs that make them suitable for CSF uptake and drainage. Just like typical functional lymphatic vessels, basal mLVs are found to have abundant lymphatic vessel branches with finger-like protrusions. Additionally, valves inside the basal mLVs allow the flow to go in one direction. In particular, they found that the basal mLVs are closely located to the CSF. Dr. Hyunsoo Cho, the first author of this study explains, "All up, it seemed a solid case that basal mLVs are the brain's main clearance pathways."

The researchers verified such specialized morphologic characteristics of basal mLVs indeed facilitate the CSF uptake and drainage. Using CSF contrast-enhanced magnetic resonance imaging in a rat model, they found that CSF is drained preferentially through the basal mLVs. They also utilized a lymphatic-reporter mouse model and discovered that fluorescence-tagged tracer injected into the brain itself or the CSF is cleared mainly through the basal mLVs. Jun-Hee Kim, the first author of this study notes, "We literally saw that the brain clearance mechanism utilizing basal outflow route to exit the skull."

It has long been suggested that CSF turnover and drainage declines with ageing. However, alteration of mLVs associated with ageing is poorly understood. In this study, the researchers observed changes of mLVs in young (3-month-old) and aged (24~27-months-old) mice. They found that the structure of the basal mLVs and their lymphatic valves in aged mice become severely flawed, thus hampering CSF clearance. The corresponding author of this study, Dr. Koh says, "By characterizing the precise route for fluids leaving the brain, this study improves our understanding on how waste is cleared from the brain. Our findings also provide further insights into the role of impaired CSF clearance in the development of age-related neurodegenerative diseases."

Many current therapies for Alzheimer's disease target abnormally accumulated proteins, such as beta-amyloid. By mapping out a precise route for the brain's waste clearance system, this study may be able to help find ways to improve the brain's cleansing function. Such breakthrough might become quite a sensational strategy for eliminating the buildup of aging-related toxic proteins. "It definitely warrants more extensive investigation of mLVs in patients with age-related neurodegenerative disease such as Alzheimer's disease prior to clinical investigation," adds Dr. Koh.

Credit: 
Institute for Basic Science

How random tweaks in timing can lead to new game theory strategies

Most game theory models don't reflect the relentlessly random timing of the real world. In some models, players may receive information at the same time, and they act simultaneously. Others may include randomness in terms of sharing information or acting, but that randomness occurs at discrete steps.

But that's not the way the world works, notes economist Justin Grana at the RAND Corporation in Washington, D.C., a former postdoctoral scholar at the Santa Fe Institute (SFI). Competing companies make decisions based on when they receive information, as well as what that information is. Timing can make or break a decision, and randomness evolves continuously, not step-wise.

"We don't know when things will happen," says Grana. "The environment changes according to some kind of random processes, and we have to act at random times. You might get information, and that information causes you to act - or it might cause delays in what you do."

In a new paper in the Berkeley Electronic Journal of Theoretical Economics, Grana and his collaborators investigate game theory models that address what happens when players receive information or act at random times, determined as part of a continuous time evolution.

"We wanted to introduce the uncertainty of timing in these scenarios," says Grana. He developed the models with physicist David Wolpert of SFI and economist James Bono.

In the new work, the researchers looked at models of Bertrand competition -- scenarios that predict how consumers will respond when sellers set the price of a good. They wanted to know under what circumstances random time fluctuations could lead to collusion, a specific kind of cooperation in which two parties may share information if both benefit.

Imagine two gas stations, for example, facing each other off the same remote exit of the same lonely interstate. The owners buy gas at the same price. Knowing that customers will choose the cheaper option, one owner may lower the price, prompting her competitor to lower his price, and so on until neither station can make excessive profits. In the interest of keeping the businesses alive, the two may instead decide to keep prices high and share the customers equally.

The model could help identify, for example, at what rate the station owners would need to have new information about demand in order to sustain this collusive structure, says Grana. It could predict how fluctuation in that timing could influence the strategic decisions of the players involved.

The model is part of an emerging research interest in a variety of fields -- ranging from economics to engineering to air traffic controls -- that focuses on how asynchronous events can influence game theory strategies. Although it's too early to see if real-world data lines up with the predictions of such an abstract model, Grana says this exploratory work suggests that small tweaks in timing can make a big difference in decision-making.

"Those changes are rich enough to show that it's worthwhile to explore loosening our assumptions about timing," he says.

Credit: 
Santa Fe Institute

Found: Fastest eclipsing binary, a valuable target for gravitational wave studies

video: Artist's animation depicting the eclipsing binary ZTF J1530+5027, which is comprised of two extremely dense objects (white dwarfs) that orbit each other roughly every seven minutes. One second of time in the animation represents two minutes of real time. The smaller white dwarf is slightly larger than Earth and is the more massive of the two, with about 60% the mass of the sun. Its companion is larger but less massive, with only about 20% the mass of the sun. The orbital separation of these objects is shrinking by about 26 centimeters per day due to the emission of gravitational waves, depicted in green near the end of the movie.

Image: 
Caltech/IPAC

Observations made with a new instrument developed for use at the 2.1-meter (84-inch) telescope at the National Science Foundation’s Kitt Peak National Observatory have led to the discovery of the fastest eclipsing white dwarf binary yet known. Clocking in with an orbital period of only 6.91 minutes, the rapidly orbiting stars are expected to be one of the strongest sources of gravitational waves detectable with LISA, the future space-based gravitational wave detector.

The Dense "Afterlives" of Stars

After expanding into a red giant at the end of its life, a star like the Sun will eventually evolve into a dense white dwarf, an object with a mass like that of the Sun squashed down to a size comparable to Earth. Similarly, as binary stars evolve, they can engulf their companion in the red giant phase and spiral close together, eventually leaving behind a close white dwarf binary. White dwarf binaries with very tight orbits are expected to be strong sources of gravitational wave radiation. Although anticipated to be relatively common, such systems have proven elusive, with only a few identified to date.

Record-setting White Dwarf Binary

A new survey of the night sky, currently underway at Palomar Observatory and Kitt Peak National Observatory, is changing this situation.

Each night, Caltech’s Zwicky Transient Facility (ZTF), a survey that uses the 48-inch telescope at Palomar Observatory, scans the sky for objects that move, blink, or otherwise vary in brightness. Promising candidates are followed up with a new instrument, the Kitt Peak 84-inch Electron Multiplying Demonstrator (KPED), at the Kitt Peak 2.1-meter telescope to identify short period eclipsing binaries. KPED is designed to measure with speed and sensitivity the changing brightness of celestial sources.

This approach has led to the discovery of ZTF J1539+5027 (or J1539 for short), a white dwarf eclipsing binary with the shortest period known to date, a mere 6.91 minutes. The stars orbit so close together that the entire system could fit within the diameter of the planet Saturn.

"As the dimmer star passes in front of the brighter one, it blocks most of the light, resulting in the seven-minute blinking pattern we see in the ZTF data," explains Caltech graduate student Kevin Burdge, lead author of the paper reporting the discovery, which appears in the today's issue of the journal Nature.

A Strong Source of Gravitational Waves

Closely orbiting white dwarfs are predicted to spiral together closer and faster, as the system loses energy by emitting gravitational waves. J1539's orbit is so tight that its orbital period is predicted to become measurably shorter after only a few years. Burdge's team was able to confirm the prediction from general relativity of a shrinking orbit, by comparing their new results with archival data acquired over the past ten years.

J1539 is a rare gem. It is one of only a few known sources of gravitational waves--ripples in space and time--that will be detected by the future European space mission LISA (Laser Interferometer Space Antenna), which is expected to launch in 2034. LISA, in which NASA plays a role, will be similar to the National Science Foundation's ground-based LIGO (Laser Interferometer Gravitational-wave Observatory), which made history in 2015 by making the first direct detection of gravitational waves from a pair of colliding black holes. LISA will detect gravitational waves from space at lower frequencies. J1539 is well matched to LISA; the 4.8 mHz gravitational wave frequency of J1539 is close to the peak of LISA's sensitivity.

Discoveries Continue for Historic Telescope

Kitt Peak’s 2.1-meter telescope, the second major telescope to be constructed at the site, has been in continuous operation since 1964. Its history includes many important discoveries in astrophysics, such as the Lyman-alpha forest in quasar spectra, the first gravitational lens by a galaxy, the first pulsating white dwarf, and the first comprehensive study of the binary frequency of stars like the Sun. The latest result continues its venerable track record.

Lori Allen, Director of Kitt Peak National Observatory and Acting Director of NOAO says, “We’re thrilled to see that our 2.1-meter telescope, now more than 50 years old, remains a powerful platform for discovery.”

“These wonderful observations are further proof that cutting-edge science can be done on modest-sized telescopes like the 2.1-meter in the modern era,” adds Chris Davis, NSF Program Officer for NOAO.

More Thrills Ahead!

As remarkable as it is, J1539 was discovered with only a small portion of the data expected from ZTF. It was found in the ZTF team's initial analysis of 10 million sources, whereas the project will eventually study more than a billion stars.

"Only months after coming online, ZTF astronomers have detected white dwarfs orbiting each other at a record pace," says NSF Assistant Director for Mathematical and Physical Sciences, Anne Kinney. "It's a discovery that will greatly improve our understanding of these systems, and it's a taste of surprises yet to come."

Credit: 
Association of Universities for Research in Astronomy (AURA)

A new framework to study congenital heart defects

image: Gladstone scientists Deepak Srivastava (left), Yvanka De Soysa (center), and Casey Gifford (right) publish a complete catalog of the cells involved in heart development.

Image: 
Photo: Gladstone Institutes

Each year, 9 months of dreams and anticipation shared by millions of parents-to-be turn to despair and fright when learning their child is born with a birth defect; an often-devastating event affecting one out of 20 children born worldwide. The formation of our organs, limbs, and face are the result of carefully choreographed movement and behavior by millions of cells, much like dancers in a troupe. If even a few cells don't get to the right position and do their job correctly, the end result is a birth defect. Yet, how each individual cell knows what to do at precisely the right time and place has largely been a mystery.

In a new study published in the scientific journal Nature, a team of researchers at the Gladstone Institutes, in collaboration with the Luxembourg Centre for Systems Biomedicine (LCSB) of the University of Luxembourg, reveal for the first time the full spectrum of cells that come together to make a heart at the earliest stages of embryo formation. They also uncovered how the cells are controlled, and how a mutation in just one gene can have catastrophic consequences by affecting a tiny group of cells that make up the organ.

Congenital heart defects are the most common and most lethal human birth defect. Thanks to the advent of a powerful new technology known as single-cell RNA sequencing, the researchers were finally able to discern the role of tens of thousands of individual cells during the formation of the heart, which is essential to determine how genetic mutations cause disease.

"With genome sequencing, we can now more easily find genetic variants that we think are contributing to a disease," said Gladstone President and Senior Investigator Deepak Srivastava, MD, who led the study. "The big challenge is figuring out the specific cell type in which this variant is functioning and how those cells are impacted. This has been particularly difficult for birth defects, given that genetic variants affect only a small subset of the cells in the organ. With single-cell technologies, we can finally begin to unravel the mechanisms behind defects for which we know the genetic cause."

The catalog that Srivastava and his team compiled contains all the genes that are active during different stages of heart development and identifies the cells in which they can be found. It represents the first step in making the connection between a genetic variant and a specific cell type.

"This can tell us, among many other things, which subset of cells are performing critical functions in specific regions of the heart and which are contributing to the underlying cause of a disease associated with genetic mutations," explained Yvanka De Soysa, a graduate student in Srivastava's laboratory and first author of the study.

A Rich Source of Data on Heart Development

To complete the repository, the researchers studied nearly 40,000 individual cardiac cells from a mouse model of heart development. The technology that made this study possible is single-cell RNA sequencing. This sophisticated method, which has only been commercially available for the past 3 years in its current form, enabled the scientists to capture data about thousands of individual cells at once.

"This sequencing technique allowed us to see all the different types of cells present at various stages of heart development and helped us identify which genes are activated and suppressed along the way," said Casey A. Gifford, PhD, a staff scientist at Gladstone who is a senior author on the paper. "We were not only able to uncover the existence of unknown cell types, but we also gained a better understanding of the function and behavior of individual cells--information we could never access before."

Once they identified the numerous types of cells involved in heart development, the team wanted to learn how these diverse cell types are generated. To do so, they teamed up with computational biologists at the LCSB who specialize in using data from single-cell RNA sequencing to uncover the molecular drivers of different cell types.

"Our group has a long history of developing computational models to understand cell conversion," explained Antonio Del Sol, head of the Computational Biology group at the LCSB and Ikerbasque Research Professor at the Research Center CIC bioGUNE in Bilbao, Spain. "We have the expertise to study whole networks of genes that control cell identity. When we joined the project, we applied our method to predict--without any kind of prior knowledge--which molecular factors govern the fate of these different cardiac cells."

A Discovery 20 Years in the Making

The computational analysis predicted the genes involved in generating specific cell types in the heart, which shed light on those cells' function. The analysis also pointed to one major player, a gene called Hand2 that can control the activity of thousands of other genes, and which Srivastava discovered and named over two decades ago.

Then, as a young researcher, Srivastava spent years investigating the role of this gene and master regulator. He eventually found that it is one of the most important genes for the formation of the heart. But about 10 years ago, in trying to discover how this gene actually affects heart cells that make up the organ, his work reached an impasse because the scientific tools to pursue the research didn't exist. Today, his efforts have finally been revived thanks to new technology.

By applying single-cell RNA sequencing, he and his collaborators were able to get a much more detailed and complete picture of how the loss of Hand2 causes different cell populations to become dysregulated.

Mice lacking the gene Hand2 fail to form the right ventricle chamber, which pumps blood to the lungs. Surprisingly, the new prediction made by the Luxembourg researchers suggested that Hand2 is not required for cells being instructed to become right ventricular cells, but that it is critical in forming the cells of the outflow tract, the structure where major outgoing blood vessels of the heart arise.

"This didn't make sense based on previous findings," said De Soysa. "However, we found that, in fact, Hand2 has very distinct functions for different cell types."

The computational prediction turned out to be correct. The team discovered that hearts without the Hand2 gene never made cells of the outflow tract, but did make right ventricular cells. In the choreography of the heart, it is not enough for a cell to be made, it must also get to the right place relative to the other "dancers." Without Hand2, right ventricle cells were created but stuck at their origin, failing to move into the developing heart.

"Our collaborative findings made us change the way we think about heart formation, and showed how disruption of cell fate, migration, or survival of a few cells can cause a heart defect," De Soysa added.

A Hopeful Future for the Treatment of Congenital Heart Disease

The study has revealed the mechanisms by which relatively small populations of cells are affected during development and lead to defects in the formation of the heart. It also represents a discovery that could not have been possible without single-cell RNA sequencing technology.

"Single-cell technologies can inform us about how organs form in ways we couldn't understand before and can provide the underlying cause of disease associated with genetic variations," said Gifford. "We revealed subtle differences in very, very small subsets of cells that actually have catastrophic consequences and could easily have been overlooked in the past. This is the first step toward devising new therapies."

Significantly, the new catalog of cardiac cells can now serve scientists and physicians interested in various aspects of heart development. With knowledge of the types of cells involved in normal and abnormal formation of the heart, the scientific community can begin to design strategies to correct genetic variants that cause congenital heart disease.

These findings could also guide therapeutic approaches to help both newborns and the growing adult population with congenital heart disease.

"With surgical interventions, we've become very good at keeping most kids with heart defects alive," said Srivastava, who is also a pediatric cardiologist at UCSF Benioff Children's Hospital and professor of pediatrics at UC San Francisco. "The result is that we have nearly 2.5 million survivors of congenital heart disease in the United States today."

When children with a birth defect are fortunate enough to survive, the same genetic condition that caused the developmental problem can lead to ongoing difficulties with maintaining a healthy heart over a lifetime.

"We're beginning to see the long-term consequences in adults, and right now, we don't really have any way to treat them," Srivastava added. "My hope is that if we can understand the genetic causes and the cell types affected, we could potentially intervene soon after birth to prevent the worsening of their state over time."

For Srivastava, the holy grail would be to get such a clear picture of the mechanisms involved in causing congenital heart defects that they could develop preventive strategies for people who are genetically at risk.

"Folic acid being the best paradigm--expecting mothers now take higher levels of this vitamin and can successfully prevent nearly two thirds of cases of spina bifida," he said. "The ultimate goal is to create similar public health measures that could reduce the overall incidence of birth defects through prevention. But first, we have to know where and how to intervene."

Credit: 
Gladstone Institutes

Volcanoes shaped the climate before humankind

The volcanoes in the tropics went crazy between 1808 and 1835: Not only did Tambora erupt in Indonesia during this short period of time but there were also four other large eruptions. This unusual series of volcanic eruptions caused long-lasting droughts in Africa and contributed to the last advance of Alpine glaciers during the Little Ice Age.

"Frequent volcanic eruptions caused an actual gear shift in the global climate system," says Stefan Brönnimann, head of the international research team that discovered the effects of the series of eruptions on the oceans and thus on atmospheric circulation. Brönnimann is Professor of Climatology at the University of Bern and a member of the Oeschger Centre for Climate Research. Their research has been published in the Nature Geosciences journal.

Less rain in Africa and India, more rain and snow in Europe

For their investigations, the researchers analyzed new climate reconstructions that include atmospheric circulation and compared the results to observation-based data. Model simulations finally helped to pin down the role of the oceans in climate change in the early 19th century and showed that they could not recover from the effects of the sequence of eruptions for several decades. The consequences: the persistent weakening of the African and Indian monsoon systems and a shift of atmospheric circulation over the Atlantic-European sector. This led to an increase in low-pressure systems crossing Central Europe.

The last glacier advance in the Alps from the 1820s to the 1850s, depicted in paintings and even old photographs, is a consequence of increased precipitation due to the altered circulation in combination with low temperatures. However, global temperature increased again from the late 19th century onward. The Little Ice Age was eventually superseded by a first phase of global warming, culminating in the 1940s and with a significant manmade contribution.

Important for the definition of "pre-industrial climate"

The new Bern study not only explains the global early 19th century climate, but it is also relevant for the present. "Given the large climatic changes seen in the early 19th century, it is difficult to define a pre-industrial climate," explains lead author Stefan Brönnimann, "a notion to which all our climate targets refer." And this has consequences for the climate targets set by policymakers, who want to limit global temperature increases to between 1.5 and 2 degrees Celsius at the most. Depending on the reference period, the climate has already warmed up much more significantly than assumed in climate discussions. The reason: Today's climate is usually compared with a 1850-1900 reference period to quantify current warming. Seen in this light, the average global temperature has increased by 1 degree. "1850 to 1900 is certainly a good choice but compared to the first half of the 19th century, when it was significantly cooler due to frequent volcanic eruptions, the temperature increase is already around 1.2 degrees," Stefan Brönnimann points out.

Credit: 
University of Bern

Study reveals top tools for pinpointing genetic drivers of disease

image: Dr. Daniel Cameron and Professor Tony Papenfuss led the study.

Image: 
Walter and Eliza Hall Institute of Medical Research

Published in Nature Communications, the study is the largest of its kind and was led by Walter and Eliza Hall Institute computational biologists Professor Tony Papenfuss, Dr Daniel Cameron and Mr Leon Di Stefano.

The new study reveals the world's top genomic rearrangement detection tools providing summaries on their performance and recommendations for use. Dr Cameron said the study could ultimately help clinicians determine the best treatments for their patients.

"Basically, you have to understand what is going wrong before you can work out how to fix the problem. In the context of cancer for instance, an understanding of the genetic mutations driving tumour growth could help oncologists determine the most appropriate treatment for their patients," he said.

To determine the best genomic rearrangement detection methods, the researchers comprehensively tested 12 of the most widely used tools to see which ones could accurately identify the differences between a patient's genetic information and the standard human reference genome. The findings revealed that a tool called GRIDSS, developed by Professor Papenfuss and Dr Cameron, was one of the best performing options, most accurately able to detect DNA rearrangements.

Dr Cameron said the study would not have been possible without the Institute's high-performance computing resource.

"Over the course of two years, we tested 12 of the most popular genomic rearrangement detection tools, generating more than 50 terabytes of data, to determine which tools perform well, and when they perform badly. Without these computing resources, we estimate the study would have taken us more than ten years," he said.

The Institute's Theme Leader for Computational Biology Professor Papenfuss said computational methods were required, more than ever before, for making sense of vast and complex datasets being generated from research.

"Computational studies like this one keep the field up to date with best practice approaches for data analysis. This particular study provides a comprehensive resource for users of genomic rearrangement detection methods, as well as developers in the field. It will also help to direct the next iteration of genomic rearrangement tool development at the Institute," he said.

As new experimental techniques and DNA sequencing machines become available, the very nature of the data they generate is changing. Professor Papenfuss said that older analysis tools, while heavily cited and widely used, could lead to erroneous interpretations if used on data produced by the latest DNA sequencing machines. "This is why it is so important for researchers to find the right match between the analysis tool and dataset at hand," he said.

Credit: 
Walter and Eliza Hall Institute

Researchers develop new technology for multiple sclerosis diagnosis and treatment

image: Dinesh Sivakolundu, Dr. Bart Rypma and Dr. Darin T. Okuda.

Image: 
Center for BrainHealth

DALLAS (July 19, 2019) - Researchers at the Center for BrainHealth®, part of The University of Texas at Dallas, in collaboration with a team from UT Southwestern, have developed technology for a novel diagnostic method for multiple sclerosis (MS). The new approach has the potential to determine which damaged regions in an MS patient's brain have the capacity to heal themselves, and which do not.

The researchers examined brain scans from 23 patients and 109 different lesions. Some lesions showed increased levels of surrounding oxygen, a biomarker that is known to correlate with the capacity for healing. The researchers then created 3D images of the lesions using a new, patent-pending technology tool, which revealed that the metabolically active lesions are more spherical with a rough surface, whereas the metabolically inactive lesions are irregular in shape and have a smooth surface. The results are published in the Journal of Neuroimaging (May 2019).

"This diagnostic method represents a significant advance in our field, considering the new MS drugs being developed to heal damaged areas of the brain," said Dinesh Sivakolundu, the study's lead author, and a teaching and doctoral student working in Dr. Bart Rypma's lab at the Center for BrainHealth. "Using our new technology, we could potentially determine which patients would benefit from such new drugs and which patients would not."

The lab of Dr. Darin T. Okuda, a neurologist and MS expert with UT Southwestern Medical Center, looked at the 3D lesion images and studied lesion phenotypes to uncover the relevant biomarker: a blood oxygen level dependent (BOLD) slope that compares the amount of oxygen available at the injury site to that of its surroundings.

As a result of the collaboration, the researchers were able to study MS physiology within and around lesions and its relationship to lesion physiology.

"Our new technology has the potential to be a game-changer in the treatment of MS by helping doctors be more precise in their treatment plans," added Sivakolundu.

Credit: 
Center for BrainHealth

Dangers of the blame game

When David Dao was forcibly removed from a United Airlines flight in 2017 after declining to give up his seat for United employees, there was immediate public outrage against the airline. But quickly after the event, news spread that Dao had used his medical license to trade prescription drugs for sex. Online reports implied that Dao, rather than United Airlines, was to blame because he was viewed as a bad person.

This tendency to shift blame to victims based on assumptions about their character can have serious consequences in today's free market, where consumers have more power than ever to hold companies accountable for faulty products or services. In a new study, researchers discovered which factors lead people to assign blame to a victim and the repercussions of this judgement. The study, which was led by marketing assistant professor Brandon Reich of Portland State University, was recently published online in the Journal of Consumer Psychology.

In one experiment, researchers summarized the United Airlines incident for participants. One group heard that Dao had traded prescription drugs for sex while the second group did not learn about this detail. Then the participants were told that a lawsuit against the airline required a certain number of signatures to proceed, and they were invited to sign the petition. The results of the study showed that when participants were informed about Dao's past moral transgressions, they were significantly less likely to sign the petition against United Airlines, with 65 percent versus 84 percent signing depending on knowledge of the transgression.

"The evidence suggests that introducing the morality of the victim, which is irrelevant to the harm they suffered from a faulty product or service, leads to changes in our thinking about who is to blame and who should be held responsible," says Troy Campbell, PhD, an assistant professor of marketing at the University of Oregon who was involved in the study. "Individuals, groups and society can be harmed by irrational victim blaming, so we need to more fully understand this psychological tendency."

In a second experiment, participants read about a scenario in which a bank employee noticed that he had an extra $200 in his register due to another employee's error. In one group, participants learned that the bank employee kept the money for himself, while the second group learned that the employee told his manager about the discrepancy. The next day, the bank employee's hot coffee spilled all over his lap due to faulty threading in his travel mug. Then the participants answered questions about the scenario, and they were more likely to blame the employee for the spilled coffee when he had kept the $200 for himself. They were also less likely to write a negative review about the travel mug brand or verbally discourage others from using the brand.

Campbell hopes these findings will increase awareness among consumers as they decide when to take action to protect the public. Consumers may have fodder to blame victims of food poisoning, malfunctioning cars, faulty technology products or many other product or service failures. Moral judgements about the victims of sexual or racial harassment can also impact public responses to incidents.

"The more we blame a victim, the less likely we are to take action," says Campbell. "If people are aware of how irrelevant information can influence their behavior, my hope is that they will be empowered to make rational decisions that enable the free market to work more effectively."

Credit: 
Society for Consumer Psychology

Lobster organs and reflexes damaged by marine seismic surveys

image: This is a seismic air gun test during experiments in Storm Bay, Tasmania.

Image: 
Rob McCauley

A new study of the impact on marine life of seismic air guns, used in geological surveys of the seafloor, has found that the sensory organs and righting reflexes of rock lobster can be damaged by exposure to air gun signals.

Published in the journal Proceedings of the Royal Society B, the research by scientists from IMAS and the Centre for Marine Science and Technology at Curtin University is the latest in a series of studies they have conducted into how seismic surveys affect marine animals.

The study was funded by the Australian Government through the Fisheries Research and Development Corporation (FRDC), Origin Energy, and the Victorian Government's CarbonNet Project.

Lead author Dr Ryan Day said researchers exposed rock lobster to seismic air gun noise during field tests in Tasmania's Storm Bay and examined the effects on a key sensory organ, the statocyst, and the lobsters' reflexes.

"While the impact of air guns on whales and fishes has been relatively well-studied, the effects on marine invertebrates such as lobsters, crabs and squid remain poorly understood," Dr Day said.

"We chose to study the impact on rock lobster because they are a high value fishery and an important part of global marine ecosystems.

"Previous studies have shown that the statocyst, a sensory organ on a lobster's head, is critical in controlling their righting reflex, enabling them to remain coordinated and evade predators.

"After exposing lobsters to the equivalent of a commercial air gun signal at a range of 100-150 metres, our study found that the animals suffered significant and lasting damage to their statocyst and righting reflexes.

"The damage was incurred at the time of exposure and persisted for at least one year - surprisingly, even after the exposed lobsters moulted," Dr Day said.

The study's Principal Investigator, Associate Professor Jayson Semmens, said that while the ecological impacts of the damage were not evaluated, the impairment would likely affect a lobster's ability to function in the wild.

"This study adds to a growing body of research that shows marine invertebrates can suffer physiological impacts and changes to their reflexes in response to anthropogenic noise such as seismic surveys," Associate Professor Semmens said.

"In recent years our research team has also looked at the impact of seismic surveys on lobster embryos, scallops and zooplankton

"Such studies are important to enable government, industry and the community to make informed decisions about how such activities can best be conducted while minimising negative outcomes for fisheries and ecosystems globally," he said.

Credit: 
University of Tasmania

Is your favorite brand authentic?

In the modern media world, consumers are constantly bombarded with advertisements claiming that products are "luxury," "European," created with "old-world traditions and craftsmanship" and more, but how do people know if these descriptions are true? The name Haagen-Dazs evokes a premium, imported brand image, but the company's original brand name was Senator Frozen Foods.

Being authentic is trendy today, and researchers have discovered one of the critical factors that influences consumers to believe a brand is in fact authentic. The investigators found that information about a founder's motivation for creating a company has a powerful effect on whether consumers deem a brand authentic, which in turn influences judgments about the quality of the product. The findings were recently published online in the Journal of Consumer Psychology.

In one experiment, participants reviewed product information about Sweet Things granola, and half of the group read that the brand was founded by a young woman named Kelly who was known among her friends for her homemade granola. She decided to make a living selling the product she loved making, which the researchers labeled as the "intrinsic motivation" condition. The other group read that the brand was created as an extension of a larger company that already made gourmet snack foods. The company wanted to expand its market, which the researchers labeled "extrinsic motivation." The study results showed that participants in the intrinsic motivation condition believed the granola brand was more authentic.

In a follow-up experiment, participants saw a list of the ingredients in Sweet Things granola, including rolled oats, nuts, chocolate and dried fruit, as well as possible uses, such as breakfast or a snack while hiking. Then they rated the expected quality of the product. Next, participants read about the origins of the company, and one group read the intrinsic motivation story while the other group read the extrinsic motivation story. Then they re-rated the quality of the granola.

The results showed that participants who had read the story about the founder named Kelly, who had demonstrated intrinsic motivation, re-rated the quality of the product significantly higher than those who had read the extrinsic motivation story.

The researchers discovered that even for products that are generally disliked, the same trend applied. They shared different stories about the origins of a cigarette brand, and the group in the intrinsic motivation condition read that the owner was a nightclub manager who had been hand-rolling cigarettes using unique tobacco blends for himself and decided to start a business based on this hobby. The extrinsic motivation participants read that the nightclub manager noticed that blended cigarettes were becoming more popular and he capitalized on this market knowledge by starting a business. As expected, the people in the intrinsic motivation condition rated the brand as more authentic and higher in quality than the motivation group.

"The findings suggest that people draw many inferences about a company based on their beliefs about its authenticity," says study author Melissa Cinelli, an associate professor of marketing at the University of Mississippi. "A company's story telling strategy can shift opinions."

Consumers' tendency to make assumptions about authenticity can also serve as a word of caution to marketers, she says. If consumers, for example, believe that a granola brand is authentically all-natural, but they notice high-fructose corn syrup as an ingredient, they could feel betrayed if they considered this ingredient to be artificial. This could prompt consumers to discourage others from using the brand.

Now the researchers hope to investigate how perceptions about brand authenticity influence buying decisions. "If inferences about authenticity lead to judements that products are higher quality, then people may be more willing to buy products and also pay more for them," says Cinelli.

Credit: 
Society for Consumer Psychology

Penn engineers' 'LADL' uses light to serve up on-demand genome folding

image: A modification of CRISPR/Cas9 allowed researchers to home in on the desired sequences of DNA on either end of the loop they wanted to form. If those sequences could be engineered to seek one another out and snap together under the other necessary conditions, the loop could be formed on demand.

Image: 
University of Pennsylvania

Every cell in your body has a copy of your genome, tightly coiled and packed into its nucleus. Since every copy is effectively identical, the difference between cell types and their biological functions comes down to which, how and when the individual genes in the genome are expressed, or translated into proteins.

Scientists are increasingly understanding the role that genome folding plays in this process. The way in which that linear sequence of genes are packed into the nucleus determines which genes come into physical contact with each other, which in turn influences gene expression.

Jennifer Phillips-Cremins, assistant professor in Penn Engineering's Department of Bioengineering, is a pioneer in this field, known as "3-D Epigenetics." She and her colleagues have now demonstrated a new technique for quickly creating specific folding patterns on demand, using light as a trigger.

The technique, known as LADL or light-activated dynamic looping, combines aspects of two other powerful biotechnological tools: CRISPR/Cas9 and optogenetics. By using the former to target the ends of a specific genome fold, or loop, and then using the latter to snap the ends together like a magnet, the researchers can temporarily create loops between exact genomic segments in a matter of hours.

The ability to make these genome folds, and undo them, on such a short timeframe makes LADL a promising tool for studying 3D-epigenetic mechanisms in more detail. With previous research from the Phillips-Cremins lab implicating these mechanisms in a variety of neurodevelopmental diseases, they hope LADL will eventually play a role in future studies, or even treatments.

Alongside Phillips-Cremins, lab members Ji Hun Kim and Mayuri Rege led the study, and Jacqueline Valeri, Aryeh Metzger, Katelyn R. Titus, Thomas G. Gilgenast, Wanfeng Gong and Jonathan A. Beagan contributed to it. They collaborated with associate professor of Bioengineering Arjun Raj and Margaret C. Dunagin, a member of his lab.

The study was published in the journal Nature Methods.

"In recent years," Phillips-Cremins says, "scientists in our fields have overcome technical and experimental challenges in order to create ultra-high resolution maps of how the DNA folds into intricate 3D patterns within the nucleus. Although we are now capable of visualizing the topological structures, such as loops, there is a critical gap in knowledge in how genome structure configurations contribute to genome function."

In order to conduct experiments on these relationships, researchers studying these 3D patterns were in need of tools that could manipulate specific loops on command. Beyond the intrinsic physical challenges -- putting two distant parts of the linear genome in physical contact is quite literally like threading a needle with a thread that is only a few atoms thick -- such a technique would need to be rapid, reversible and work on the target regions with a minimum of disturbance to neighboring sequences.

The advent of CRISPR/Cas9 solved the targeting problem. A modification of the gene editing tool allowed researchers to home in on the desired sequences of DNA on either end of the loop they wanted to form. If those sequences could be engineered to seek one another out and snap together under the other necessary conditions, the loop could be formed on demand.

Cremins Lab members then sought out biological mechanisms that could bind the ends of the loops together, and found an ideal one in the toolkit of optogenetics. The proteins CIB1 and CRY2, found in Arabidopsis, a flowering plant that's a common model organism for geneticists, are known to bind together when exposed to blue light.

"Once we turn the light on, these mechanisms begin working in a matter of milliseconds and make loops within four hours," says Rege. "And when we turn the light off, the proteins disassociate, meaning that we expect the loop to fall apart."

"There are tens of thousands of DNA loops formed in a cell," Kim says. "Some are formed slowly, but many are fast, occurring within the span of a second. If we want to study those faster looping mechanisms, we need tools that can act on a comparable time scales."

Fast acting folding mechanisms also have an advantage in that they lead to fewer perturbations of the surrounding genome, reducing the potential for unintended effects that would add noise to an experiment's results.

The researchers tested LADL's ability to create the desired loops using their high-definition 3D genome mapping techniques. With the help of Arjun Raj, an expert in measuring the activity of transcriptional RNA sequences, they also were able to demonstrate that the newly created loops were impacting gene expression.

The promise of the field of 3D-epigenetics is in investigating the relationships between these long-range loops and mechanisms that determine the timing and quantity of the proteins they code for. Being able to engineer those loops means researchers will be able to mimic those mechanisms in experimental conditions, making LADL a critical tool for studying the role of genome folding on a variety of diseases and disorders.

"It is critical to understand the genome structure-function relationship on short timescales because the spatiotemporal regulation of gene expression is essential to faithful human development and because the mis-expression of genes often goes wrong in human disease," Phillips-Cremins says. "The engineering of genome topology with light opens up new possibilities to understanding the cause-and-effect of this relationship. Moreover we anticipate that, over the long term, the use of light will allow us to target specific human tissues and even to control looping in specific neuron subtypes in the brain."

Credit: 
University of Pennsylvania

Study highlights the benefits of a US salt reduction strategy to US food industry

New research, published in The Milbank Quarterly, highlights the potential health and economic impact of the United States (US) Food and Drug Administration's (FDA) proposed voluntary salt policy on workers in the US food industry.

Excess salt consumption is associated with higher risk of cardiovascular disease (CVD). Globally, more than 1.5 million CVD related deaths every year can be attributed to excess dietary salt intake.

The World Health Organization has recommended sodium reduction as a "best buy" to prevent cardiovascular disease (CVD). In 2016, the US Food and Drug Administration (FDA) set voluntary targets for food industry to reduce sodium in processed foods.

In a previous study, researchers from the University of Liverpool, Imperial College London, Friedman School of Nutrition Science and Policy at Tufts and collaborators as part of the Food-PRICE project, found that for the whole US population the optimal reformulation scenario, 100% compliance with the 10-year FDA targets, could prevent approximately 450,000 CVD cases, gain 2 million Quality Adjusted Life Years (QALYs) and produce discounted cost savings of approximately $40 billion over a 20 year period.

Despite this, Congress has temporarily blocked the FDA from implementing voluntary industry targets for sodium reduction in processed foods, the implementation of which could cost the industry around $16 billion over 10 years.

This new study examined the impact of the policy on the food industry itself to determine the cost effectiveness of meeting these draft sodium targets.

The team modelled the health and economic impact of meeting the two-year and 10-year FDA targets, from the perspective of people working in the food system itself, over 20 years, from 2017 to 2036.

They found that the benefits of implementing the FDA voluntary sodium targets extend to food companies and food system workers, and the value of CVD-related health gains and cost savings are together greater than the government and industry costs of reformulation.

The researchers found that achieving long-term sodium reduction targets could produce 20-year health gains of approximately 180,000 QALYs and produce health cost savings of approximately $5.2 billion.

Because many health benefits may occur in individuals over age 65 years or the uninsured, these health savings would be shared among individuals, industry, and government.

Brendan Collins, Public Health Economist (who co-led the study with Dr Chris Kypridemos), University of Liverpool, said: "Excess dietary salt kills people. Salt reduction has therefore been recommended by the World Health Organisation as a "best buy". Around three quarters of salt is hidden in packaged foods before we buy them. That makes it very hard for people to cut their intake."

"We have shown that reducing the salt hidden in processed foods would have huge economic benefits because fewer people get high blood pressure which leads to strokes and heart attacks. That remains true even when just looking at people working in the food industry itself, because they cost their company less in healthcare, and they can continue working and looking after their families for longer."

Professor Simon Capewell, co-author, University of Liverpool, said: "The message for our own UK Government is also clear. A laissez faire approach will kill or maim thousands of people. We therefore welcome the new Prevention Green Paper focus on salt. Reactivating the previously successful FSA approach would prevent thousands of deaths, and powerfully assist the NHS, UK employers and the UK economy".

Credit: 
University of Liverpool

Pottery related to unknown culture was found in Ecuador

image: This is a shard of an ancient ceramic vessel from the insufficiently studied San Pedro complex found on Real Alto site, Ecuador.

Image: 
FEFU press office

Archaeologists of Far Eastern Federal University (FEFU), Institute of Archeology and Ethnography SB RAS (Russia), Escuela Superior Politécnica del Litoral (ESPOL) (Ecuador), and Tohoku University (Japan) found shards of ceramic vessels referred to the cultural sediments of early periods of Real Alto site. Findings date back to 4640 - 4460 BC, this period borders with Valdivia, one of the oldest pottery-featured cultures in North and South America. A related article is published in Antiquity.

During the excavations at Real Alto site (Ecuador), Russian scientists found fragments of ceramic vessels at a depth of 75 cm to 1 meter. They belong to the insufficiently studied San Pedro complex. Radiocarbon analysis by mass spectrometer showed the pottery dates back to 4640-4460 BC. This period borders or coincides with the first stages of Valdivia culture, the worldwide famous ceramic figures, a kind of symbol of Ecuador, relates to. At the same time, fragments of San Pedro pottery differ from the Valdivian by decorative composition and way of its application.

The shards of San Pedro pottery correlate with fragments from Real Alto and other places of archaeological excavations retrieved in the 70s and 80s but attributed to no particular culture. Thus, the researchers received additional arguments to speak about new archaeological culture related to formative period. The one existed and developed simultaneously with Valdivia on the Pacific coast of Ecuador.

'The mass emergence of pottery was a kind of technical breakthrough associated with many aspects of human life and the level of economic development in different parts of the globe. Ceramic vessels belonging to different cultures developed simultaneously confirm that our ancestors had evolved in terms of cultural diversity. It is curious that, despite the different vectors of human development, in the technological sense we were moving in the same direction.' Alexander Popov said, Head of the Russian archeological expedition to Ecuador, Director of the Educational and Scientific Museum FEFU of the School of Arts and Humanities of Far Eastern Federal University.

According to the scientist, at the next stage of excavation, the research team will look for additional artifacts of new culture. Such findings may well help to determine conditions for the culture development with more preciseness.

Researchers believe that pottery fragments related to even more archaic time can be found in Ecuador, i.e., the more archaic cultural layer may exist. From that point, one will likely to find out whether pottery was invented in South America at the same time as in the other cultures of the globe or may it probably have been imported. The information will help to comprehend the processes of parallel development of people on the different sides of the Pacific Ocean and, in general, the multi-vector development of human communities.

FEFU researchers seek for common details and local options concerning the development of human civilization on opposite sides of the Pacific Ocean -- in South America and East Asia. Scientists compare the adaptation of the ancient man to environmental changes that influenced the economic, domestic and other aspects of the population.

Previously FEFU archaeologists in Ecuador have found ancient human remains dating back to 6 to 10 thousand years old. The excavations were carried out in Atahualpa canton, the findings belong to the Las Vegas archeological culture of the Stone Age.

Credit: 
Far Eastern Federal University

Quenching scientific curiosity with single-molecule imaging

video: Quenching scientific curiosity with single-molecule imaging.

Image: 
KAUST 2019

A single-molecule imaging technique, called protein-induced fluorescence enhancement (PIFE), has gained traction in recent years as a popular tool for observing DNA-protein interactions with nanometer precision. Yet, according to a new KAUST study, research laboratories have not been using the technique to its fullest potential.

The PIFE assay is predicated on the idea that DNA tagged with a fluorescent dye will glow brighter when proteins are bound in the vicinity. In many instances, this is true--which has led many scientists to adopt PIFE over other more labor-intensive techniques that rely on dual labeling of proteins and DNA.

But Samir Hamdan's graduate students Fahad Rashid, Manal Zaher and Vlad-Stefan Raducanu realized that protein binding to DNA-dye complexes could sometimes have the opposite effect as well. Instead of enhancing the fluorescent signal, protein interactions can sometimes dampen the glow, depending on certain properties of the system.

Hamdan credits the curiosity of his students for making this observation and detailing how it works. Inspiration from Rashid's previous work led the team to the phenomenon they call protein-induced fluorescence quenching (PIFQ). And as Rashid explains, "We set out to better define the conditions that lead to fluorescent booms or busts."

Through a combination of experimental and computational analyses, the KAUST team showed that the initial fluorescence state of the DNA-dye complex determines whether PIFE or PIFQ will result after protein binding. Without this knowledge, the likelihood of either event becomes equivalent to a coin toss, which can jeopardize the mechanistic interpretation of laboratory results.

"When insight into this initial state is gleaned from fluorescence and structural work, the anticipation of either effect becomes experimentally feasible," Raducanu explains.

Factors such as DNA sequence and dye position could tip the balance toward PIFE or PIFQ; the KAUST team got so good at interpreting the molecular code that they could accurately predict which would happen simply by measuring how these parameters influence the initial fluorescence state of the DNA-dye system.

"We turned every measurement into a game," Zaher says, "and we are happy to say that our hypothesis predicted the outcome more than 90 percent of the time!"

These novel insights should dramatically expand the reach and experimental promise of this powerful single-molecule imaging tool, predicts Raducanu. "By introducing PIFQ, we offer researchers in the field the possibility to address several biological questions where PIFE might not have been witnessed," he says.

Scientists may also opt to combine PIFE and PIFQ to decipher multistep and multiprotein processes with just a single DNA-dye construct.

"Taking into consideration the context-dependent nature of fluorescence modulation in the DNA-dye system opens the door to many possibilities in experimental design that could be tailored to researchers' needs," Zaher says.

"We now anticipate that interpretation of data and attribution of molecular events from single-molecule data will become easier and more precise," Rashid adds.

Credit: 
King Abdullah University of Science & Technology (KAUST)

Toward a CERN next generation circular collider

Back in January, CERN released a conceptual report outlining preliminary designs for a Future Circular Collider (FCC), which if built, would have the potential to be the most powerful particle collider the world over. Earlier this month attendees of the 2019 FCC week in Brussels got the first look at what this could look like with the release of the four volume FCC study conceptual design report (CDR).

In a special ceremony, on the first day of the cnference, Christian Caron (Executive Editor for the European Physical Journal (EPJ) at Springer Nature) handed over the four volumes to Fabiola Gianotti (CERN's Director General), Frédérick Bordry (CERN's Director of Accelerators & Technologies) and Michael Benedikt (FCC study leader).

Commenting on the publication of this report Michael Benedikt, FCC study leader remarked:

"The FCC design report is the outcome of the common effort of more than 1350 contributors from 34 countries including academic and industrial partners. I would like to thank each and every participant for helping to develop a global vision and preparing the construction of this unique future accelerator facility, which will serve the worldwide high-energy physics community throughout the 21st century. Together, we will continue reviewing the experimental challenges and exploiting opportunities for technological breakthroughs towards the realization of these machines."

Speaking after the ceremony, Christian Caron further commented:

''The development of a new research facility for particle physics that could push the energy and precision/intensity frontiers and unearth some of the mysteries of nature is incredibly exciting. As one of the leading research publishers, the team here at Springer Nature are incredibly proud to have partnered with CERN on the publication and publicizing of landmark projects and achievements at the frontier of high-energy physics and instrumentation. We are looking forward to a close collaboration in the forthcoming years where preparatory work picks up further pace."

The four volumes of the FCC CDR demonstrate the technical feasibility and identify the physics opportunities offered by the different collider options that lie at the core of the FCC study. Moreover, they point to key areas for future technological R&D (research and development) that would guarantee the efficient realization and operation of such a new research infrastructure for fundamental physics. The combination of a high luminosity lepton collider (FCC-ee) as a first step, followed by an energy-frontier proton collider (FCC-hh) will offer unprecedented precision in studying nature at its most fundamental scales through a diverse physics programme spanning 70 years.

Last but not least, the design report also addresses the significant socio-economic impact through investment in such large-scale scientific tools; and how their international and innovative nature can prove vital motors for economic and societal development on a broader scale.

The FCC design report informs the ongoing update of the European Strategy for Particle Physics and helps shape a global vision for high-energy physics research beyond the LHC. Today, the FCC is a worldwide collaboration of more than 150 universities, research institutes and industrial partners actively dedicated to the future of high-performant particle colliders and pursuing R&D on innovative technologies that could boost the efficiency of future particle accelerators. The broad range of scientific and technological disciplines as well as the related cultural diversity that some 1350 contributors introduced into the FCC design report considerably strengthens the collaborative spirit and integrative approach of this project, allowing in particular for unique perspectives that accelerated the progress of the study.

The first volume covering the "Physics Opportunities" was published in EPJC while the three volumes on "The Lepton Machine - FCC-ee", "The Hadron Machine - FCC-hh" and the "High-Energy LHC - HELHC" were published in EPJ Special Topics.

Springer Nature is a leading research, educational and professional publisher, providing quality content to our communities through a range of innovative platforms, products and services. Every day, around the globe, our imprints, books, journals and resources reach millions of people - helping researchers, students, teachers and professionals to discover, learn and achieve more. Through our family of brands, we aim to serve and support the research, education and professional communities by putting them at the heart of all we do, delivering the highest possible standards in content and technology, and helping shape the future of publishing for their benefit and for society overall. Visit: springernature.com/group and follow @SpringerNature.

CERN is the European Organization for Nuclear Research, known as CERN. Established in 1954, it is one of the world's largest and most respected centers for scientific research. The organization is based in a northwest suburb of Geneva on the Franco-Swiss border and has 23 member states. https://home.cern/

Credit: 
Springer