Culture

Can pomegranate juice protect the infant brain?

Boston, MA -- When it comes to protecting the newborn brain, taking steps to mitigate risk before birth may be critical. Some newborns, such as those with intrauterine growth restriction (IUGR), are at heightened risk. Being able to intervene before birth to aid in protecting the newborn brain may prevent the often-devastating effects of brain injury. In ongoing investigations, clinical researchers from Brigham and Women's Hospital are exploring whether pomegranate juice intake during pregnancy can have a protective effect. In a paper appearing in PLOS One, the team presents its preliminary findings from a clinical trial of expectant mothers whose babies were diagnosed with IUGR. The exploratory study, supported by National Institute of Health Grants, The Foundation for Barnes-Jewish Hospital and an unrestricted gift from POM Wonderful, shows promise, with evidence of better brain development and brain connectivity in infants born to mothers who consumed pomegranate juice daily. A second, larger clinical trial is currently underway at the Brigham to validate these findings.

"Our study provides preliminary evidence suggesting potential protective effects for newborns exposed to pomegranate juice while in utero," said senior author Terrie Inder, MBCHB, chair of the Department of Pediatric Newborn Medicine at the Brigham. "These findings warrant continued investigation into the potential neuroprotective effects of polyphenols in at-risk newborns, such as those with hypoxic-ischemic injury."

In cases of IUGR, a baby in the womb is measuring small for its gestational age, often because of issues with the placenta, which brings oxygen and nutrients to the growing fetus. One out of every 10 babies is considered to have IUGR. The process of birth itself can further decrease blood flow or oxygen to the baby, including to the baby's brain. If this is very severe, it can result in a condition known as hypoxic-ischemic injury, which contributes to almost one-quarter of newborn deaths worldwide.

Polyphenols, which include tannic acid and ellagitannins, are part of a class of antioxidants found in many foods and beverages, including nuts, berries, red wine and teas. Pomegranate juice is a particularly rich source of these molecules. Polyphenols are known to cross the blood-brain barrier, and studies in animal models have demonstrated protective effects against neurodegenerative diseases. To date, no clinical studies had evaluated the potential effects of giving pregnant women pomegranate juice to protect the brains of at-risk newborns.

The current randomized, controlled, double-blinded study enrolled 78 mothers from Barnes-Jewish Hospital obstetric clinic in St. Louis with IUGR diagnosed at 24-43 weeks' gestation. Women were randomized to receive 8 ounces of pomegranate juice daily or a taste/calorie matched placebo that was polyphenol free. Women drank the juice daily from enrollment until delivery. The team measured several aspects of brain development and injury, including infant brain macrostructure, microstructural organization and functional connectivity.

While the team did not observe differences in brain macrostructure, they did find regional differences in white matter microstructure and functional connectivity.

"These measures tell us about how the brain is developing functionally," said Inder. "We saw no difference in brain growth and baby growth, but we did see improvement in cabling network and brain development measured by synchronous blood flow and visual development of the brain."

The authors note that the findings warrant the need for a larger, rigorously designed clinical trial to allow continued investigation into the potential neuroprotective effects of polyphenols. Such a study is now underway at the Brigham.

"We plan to continue investigating these exciting findings," said Inder. "While the preliminary evidence shows promise, additional study and replication is needed."

Credit: 
Brigham and Women's Hospital

Highest-resolution human brain 'parts list' to date lays road map to better treatments

image: Allen Institute Scientist Rebecca Hodge examines a slice of human brain. This is one of several brain samples used in a comparison study of human and mouse brain cell types.

Image: 
Allen Institute

A new study from the Allen Institute for Brain Science has written the most detailed "parts list" of the human brain to date. This categorization of our brain cell types lays the groundwork to improve our understanding of our own brains and to dramatically change how we treat human brain diseases and disorders.

The study, published today in the journal Nature, captures for the first time an ultra-high-resolution comparison between human and mouse brain cell types, showing that the majority of our brain cells have a counterpart in the mouse brain, counterparts that have been maintained over approximately 75 million years of evolutionary distance. The researchers studied the genes switched on in one region of the human brain, cell by single cell, to home in on our similarities and our differences as compared to mice.

"Just as we can use our genes to build our family trees or find long-lost relatives with services like ancestry.com or 23andme, we're letting the genes tell us the story of our brains and their evolution," said Ed Lein, Ph.D., Investigator at the Allen Institute for Brain Science, a division of the Allen Institute, and senior author on the study. "And in the same way that police were able to use information in those genetic databases to track down the Golden State Killer, this new high-resolution view of our brains provides a baseline to find the cells that go wrong in disease."

The new comparison between mouse and human brain cell types opens doors for scientists to extrapolate knowledge gained over decades of rodent neuroscience research to our own brains. The cell type alignment also highlights key differences between us and the rodent that could explain why so many psychiatric drugs developed in mouse studies don't work in humans.

"The parts list of the human brain is key. That list allows us to understand what's different between mice and human beings and what's similar. The ultimate impact of this understanding will be better treatments for mental illnesses," said Joshua Gordon, M.D., Ph.D., Director of the National Institute of Mental Health. "The eventual goal will be to develop the complete parts list of the entire human brain. It's a hugely ambitious goal, but this paper shows us that it's a doable undertaking."

Serotonin receptors, the proteins that allow our neurons to react to the neurotransmitter serotonin and which play a starring role in appetite, mood, memory, sleep and many other important brain functions, are vastly altered between mouse and human brain cells, the study found. We and mice both have these proteins, but they are used in other kinds of neurons in mice than in us. Those changes could explain why it's been so difficult for clinical researchers to develop new therapies for depression and other disorders related to serotonin: A drug that acts on a serotonin receptor could affect a mouse very differently than it would us.

Are we more complex than a mouse?

In some ways, the finding that the 75 human brain cell types identified in the study have a loose match in the mouse brain parallels the Human Genome Project's revelation nearly 20 years ago that humans and mice have almost the same number of genes.

"Many people would assume that the human brain is more complex than the mouse brain," Lein said. "It's somewhat of a surprise that at least in terms of cellular diversity, that doesn't seem to be the case. But now that we have a way to make a true apples-to-apples comparison, we start to see many differences in how the genes are used, what the cells look like and the proportions of different cell types in the brain."

The researchers used gene expression, the genes that are switched on in a given cell, to sort nearly 16,000 cells from the human medial temporal gyrus, a part of the temporal lobe of the brain. The study relied on post-mortem tissue from people who had donated their brains to science and tissue donated from epilepsy patients who'd undergone brain surgery. The Allen Institute researchers are now working on studies of brain cell types that encompass more areas of the brain.

A similar Allen Institute study published last year provided a dataset of mouse cells for comparison, revealing that while our brains and our brain cells may look and act differently from those of the mouse, at the level of their gene expression, cell types line up between the two species.

If it sounds confusing that our brain cells can be both the same but different, think of the diversity of mammalian limbs. The same genes are used to build our hand, a bat's wing and a whale's flipper, but these structures look and function in very distinct ways. In addition to the serotonin receptor finding, the study revealed large changes in gene expression that relate to how neurons connect, meaning that our brain circuitry may turn out to be very different from that of the mouse.

"The bottom line is there are great similarities and differences between our brain and that of the mouse," Christof Koch, Ph.D., Chief Scientist and President of the Allen Institute for Brain Science and one of the study authors. "One of these tells us that there is great evolutionary continuity, and the other tells us that we are unique. If you want to cure human brain diseases, you have to understand the uniqueness of the human brain."

Credit: 
Allen Institute

Scientists discover why brown fat is good for people's health

image: People have a few grams of brown fat that is activated when the body is cold. Brown fat uses sugar, fat and amino acids from the blood to generate heat. Left: Brown fat is not activated. Right: Cold conditions activate the brown fat as shown by the orange color on both shoulders and the neck.

Image: 
Image: Labros Sidossis/Rutgers University

Rutgers and other scientists have discovered how brown fat, also known as brown adipose tissue, may help protect against obesity and diabetes. Their study in the journal Nature adds to our knowledge about the role of brown fat in human health and could lead to new medications for treating obesity and type 2 diabetes.

Brown fat is considered a heat organ. People have a few grams of it in areas including the neck, collarbone, kidneys and spinal cord. When activated by cool temperatures, brown fat uses sugar and fat from the blood to generate heat in the body.

The study found that brown fat could also help the body filter and remove branched-chain amino acids (BCAAs) from the blood. BCAAs (leucine, isoleucine and valine) are found in foods like eggs, meat, fish, chicken and milk, but also in supplements used by some athletes and people who want to build muscle mass.

In normal concentrations in the blood, these amino acids are essential for good health. In excessive amounts, they're linked to diabetes and obesity. The researchers found that people with little or no brown fat have reduced ability to clear BCAAs from their blood, and that may lead to the development of obesity and diabetes.

The study also solved a 20-plus year mystery about brown fat: how BCAAs enter the mitochondria that generate energy and heat in cells. The scientists discovered that a novel protein (called SLC25A44) controls the rate at which brown fat clears the amino acids from the blood and uses them to produce energy and heat.

"Our study explains the paradox that BCAA supplements can potentially benefit those with active brown fat, such as healthy people, but can be detrimental to others, including the elderly, obese and people with diabetes," said co-author Labros S. Sidossis, a Distinguished Professor who chairs the Department of Kinesiology and Health in the School of Arts and Sciences at Rutgers University-New Brunswick. He is also a professor in the Department of Medicine at Rutgers Robert Wood Johnson Medical School in Rutgers Biomedical and Health Sciences.

Researchers next need to determine whether uptake of BCAAs by brown fat can be controlled by environmental factors - such as exposure to mildly cold temperatures (65 degrees Fahrenheit) or consumption of spicy foods - or by drugs. This could improve blood sugar levels that are linked to diabetes and obesity, Sidossis said.

Credit: 
Rutgers University

Researcher hones model to forecast dusty conditions months in advance

image: Bing Pu, assistant professor of geography & atmospheric science at KU, has developed a long-range dust-prediction method her team used to accurately predict dustiness in the southwestern and central United States.

Image: 
NOAA

LAWRENCE -- Southwestern Kansas in the 1930s saw some of the worst dust storms ever recorded in the U.S., when apocalyptic clouds of heavy dust terrified and even killed people, livestock and wildlife.

Long ago, farmers phased out the kinds of practices that brought about the Dust Bowl, but dust still can harm health, agriculture and transportation while exacerbating environmental problems. Indeed, dust storms may increase as climate change causes drier conditions. (The National Oceanic and Atmospheric Administration asserts windblown dust storms increased 240% from 1990 to 2011 in the southwestern United States.)

Today, a researcher at the University of Kansas has developed an advanced technique for forecasting dusty conditions months before they occur, promising transportation managers, climatologists and people suffering health issues much more time to prepare for dusty conditions. By contrast, common methods of predicting dust in the air only give a few days of advance warning.

Bing Pu, assistant professor of geography & atmospheric science at KU, is lead author of a new paper in Geophysical Research Letters detailing a long-range dust-prediction method her team used to accurately predict dustiness in the southwestern and central United States.

"We use a statistical model constrained by observational data and the output of a state-of-the-art dynamic seasonal prediction model driven by observational information on Dec. 1," Pu said. "We found using our method, we actually can give a skillful prediction for the dustiness in springtime, one of the dustiest seasons in the U.S., over the Southwestern and Great Plains regions -- two of the dustiest areas in the U.S."

Pu and her colleagues, Paul Ginoux and Sarah Kapnick of the NOAA Geophysical Fluid Dynamics Laboratory, and Xiaosong Yang of NOAA and the University Corporation for Atmospheric Research, were able to predict "variance," or days when there was more or less dust in the air than average.

"Over the southwestern U.S., our model captured the variance of the dustiness over the time period from 2004 to 2016 by about 63%," Pu said. "Over the Great Plans, about 71% of the variance is explained."

Pu said factors influencing amounts of dust in the air can include surface winds, precipitation and amount of bareness of the landscape. These kinds of data were incorporated as key variables into the prediction model.

According to Pu and her collaborators, high levels of airborne dust can affect individual people, transport systems and agricultural production.

"Small dust particles are very easily taken into your breathing system and then could cause lung diseases like asthma -- and some studies suggest there might be some connection with lung cancers," Pu said. "There's a study finding dust storms are related to valley fever in Arizona as fungi can attach to dust particles. And when there's a severe dust storm, visibility is reduced so it can increase car accidents on the highways. In 2013, there were severe dust storms in western Kansas that reduced visibility and caused problems for local traffic. In Arizona, when there's a strong dust storm usually called a 'haboob,' the dust wall goes up to a few kilometers high, and this can affect airports --airports have to close due to the dust storms. Fortunately, these storms are moving quickly and dissipate after a few hours."

Beyond safety for people, Pu's team detail in their study how high dust levels can sway the environment as a whole.

"Dust particles absorb and scatter both solar and terrestrial radiation, thus affecting the local radiative budget and regional hydroclimate," they wrote. "For instance, dust is found to amplify severe droughts in the United States by increasing atmospheric stability, to modulate the North American monsoon by heating the lower troposphere, and to accelerate snow melting and perturb runoff over the Upper Colorado River Basin by its deposition on snow."

Pu said she hopes someday an organization or government agency could run the model she's developed and issue seasonal dust predictions months in advance, especially if the potential for high levels of dust cause concern.

"Traffic systems and human health would benefit most from this long-term prediction ability about dust and air quality," she said. "I think it would be great if an institute would try to give regular predictions of dustiness variations that could be helpful for airports or road traffic or transportation managers. Facilities could plan for times when there could be a lot of dust in the local area. It could even affect the plans of local farmers."

For the time being, Pu aims to continue to refine the dust-prediction model to include atypical weather influences and human activity that could contribute to dust patterns.

"We want to continue to understand what other factors haven't been explored in the seasonal variation of the dust," she said. "For instance, those large-scale factors such as the El Niño-Southern Oscillation, and also anthropogenic factors, how people's influence through agriculture or construction projects, might affect dust emission in the future. Of course, we want to also keep collaborating with people at NOAA GFDL to give dust predictions."

Credit: 
University of Kansas

Patient charges mean young people visit doctor less

image: This is Naimi Johansson, PhD student in health economics, Sahlgrenska Academy, University of Gothenburg.

Image: 
Photo by Cecilia Hedstrom

When young adults pass the age limit for paying patient co-payments, or out-of-pocket prices, their medical consultations in primary care decrease by 7 percent, a study shows. The groups affected most are women and low-income earners.

"It's interesting that, despite relatively low charges, we find a fall in the number of physician visits," says Naimi Johansson. A PhD student in health economics, Johansson is the first author of the research study, published in the European Journal of Health Economics.

This study involved an assessment of the impact on primary care medical consultations in the Västra Götaland region, in southwest Sweden, of young adults leaving cost-free care when, starting on their 20th birthday, they need to pay a co-payment of SEK 100 for every visit.

The study was based on an analysis of register data from the Västra Götaland region and Statistics Sweden, including particulars of some 73,000 individuals aged 18-22.

The results show that 19-year-old women visit primary care facilities, on average, 1.41 times a year, and that most of these visits are made a relatively short time before their 20th birthday. As 20-year-olds, they visit 1.32 times a year on average. Precisely at the transition date, a marked fall takes place in the group, with 9.2 percent fewer consultations.

Men conform to the same pattern but with a lower frequency: 0.86 primary care physician visits annually as 19-year-olds and 0.81 as 20-year-olds. The change at the 20-year point, minus 3.5 percent, is not statistically significant.

The study confirms previous research in the field showing that patient charges reduce people's use of health care to some degree. Nevertheless, the study is unusually clear in demonstrating how price sensitivity in health care can be linked to income level.

The current study refers to four income categories. When one have to pay the co-payment, women with the lowest incomes prove to be the group whose frequency of visits to doctors in primary care falls most: by 14.0 percent. For men in the lowest income category, the decrease is 9.8 percent.

"It's important to point out that, in this research, we haven't been able to study what type of visits is declining after the 20th birthday. Patient out-of-pocket prices exist, to some extent, to discourage visits that are relatively unnecessary," Johansson says.

Her research at Sahlgrenska Academy, University of Gothenburg, focuses on exactly how patient out-of-pocket prices affect use of health care in Sweden and how health care utilization differs among geographical areas.

Patient out-of-pocket prices are part of the health care system, but regulations and levels vary from one region to another in Sweden. In view of the great volume of resources devoted to health care, it is vital to understand how the health care system impacts on how people use it and, by extension, on their health, Johansson thinks.

Credit: 
University of Gothenburg

Spaceflight consistently affects the gut

image: The study analyzed data collected as far back as 2011, from the last American space shuttle launch shown here.

Image: 
NASA

EVANSTON, Ill. -- A new Northwestern University study discovered that spaceflight -- both aboard a space shuttle or the International Space Station (ISS) -- has a consistent effect on the gut microbiome.

The Northwestern researchers developed a novel analytical tool to compare microbiome data from mice as far back as 2011. Called STARMAPS (Similarity Test for Accordant and Reproducible Microbiome Abundance Patterns), the tool indicates that spaceflight causes a specific, consistent change on the abundance, ratios and diversity of bacteria in the gut.

Perhaps most surprisingly, the team also used STARMAPS to compare spaceflight data to data collected from Earth-based studies on the effects of radiation on the gut. They effectively ruled out space radiation as the cause of changes in the microbiome during spaceflight.

"Radiation definitely has an effect on the gut microbiome," said Northwestern's Martha Vitaterna, who led the study. "But those effects do not look like what we saw in spaceflight."

The study published last week in the journal Microbiome. Vitaterna is a research professor in neurobiology at Northwestern's Weinberg College of Arts and Sciences. Peng Jiang, research assistant professor in neurobiology at Weinberg, was the paper's first author.

Vitaterna and longtime collaborator Fred W. Turek, also from Northwestern, led the microbiome section of NASA's Twin Study, which compared physiological changes in astronaut Scott Kelly to his Earth-bound twin Mark. Although the Turek and Vitaterna found that a year in space affected astronaut Scott Kelly's gut microbiome, it was not enough data to draw general conclusions about the effects of spaceflight on the human body.

"If we are going to send humans to Mars or on long missions to the moon, it is essential to understand the effects of long-term exposure of the space environment on us -- and on the trillions of bacteria traveling with us," said Turek, the Charles and Emma Morrison Professor of Neurobiology in Weinberg, who co-authored the paper. "While we have studied the effects of a year in space on Scott Kelly's microbiota, we need to use mice in larger numbers to establish the effects of space."

Necessity yields tools

NASA has studied the effects of microgravity on mice's biological processes for many years. In fall 2014, it delivered its first group of mice to the ISS for a 37-day stay. Since that experiment (called Rodent Research-1), NASA has subsequently delivered seven more groups of mice to the ISS, including the Northwestern team's Rodent Research-7 experiment.

Vitaterna and Jiang started with samples from Rodent Research-1, which included the spaceflight group plus a matched ground control group, a baseline group and a laboratory group that was housed in a conventional mouse facility for the same duration of the trip. They also looked at mouse samples from the final American space shuttle mission, STS 135, which launched in 2011.

Researchers have struggled to crunch all the data because of the sheer amount of it. There are hundreds of different bacterial species in the gut, and different individuals may have vastly different gut bacterial communities at the start of an experiment. This makes it challenging to detect when there is a consistent response.

"There wasn't a statistical approach for doing this work," Vitaterna said. "The tools didn't exist, so we invented them. It's a classic case of how necessity is the mother of invention."

"We knew that spaceflight affects the microbiome, so we could have looked at this anecdotally," Jiang said. "But there are a lot of limitations to that. We needed a more comprehensive, high-level view. Then we could say that microbiome changes are comparable among multiple spaceflights."

Mapping the microbiome

STARMAPS provides a new method to put all the data from different experiments into the same multi-dimensional space. Then users can more readily see patterns where different types of bacteria become more or less abundant under different conditions.

Using this tool, the Northwestern researchers immediately noticed that microbiomes in the spaceflight and ground control mice looked very differently from the other two groups. The ground control group lives in an environment simulator that directly matches the spaceflight habitats. The gas composition, temperature and diet are exactly the same.

"We found that habitat has a big impact," Vitaterna said.

Although the researchers did not use STARMAPS to analyze samples collected from astronaut Scott Kelly during his year in space, they did notice that his data fit the same, consistent pattern caused by spaceflight.

"Some of the high-level changes are similar," Jiang said. "We saw the ratios of the same major types of bacteria change in the same direction and a slight increase in overall diversity. That is consistent."

Ruling out radiation

But, still, the spaceflight and ground control mice's microbiomes were quite different from each other. This proves there is a distinct effect from spaceflight. To find what might cause that effect, the researchers dug deeper.

One possibility was that exposure to radiation in space might cause the microbiome shift. Jiang found multiple Earth-based studies on the effects of radiation on the mouse microbiome and analyzed them with STARMAPS. They discovered that the microbiome shifts caused by spaceflight versus radiation did not match.

Vitaterna and Jiang believe microgravity might cause the spaceflight effect, but they agree more work needs to be done. They are currently processing samples from Rodent Research-7 and hope that data will hold more clues. Vitaterna said the diversity and ratios of gut bacteria during spaceflight look most similar to stress.

"Understanding the factors that can reduce this kind of microbiome change would be useful information to have -- for offsetting the effects of stress on Earth," she said. "Understanding what genetic factors contribute to differences in bacterial strains will be useful for developing countermeasures that can protect your microbiome during stressful periods."

Credit: 
Northwestern University

Hush, baby -- the dog is whimpering!

We are all familiar with the sounds of a cat or dog vying for human attention, and for pet-owners, these sounds are particularly evocative. Dog sounds are especially sad to both cat and dog owners, who actually rate a whimpering dog as sounding as sad as a crying baby.

These results are reported in the new study, Pawsitively sad: pet-owners are more sensitive to negative emotion in animal distress vocalizations, from Associate Professor, Christine Parsons, who is based at the Interacting Minds Centre at the Department of Clinical medicine at Aarhus University, Denmark. She is the first author of the scientific article, just published in the journal "Royal Society Open Science".

"Pet ownership is associated with greater sensitivity to pet distress sounds, and it may be part of the reason why we are willing to spend large amounts of time and resources on our domestic companions. It might also explain why we find interacting with pets so rewarding, and are emotionally impacted by both positive communication signals, like purring and negative, like meows or whines", says Christine Parsons.

She explains that the work was carried out as part of building a major database of emotional sounds - originally developed to test the instinctive responses that parents have to their children. In this study, Parsons has worked with researchers from the University of Oxford, the University of California LA, and King's College of London.

The researchers tested more than 500 young adults and found that dog whines sounded 'more negative' to dog or cat owners, compared to people with no pets, whereas cat meows sounded sadder only to cat owners. Another finding was that regardless of pet ownership, dog whines sounded sadder than cat meows.

"The result suggests that dogs, more effectively than cats, communicate distress to humans and that pet ownership is linked to greater emotional sensitivity to these sounds. For sounds that we need to respond to, like a dog that is utterly dependent on its human host for food and care, it makes sense that we find these sounds emotionally compelling," says Christine Parsons.

Parsons's collaborator, Katherine Young, a lecturer at King's College London and senior author, also points out that dog owners in general spend more time providing basic care to their pets than cat owners. Dog owners need to take their pets for walks, they need more dedicated care, while cat owners have fewer obligations. Cats are semi-domesticated, and generally retain their independence, along with an air of mystique. They come and go as they please.

"This difference in animal dependence may explain why dog whines are rated as more negative than cat meows by all adults, including cat-owners. Dogs may simply have more effective distress signals than cats", says Katherine Young.

According to Christine Parsons the study also found no evidence to support the longstanding 'crazy cat lady' stereotype. Female cat owners have, for many years, been portrayed as neurotic, lonely, sexless and eccentric. Dog owners, and dog ownership is seen more positively, associated with benefits like the 'Lassie effect'. Named after the TV collie, Lassie, dog owners typically get more physical exercise than non-owners, a happy side-effect of dog walks.

"In general, we think of dog owners in more positive terms than cat owners. In our study, we were able to test how cat-owners, dog owners and people with no pets responded on a series of robust psychological measures. We found no differences," Christine Parsons says.

"For symptoms of anxiety, depression and self-reported experiences in close relationships, we found no differences between adults with and without pets. We suggesting that cat or dog ownership is not necessarily associated with individual differences in psychological health, at least as tested here."

Credit: 
Aarhus University

Scientists develop a metamaterial for applications in magnonics

image: A schematic representation of spin waves traveling through the metamaterial and the resultant wave spectrum, reflecting the properties of an artificial crystal.

Image: 
@tsarcyanide / MIPT Press Office

Physicists from Russia and Europe have demonstrated the real possibility of using superconductor/ferromagnet systems to create magnonic crystals, which will be at the core of spin-wave devices to come in the post-silicon era of electronics. The paper was published in the journal Advanced Science.

Magnonics investigates the possibilities of using spin waves to transmit and process information. Whereas photonics deals with photons and electromagnetic waves, the focus of magnonics is on spin waves, or magnons, which are harmonic oscillations of the orientation of magnetic moments. In ferromagnetic materials, the magnetic moments of the electrons, i.e., their spins, are aligned in a magnetic field. The waves of spin alignment observed in a magnetic system are called spin waves.

Magnonics is seen as a promising research area in post-silicon wave electronics, as spin waves have a number of advantages over, say, microwave photons. For instance, spin waves can be controlled by an external magnetic field. Microwaves, which are essentially electromagnetic waves, have an average wavelength of one centimeter, whereas spin waves in the same microwave frequency range have wavelengths of micrometers. This is why these controllable waves can be used to build very compact microdevices for microwave signals.

Magnonic crystals are the most fundamental systems (sometimes referred to as the building blocks) required to build a device that operates using spin wave signals. These crystals have a wide range of potential applications and will lie at the heart of frequency filters, grating couplers, waveguides, and magnonic devices, which are analogs of transistors.

The authors of this study tested their basic hypothesis, which was as follows: Can a magnonic crystal be created using a ferromagnet/superconductor hybrid system? Ferromagnetism and superconductivity are two antagonistic phenomena. In a superconductor, the spins of the electrons bound into a Cooper pair are oriented in opposite directions, whereas in ferromagnets, they tend to align in the same direction. Scientists have traditionally tried to influence superconducting properties with ferromagnetism.

"The last couple of years, we have been successful achieving the reverse. First, we examine ferromagnetic systems and see if their ferromagnetic properties can somehow be modified using superconductors. This is why it has attracted global interest," explains Dr. Igor Golovchanskiy, a co-author of the study and researcher at MIPT's Laboratory of Topological Quantum Phenomena in Superconducting Systems. "Initially, magnonics included only room-temperature investigations. Therefore, hybridization of ferromagnets with superconductors, which do not exist at room temperature, was out of the question. Besides, ferromagnetism has traditionally been considered "stronger" than superconductivity and, hence, can't be influenced by it. Our laboratory studies cryogenic systems, and we set ourselves a goal to look at how magnonic systems behave at cryogenic temperatures when they are forced to interact with superconductors."

The main result of this research is that the scientists have proved it possible to work with magnonic crystals using the superconductor/ferromagnet hybrid system. The scientists have also observed a peculiar magnonic band structure in their architecture characterized by the presence of forbidden bands in the gigahertz frequency range.

The research was conducted in three stages: a sample was fabricated and measured, and then simulations were performed. The system consisted of a regular superconducting niobium (Nb) structure placed on top of a ferromagnetic Ni80Fe20 permalloy (Py) thin film.

The system was placed in a cryostat, and the microwave signal transmission coefficient was measured. If the value was the same as the fundamental frequencies of the system, resonance absorption was observed. This is called ferromagnetic resonance. The obtained spectrum showed two lines, indicating that the periodic structure consisted of two bound areas with alternating ferromagnetic resonance conditions. The ferromagnetic properties were modulated by means of the superconducting structure.

During the third stage, "micromagnetic simulations" were performed. This helped the researchers recreate the magnonic band structure, which is formed by allowed and forbidden bands with a different geometry.

The technological process of the development of silicon-based microelectronic components is reaching the theoretical limit of available sizes. As a result, a further increase in computational capacity, and hence the continued miniaturization of components, requires new approaches. In this regard, the investigated superconductor/ferromagnet systems offer good prospects for wave electronics, since the critical sizes for superconducting materials are less than a micrometer. Therefore, it is possible to make superconducting elements very small.

The authors of the study believe the results of their research will find use in microwave electronics and magnonics, including the field of quantum magnonics. However, the range of potential applications is still limited as the system cannot survive at room temperature.

Credit: 
Moscow Institute of Physics and Technology

Yale study uses real-time fMRI to treat Tourette Syndrome

Characterized by repetitive movements or vocalizations known as tics, Tourette Syndrome is a neurodevelopmental disorder that plagues many adolescents. A study conducted by Yale researchers has trained adolescents with Tourette Syndrome to control their tics through an imaging technique that allows patients to monitor the function of their own brain in real time.

This study is published in Biological Psychiatry.

The study utilized real-time functional magnetic resonance imaging neurofeedback (rt-fMRI-NF), which is a relatively new technique with great potential for treating neuropsychiatric disorders, according to Michelle Hampson, senior author and associate professor in the Department of Radiology and Biomedical Imaging. "It is a non-invasive, neuroscience-based intervention for training human brain function towards healthier patterns," Hampson said.

Although researchers have explored the clinical utility of this technique in treating conditions from depression to Parkinson's disease, this study marks the first time that it has been tested as a clinical intervention for Tourette Syndrome.

The study enrolled individuals with Tourette Syndrome ages 11 to 19 years who displayed a certain frequency of tics as measured by the Yale Global Tic Severity Scale. Subjects were tasked with alternately raising and lowering activity in the supplementary motor area, a brain region associated with tics in Tourette Syndrome, which was displayed to them as a real-time graph during the brain imaging scans. The researchers found a significant reduction of tics in subjects during the training, which exceeded symptom improvements in a control condition (the control condition was designed to induce similar placebo and motivation effects but did not involve real neurofeedback), suggesting that the neurofeedback may be helpful for treating Tourette symptoms.

"Currently available treatments for tics in Tourette Syndrome include behavior therapy and pharmaceuticals, but not everyone responds. This is the first study of its kind showing that rt-fMRI-NF has potential as a treatment for Tourette Syndrome," said Denis Sukhodolsky, co-author and associate professor in the Yale Child Study Center. This was an early stage study with a small sample size, but the promising results should encourage further research, said the investigators.

Credit: 
Yale University

Why initial UTIs increase susceptibility to further infection

image: White blood cells (small circles) attack infected bladder cells (large circles) in this scanning electron micrograph of a mouse bladder with a bacterial infection. Researchers at Washington University School of Medicine in St. Louis have discovered that an initial urinary tract infection (UTI) triggers changes to immune and other cells in the bladder that can prime the bladder to overreact to bacteria, worsening subsequent UTIs.

Image: 
Valerie O'Brien

More than 60% of women will experience a urinary tract infection (UTI) at some point in their lives, and about a quarter will get a second such infection within six months, for reasons that have been unclear to health experts.

Now, researchers at Washington University School of Medicine in St. Louis have discovered that an initial infection can set the tone for subsequent infections. In mouse studies, the researchers found that a transient infection triggers a short-lived inflammatory response that rapidly eliminates the bacteria. But if the initial infection lingers for weeks, the inflammation also persists, leading to long-lasting changes to the bladder that prime the immune system to overreact the next time bacteria find their way into the urinary tract, worsening the infection.

The findings, published Aug. 20 in eLife, suggest that reversing such changes to the bladder may help prevent or mitigate future UTIs.

"Millions of women suffer from recurrent bladder infections, and it can really take a toll on their quality of life," said co-senior author Thomas J. Hannan, DVM, PhD, an instructor in pathology and immunology. "In the process of fighting these infections, the immune system sometimes does more harm to the bladder than to the bacteria. If we could fine-tune the immune response to keep the body focused on eliminating infecting bacteria, we might be able to reduce the severity of these infections."

To understand why some people are more prone to severe, recurrent infections than others, Hannan and co-senior author Scott J. Hultgren, PhD, the Helen L. Stoever Professor of Molecular Microbiology - along with co-first authors Lu Yu, PhD, and Valerie O'Brien, PhD, both graduate students when the work was conducted - infected a strain of genetically identical mice with E. coli, the most common cause of UTIs in people. The strain can have widely divergent responses to bacterial bladder infections. Some eliminate the bacteria within a few days; others develop chronic infections that last for weeks.

The researchers infected these mice with E. coli, monitored them for signs of infection in their urine for four weeks, and then gave them antibiotics. After giving the mice a month to heal, the researchers infected them again. For comparison, they also infected a separate group of mice for the first time.

All the previously infected mice mounted immune responses more rapidly than the mice infected for the first time. The ones that had cleared the infection on their own the first time around did so again, eliminating the bacteria even faster than before. But the mice that failed to clear the infection the first time did much worse, despite the speed of their immune responses. A day after infection, 11 out of 14 had more bacteria in their bladders than they had started with, and many went on to again develop chronic infections that lasted at least four weeks.

The difference lay in an immune molecule called TNF-alpha that coordinates a powerful inflammatory response to infection, the researchers discovered. Both sets of mice turned on TNF-alpha within six hours of infection. But the mice that controlled the infection turned off TNF-alpha again within 24 hours, allowing the inflammation to resolve. In the mice with prolonged initial infections, TNF-alpha stayed on, driving persistent inflammation and triggering a change to the patterns of gene activity in immune cells and the cells of the bladder wall.

"After a chronic infection, even though the bacteria have been eliminated with antibiotics and the bladder lining has healed, the behavior of cells lining the bladder do not revert back to normal even after the inflammation is finally resolved," Hannan said. "The bladder becomes trained to respond to infecting bacteria too harshly, causing tissue damage that predisposes to recurrent infection. So then the question becomes, 'Can we retrain these bladders to be more predisposed to resolving inflammation quickly?'"

To find out, the researchers took mice that had recovered from initial prolonged UTIs and depleted their TNF-alpha before re-infecting them with bacteria. Without TNF-alpha driving excessive inflammation, the mice fared better, significantly reducing the number of bacteria in their bladders within a day of infection.

The findings suggest that targeting TNF-alpha or another aspect of the inflammatory response that causes bladder tissue damage during acute infection may help prevent or alleviate recurrent UTIs, the researchers said.

"Improving our understanding of this underlying biological process will be essential for developing new therapies that target inflammation as an alternative strategy against rapidly increasing antimicrobial resistance," Hultgren said.

Credit: 
Washington University School of Medicine

In cystic fibrosis, lungs feed deadly bacteria

image: P. aeruginosa bacteria (blue) thrive in lungs that produce lots of succinate, a preferred food.

Image: 
Sebastián Riquelme, Columbia University Irving Medical Center.

In cystic fibrosis, Pseudomonas aeruginosa is a much-feared pathogen. The bacterium easily colonizes the lungs of people with cystic fibrosis, leading to chronic infections that are almost impossible to eradicate and are ultimately fatal.

Why does P. aeruginosa, but not other common bacteria, thrive in cystic fibrosis lungs?

A new study from researchers at Columbia University Vagelos College of Physicians and Surgeons suggests the answer has to do with the bacterium's culinary preference for succinate, a byproduct of cellular metabolism, that is abundant in cystic fibrosis lungs.

"Preventing infection by P. aeruginosa could greatly improve the health of people with cystic fibrosis," says Sebastián A. Riquelme, PhD, the study's lead author and a postdoctoral fellow in the Department of Pediatrics. "And it's possible that we could control infection by targeting the bacteria's use of succinate in the lung."

Excess Succinate in CF Lungs

The excess succinate in the lungs of people with cystic fibrosis comes from an interaction between two proteins, CFTR and PTEN. Mutations in the CFTR gene causes cystic fibrosis by preventing the CFTR protein from transporting ions in and out of cells. But the mutations also disrupt CFTR's interactions with PTEN (a discovery made by the Columbia team in 2017).

It is this abnormal PTEN-CFTR interaction, the new study found, that causes lung cells to release an excessive amount of succinate. The succinate fueled growth of P. aeruginosa in the lungs of mice, but had no effect on Staphylococcus aureus, another major pathogen.

More Succinate, More Slime

Not only does P. aeruginosa thrive in a succinate-rich environment, it actively adapts to the abundance of its favored food.

"Succinate-adapted bacteria divert their metabolism into the production of extracellular slime that makes the organisms extremely difficult to eradicate from the lung," says the study's senior author Alice Prince, MD, the John M. Driscoll Jr., MD and Yvonne Driscoll, MD Professor of Pediatrics. "These bacteria are the cause of chronic infection in cystic fibrosis."

The succinate-fed bacteria also suppress the immune response, furthering hampering the body's ability to control the infection.

Targeting Succinate

The new findings--made in mice and in human cells in tissue culture--suggest that it may be possible to treat P. aeruginosa infection by restoring the interaction between PTEN and CFTR, even if CFTR's other functions are impaired.

New drugs for cystic fibrosis, such as the lumacaftor and ivacaftor combination currently available, restore the CFTR-PTEN interaction and may decrease the generation of succinate.

Limiting the accumulation of succinate may also reduce bacterial growth and adaptation. Succinate is mainly produced by immune cells during the inflammatory response, Riquelme says. "We predict that by controlling the exaggerated inflammation observed in the airway, we could reduce succinate and P. aeruginosa infection."

Credit: 
Columbia University Irving Medical Center

UC San Diego researchers convert pro-tumor macrophages into cancer killers

Epithelial cancers, such as cancers of the lung and pancreas, use the ανβ3 molecule to gain drug resistance to standard cancer therapies and to become highly metastatic. In a paper published in Cancer Research, University of California San Diego School of Medicine researchers identified a new therapeutic approach in mouse models that halts drug resistance and progression by using a monoclonal antibody that induces the immune system to seek and kill ανβ3-expressing cancer cells.

"This antibody is designed to seek and destroy the most stem-like, drug-resistant, aggressive tumor cells. It does this by building a bridge between tumor-associated macrophages and these highly aggressive tumor cells," said David Cheresh, PhD, Distinguished Professor and vice chair of Pathology. "What we have been able to observe in mice is that when we give this drug to drug-resistant tumors, it prolongs their response to standard of care and prevents their capacity to enter the blood stream."

Using the ανβ3 antibody LM609, Cheresh and his team exploited the appearance of ανβ3 receptors on tumor cells to redirect tumor-associated macrophages (TAMs) into recognizing and killing ανβ3 expressing tumor cells.

During the study period, no tumor progression or drug resistance was detected while untreated animals developed tumor growth and metastasis. The research in mouse models focused on pancreatic and lung cancer cells treated in combination with LM609 and the EGFR inhibitor erlotinib. But, the antibody is expected to work in combination with various drugs currently used to treat cancer patients, said Cheresh.

"We have observed a highly significant link between the appearance of ανβ3 expressing tumors and the appearance of tumor-associated macrophages," said Cheresh, associate director of innovation and industry alliances at UC San Diego Moores Cancer Center. "Normally, the appearance of tumor-associated macrophages promotes tumor growth and metastasis. However, our antibody arms these macrophages to join our fight against the cancer."

Macrophages are specialized immune cells that promote tissue inflammation, stimulate the immune system and rid the body of foreign debris, including cancer cells. TAMs instead create a pro-tumor environment that accelerates tumor growth, angiogenesis (the development of new blood vessels to support the tumor) and suppresses immune recognition of the tumor by the host immune response.

As tumors progress, the abundance of TAMs increases, allowing the cancer to become more aggressive and spread. As tumors become drug-resistant, ανβ3 appears on cell surfaces.

The Cheresh lab previously discovered that ανβ3 is upregulated on various cells during normal wound repair and in cancer cells as cancer becomes invasive. In both cases, this molecule triggers cells to enter a stress-tolerant state. In normal epithelial cells, this state enables them to initiate tissue remodeling, such as healing. In cancer, it allows cells to become drug-resistant and highly metastatic.

The current study revealed a new approach to induce TAMs to reverse course, killing cancer cells rather than supporting them. The antibody prompts these macrophages to begin killing tumor cells through a mechanism known as antibody-dependent cytotoxicity (ADCC).

"These results were initially unexpected since macrophages usually destroy cells via phagocytosis, a process that involves them literally devouring the foreign or target cell," said Cheresh, a faculty member of the Sanford Consortium for Regenerative Medicine. "Also, ADCC is typically known to be induced by natural killer cells, but we saw very few of these NK cells in the late-stage, drug-resistant cancers we have examined."

"We believe that the effectiveness of this antibody is based on three things: Its capacity to recognize drug-resistant cancers. Its ability to bind to a particular receptor on tumor-associated macrophages. And its capacity to induce ADCC of these highly aggressive tumor cells."

The protein CD47, which is found on many cells in the body and is often hijacked by cancer cells, tells macrophages not to eat these cells. The ανβ3 antibody bypasses the CD47 "don't eat me signal" by inducing ADCC as opposed to phagocytosis.

"In our studies, macrophages are not killing through phagocytosis which would be blocked by the appearance of CD47 on the tumor cell target. Rather, we're inducing macrophage to kill its tumor cell target by its ability to mediate ADCC. The therapeutic antibody we are utilizing is bridging the macrophage to the ανβ3-expressing tumor cell as a target. When this occurs it releases a cytotoxic substance that kills the tumor cell."

The team is currently producing a humanized version of this antibody, which Cheresh hopes will do in humans what LM609 does in mice.

Credit: 
University of California - San Diego

UNM study confirms cannabis flower is an effective mid-level analgesic medication for pain

image: UNM researchers found further evidence that cannabis can significantly alleviate pain with the average user experiencing a three-point drop in pain suffering on a 0-10 point scale.

Image: 
Esteban Lopez

Using the largest database of real-time recordings of the effects of common and commercially available cannabis products in the United States (U.S.), researchers at The University of New Mexico (UNM) found strong evidence that cannabis can significantly alleviate pain, with the average user experiencing a three-point drop in pain suffering on a 0-10 point scale immediately following cannabis consumption.

With a mounting opioid epidemic at full force and relatively few alternative pain medications available to the general public, scientists found conclusive support that cannabis is very effective at reducing pain caused by different types of health conditions, with relatively minimal negative side effects.

Chronic pain afflicts more than 20 percent of adults and is the most financially burdensome health condition that the U.S faces; exceeding, for example, the combined costs of treating heart disease and cancer.

"Our country has been flooded with an over-prescription of opioids medications, which then often leads to non-prescription opioid and heroin use for many people. This man-made disaster is killing our families and friends, regardless of socio-economic status, skin tone, and other superficial human differences" said Jacob Miguel Vigil, one of the lead investigators of the study, titled "The Effectiveness of Self-Directed Medical Cannabis Treatment for Pain", published in the journal Complementary Therapies in Medicine.

Vigil explains, "Cannabis offers the average patient an effective alternative to using opioids for general use in the treatment of pain with very minimal negative side effects for most people."

The researchers relied on information collected with Releaf App, a mobile software program developed by co-authors Franco Brockelman, Keenan Keeling and Branden Hall. The app. enables cannabis users to monitor the real-time effects of the breadth of available cannabis-based products, which are always variable, of course, given the complexity of the Cannabis plant from which these products are obtained.

Since its release in 2016, the commercially developed Releaf App has been the only publicly available, incentive-free app for educating patients on how different types of products (e.g., flower or concentrate), combustion methods, cannabis subspecies (Indica, Sativa, and hybrid), and major cannabinoid contents (THC and CBD) affect their symptom severity levels, providing the user invaluable feedback on their health status, medication choices, and the clinical outcomes of those choices as measured by symptom relief and side effects.

Scientifically, software like the Releaf App enables researchers to overcome the inherent limitations of government-funded clinical trials on the real-time effects of Cannabis, which are rare in general, but also often limited by onerous federal regulations, including its Schedule I status (no accepted medical use and a high abuse potential) and the mandate that investigators use the notoriously poor quality and low potency cannabis products supplied by the National Institute of Drug Abuse.

"Even just rescheduling cannabis just from Schedule I to Schedule II, i.e., classifying it with fentanyl, oxycodone, and cocaine rather than heroin and ecstasy, could dramatically improve our ability to conduct research and only would require that the DEA recognizes that accepted medical uses for cannabis exist, as clearly evidenced by our results and the flourishing medical cannabis programs in the majority of U.S. states," pointed out co-author Sarah Stith.

Among the study's findings the greatest analgesic responses were reported by people that used whole dried cannabis flower, or 'buds,' and particularly cannabis with relatively high levels of tetrahydrocannabinol, otherwise known as THC. The more recently popularized cannabinoid, cannabidiol or CBD, in contrast, showed little association with the momentary changes in pain intensity, based on the massive database explored in the study.

"Cannabis likely has numerous constituents that possess analgesic properties beyond THC, including terpenes and flavonoids, which likely act synergistically for people that use whole dried cannabis flower," said Vigil, "Our results confirm that cannabis use is a relatively safe and effective medication for alleviating pain, and that is the most important message to learn from our results. It can only benefit the public for people to be able to responsibly weigh the true risks and benefits of their pain medication choices, and when given this opportunity, I've seen numerous chronic pain patients substitute away from opioid use, among many other classes of medications, in favor of medical cannabis."

"Perhaps the most surprising result is just how widespread relief was with symptom relief reported in about 95 percent of cannabis administration sessions and across a wide variety of different types of pain," added lead author of the study, Xiaoxue Li.

The authors do caution that cannabis use does carry the risks of addiction and short-term impairments in cognitive and behavioral functioning, and may not be effective for everyone. However, there are multiple mechanisms by which cannabis alleviates pain suffering. In addition to its anti-inflammatory properties, cannabis activates receptors that are colocalized with opioid receptors in the brain. "Cannabis with high THC also causes mood elevation and adjusts attentional demands, likely distracting patients from the aversive sensations that people refer to "pain," explains Vigil.

"When compared to the negative health risks associated with opioid use, which currently takes the lives of over 115 Americans a day, cannabis may be an obvious value to patients. Chronic opioid use is associated with poorer quality of life, social isolation, lower immune functioning and early morbidity. In contrast, my own ongoing research increasingly suggests that cannabis use is associated with a reversal of each of these potential outcomes," said Vigil

Credit: 
University of New Mexico

Babbling babies' behavior changes parents' speech

ITHACA, N.Y. - New research shows baby babbling changes the way parents speak to their infants, suggesting that infants are shaping their own learning environments.

Researchers from Cornell University's Behavioral Analysis of Beginning Years (B.A.B.Y) Laboratory found that adults unconsciously modify their speech to include fewer unique words, shorter sentences, and more one-word replies when they are responding to a baby's babbling, but not when they are simply speaking to a baby.

"Infants are actually shaping their own learning environments in ways that make learning easier to do," said Steven Elmlinger, lead author of "The Ecology of Prelinguistic Vocal Learning: Parents Simplify the Structure of Their Speech in Response to Babbling." "We know that parents' speech influences how infants learn - that makes sense - and that infants' own motivations also change how they learn. But what hasn't been studied is the link between how infants can change the parents, or just change the learning environment as a whole. That's what we're trying to do."

In the study, 30 mother-infant pairs went to the lab's play space for 30-minute sessions on two consecutive days. The 9- and 10-month-old babies could roam freely around the environment, which was filled with toys, a toy box and animal posters. The babies wore overalls with hidden wireless microphones to record their speech, and were also videotaped by three remote-controlled digital video cameras.

Researchers measured parents' vocabulary and syntax, and calculated the change in babies' vocal maturity from the first to the second day. They found that babies whose mothers provided more learning opportunities - by using simplified speech with fewer unique words and shorter utterances - were faster learners of new speech sounds on the second day.

The research contributes to a growing body of work that demonstrates the important role infants play in shaping their own language learning environment. Interventions to improve at-risk children's learning should encourage people to be responsive to their baby's babbling, said senior author Michael Goldstein, associate professor of psychology.

"It's not meaningless," he said. "Babbling is a social catalyst for babies to get information from the adults around them."

Credit: 
Cornell University

Physicists create world's smallest engine

image: The world's smallest engine works due to its intrinsic spin, which converts heat absorbed from laser beams into oscillations, or vibrations, of the trapped ion.

Image: 
Professor Goold, Trinity College Dublin.

Theoretical physicists at Trinity College Dublin are among an international collaboration that has built the world's smallest engine - which, as a single calcium ion, is approximately ten billion times smaller than a car engine.

Work performed by Professor John Goold's QuSys group in Trinity's School of Physics describes the science behind this tiny motor. The research, published today in international journal Physical Review Letters, explains how random fluctuations affect the operation of microscopic machines. In the future, such devices could be incorporated into other technologies in order to recycle waste heat and thus improve energy efficiency.

The engine itself - a single calcium ion - is electrically charged, which makes it easy to trap using electric fields. The working substance of the engine is the ion's "intrinsic spin" (its angular momentum). This spin is used to convert heat absorbed from laser beams into oscillations, or vibrations, of the trapped ion.

These vibrations act like a "flywheel", which captures the useful energy generated by the engine. This energy is stored in discrete units called "quanta", as predicted by quantum mechanics.

"The flywheel allows us to actually measure the power output of an atomic-scale motor, resolving single quanta of energy, for the first time," said Dr Mark Mitchison of the QuSys group at Trinity, and one of the article's co-authors.

Starting the flywheel from rest -- or, more precisely, from its "ground state" (the lowest energy in quantum physics) -- the team observed the little engine forcing the flywheel to run faster and faster. Crucially, the state of the ion was accessible in the experiment, allowing the physicists to precisely assess the energy deposition process.

Assistant Professor in Physics at Trinity, John Goold said: "This experiment and theory ushers in a new era for the investigation of the energetics of technologies based on quantum theory, which is a topic at the core of our group's research. Heat management at the nanoscale is one of the fundamental bottlenecks for faster and more efficient computing. Understanding how thermodynamics can be applied in such microscopic settings is of paramount importance for future technologies."

Credit: 
Trinity College Dublin