Culture

Interpretation of material spectra can be data-driven using machine learning

image: This is an Illustration of our approach. Two trees suck up the spectrum and exchange information with each other and make the "interpretation" (apple) bloom.

Image: 
2018 Teruyasu Mizoguchi, Institute of Industrial Science, The University of Tokyo

Tokyo, Japan - Spectroscopy techniques are commonly used in materials research because they enable identification of materials from their unique spectral features. These features are correlated with specific material properties, such as their atomic configurations and chemical bond structures. Modern spectroscopy methods have enabled rapid generation of enormous numbers of material spectra, but it is necessary to interpret these spectra to gather relevant information about the material under study.

However, the interpretation of a spectrum is not always a simple task and requires considerable expertise. Each spectrum is compared with a database containing numerous reference material properties, but unknown material features that are not present in the database can be problematic, and often have to be interpreted using spectral simulations and theoretical calculations. In addition, the fact that modern spectroscopy instruments can generate tens of thousands of spectra from a single experiment is placing considerable strain on conventional human-driven interpretation methods, and a more data-driven approach is thus required.

Use of big data analysis techniques has been attracting attention in materials science applications, and researchers at The University of Tokyo Institute of Industrial Science realized that such techniques could be used to interpret much larger numbers of spectra than traditional approaches. "We developed a data-driven approach based on machine learning techniques using a combination of the layer clustering and decision tree methods," states co-corresponding author Teruyasu Mizoguchi.

The team used theoretical calculations to construct a spectral database in which each spectrum had a one-to-one correspondence with its atomic structure and where all spectra contained the same parameters. Use of the two machine learning methods allowed the development of both a spectral interpretation method and a spectral prediction method, which is used when a material's atomic configuration is known.

The method was successfully applied to interpretation of complex spectra from two core-electron loss spectroscopy methods, energy-loss near-edge structure (ELNES) and X-ray absorption near-edge structure (XANES), and was also used to predict the spectral features when material information was provided. "Our approach has the potential to provide information about a material that cannot be determined manually and can predict a spectrum from the material's geometric information alone," says lead author Shin Kiyohara.

However, the proposed machine learning method is not restricted to ELNES/XANES spectra and can be used to analyze any spectral data quickly and accurately without the need for specialist expertise. As a result, the method is expected to have wide applicability in fields as diverse as semiconductor design, battery development, and catalyst analysis.

Credit: 
Institute of Industrial Science, The University of Tokyo

Drought predictive of decrease in snakebites

Grant Lipman, MD, an emergency medicine physician at Stanford Medicine, was running through the brown hills behind the campus a few years ago during a severe drought when he came across a 3-foot-long rattlesnake lying by the trail. When a colleague mentioned he'd experienced a similar rattlesnake sighting, Lipman got to thinking.

"I wondered if there are more snakebites during droughts," said Lipman, clinical associate professor of emergency medicine at the Stanford School of Medicine, who routinely treats patients with venomous snakebites.

Lipman launched a study with a team of researchers to investigate the question. What they found defies conventional wisdom: The number of snakebites actually decreases after a drought but goes up after periods of rainy weather. The study also reported that the increase in weather extremes caused by climate change has a direct influence on snakebite incidence in California. This could be useful to help guide public health measures, such as determining the best allocation of antivenom supplies, the study said.

Lipman shares lead authorship with Caleb Phillips, PhD, adjunct assistant professor of computer science at the University of Colorado-Boulder. The senior author is Derrick Lung, MD, assistant clinical professor of medicine at the University of California-San Francisco. The study will be published Sept. 5 in Clinical Toxicology.

Little scientific evidence links drought to an increase in snakebites, and yet everyone seems to believe there's a connection, including emergency medicine providers, since that's what they're taught during training, Lipman said. A quick Internet search of headlines from the popular press quickly confirmed this: "Deadly Snakebites Set to Skyrocket During Record-breaking Drought," reads one in 2018 in the Daily Mail. Another, "Snakes Cross Paths with Humans in Bay Area Due to Drought," was reported on ABCNews.com in 2015. The prevailing theory goes that snakes wander further to forage for food during times of drought, plus they are simply more active in warmer weather.

"We set out to prove that, yes, there are more snakebites during high drought time especially since that's what we were taught," Lipman said.

20 years of snakebite data

In fact, what they found was the exact opposite. The researchers collected and examined 20 years of snakebite data from every phone call made to the California Poison Control System from 1997 to 2017. Details included the date and time of the bite; the patient's age and sex; where the bite occurred on the body; call site; treatment; and medical outcomes. Cases were also grouped by the callers' ZIP codes to one of California's 58 counties.

A total of 5,365 snakebites were reported, all of them from rattlesnakes. Five deaths were reported over the 20-year period. The median age of the patients was 37. They were most likely to be male, and the bites most often occurred at home in the backyard. The majority of bites occurred during the spring or summer and in counties dominated by shrub or scrub growth. Mariposa County topped the list with the most bites at 96 bites per 1 million people. In Santa Clara County, where Stanford is located, there were 4 bites per 1 million people.

"The most common comment I usually hear from snakebite victims in the emergency room is: 'I was just minding my own business,'" Lipman said. "Usually, though, it's the snakes that were minding their own business, having a nice nap. It's people who tend to disturb them."

'More food, more snakes, more snakebites'

The study found that snakebite incidence decreased 10 percent following a drought but increased by 10 percent following high levels of precipitation. The researchers developed their own theory that an increase in rain results in more shrub growth and, with that, an increase in rodents, the snakes' primary food source.

"More food, more snakes, more snakebites," Lipman said. "But that's just our theory."

After accounting for seasonal trends, researchers observed that precipitation was a strong predictor of snakebites. The numbers of bites peaked following the heavy precipitation years of 2006 and 2011, the study found.

"While we were writing this up, we were seeing all these catastrophic weather events around the world," Lipman said. "Massive droughts, powerful hurricanes and floods. We were seeing this global climate change, and we started looking at the recent worst California drought followed by the state's highest precipitation levels on record."

By looking at reports of the wettest and driest years during the 20-year period, researchers saw quite visible comparative trends across the state in all 58 counties. After adjusting for population, the researchers found that the incidence of snakebite fell during two periods of extreme drought between 2002-05 and from 2007-10. From 2015-16, the most severe drought on record in California, the number of snakebites reached their nadir, the study said.

As weather grows increasingly extreme, it grows ever more important to know when to be prepared for perhaps increasingly high incidences of snakebites, Lipman said.

"We can predict a big snakebite season because of prior wet winters and have antivenom in places where there are a lot of hikers or trail runners," Lipman said. "It's important information for people who work and play in California."

Reminding outdoorsy Californians of snakebite-prevention practices is also helpful, particularly when risks of snakebites might be high, Lipman said. Such practices include staying at least two snake-lengths away from rattlesnakes, which can strike fast. Lipman noted that because snakes don't have ears, stomping on the ground works best to scare them away, and that antivenom is effective (but expensive) and needs to be administered quickly to snakebite victims with signs of poisoning.

Credit: 
Stanford Medicine

Jet-air dryers should not be used in hospital toilets

Jet-air hand dryers in hospital toilets spread more germs than disposable paper towels and should not be used, say researchers.

Writing in the Journal of Hospital Infection, they argue that the official guidance about how to prevent bacterial contamination in hospital buildings needs to be strengthened.

At the moment, the official Department of Health guidance says air dryers can be placed in toilets in the public areas of a hospital but not in clinical areas: not because of the risks they pose for cross contamination but because they are noisy.

Mark Wilcox, Professor of Medical Microbiology at the University of Leeds who supervised the international study, said the guidance needs to focus on the infection risks given new evidence.

The new study looked at bacterial spread in a real world setting - in two toilets in each of three hospitals, which were in the UK, France and Italy. Each of the toilets had paper towel dispensers and jet-air dryers, but only one of these was in use on any given day.

Professor Wilcox said: "The problem starts because some people do not wash their hands properly.

"When people use a jet-air dryer, the microbes get blown off and spread around the toilet room.

"In effect, the dryer creates an aerosol that contaminates the toilet room, including the dryer itself and potentially the sinks, floor and other surfaces, depending on the dryer design and where it is sited. If people touch those surfaces, they risk becoming contaminated by bacteria or viruses.

"Jet-air dryers often rely on no-touch technology to initiate hand drying. However, paper towels absorb the water and microbes left on the hands and if they are disposed of properly, there is less potential for cross-contamination."

The study, led by researchers from the University of Leeds and Leeds Teaching Hospitals Trust, was the largest of its type to investigate whether the way people dry their hands has an impact on the spread of bacteria.

This research follows a previous laboratory-based study led by the same team, which found that jet-air dryers were much worse than paper towels or traditional warm air hand dryers when it came to spreading germs.

The hospitals used in the study were the Leeds General Infirmary in Yorkshire, the hospital of Saint Antoine (Assistance Publique-Hôpitaux de Paris) in France, and the Hospital of Udine in Italy.

On each day, over 12 weeks, levels of bacterial contamination in the toilets were measured, allowing comparisons to be made when either paper towels or jet-air dryers were in use. Samples were taken from the floors, air and surfaces in each of the toilets.

The main target bacteria were:

Staphylococcus aureus: responsible for a range of conditions from minor skin and wound infections to life-threatening septicaemia.

Enterococci: bacteria that can cause difficult-to-treat infections, including in immunocompromised patients.

Enterobacteria: including Escherichia coli. These bacteria cause a wide range of infections, including gastroenteritis, pneumonia and septicaemia.

Across the three hospitals, bacterial counts were significantly higher in the toilets on the days that jet-air dryers were in use.

In Leeds and Paris, at least five times more bacteria were recovered from the floors when jet-air dryers were in use, compared with paper towels.

In Leeds, Staphylococcus aureus (including MRSA) was found three times more often and in higher amounts on the surface of the jet-air dryers compared with the paper towel dispensers. Significantly more enterococci and multidrug resistant bacteria were recovered from either the floors or dust of toilets when the jet-air dryers rather than paper towels were in use.

In Italy, the researchers found significantly fewer bacteria on the surface of paper towel dispensers compared with the jet-air dryers, although no significant difference on the floors.

Professor Wilcox said: "We found multiple examples of greater bacterial contamination on surfaces, including by faecal and antibiotic-resistant bacteria, when jet-air dryers rather than paper towels were in use. Choice of hand drying method affects how likely microbes can spread, and so possibly the risk of infection."

Frédéric Barbut, Professor of Microbiology at Saint Antoine (Assistance Publique-Hôpitaux de Paris), said: "The higher environmental contamination observed when using jet air-dryers compared with paper towels increases the risk for cross-contamination.

"These results confirm previous laboratory-based findings and support the recent French guidelines regarding hand hygiene, which discourage using jet-air dryers in clinical wards".

Credit: 
University of Leeds

Nerve pain in the legs? Medical marijuana may alter brain connections, bring relief

MINNEAPOLIS - When medical marijuana is taken for chronic nerve pain, it may provide pain relief by reducing connections between the areas of the brain that process emotions and sensory signals, according to a study published in the September 5, 2018, online issue of Neurology®, the medical journal of the American Academy of Neurology. The study looked specifically at radicular pain, a type of nerve pain that radiates from the spine into the legs. Sciatica is a common form of radicular pain.

The component of marijuana examined in this study was tetrahydrocannabinol (THC), one of many cannabinoids found in marijuana and the one most commonly associated with producing a high.

"Pain is a complex experience that involves both the senses and emotions," said study author Haggai Sharon, MD, of the Sagol Brain Institute, Tel Aviv Medical Center in Israel. "Our study results link pain relief from THC with a reduction in the connections between areas of the brain otherwise heavily connected, suggesting that THC may alleviate pain by disrupting signals between these pain processing pathways."

The study involved 15 men with chronic radicular nerve pain with an average age of 33. Women were excluded since hormone fluctuations during menstruation may affect pain sensitivity. All participants had medium to high radicular pain for over six months.

Before treatment, participants rated their pain levels and had brain scans with functional magnetic resonance imaging (fMRI) to look at the connections between various areas of the brain. Participants were then given treatment with THC.

For the first visit, nine participants were given an average of 15 milligrams of THC oil placed under the tongue and six were given placebo oil. One hour after treatment, participants were questioned again, and had another brain scan approximately two hours after treatment.

At least one week later, participants returned for a second visit and those who had the placebo now received the treatment, and vice versa.

Researchers found that THC reduces a person's pain when compared to placebo. On a scale of zero to 100, before taking medication, on average participants rated their pain levels at 53. After taking THC oil, they rated their pain levels at an average of 35 compared to an average of 43 for those who were given the placebo.

In addition, the more pain relief a person experienced, the greater the reduction of connections between the areas of the brain involved in processing pain.

"Interestingly, our results also show that the more connected the areas of the brain that process emotion and sensory prior to treatment, the greater the pain relief experienced when taking THC," said Sharon. "Larger studies are needed to confirm our findings."

Limitations of the study are that women were excluded and the number of participants was small. Also, this study looked only at THC. Future studies are needed to examine how other components of the marijuana plant, like cannabidiol, may be useful in relieving pain in combination with THC.

Credit: 
American Academy of Neurology

People who embrace traditional masculinity beliefs less likely to report rape

BINGHAMTON, N.Y. - Even in cases where a rape has clearly taken place, traditional beliefs and assumptions about masculinity can cause both witnesses and victims to be uncertain about reporting it, according to new research conducted at Binghamton University, State University at New York.

In a study exploring possible reasons for the underreporting of rape, researchers at Binghamton University and SUNY Broome Community College had both male and female college students read a series of vignettes describing a clear incident of rape. In the different vignettes, which were randomly assigned, the rape was perpetrated by either a man against a woman, a man against a man, or a woman against a man. Afterwards, participants were asked to indicate how much blame they felt was attributable to the perpetrator or the victim; and then to consider, if they were the victim, how likely they would be to (1) tell people they know that the rape happened, or (2) to report it to authorities.

Even in situations that were clearly rape, individuals often appeared on the fence about whether or not they would dislcose the rape to others.

"In general, participants were ambivalent about disclosing that they had been sexually assaulted, even though they identifed the attack as a definite rape," said Binghamton University Associate Professor of Psychology Richard Mattson, corresponding author for the study. "The participants' gender role beliefs and sexual orientation, together with the sex of the perpetrator, seems to affect their attributions of blame, which could influence this tenuous decisional balance in ways that map onto patterns of underreporting in actual rapes."

The researchers found that male and heterosexual participants were more likely to blame victims and less likely to blame perpetrators, and were also less likely to disclose the rape if they were the victim. Endorsement of traditional beliefs and assumptions about men and masculinity seemed to be driving these associations.

"Regardless of gender (and sexual orientation), those who believed men should act more stereotypically masculine were less likely to either report a rape or disclose having been assaulted," said Mattson. "In part, this was because those endorsing such ideologies blamed victims more and minimized the responsibility of the perpetrators. However, the overall pattern of effects suggest a more complex picture in which different aspects of the masculine gender role might relate to underreporting for different reasons."

One surprising finding was that decisions to report a rape to authorities were more strongly tied to judgements about the perpetrator's actions than those of the victim.

"Regardless of how much blame a person placed on the victim for being raped, it was how they viewed the perpetrator, how much blame they assigned to them, that affected their likelihood to report the incident to authorities," said Mattson.

The study highlights the importance of continuing to explore and critically reflect on our enduring traditional beliefs about gender and how these beliefs shape our understanding of both sexual behaviors and sexual assault, said Mattson.

"Our findings suggest that challenging belief systems and cultural narratives about rape that exonerate perpetrators - particularly those related to gender and sexual orientation - may help to increase the reporting of rapes, which has implications for both public safety and the support and resources available to, and accessed by, victims of rape," said Mattson. "We hope these findings will serve to prevent the inadvertent and unjust blaming of victims while giving guilty perpetrators a pass."

Credit: 
Binghamton University

Marmosets serve as an effective model for non-motor symptoms of Parkinson's disease

image: Marmosets in the study were monitored with Fitbit-like devices.

Image: 
Texas Biomedical Research Institute

San Antonio, Texas (September 5, 2018) - Small, New World monkeys called marmosets can mimic the sleep disturbances, changes in circadian rhythm, and cognitive impairment people with Parkinson's disease develop, according to a new study by scientists at Texas Biomedical Research Institute.

By developing an effective animal model that can emulate both the motor and non-motor symptoms of Parkinson's disease, scientists have a better chance of understanding the molecular mechanisms of the neuro-circuitry responsible for changes in the brain during the course of the disease. Scans like magnetic resonance imaging (MRIs) and analysis after dissections may lead to potential targets for new therapies for patients.

Associate Scientist Marcel Daadi, Ph.D., leader of the Regenerative Medicine and Aging Unit at the Southwest National Primate Research Center on the Texas Biomed campus, is the lead author of the study that tracked marmosets using devices around their necks similar to Fitbits humans use to track their activity and sleep. The study was published in a recent edition of the journal PLOS ONE. In the case of the tiny monkeys, investigators wanted to see if the marmosets with induced classic Parkinson's motor symptoms - like tremors - could also serve as an effective model for non-motor symptoms. In addition, scientists videotaped the animals to monitor their ability to perform certain tasks and how those abilities were impacted over time by the disease.

"Most of the early studies in Parkinson's have been conducted with rodents," Dr. Daadi explained, "but there are some complex aspects of this disease you simply cannot investigate using rodents in a way that is relevant to human patients. Nonhuman primates are critical in his aspect because we can see these symptoms clearly whether it is the dyskinesia (abnormality or impairment of voluntary movements), or the sleep disturbances that you can monitor or the fine motors skills."

Parkinson's disease affects a million people in the United States and 10 million people worldwide. With the aging population, the incidence of the neurodegenerative disorder is on the rise. 60,000 people are diagnosed with Parkinson's each year in the U.S. alone. The hallmark symptoms of Parkinson's include tremors, slow movements, balance problems and rigid or stiff muscles. However, non-motor symptoms including disorders of the sleep-wake cycle and problems thinking clearly can be just as difficult for patients to handle.

"This study is a great first step," Dr. Daadi stated. "More studies are needed to expand on these non-motor symptoms in marmosets in the longer-term, and perhaps, include other nonhuman primates at the SNPRC like macaques and baboons."

Credit: 
Texas Biomedical Research Institute

Excessive airway nerves tied to more severe asthma symptoms, study finds

image: Researchers Matthew Drake, M.D. (left), and lab manager Emily Blum use a confocal microscope to generate three-dimensional imagery of airway nerves. Their research demonstrated that inflammatory cells can alter nerve structure in the lungs to cause asthma.

Image: 
Kristyna Wentz-Graff/OHSU

A new study implicates remodeling of nerves in the airways as a key contributor to heightened sensitivity and airway constriction in patients with asthma.

The study published today in the journal Science Translational Medicine.

The results provide new insight into a little-understood factor in the development of asthma, a condition that affects about 235 million people worldwide. The study is the first to demonstrate that inflammatory cells can alter nerve structure in the lungs to cause disease.

Airway nerves sense inhaled particles, such as pollen and smoke, in the environment and help regulate airway constriction. In asthma, these nerves become more sensitive, causing patients to develop symptoms of wheezing and cough. Although previous research had shown that two-thirds of patients with asthma have an overabundance of a type of immune cell, called eosinophils, the effects of eosinophils on airway nerves were not fully understood.

To study airway nerves in asthma, researchers used OHSU's state-of-the-art confocal microscopes to generate three-dimensional imagery capturing a complete picture of airway nerves and their interactions with eosinophils.

"Picture the branches of trees in a forest," said lead author Matthew Drake, M.D., assistant professor of medicine (pulmonary and critical care medicine) in the OHSU School of Medicine in Portland, Oregon. "In previous studies, researchers could only visualize small sections of the branches, which meant you could never see the whole tree or how multiple trees fit together. With our new method, you can see both the forest and the trees."

Using this new 3-D method, Drake's team studied the length of nerves and how often they branch in the airways of healthy patients and in patients with asthma. They found that in asthma, airway nerves are denser.

"In essence, the trees are growing more branches," Drake said. "As a result of those changes, nerves are more easily irritated, which leads to exaggerated responses that constrict the airway."

The research also showed that having more eosinophils increased the likelihood of having denser nerves and that increased nerves connected with more severe asthma symptoms.

"Changes in nerve structure are clearly tied to worse lung function in asthma," Drake said.

However, future studies are needed to determine whether these changes are preventable, or if this process is reversible once it is established, either by treating with currently available asthma drugs or by developing new medications, Drake said.

Credit: 
Oregon Health & Science University

MIT Energy Initiative study reports on the future of nuclear energy

image: Earth in space with atom and twinkling stars.

Image: 
Christine Daniloff / MIT

How can the world achieve the deep carbon emissions reductions that are necessary to slow or reverse the impacts of climate change? The authors of a new MIT study say that unless nuclear energy is meaningfully incorporated into the global mix of low-carbon energy technologies, the challenge of climate change will be much more difficult and costly to solve. For nuclear energy to take its place as a major low-carbon energy source, however, issues of cost and policy need to be addressed.

In "The Future of Nuclear Energy in a Carbon-Constrained World," released by the MIT Energy Initiative (MITEI) on Sept. 3, the authors analyze the reasons for the current global stall of nuclear energy capacity -- which currently accounts for only 5 percent of global primary energy production -- and discuss measures that could be taken to arrest and reverse that trend.

The study group, led by MIT researchers in collaboration with colleagues from Idaho National Laboratory and the University of Wisconsin at Madison, is presenting its findings and recommendations at events in London, Paris, and Brussels this week, followed by events on Sept. 25 in Washington, and on Oct. 9 in Tokyo.

MIT graduate and undergraduate students and postdocs, as well as faculty from Harvard University and members of various think tanks, also contributed to the study as members of the research team.

"Our analysis demonstrates that realizing nuclear energy's potential is essential to achieving a deeply decarbonized energy future in many regions of the world," says study co-chair Jacopo Buongiorno, the TEPCO Professor and associate department head of the Department of Nuclear Science and Engineering at MIT. He adds, "Incorporating new policy and business models, as well as innovations in construction that may make deployment of cost-effective nuclear power plants more affordable, could enable nuclear energy to help meet the growing global demand for energy generation while decreasing emissions to address climate change."

The study team notes that the electricity sector in particular is a prime candidate for deep decarbonization. Global electricity consumption is on track to grow 45 percent by 2040, and the team's analysis shows that the exclusion of nuclear from low-carbon scenarios could cause the average cost of electricity to escalate dramatically.

"Understanding the opportunities and challenges facing the nuclear energy industry requires a comprehensive analysis of technical, commercial, and policy dimensions," says Robert Armstrong, director of MITEI and the Chevron Professor of Chemical Engineering. "Over the past two years, this team has examined each issue, and the resulting report contains guidance policymakers and industry leaders may find valuable as they evaluate options for the future."

The report discusses recommendations for nuclear plant construction, current and future reactor technologies, business models and policies, and reactor safety regulation and licensing. The researchers find that changes in reactor construction are needed to usher in an era of safer, more cost-effective reactors, including proven construction management practices that can keep nuclear projects on time and on budget.

"A shift towards serial manufacturing of standardized plants, including more aggressive use of fabrication in factories and shipyards, can be a viable cost-reduction strategy in countries where the productivity of the traditional construction sector is low," says MIT visiting research scientist David Petti, study executive director and Laboratory Fellow at the Idaho National Laboratory. "Future projects should also incorporate reactor designs with inherent and passive safety features."

These safety features could include core materials with high chemical and physical stability and engineered safety systems that require limited or no emergency AC power and minimal external intervention. Features like these can reduce the probability of severe accidents occurring and mitigate offsite consequences in the event of an incident. Such designs can also ease the licensing of new plants and accelerate their global deployment.

"The role of government will be critical if we are to take advantage of the economic opportunity and low-carbon potential that nuclear has to offer," says John Parsons, study co-chair and senior lecturer at MIT's Sloan School of Management. "If this future is to be realized, government officials must create new decarbonization policies that put all low-carbon energy technologies (i.e. renewables, nuclear, fossil fuels with carbon capture) on an equal footing, while also exploring options that spur private investment in nuclear advancement."

The study lays out detailed options for government support of nuclear. For example, the authors recommend that policymakers should avoid premature closures of existing plants, which undermine efforts to reduce emissions and increase the cost of achieving emission reduction targets. One way to avoid these closures is the implementation of zero-emissions credits -- payments made to electricity producers where electricity is generated without greenhouse gas emissions -- which the researchers note are currently in place in New York, Illinois, and New Jersey.

Another suggestion from the study is that the government support development and demonstration of new nuclear technologies through the use of four "levers": funding to share regulatory licensing costs; funding to share research and development costs; funding for the achievement of specific technical milestones; and funding for production credits to reward successful demonstration of new designs.

The study includes an examination of the current nuclear regulatory climate, both in the United States and internationally. While the authors note that significant social, political, and cultural differences may exist among many of the countries in the nuclear energy community, they say that the fundamental basis for assessing the safety of nuclear reactor programs is fairly uniform, and should be reflected in a series of basic aligned regulatory principles. They recommend regulatory requirements for advanced reactors be coordinated and aligned internationally to enable international deployment of commercial reactor designs, and to standardize and ensure a high level of safety worldwide.

The study concludes with an emphasis on the urgent need for both cost-cutting advancements and forward-thinking policymaking to make the future of nuclear energy a reality.

Credit: 
MIT Energy Initiative

Losing just six hours of sleep could increase diabetes risk, study finds

Rockville, Md. (September 5, 2018)--Losing a single night's sleep may affect the liver's ability to produce glucose and process insulin, increasing the risk of metabolic diseases such as hepatic steatosis (fatty liver) and type 2 diabetes. The findings of the mouse study are published ahead of print in the American Journal of Physiology--Endocrinology and Metabolism. The research was chosen as an APSselect article for September.

Sleep deprivation has been associated with eating more, moving less and having a higher risk of developing type 2 diabetes. However, a team of researchers from Toho University Graduate School of Medicine in Japan, explained, "It was not clear whether glucose intolerance was due to the changes in food intake or energy expenditure or to the sleep deprivation itself."

The researchers studied two groups of mice: One group was kept awake for six hours each night ("sleep deprivation"), while the control group was allowed to sleep as desired. The research team offered unlimited high-fat food and sugar water--mimicking lifestyle-related food choices that people make--to both groups prior to the study. During the sleep/wake period, the animals had limited opportunity for physical activity.

The researchers measured glucose levels and fat content of the liver immediately after the trial period. Blood glucose levels were significantly higher in the sleep deprivation group than controls after one six-hour session of wakefulness. Triglyceride (fat) levels and the production of glucose in the liver also increased in the sleep deprivation group after a single wake period. Elevated liver triglycerides are associated with insulin resistance, or the inability of the body to process insulin properly. In addition, lack of sleep changed the expression of enzymes that regulate metabolism in the liver in the sleep deprivation group. These findings suggest that "intervention studies designed to prevent sleep deprivation-induced hepatic steatosis and insulin resistance should be performed in the future," the researchers wrote.

Credit: 
American Physiological Society

A new exoplanet is discovered by an international team led by a young Canadian student

image: This is a size comparison of the Earth, Wolf 503b and Neptune. The color blue for Wolf 503b is imaginary; nothing is yet known about the atmosphere or surface of the planet.

Image: 
NASA Goddard/Robert Simmon (Earth), NASA / JPL (Neptune).

MONTREAL, September 5, 2018 -- Wolf 503b, an exoplanet twice the size of Earth, has been discovered by an international team of Canadian, American and German researchers using data from NASA's Kepler Space Telescope. The find is described in a new study whose lead author is Merrin Peterson, an Institute for research on exoplanets (iREx) graduate student who started her master's degree at Université de Montréal (UdeM) in May.

Wolf 503b is about 145 light years from Earth in the Virgo constellation; it orbits its star every six days and is thus very close to it, about 10 times closer than Mercury is to the Sun.

"The discovery and confirmation of this new exoplanet was very rapid, thanks to the collaboration that I and my advisor, Björn Benneke, are a part of," Peterson said. "In May, when the latest release of Kepler K2 data came in, we quickly ran a program that allowed us to find as many interesting candidate exoplanets as possible. Wolf 503b was one of them."

The program the team used identifies distinct, periodic dips that appear in the light curve of a star when a planet passes in front of it. In order to better characterize the system Wolf 503b is part of, the astronomers first obtained a spectrum of the host star at the NASA Infrared Telescope Facility. This confirmed the star is an old 'orange dwarf', slightly less luminous than the Sun but about twice as old, and allowed a precise determination of the radius of both the star and its companion.

To confirm the companion was indeed a planet and to avoid making a false positive identification, the team obtained adaptive optics measurements from Palomar Observatory and also examined archival data. With these, they were able to confirm that there were no binary stars in the background and that the star did not have another, more massive companion that could be interpreted as a transiting planet.

Wolf 503b is interesting, firstly, because of its size. Thanks to the Kepler telescope, we know that most of the planets in the Milky Way that orbit close to their stars are about as big as Wolf 503b, somewhere between that the size of the Earth and Neptune (which is 4 times bigger than Earth). Since there is nothing like them in our solar system, astronomers wonder whether these planets are small and rocky 'super-Earths' or gaseous mini versions of Neptune. One recent discovery also shows that there are significantly fewer planets that are between 1.5 and 2 times the size of Earth than those either smaller or larger than that. This drop, called the Fulton gap, could be what distinguishes the two types of planets from each other, researchers say in their study of the discovery, published in 2017.

"Wolf 503b is one of the only planets with a radius near the gap that has a star that is bright enough to be amenable to more detailed study that will better constrain its true nature," explained Björn Benneke, an UdeM professor and member of iREx and CRAQ. "It provides a key opportunity to better understand the origin of this radius gap as well as the nature of the intriguing populations of 'super-Earths' and 'sub-Neptunes' as a whole."

The second reason for interest in the Wolf 503b system is that the star is relatively close to Earth, and thus very bright. One of the possible follow-up studies for bright stars is the measurement of their radial velocity to determine the mass of the planets in orbit around them. A more massive planet will have a greater gravitational influence on its star, and the variation in line-of-sight velocity of the star over time will be greater. The mass, together with the radius determined by Kepler's observations, gives the bulk density of the planet, which in turn tells us something about its composition. For example, at its radius, if the planet has a composition similar to Earth, it would have to be about 14 times its mass. If, like Neptune, it has an atmosphere rich in gas or volatiles, it would be approximately half as massive.

Because of its brightness, Wolf 503 will also be a prime target for the upcoming James Webb Space Telescope. Using a technique called transit spectroscopy, it will be possible to study the chemical content of the planet's atmosphere, and to detect the presence of molecules like hydrogen and water. This is crucial to verify if it is similar to that of the Earth, Neptune or completely different from the atmospheres of planets in our solar system.

Similar observations can't be made of most planets found by Kepler, because their host stars are usually much fainter. As a result, the bulk densities and atmospheric compositions of most exoplanets are still unknown.

"By investigating the nature of Wolf 503b, we'll understand more about the structure of planets near the radius gap and more generally about the diversity of exoplanets present in our galaxy," said Peterson. "I look forward to learning more about it."

Credit: 
University of Montreal

Globally, 1.4 billion adults at risk of disease from not doing enough physical activity

No improvement in global levels of physical activity since 2001.

Worldwide, around 1 in 3 women and 1 in 4 men do not do enough physical activity to stay healthy.

Levels of insufficient physical activity are more than twice as high in high-income countries compared to low-income countries, and increased by 5% in high-income countries between 2001 and 2016.

The highest rates of insufficient activity in 2016 were found in adults in Kuwait, American Samoa, Saudi Arabia, and Iraq where more than half of all adults were insufficiently active. Comparatively, around 40% of adults in the USA, 36% in the UK, and 14% in China were insufficiently active.

More than a quarter (1.4 billion) of the world's adult population were insufficiently active in 2016, putting them at greater risk of cardiovascular disease, type 2 diabetes, dementia, and some cancers, according to the first study to estimate global physical activity trends over time. The study was undertaken by researchers from the World Health Organization (WHO) and published in The Lancet Global Health journal.

Together, these estimates demonstrate that there has been little progress in improving physical activity levels between 2001 and 2016. The data show that if current trends continue, the 2025 global activity target of a 10% relative reduction in insufficient physical activity will not be met.

"Unlike other major global health risks, levels of insufficient physical activity are not falling worldwide, on average, and over a quarter of all adults are not reaching the recommended levels of physical activity for good health," warns the study's lead author, Dr Regina Guthold of the WHO, Switzerland. [1]

In 2016, around one in three women (32%) and one in four men (23%) worldwide were not reaching the recommended levels of physical activity to stay healthy - ie, at least 150 minutes of moderate-intensity, or 75 minutes of vigorous-intensity physical activity per week.

The new study is based on self-reported activity levels, including activity at work and at home, for transport, and during leisure time, in adults aged 18 years and older from 358 population-based surveys in 168 countries, including 1.9 million participants.

Among the study's main findings were:

In 2016, levels of insufficient activity among adults varied widely across income groups - 16% in low-income countries compared to 37% in high-income countries.

In 55 (33%) of 168 countries, more than a third of the population was insufficiently physically active.

In four countries, more than half of adults were insufficiently active - Kuwait (67%), American Samoa (53%), Saudi Arabia (53%), and Iraq (52%).

Countries with the lowest levels of insufficient physical activity in 2016 were Uganda and Mozambique (6% each).

Women were less active than men in all regions of the world, apart from east and southeast Asia. In 2016, there was a difference in levels of insufficient activity between women and men of 10 percentage points or more in three regions: South Asia (43% vs 24%), Central Asia, Middle East and north Africa (40% vs 26%), and high-income Western countries (42% vs 31%).

Across regions, many individual countries recorded large differences in insufficient activity between women and men. Examples include Bangladesh (40% vs 16%), Eritrea (31% vs 14%), India (44% vs 25%), Iraq (65% vs 40%), Philippines (49% vs 30%), South Africa (47% vs 29%), Turkey (39% vs 22%), the USA (48% vs 32%), and the UK (40% vs 32%).

"Addressing these inequalities in physical activity levels between men and women will be critical to achieving global activity targets and will require interventions to promote and improve women's access to opportunities that are safe, affordable and culturally acceptable," said co-author Dr Fiona Bull from WHO, Geneva [1].

From 2001-2016, substantial changes in insufficient physical activity levels were recorded in multiple regions. Key findings include:

The regions with the highest increase in insufficient activity over time were high-income Western countries (from 31% in 2001 to 37% in 2016), and Latin America and the Caribbean (33% to 39%). Countries from these regions driving this trend include Germany, New Zealand, the USA, Argentina, and Brazil.

The region with the largest decrease in insufficient activity was east and southeast Asia (from 26% in 2001 to 17% in 2016), which was largely influenced by uptake of physical activity in China, the most populated country in the region.

There has been an increase of 5% in prevalence of insufficient activity in high-income countries, from 32% in 2001 to 37% in 2016. In comparison, there has been an average rise of just 0.2% amongst low-income countries (16.0% to 16.2%).

In wealthier countries, the transition towards more sedentary occupations, recreation and motorised transport could explain the higher levels of inactivity, while in lower-income countries, more activity is undertaken at work and for transport, according to the authors. While declines in occupational and domestic physical activity are inevitable as countries prosper, and use of technology increases, governments must provide and maintain infrastructure that promotes increased walking and cycling for transport and active sports and recreation.

"Regions with increasing levels of insufficient physical activity are a major concern for public health and the prevention and control of noncommunicable diseases (NCDs)," says Dr Guthold [1].

"Although a recent NCD policy survey showed that almost three quarters of countries report having a policy or action plan to tackle physical inactivity, few have been implemented to have national impact. Countries will need to improve policy implementation to increase physical activity opportunities and encourage more people to be physically active. Governments have recognized the need for action by endorsing the WHO Global Action Plan on Physical Activity (2018-2030)," says Dr Bull. [1][2]

The action plan, titled More active people for a healthier world, launched in June 2018, recommends a set of 20 policy areas, which, combined, aim to create more active societies through improving the spaces and places for physical activity as well as increasing programs and opportunities for people of all ages and abilities to do more walking, cycling, sport, active recreation, dance and play. The plan is a road map for the actions needed by all countries to reduce insufficient physical activity in adults and adolescents. [3]

The study's release comes ahead of the Third United Nations General Assembly High-level Meeting on NCDs and their risk factors, including physical inactivity, being held on 27 September 2018 in New York.

Writing in a linked Comment, Dr Melody Ding from the University of Sydney in Australia discusses the important policy implications of the study, and says: "The gender gap in physical activity, particularly in central Asia, Middle East and North Africa and South Asia reveals a health equity issue where women face more environmental, social and cultural barriers to participate in physical activity, particularly in their leisure time...Although high-income countries have a higher prevalence of insufficient physical activity, it is important to note that low- and middle-income countries still bear the larger share of the global disease burden of physical inactivity. Furthermore, economic development and urbanisation lead to lifestyle and epidemiological transitions, characterised by increasing prevalence of physical inactivity and subsequent burdens from chronic diseases, as observed in China and Brazil. While declines in occupational and domestic physical activity are inevitable, it is essential to incentivise transport and leisure-time physical activity in emerging economies through improving public and active transportation infrastructure, promoting social norms for physical activity through mass sports and school-level participation, and implementing sustainable programs at scale that could yield economic, environmental, and social co-benefits while promoting physical activity."

Credit: 
The Lancet

More hospital doctors are opting to retire early

Hospital doctors in England and Wales are increasingly choosing to take early retirement, figures released to The BMJ by the NHS Business Services Authority in response to a freedom of information request show.

The number of hospital doctors claiming their NHS pension on voluntary early retirement grounds increased from 164 in 2008 to 397 in 2018. The number retiring on ill health grounds rose from 12 to 79 over the same period.

The figures also show a greater proportion of doctors claiming their pensions now do so on health grounds or are taking voluntary early retirement.

For example, in 2008, 14% of hospital doctors claiming their NHS pension took voluntary early retirement; in 2018, 27% of doctors did so. In 2008, 1% of retirements were on grounds of ill health; in 2018, 5% were.

The total number of hospital doctors choosing to take their pension (whether on grounds of age, voluntary early retirement, or ill health) rose from 1,205 in 2008 to 1,475 in 2018, representing a 22% increase over this period.

The total number of hospital doctors working in the NHS rose by 21% over the same period, and the proportion of doctors choosing to retire was 1.25% in 2008 and 1.27% in 2018.

Earlier this year, The BMJ reported on figures from the NHS Business Services Authority showing that GPs are also increasingly choosing to take early retirement

The number of GPs claiming their NHS pension on voluntary early retirement grounds increased from 198 in 2007-08 to 721 in 2016-17, and the number retiring on ill health grounds rose from 12 to 63 over the same period. Over the same period, the number retiring on age grounds fell from 944 in 2007-08 to 380 in 2016-17.

Credit: 
BMJ Group

Researchers outline game-theory approach to better understand genetics

Principles of game theory offer new ways of understanding genetic behavior, a pair of researchers has concluded in a new analysis appearing in the Journal of the Royal Society Interface. Its work opens the possibility of comprehending biological processes, and specifically biochemistry, through a new scientific lens.

The exploration considers signaling game theory, which involves sender and receiver interactions with both seeking payoffs.

"The view of genes as players in a signaling game effectively animates genes and bestows simple utilities and strategies--thus, unique personalities--on them," explains Bhubaneswar "Bud" Mishra, a professor at NYU's Courant Institute of Mathematical Sciences, who co-authored the analysis with Steven Massey, an associate professor at the University of Puerto Rico. "In this view, the genome possesses characteristics of a molecular society, complete with deception, imitation, cooperation, and competition--not unlike human society. This adds a grandeur to a traditional view of life and the interactions it is made up of."

The researchers note the long history of signaling game theory across different fields.

"Signaling game theory was developed in economics and biology and has subsequently found applications in the design of smart contracts, privacy, identity systems and
cybersecurity," Massey and Mishra write.

The Journal of the Royal Society Interface piece represents the application of signaling game theory to biochemistry.

Massey and Mishra, relying on existing research, propose a novel view of biochemistry as a signaling game between genes and their associated macromolecules. Mathematically, it models an interaction between a sender and a receiver -- both biological macromolecules--where the sender has important information and signals the receiver to act.

For instance, the macromolecules signal their identity to other macromolecules that bind to them, which then undertake a biochemical reaction. The communication of identity opens the possibility of certain behaviors associated with humans--such as molecular "deception" occurring between gene players.

Of particular note, signaling game theory shows that deception is expected in situations where there is a conflict of interest between parties. In the case of biochemistry, this could be observed through the activity of "selfish" elements (e.g., transposons, which are DNA sequences that change their position within the genome), pathogens, and instances of conflict between genes that occur between genders and parents and their offspring.

"The evolution of the genetic code and many of the other major evolutionary transitions that led to present-day lifeforms may be linked to the evolution of signaling conventions between macromolecules, and the possibility of subversion by selfish entities or pathogens," explains Mishra. "Notably, the occurrence of molecular deception has led to the evolution of mechanisms of 'molecular sanctioning' to control the offending behavior."

"Molecular sanctioning" is a novel concept developed by Mishra and Massey, derived from game theory, that describes the punishment of gene players that display "antisocial" behavior detrimental to the genome as a whole.

Credit: 
New York University

Hormone therapy can make prostate cancer worse, study finds

LOS ANGELES (Sept. 4, 2018) -- Scientists at Cedars-Sinai have discovered how prostate cancer can sometimes withstand and outwit a standard hormone therapy, causing the cancer to spread. Their findings also point to a simple blood test that may help doctors predict when this type of hormone therapy resistance will occur.

Prostate cancer is the second-leading cause of cancer death in men, behind lung cancer, killing nearly 30,000 in the U.S. each year, according to the American Cancer Society. In its early stages, the most common type, adenocarcinoma, is curable and generally responds well to therapies, including those that target androgen - a male sex hormone that stimulates tumor growth.

However, in certain patients, the cancer becomes resistant to androgen-targeted therapy, and the cancer recurs or spreads. One possible reason for that resistance, the study indicated, appears to be that the therapy causes some adenocarcinoma cells to become neuroendocrine cancer-type cells - a rare type that normally appears in fewer than 1 percent of prostate cancer patients.

"This transformation is a problem because neuroendocrine prostate cancer is especially aggressive, metastasizes more readily and is more resistant to both androgen-targeted therapy and chemotherapy," said Neil Bhowmick, PhD, co-director of the Cancer Biology Program at the Samuel Oschin Comprehensive Cancer Institute at Cedars-Sinai. He is senior author of the study, published in the Journal of Clinical Investigation, and Rajeev Mishra, PhD, former project scientist in his laboratory, is the lead author.

Bhowmick said about one-fourth of the patients who receive androgen-targeted therapy may relapse with tumors that show features of neuroendocrine prostate cancer and develop treatment-resistant disease, according to published research.

To learn more about this process, the investigators examined how cancer cells interact with the supporting cells near the tumor, referred to as the tumor microenvironment, in laboratory mice. They found these interactions raised the level of the amino acid glutamine, turning the supporting cells into "factories" that supplied fuel for the cancer cells.

"While glutamine is known to spur cancer growth, its role in prostate cancer cells to trigger reprogramming of adenocarcinoma cells into neuroendocrine cancer cells is a new and important finding," said Roberta Gottlieb, MD, professor of Medicine and vice chair of translational medicine in the Department of Biomedical Sciences at Cedars-Sinai. Gottlieb was a co-author of the study.

The team also examined how androgen-targeted therapy affected the cancer microenvironment.

"To our surprise, we found this type of therapy further changed the cellular environment in a way that caused adenocarcinoma cells in the prostate to transform into neuroendocrine cancer-type cells," said Bhowmick, professor of Medicine and Biomedical Sciences.

As the final step in validating the findings in mice, investigators compared levels of glutamine in the plasma of small groups of patients - one with treatment-responsive prostate cancer and the other with treatment-resistant prostate cancer. They found that levels of glutamine were higher in the second group.

This finding has potential implications for treating prostate cancer patients, said Edwin Posadas, MD, co-director of the Translational Oncology Program at the cancer institute and associate professor and clinical chief of the Division of Hematology/Oncology in the Department of Medicine at Cedars-Sinai.

"The study raises the possibility that a simple blood test measuring glutamine might be able to pinpoint when androgen-targeted therapy is failing in a prostate cancer patient and even predict when therapy resistance will occur," said Posadas, who co-authored the study. He said the team is designing a new study to test this hypothesis.

Credit: 
Cedars-Sinai Medical Center

Study provides 10-year risk estimates for dementia, which may help with prevention in high-risk individuals who potentially could benefit from early targeted prevention

A Danish study provides 10-year absolute risk estimates for dementia specific to age, sex and common variation in the APOE gene, which may help identify high-risk individuals who potentially could benefit from early targeted prevention. The study is published in CMAJ (Canadian Medical Association Journal).

Dementia is a major cause of disability in older adults worldwide, yet no effective treatment is currently available. Reduction of risk factors for dementia may have the potential to delay or prevent development of the disease. Age, sex and common variation in the APOE gene identify high-risk individuals with the greatest potential to benefit from targeted interventions to reduce risk factors.

The apolipoprotein E (APOE) protein is key for metabolizing cholesterol and to clear β-amyloid protein from the brain in individuals with Alzheimer disease.

"Recently, it was estimated that one-third of dementia most likely can be prevented. According to the Lancet Commission, early intervention for hypertension, smoking, diabetes, obesity, depression and hearing loss may slow or prevent disease development. If those individuals at highest risk can be identified, a targeted prevention with risk-factor reduction can be initiated early before disease has developed, thus delaying onset of dementia or preventing it," says Ruth Frikke-Schmidt, professor at the University of Copenhagen, and at the Department of Clinical Biochemistry, Rigshospitalet, Copenhagen, Denmark.

The study looked at data on 104 537 people in Copenhagen, Denmark, and linked it to diagnoses of dementia. Researchers found that a combination of age, sex and a common variation in the APOE gene could identify high-risk groups, with a 7% risk for women and 6% risk for men in their 60s; a 16% and 12% risk, respectively, for people in their 70s; and a 24% and 19% risk, respectively, for those aged 80 years and older.

The generalizability of the study is limited as it included only people of white European background.

"The present absolute 10-year risk estimates of dementia by age, sex and common variation in the APOE gene have the potential to identify high-risk individuals for early targeted preventive interventions," the authors conclude.

Credit: 
Canadian Medical Association Journal