Culture

Printing flattens polymers, improving electrical and optical properties

video: The video shows twisted polymers transitioning to a liquid crystal phase forming polymer helixes, which can be flattened by printing flow.

Image: 
Ying Diao

CHAMPAIGN, Ill. -- Researchers have found a way to use polymer printing to stretch and flatten twisted molecules so that they conduct electricity better. A team led by chemical and biomolecular engineers from the University of Illinois report their findings in the journal Science Advances.

Conjugated polymers are formed by the union of electron-rich molecules along a backbone of alternating single and double chemical bonds. The conjunction allows electricity to travel very quickly through a polymer, making it highly desirable for use in electrical and optical applications. This mode of transporting charges works so well that conjugated polymers are now poised to compete with silicon materials, the researchers said.

However, these polymers tend to contort into twisted spirals when they join, severely impeding charge transport.

"The flatness or planarity of a conjugated polymer plays a large role in its ability to conduct electricity," said chemical and biomolecular engineering professor Ying Diao, who led the study. "Even a slight twist of the backbone can substantially hinder the ability of the electrons to delocalize and flow."

It is possible to flatten conjugated polymers by applying an enormous amount of pressure or by manipulating their molecular structure, but both techniques are very labor-intensive, Diao said. "There really is no easy way to do this."

Postdoctoral researcher Kyung Sun Park and graduate student Justin Kwok noticed something while running printing experiments and flow simulations in Diao's lab. Polymers go through two distinct phases of flow during printing: The first phase occurs when capillary action pulls on the polymer ink as it begins to evaporate, and the second phase is the result of the forces imposed by the printing blades and substrate, the researchers said.

"Park and Kwok uncovered another phase that occurs during printing in which the polymers appear to have vastly different properties," Diao said. "This third phase occurs in between the two already-defined phases, and shows the polymers being stretched into planar shapes."

Not only are the polymers stretched and flattened in this third phase, but they also remain that way after precipitating out of solution, Diao said, making it possible to fine-tune printer settings to produce conjugated polymers for use in new, faster biomedical devices and flexible electronics.

"We are discovering a whole zoo of new polymer phases, all sensitive to the forces that take place during the printing process," Diao said. "We envision that these unexplored equilibria and flow-induced phases will ultimately translate into new conjugated polymers with exciting optoelectronic properties."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Women gain more political and economic power, but gender gap persists

While women have made great strides in entering the workforce, running companies and getting elected to Congress, there remains a persistent gender gap in attitudes about equality between men and women, suggests a University of California, Davis, study.

Although the last half of the 1900s saw much progress, the trajectory of attitudes about gender equality slowed in recent decades as men began to work longer hours and take on increased responsibilities to get ahead at work, nudging their wives into more traditional roles at home. This influence on negative attitudes toward women's work was widespread and not limited to the women and men directly impacted. The decline in egalitarian attitude was starkest among highly educated, high-salaried white people in professional jobs, researchers said.

The phenomenon, in which men work 50 or more hours per week, is called "men's overwork."

The study will be presented on Saturday, Aug. 10, at the American Sociological Association Annual Meeting in New York. The paper, "Persistent Gender Gap, Shifting Racial & Educational Differentiations: Variations in Structural Influences on Gender Attitudes, 1977-2016," was written by Kelsey D. Meagher, a postdoctoral scholar, and Xiaoling Shu, a professor of sociology.

"Men's overwork depressed gender egalitarianism among the whole population, but had the strongest effect among the college-educated, who were more proximate to the rising demands of overwork among professional occupations," the researchers said. "We show how labor market dynamics work hand-in-hand with social classifications by race, gender, and education in shaping the contour of American gender ideology in the last four decades."

Looked at changes in labor market

Meagher and Shu looked at data from the General Social Survey for the years 1977-2016, seeking to learn how changes in the labor market affected attitudes of different demographic groups. The analytic sample consists of more than 26,000 U.S. individuals, both men and women, who are both black and white, ages 16 and older, who are single, married and with or without children, of various political ideologies and religions. Latinx, Asian and other races are not included in the research because of lack of adequate sample size. Participants answered a series of questions about their attitudes about women in politics, the workplace, men's and women's social roles, and other issues.

Researchers found that college-educated respondents report more egalitarian attitudes than less-educated respondents. The researchers also found a racial gap in attitudes about gender equality, with white people holding more conservative gender attitudes than their black peers.

Women, across the board, reacted more positively toward egalitarianism than men.

This gender gap has persisted in the last four decades despite large societal transformations in the United States, while education and racial gaps fluctuated with contextual economic forces.

Among the decades of the survey, support for gender equality increased in each decade, with a slowdown only in the 1990s -- when men's overwork increased -- with an upward trend starting again in the 2000s. This "stalled revolution" in the '90s affected all demographic groups. The last year of the survey, 2016, showed the highest rate of feelings of egalitarianism in the study group for almost every group except for blacks, whose attitudes of egalitarianism peaked in 2014.

Credit: 
University of California - Davis

Novel dual stem cell therapy improving cardiac regeneration

image: This is a schematic diagram of the underlying mechanism of dual treatment approach of hiPSC-CMs and hMSC-patch.

Image: 
Photo source: <i>Nature Communications</i> (https://www.nature.com/articles/s41467-019-11091-2)

As a medical emergency caused by severe cardiovascular diseases, myocardial infarction (MI) can inflict permanent and life-threatening damage to the heart. A joint research team comprising scientists from City University of Hong Kong (CityU) has recently developed a multipronged approach for concurrently rejuvenating both the muscle cells and vascular systems of the heart by utilizing two types of stem cells. The findings give hope to develop a new treatment for repairing MI heart, as an alternative to the existing complex and risky heart transplant for seriously-ill patients.

MI is a fatal disorder caused by a shortage of coronary blood supply to the myocardium. It leads to permanent loss of heart muscle cells (cardiomyocytes, CMs), and scar tissue formation, resulting in irreversible damage to cardiac function or even heart failure. With limited therapeutic options for severe MI and advanced heart failure, a heart transplant is the last resort. But it is very risky, costly and subject to limited suitable donors. Therefore, stem cell-based therapy has emerged as a promising therapeutic option.

Dr Ban Kiwon, a stem cell biologist from Department of Biomedical Sciences at CityU, has been focusing on developing novel stem cell-based treatments for cardiac regeneration. "Heart is an organ composed of cardiac muscles and blood vessels, where vessels are essential to supply oxygen and energy to the muscles. Since both cardiac muscles and vasculatures would be severely damaged following MI, the therapeutic strategies should focus on comprehensive repair of both at the same time. But so far the strategies only focus on either one," he explains.

In this regard, Dr Ban and his collaborators, including researchers from Konkuk University, The Catholic University of Korea, Pohang University of Science and Technology, and T&R Biofab in South Korea, have recently developed a multipronged approach. It aimed to concurrently rejuvenate both the heart muscles and the vasculatures by utilizing two major types of stem cells, namely human bone marrow derived mesenchymal stem cells (hMSCs) and cardiomyocytes derived from human induced pluripotent stem cells (hiPSC-CMs).

The hMSCs was employed in the study due to their prominent paracrine activity of secreting good proteins to promote the regeneration of blood vessels and the endothelial cell survival. And the hiPSC-CMs was utilized for their similarities with human primary CMs in terms of the expressions of cardiac-specific genes, structural proteins, ion channels, and more importantly, the spontaneous contraction.

First study of two distinct stem cell effects for cardiac repair

While several previous studies described the beneficial effects of either hiPSC-CMs or hMSCs on MI separately, this is the first study to simultaneously examine the effects of these two distinct stem cells in cardiac repair. The researchers have adopted a dual approach, in which the hMSCs and the hiPSC-CMs were delivered via two distinct routes. The hiPSC-CMs were intramyocardially injected directly into border zone of the rat's heart, while the hMSCs-loaded patch was implanted on top of the infarct area, like a bandage.

The results showed that this dual approach led to a significant improvement of cardiac function and enhancement of vessel formation on a MI heart. The implanted hMSC-loaded patch not only provided a micro-environment which enhanced vascular regeneration as expected, but also improved the retention of hiPSC-CMs, ultimately augmenting heart function and restoring the injured myocardium.

Moreover, histological analyses results demonstrated that the implantation of hMSC-loaded patch has promoted the functional maturation of injected hiPSC-CMs. They became more elongated and rectangular in cell shape, appeared to be more organized in order, which were typical morphological characteristics of matured adult CMs. Functional maturation of intramyocardially injected hiPSC-CMs is particularly important. It is because it can reduce potential risk of arrhythmias, meaning irregular heart contraction, which is a major cause of sudden cardiac death.

Application potential in cardiac regeneration and beyond

"We believe this novel dual approach can potentially provide translational and clinical benefit to the field of cardiac regeneration," said Dr Ban. "Based on the same principle, the protocol may also be utilized for repairing other organs including brain, liver and pancreas in which multiple types of stem cells are co-existing." The team is working on follow-up studies in larger animal model such as pigs. The patent application has been submitted.

The research findings were published in scientific journal Nature Communications, titled "Dual stem cell therapy synergistically improves cardiac function and vascular regeneration following myocardial infarction". Dr Ban, together with Dr Moon Sung-Hwan from Konkuk University School of Medicine and Professor Park Hun-Jun from The Catholic University of Korea, are the corresponding authors of the paper. The first authors are Park Soon-Jung from Konkuk University School of Medicine, Kim Ri Youn and PhD student Lee Sunghun from CityU and Park Bong-Woo from The Catholic University of Korea.

Credit: 
City University of Hong Kong

Despite temperature shifts, treehoppers manage to mate

image: Kasey Fowler-Finn, Ph.D., assistant professor of biology at Saint Louis University

Image: 
Saint Louis University / Ellen Hutti

ST. LOUIS - During the mating season, male treehoppers--small plant feeding insects--serenade potential mates with vibrational songs sent through plant stems. If a female treehopper's interest is sparked, a male-female duet ensues until mating occurs.

Scientists know that there is a thermal window when the half-centimeter-long insects are active and temperature shifts can throw this delicate coordination off. For example, the songs produced by males to attract mates vary with temperature. At some temperatures, male treehoppers can even sound like different species, potentially confusing females, as female treehoppers use these songs to pick a good mate.

A Saint Louis University research team wanted to know if temperature variation, which is increasing with global warming, could have a disruptive effect on the insects' reproduction.

To find out, the team studied how temperature variation affects male vibrational songs and female preferences for these songs in an experiment recently published in The Journal of Evolutionary Biology.

Led by Kasey Fowler-Finn, Ph.D., assistant professor of biology at Saint Louis University, the team tested four groups of Enchenopa binotata treehoppers, each from a different location. They measured the frequency (i.e. pitch) of male vibrational songs and the frequency most preferred by females across a range of temperatures from 18 to 36 degrees Celsius.

The results showed a strong temperature effect on both male signals and female preferences with changes in male signals across temperatures being matched by similar changes in the songs that females prefer. Because the male and female insects responded to temperature shifts together, changes in temperature did not significantly influence predicted patterns of female mate choice. Thus, it seems unlikely that thermal sensitivity in male songs will disrupt mating as temperatures shift.

Fowler-Finn is encouraged by the treehoppers' resilience.

"At a time when we are increasingly concerned about how global warming will influence animals, these findings provide some hope that treehoppers will persist in the face of change."

In addition to their research on global warming's potential impact to insects and ecosystems, the study team also is partnering with the Saint Louis University Museum of Art (SLUMA) to help educate the public about the vital role and fascinating attributes of vibrationally singing insects through a sound installation exhibit open to the public that will open October 25.

Other researchers on the paper include Dowen Mae I. Jocson, Morgan E. Smeester, Noah T. Leith and Anthony Macchiano.

The study was funded by the National Science Foundation (grant number IOS -1656818.)

The paper's digital object identifier is: https://doi.org/10.1111/jeb.13506

Key Take-aways

Male treehoppers serenade potential mates with vibrational songs sent through plant stems and if female treehoppers' interest is sparked, male-female duets ensue until mating occurs.

A Saint Louis University research team wanted to know if temperature variation, as is increasing with global warming, could disrupt the insects' reproduction due to its effects on male songs.

The research team tested four groups of Enchenopa binotata treehoppers, measuring the frequency of male signals and the frequency most preferred by females across a range of temperatures from 18 to 36 degrees Celsius.

Though the results showed a strong temperature effect on both male signals and female preferences, changes in male signals across temperatures were matched by changes in female preferences. Because the male and female insects both responded to temperature shifts together, changes in temperature did not significantly influence female mate choice or disrupt mating.

Credit: 
Saint Louis University

Chicago water pollution may be keeping invasive silver carp out of Great Lakes, study says

URBANA, Ill. - Invasive silver carp have been moving north toward the Great Lakes since their accidental release in the 1970s. The large filter-feeding fish, which are known to jump from the water and wallop anglers, threaten aquatic food webs as well as the $7 billion Great Lakes fishery. But, for the past decade, the invading front hasn't moved past Kankakee. A new study, led by scientists at the University of Illinois, suggests that Chicago's water pollution may be a contributing to this lack of upstream movement.

"It's a really toxic soup coming down from the Chicago Area Waterway, but a lot of those chemicals go away near Kankakee. They might degrade or settle out, or the Kankakee River might dilute them. We don't really know what happens, but there's a stark change in water quality at that point. That's right where the invading front stops," says Cory Suski, associate professor in the Department of Natural Resources and Environmental Sciences and co-author of the study. "And this fish never stops for anything."

The researchers think the fish stall out at Kankakee because they are responding negatively to compounds in the water flowing downstream from Chicago. They formulated their hypothesis after reading a 2017 water quality report from the U.S. Geological Survey. USGS researchers tracked changes in water chemistry in a single pocket of water as it moved from Chicago downstream through the Illinois River. Right near Kankakee, many of the pharmaceuticals, volatile organic compounds, and wastewater indicators dropped off the charts.

Suski says many of these compounds have been shown in other studies to induce avoidance behaviors in fish, but his team didn't look at behavior. Instead, they examined gene expression patterns in blood and liver samples from silver carp at three locations along the Illinois River: at Kankakee, approximately 10 miles downstream near Morris, and 153 miles downstream near Havana.

"We saw huge differences in gene expression patterns between the Kankakee fish and the two downstream populations," Suski explains. "Fish near Kankakee were turning on genes associated with clearing out toxins and turning off genes related to DNA repair and protective measures. Basically, their livers are working overtime and detoxifying pathways are extremely active, which seem to be occurring at the cost of their own repair mechanisms. We didn't see that in either of the downstream populations."

Suski stresses that his study wasn't designed to demonstrate a cause-and-effect relationship between water pollution and silver carp movement, but the results hint at a compelling answer to a decade-old mystery. The researchers hope to follow up to show how the fish are metabolizing the pollutants, which will give them a better understanding of which compounds are having the biggest effects. Right now, it's a black box - the USGS study documented approximately 280 chemicals in the Chicago Area Waterway and downstream sites.

Regardless of which specific pollutants may be responsible for stopping silver carp - if that hypothesis is later proven - the results could have interesting implications for management.

"We're not saying we should pollute more to keep silver carp out of the Great Lakes. That's not it," Suski says. "Right now, things are stable, but that might not always be the case. There's a lot of work in Chicago to clean up the Chicago Area Waterway. Already, water quality is improving, fish communities are getting healthier. Through the process of improving the water quality, which we should absolutely be doing, there's a possibility that this chemical barrier could go away. We don't need to hit the panic button yet, but at least we should be aware."

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Ultracold quantum particles break classical symmetry

image: An expanding cloud of quantum particles violates the scaling symmetry.

Image: 
Heidelberg University

Many phenomena of the natural world evidence symmetries in their dynamic evolution which help researchers to better understand a system's inner mechanism. In quantum physics, however, these symmetries are not always achieved. In laboratory experiments with ultracold lithium atoms, researchers from the Center for Quantum Dynamics at Heidelberg University have proven for the first time the theoretically predicted deviation from classical symmetry. Their results were published in the journal "Science".

"In the world of classical physics, the energy of an ideal gas rises proportionally with the pressure applied. This is a direct consequence of scale symmetry, and the same relation is true in every scale invariant system. In the world of quantum mechanics, however, the interactions between the quantum particles can become so strong that this classical scale symmetry no longer applies", explains Associate Professor Dr Tilman Enss from the Institute for Theoretical Physics. His research group collaborated with Professor Dr Selim Jochim's group at the Institute for Physics.

In their experiments, the researchers studied the behaviour of an ultracold, superfluid gas of lithium atoms. When the gas is moved out of its equilibrium state, it starts to repeatedly expand and contract in a "breathing" motion. Unlike classical particles, these quantum particles can bind into pairs and, as a result, the superfluid becomes stiffer the more it is compressed. The group headed by primary authors Dr Puneet Murthy and Dr Nicolo Defenu - colleagues of Prof. Jochim and Dr Enss - observed this deviation from classical scale symmetry and thereby directly verified the quantum nature of this system. The researchers report that this effect gives a better insight into the behaviour of systems with similar properties such as graphene or superconductors, which have no electrical resistance when they are cooled below a certain critical temperature.

Credit: 
Heidelberg University

Health effects of eating marijuana is subject of a new study

INDIANAPOLIS -- Researchers have conducted a study in which mice voluntarily ate a dough containing THC, the primary psychoactive component in marijuana. That opens the door to additional studies that will help shed light on behavioral and physiological effects that occur in people when they eat food infused with marijuana.

The study is among the first to report on voluntary oral THC consumption in animals, a method of consumption that is similar to the way humans take the drug.

In a recently published paper in Drug and Alcohol Dependence, researchers at IUPUI and Indiana University Bloomington said they found the mice were less active, and their body temperatures were lower, after consuming the edible THC.

The researchers also noted that the effects of edible THC varied based on the subject's sex, said Michael Smoker, first author of the paper and an addiction neuroscience Ph.D. candidate in the lab of professor Stephen Boehm in the psychology department at IUPUI. The addiction neuroscience graduate program is a Purdue University program at IUPUI.

The study showed that mice will self-administer -- or voluntarily choose to consume -- behavioral-effective doses of edible THC, and do so repeatedly, Smoker said. The mice were given gradually increasing doses in a dough made from flour, sugar, salt, glycerol and THC.

Understanding the health effects of eating marijuana edibles is important, given the growing popularity of that method of consumption in states where marijuana has been legalized, Smoker said.

"People can buy cookies, candies and all sorts of things with THC in them. Back in the day, you had to make your own brownies, or something like that, and now they are becoming more widely available and increasing in popularity," he said.

Marijuana edibles can elicit extreme, adverse reactions, Smoker said. Many of the commercially made marijuana-based products have a relatively higher concentration of THC than does marijuana plant material. In some cases, people are unsure how much of a marijuana edible they should eat and end up eating more than they should.

Questions researchers want to answer include the impact of edibles on people's ability to think, whether there are any long-term consequences for someone who has been eating edibles repeatedly and then stops, and what the consequences are, if any, of a child accidentally eating a marijuana edible, Smoker said.

Researchers turned to mice to answer questions about edible forms of THC due to ethical barriers involving use of humans in studies and the lack of control over human subjects' prior exposure to THC and other drugs.

Mice have been used in studies previously to study the effects of marijuana, but figuring out a way for them to self-administer the drug, as humans do, has been notoriously difficult, Smoker said.

Credit: 
Indiana University

Turbulence meets a shock

image: A new theoretical framework was developed and tested using the Stampede2 supercomputer to understand turbulent jumps of mean thermodynamic quantities, shock structure and amplification factors. Turbulence comes in from the left in this image, hitting the shock, and leaving the domain from the right. This three-dimensional picture shows the structure of enstrophy and colored by local Mach number with the shock at gray.

Image: 
Chang-Hsin Chen, TAMU.

This may come as a shock, if you're moving fast enough. The shock being shock waves. A balloon's 'pop' is shock waves generated by exploded bits of the balloon moving faster than the speed of sound. Supersonic planes generate a much louder sonic 'boom,' also from shock waves. Farther out into the cosmos, a collapsing star generates shock waves from particles racing near the speed of light as the star goes supernova. Scientists are using supercomputers to get a better understanding of turbulent flows that interact with shock waves. This understanding could help develop supersonic and hypersonic aircraft, more efficient engine ignition, as well as probe the mysteries of supernova explosions, star formation, and more.

"We proposed a number of new ways in which shock turbulence interactions can be understood," said Diego Donzis, an associate professor in the Department of Aerospace Engineering at Texas A&M University. Donzis co-authored the study, "Shock-Turbulence Interactions at High Turbulence Intensities," published May of 2019 in the Journal of Fluid Mechanics. "We proposed that, instead of treating the shock as a discontinuity, one needs to account for its finite thickness as in real life which may be involved as a governing parameter in, for example, amplification factors," Donzis said.

The dominant theoretical framework for shock turbulence interactions goes back to the 1950s, developed by Herbert Ribner while at the University of Toronto, Ontario. His work supported the understanding of turbulence and shocks interactions with a linear, inviscid theory, which assumes the shock to be a true discontinuity. The entire problem can thus be reduced to something mathematically tractable, where the results depend only on the shock's Mach number, the ratio of a body's speed to the speed of sound in the surrounding medium. As turbulence goes through the shock, it is typically amplified depending on the Mach number.

Experiments and simulations by Donzis and colleagues suggested this amplification depends also on the Reynolds Number, a measure of how strong the turbulence is, and the turbulent Mach number. "We proposed a theory that combined all of these into a single parameter," Donzis said. "And when we proposed this theory a couple of years ago, we didn't have well-resolved data at very high resolution to test some of these ideas."

Enter Stampede2, an 18 petaflop supercomputer at the Texas Advanced Computing Center, part of The University of Texas at Austin. Stampede2 is the most powerful computer in the U.S. for open science research, where the results are made freely available. Donzis was awarded compute time on Stampede2 through XSEDE, the Extreme Science and Engineering Discovery Environment. Both Stampede2 and XSEDE are funded by the National Science Foundation.

"On Stampede2, we ran a very large data set of shock turbulence interactions at different conditions, especially at high turbulence intensity levels, with a degree of realism that is beyond what is typically found in the literature in terms of resolution at the small scales, in terms of the order of the scheme that we used," Donzis said. "Thanks to Stampede2, we can not only show how amplification factors scale, but also under what conditions we expect Ribner's theory to hold, and under what conditions our previously proposed scaling is the more appropriate one."

Study lead author Chang Hsin Chen added that, "We also looked at the structure of the shock and, through highly resolved simulations, we were able to understand how turbulence creates holes on the shock. This was only possible due to the computational power provided by Stampede2." Chen is a postdoctoral researcher in the National Aerothermochemistry Laboratory at Texas A&M University. His research focuses on compressible turbulence and shock waves, and high performance computational fluid dynamics.

Donzis furthered that "Stampede2 is allowing us to run simulations, some of them at unprecedented levels of realism, in particular the small-scale resolution that we need to study processes at the very small scales of turbulent flows. Some of these simulations run on half of the machine, or more, and sometimes they take months to run."

What's more, the scientists also explored the so-called shock jumps, which are abrupt changes in pressure and temperature as matter moves across a shock. "In this study we developed and tested a new theoretical framework to understand, for example, why an otherwise stationary shock starts moving when the incoming flow is turbulent," Donzis said. This implies that the incoming turbulence deeply alters the shock. "The theory predicts, and the simulations on Stampede2 confirm that the pressure jumps change, and how they do so when the incoming flow is turbulent. This is an effect that is actually not accounted for in the seminal work by Ribner, but now we can understand it quantitatively," Donzis said.

Making progress in understanding when turbulence meets shocks didn't come easy. Extreme resolution on the order of billions of grid points are needed to capture the sharp gradients of a shock at high Reynolds number. "While we are limited by how much we can push the parameter range on Stampede2 or any other computer for that matter, we have been able to cover a very large space in this parameter space, spanning parameter ranges beyond what has been done before," Donzis said.

The input/output (I/O) also turned out to be challenging in writing the data to disk at very large core counts. "This is one instance in which we took advantage of the Extended Collaborative Support Services (ECSS) from XSEDE, and we were able to successfully optimize our strategy," Donzis said. "We are now confident that we can keep increasing the size of our simulations with the new strategy and keep doing I/O at a reasonable computational expense."

Donzis is no stranger to XSEDE, which he's used for years back when it was called Teragrid, to develop his group's codes - starting with the LeMieux system at the Pittsburgh Supercomputing Center; Blue Horizon at the San Diego Supercomputer Center; Kraken at the National Institute for Computational Sciences; and now on Stampede1 and Stampede2 at TACC.

"A number of the successes that we have today are because of the continued support of XSEDE, and Teragrid, for the scientific community. The research we're capable of doing today and all the success stories are in part the result of the continuous commitment by the scientific community and funding agencies to sustain a cyberinfrastructure that allows us to tackle the greatest scientific and technological challenges we face and may face in the future. This is true not just for my group, but perhaps also for the rest of the scientific computing community in the US. I believe the XSEDE project and its predecessors in this sense have been a tremendous enabler," Donzis said.

Donzis is a firm believer that advances in high performance computing (HPC) translate directly to benefits for all of society. "Any impact in HPC will have repercussions in transportation, industrial processes, manufacturing, defense, essentially the everyday life of ordinary people, as most of our lives are infused with technology products and services that at some stage or another benefit from numerical computations of different scales," Donzis said. And advances in the understanding of turbulence impact a broad range of applications, he added.

Said Donzis: "Advances in the understanding of shock turbulence interactions could lead to supersonic and hypersonic flight, to make them a reality for people to fly in a few hours from here to Europe; space exploration; and even our understanding of the structure of the observable universe. It could help answer, why are we here? More down to Earth, understanding turbulence in compressible flows could lead to great improvements in combustion efficiency, drag reduction and general transportation."

Credit: 
University of Texas at Austin, Texas Advanced Computing Center

Making sense of remote sensing data

image: Example of RadialPheno workflow starting with a set of images from a fisheye phenocam and four regions of interest (ROIs); each ROI represents one individual species' crown. From the image processing, indices including the daily green chromatic coordinate (Gcc) values representing the leaf flushing phenophases for each species are obtained and saved as a CSV file, which is used as a data input in RadialPheno.

Image: 
Mariano, G. C., B. Alberton, L. P. C. Morellato, and R. da S. Torres. 2019. RadialPheno: A tool for near-surface phenology analysis through radial layouts. <i>Applications in Plant Sciences</i> 7(6): e1253. https://doi.org/10.1002/aps3.1253

Remote sensor technologies like cameras, GPS trackers, and weather stations have revolutionized biological data collection in the field. Now researchers can capture continuous datasets in difficult terrain, at a scale unimaginable before these technologies became available. But as this flood of data has rolled into laboratory computers around the world, researchers have found themselves without well-developed analytical tools to make sense of it all. In research presented in a recent issue of Applications in Plant Sciences, Dr. Greice Mariano and colleagues introduce a tool called RadialPheno to analyze leafing patterns of plants based on remote camera data.

Before the widespread use of remote sensing equipment for field data collection, manual observation by scientists was a laborious, monotonous, and error-prone task, yielding much smaller and more dubious datasets. This is perhaps particularly true of phenology---the study of the timing of developmental events like leafing, flowering, and fruiting---because solid phenological observation requires being in the right place at the right time.

"On-the-ground phenological observations are accomplished periodically, require much more labor, and rely on the people that perform the observations," explains Dr. Mariano, corresponding author on the article and currently a post-doctoral research fellow at OCAD University, Toronto, Canada. "On top of that, results are stored in spreadsheets, which does not allow interoperability between data, making it difficult to analyze the data."

The ability to obtain rich sets of observations by setting up relatively inexpensive remote cameras has changed all that. But these observations don't mean much unless they can be translated into integrated, tractable datasets that can yield meaningful insights. "Since remote monitoring is something new to phenology studies, there is a need to develop and improve methods for detecting changes in phenological series and data images, setting up standards," says Dr. Mariano, who conducted this research during her PhD at the Institute of Computing, University of Campinas, Brazil. "Thus, in this research we introduced RadialPheno as a tool to support phenology experts in their analyses."

To identify the needs of these phenology experts, Dr. Mariano and her team worked with researchers at the Phenology Laboratory at São Paulo State University (UNESP). Dr. Mariano investigated what they wanted to see from data visualization tools and the issues they experience with existing software. Based on this investigation, they "developed a tool focused on the visualization of temporal data, where the identification of recurrent events is the most important task. We also thought about the integration of the visualization with the common statistical methods used by the experts," explains Dr. Mariano. "We choose radial representation because those structures are useful for understanding cyclical events, such as phenological ones."

The team tested their new tool by measuring leafing patterns in the Brazilian cerrado, a vast savanna ecosystem dominated by a wet and dry season. "Many plant phenology studies in Brazil have been conducted on a cerrado savanna area, and a digital camera for near-surface monitoring was installed in that area," says Dr. Mariano. "Thus, we also have phenological data based on on-the-ground observations, which allowed us to validate our tool with both on-the-ground data and [camera-derived] data." The camera network they set up is part of a set of cameras installed in different areas called the Phenocam Network, which is available online.

Solid data about phenological patterns are more urgently needed than ever, as climate change alters the timing of events in ecosystems around the world with possibly dire consequences for ecological interactions. Fortunately, remote sensing technologies have made collecting these datasets possible; tools like RadialPheno can now make them more meaningful.

Credit: 
Botanical Society of America

Virtual 'universe machine' sheds light on galaxy evolution

image: A UA-led team of scientists generated millions of different universes on a supercomputer, each of which obeyed different physical theories for how galaxies should form.

Image: 
NASA, ESA, and J. Lotz and the HFF Team/STScI

How do galaxies such as our Milky Way come into existence? How do they grow and change over time? The science behind galaxy formation has remained a puzzle for decades, but a University of Arizona-led team of scientists is one step closer to finding answers thanks to supercomputer simulations.

Observing real galaxies in space can only provide snapshots in time, so researchers who want to study how galaxies evolve over billions of years have to revert to computer simulations. Traditionally, astronomers have used this approach to invent and test new theories of galaxy formation, one-by-one. Peter Behroozi, an assistant professor at the UA Steward Observatory, and his team overcame this hurdle by generating millions of different universes on a supercomputer, each of which obeyed different physical theories for how galaxies should form.

The findings, published in the Monthly Notices of the Royal Astronomical Society, challenge fundamental ideas about the role dark matter plays in galaxy formation, how galaxies evolve over time and how they give birth to stars.

"On the computer, we can create many different universes and compare them to the actual one, and that lets us infer which rules lead to the one we see," said Behroozi, the study's lead author.

The study is the first to create self-consistent universes that are such exact replicas of the real one: computer simulations that each represent a sizeable chunk of the actual cosmos, containing 12 million galaxies and spanning the time from 400 million years after the Big Bang to the present day.

Each "Ex-Machina" universe was put through a series of tests to evaluate how similar galaxies appeared in the generated universe compared to the true universe. The universes most similar to our own all had similar underlying physical rules, demonstrating a powerful new approach for studying galaxy formation.

The results from the "UniverseMachine," as the authors call their approach, have helped resolve the long-standing paradox of why galaxies cease to form new stars even when they retain plenty of hydrogen gas, the raw material from which stars are forged.

Commonly held ideas about how galaxies form stars involve a complex interplay between cold gas collapsing under the effect of gravity into dense pockets giving rise to stars, while other processes counteract star formation.

For example, it is thought that most galaxies harbor supermassive black holes in their centers. Matter falling into these black holes radiates tremendous energies, acting as cosmic blowtorches that prevent gas from cooling down enough to collapse into stellar nurseries. Similarly, stars ending their lives in supernova explosions contribute to this process. Dark matter, too, plays a big role, as it provides for most of the gravitational force acting on the visible matter in a galaxy, pulling in cold gas from the galaxy's surroundings and heating it up in the process.

"As we go back earlier and earlier in the universe, we would expect the dark matter to be denser, and therefore the gas to be getting hotter and hotter. This is bad for star formation, so we had thought that many galaxies in the early universe should have stopped forming stars a long time ago," Behroozi said. "But we found the opposite: galaxies of a given size were more likely to form stars at a higher rate, contrary to the expectation."

In order to match observations of actual galaxies, Behroozi explained, his team had to create virtual universes in which the opposite was the case - universes in which galaxies kept churning out stars for much longer.

If, on the other hand, the researchers created universes based on current theories of galaxy formation - universes in which the galaxies stopped forming stars early on - those galaxies appeared much redder than the galaxies we see in the sky.

Galaxies appear red for two reasons. The first is apparent in nature and has to do with a galaxy's age - if it formed earlier in the history of the universe, it will be moving away faster, shifting the light into the red spectrum. Astronomers call this effect redshift. The other reason is intrinsic: - if a galaxy has stopped forming stars, it will contain fewer blue stars, which typically die out sooner, and be left with older, redder stars.

"But we don't see that," Behroozi said. "If galaxies behaved as we thought and stopped forming stars earlier, our actual universe would be colored all wrong. In other words, we are forced to conclude that galaxies formed stars more efficiently in the early times than we thought. And what this tells us is that the energy created by supermassive black holes and exploding stars is less efficient at stifling star formation than our theories predicted."

According to Behroozi, creating mock universes of unprecedented complexity required an entirely new approach that was not limited by computing power and memory, and provided enough resolution to span the scales from the "small" - individual objects such as supernovae - to a sizeable chunk of the observable universe.

"Simulating a single galaxy requires 10 to the 48th computing operations," he explained. "All computers on Earth combined could not do this in a hundred years. So to just simulate a single galaxy, let alone 12 million, we had to do this differently."

In addition to utilizing computing resources at NASA Ames Research Center and the Leibniz-Rechenzentrum in Garching, Germany, the team used the "Ocelote" supercomputer at the UA High Performance Computing cluster. Two-thousand processors crunched the data simultaneously over three weeks. Over the course of the research project, Behroozi and his colleagues generated more than 8 million universes.

"We took the past 20 years of astronomical observations and compared them to the millions of mock universes we generated," Behroozi explained. "We pieced together thousands of pieces of information to see which ones matched. Did the universe we created look right? If not, we'd go back and make modifications, and check again."

To further understand how galaxies came to be, Behroozi and his colleagues plan to expand the UniverseMachine to include the morphology of individual galaxies and how their shapes evolve over time.

Credit: 
University of Arizona

Rice chemists show it's hip to be square

image: Rice University chemists developed a synthetic pathway to azetidines, molecules that expose nitrogen atoms that serve as precursors to drug design.

Image: 
László Kürti/Rice University

HOUSTON - (Aug. 9, 2019) - Rice University chemists want to make a point: Nitrogen atoms are for squares.

The nitrogens are the point. The squares are the frames that carry them. These molecules are called azetidines, and they can be used as building blocks in drug design.

The Rice lab of chemist László Kürti introduced its azetidines in an Angewandte Chemie paper. The lab's goal is to establish a library of scaffolds for pharmaceutical design through the simple synthesis of a class of molecules that were previously hard to find in nature and very hard to copy.

Azetidines already appear in several drugs and are promising components in the development of treatments for neurological diseases like Parkinson's disease, Tourette's syndrome and attention deficit disorder, according to the researchers.

So there's value in making a fast, inexpensive synthetic route to azetidines with unprotected nitrogen atoms called NH-azetidines -- NH for nitrogen and hydrogen -- that were first found in several kinds of Pacific sea sponges and have more recently been made in arduous laboratory processes.

Rice graduate students Nicole Behnke and Kaitlyn Lovato, lead authors of the paper, quickly learned why there are so few references to synthetic NH-azetidines in the scientific literature.

It required more than 250 experiments for the students to optimize their process, which takes about 24 hours, including product purification. Azetidine molecules come in many configurations, but they all share the square motif, a four-atom "ring" that contains one nitrogen atom and three carbon atoms. This ring is heterocyclic -- that is, it contains at least two different elements.

Kürti noted the square ring is always connected to another ring via one shared carbon atom, a structure called a spiro azetidine. In this way, the two rings are perpendicular to each other, further isolating the highly reactive nitrogen for access by chemists. The nitrogen in the Rice lab's variations were often, though not always, paired with a hydrogen atom "cap" that still allows the nitrogen to react with outside agents.

"Dr. Kürti was inspired by the mechanism of a synthetic process called the Kulinkovich reaction, which is used to make three-membered all-carbon rings, called cyclopropanes, that have the heteroatom (the nitrogen or oxygen) on the outside," Lovato said.

"Once we started looking into making four-membered azetidines, we found that most of them didn't have the NH structures," she said. "The known synthetic methods predominantly yield azetidines in which the ring nitrogen atom is connected to a carbon outside of the ring, but the NH connectivity was hard to access directly. If there's a carbon there, the nitrogen is considered protected, but having the hydrogen there leaves it free to engage in further reactions."

"Once you make this NH heterocycle, you have the flexibility to put whatever you want on the nitrogen," Kürti said. "Or to leave it as it is."

A titanium reagent turned out to be a key middleman in the chemical reaction, allowing it to proceed quickly. "This metal complex mediates the overall transformation, and it's very good because titanium is non-toxic and very abundant," he said.

"It's commercially available and cheap," Behnke said. "If we don't have the titanium added to the flask, the reaction doesn't work."

The Rice team did not patent the process, Kürti said. "The reality is that synthetic organic chemists in academia can contribute a lot to biomedical sciences and pharmaceutical drug discovery when we develop a new mechanism or reaction," he said.

"Biotech and pharmaceutical companies can use the products of these reactions to build structurally diverse compound libraries and quickly test them for biological activities towards different cancer cell lines, pathogens or other important disease biochemical pathways they have assays for," Kürti said. "Once they have access to novel core structures like these spiro azetidines, it's up to medical chemists to decide what diverse functionalities they wish to add on."

Muhammed Yousufuddin, a lecturer of chemistry at the University of North Texas at Dallas, is co-author of the paper. Kürti is an associate professor of chemistry.

Rice University, the National Institutes of Health, the National Science Foundation (NSF), the Robert A. Welch Foundation, the Amgen Young Investigators Award, the Biotage Young Principal Investigator Award and an NSF Graduate Research Fellowship supported the research.

Credit: 
Rice University

Enhanced natural gas storage to help reduce global warming

image: This is a comparison of highest reported volumetric working capacities.

Image: 
KAIST

Researchers have designed plastic-based materials that can store natural gas more effectively. These new materials can not only make large-scale, cost-effective, and safe natural gas storage possible, but further hold a strong promise for combating global warming.

Natural gas (predominantly methane) is a clean energy alternative. It is stored by compression, liquefaction, or adsorption. Among these, adsorbed natural gas (ANG) storage is a more efficient, cheaper, and safer alternative to conventional compressed natural gas (CNG) and liquefied natural gas (LNG) storage approaches that have drawbacks such as low storage efficiency, high costs, and safety concerns. However, developing adsorptive materials that can more fully exploit the advantages of ANG storage has remained a challenging task.

A KAIST research team led by Professor Cafer T. Yavuz from the Graduate School of Energy, Environment, Water, and Sustainability (EEWS), in collaboration with Professor Mert Atilhan's group from Texas A&M University, synthesized 29 unique porous polymeric structures with inherent flexibility, and tested their methane gas uptake capacity at high pressures. These porous polymers had varying synthetic complexities, porosities, and morphologies, and the researchers subjected each porous polymer to pure methane gas under various conditions to study the ANG performances.

Of these 29 distinct chemical structures, COP-150 was particularly noteworthy as it achieved a high deliverable gravimetric methane working capacity when cycled between 5 and 100?bar at 273 K, which is 98% of the total uptake capacity. This result surpassed the target set by the United States Department of Energy (US DOE).

COP-150 is the first ever structure to fulfil both the gravimetric and volumetric requirements of the US DOE for successful vehicular use, and the total cost to produce the COP-150 adsorbent was only 1 USD per kilogram.

COP-150 can be produced using freely available and easily accessible plastic materials, and moreover, its synthesis takes place at room temperature, open to the air, and no previous purification of the chemicals is required. The pressure-triggered flexible structure of COP-150 is also advantageous in terms of the total working capacity of deliverable methane for real applications.

The research team believed that the increased pressure flexes the network structure of COP-150 showing "swelling" behavior, and suggested that the flexibility provides rapid desorption and thermal management, while the hydrophobicity and the nature of the covalently bonded framework allow these promising materials to tolerate harsh conditions.

This swelling mechanism of expansion-contraction solves two other major issues, the team noted. Firstly, when using adsorbents based on such a mechanism, unsafe pressure spikes that may occur due to temperature swings can be eliminated. In addition, contamination can also be minimized, since the adsorbent remains contracted when no gas is stored.

Professor Yavuz said, "We envision a whole host of new designs and mechanisms to be developed based on our concept. Since natural gas is a much cleaner fuel than coal and petroleum, new developments in this realm will help switching to the use of less polluting fuels."

Professor Atilhan agreed the most important impact of their research is on the environment. "Using natural gas more than coal and petroleum will significantly reduce greenhouse gas emissions. We believe, one day, we might see vehicles equipped with our materials that are run by a cleaner natural gas fuel," he added.

Credit: 
The Korea Advanced Institute of Science and Technology (KAIST)

Abundant screen time linked with overweight among children

A recently completed study indicates that Finnish children who spend a lot of time in front of screens have a heightened risk for overweight and abdominal obesity, regardless of the extent of their physical activity.

The increase in childhood obesity is one of the largest health problems globally. The study investigated links between screen time and overweight by utilising the Finnish Health in Teens data (Fin-HIT), encompassing more than 10,000 children from across Finland. The children studied were between 9 and 12 years of age.

The subjects reported the time spent viewing television programmes and films on screens and the amount of sedentary computer use outside school hours. The children were measured for height, weight and waist circumference.

The results, published in the Scientific Reports journal, demonstrated that heavy screen time is associated with both overweight and abdominal obesity. There was no variance in the findings by age, gender, native language, sleep duration and exercise during free time. Watching a lot of television was also associated with overweight and abdominal obesity in children who exercised the most.

"It must be noted that this cross-sectional study does not reveal anything about causality. It may be that overweight children spend more time in front of screens, or that abundant screen time may result in overweight," says researcher Elina Engberg from the University of Helsinki and Folkhälsan.

"Neither did the study measure the intensity of exercise; the participants were only asked about the amount of time they spent exercising in their free time. Further research on the combined effect of screen time, physical activity and diet on children's weight is needed."

Previously, not much research has been carried out on the link between screen time and children's abdominal obesity. Overweight in children and related adverse health effects are illustrated by the waist-to-height ratio as well the body mass index.

Credit: 
University of Helsinki

Sleep, snacks and shift work

image: If you're one of Australia's 1.4 million shiftworkers, eating at irregular times is just par for the course - but have you ever stopped to think about the impact this might have on your body? In a new research study by the University of South Australia, researchers have investigated whether altering food intake during the nightshift could optimise how shiftworkers feel during the night and reduce their sleepiness.

Image: 
Matthew Henry

If you’re one of Australia’s 1.4 million shiftworkers, eating at irregular times is just par for the course – but have you ever stopped to think about the impact this might have on your body?

In a new research study by the University of South Australia, researchers have investigated whether altering food intake during the nightshift could optimise how shiftworkers feel during the night and reduce their sleepiness.

Testing the impact of either a snack, a meal, or no food at all, the study found that a simple snack was the best choice for maximising alertness and productivity.

Lead researcher and UniSA PhD candidate Charlotte Gupta says the finding has the potential to help thousands of shiftworkers who work during the night.

“In today’s 24/7 economy, working the nightshift is increasingly common, with many industries – health care, aviation, transport and mining – requiring employees to work around the clock,” Gupta says.

“As a nightshift worker, finding ways to manage your alertness when your body is naturally primed for sleep can be really challenging.

“We know that many nightshift workers eat on-shift to help them stay awake, but until now, no research has shown whether this is good or bad for their health and performance.

“This is the first study to investigate how workers feel and perform after eating different amounts of food.

“The findings will inform the most strategic eating patterns on-shift and can hopefully contribute to more alert and better performing workers.”

In Australia, of the 1.4 million shiftworkers, 15 per cent (or over 200,000) regularly work a night or evening shift. Working at night-time conflicts with a person’s internal circadian clock, making it harder to stay focused and awake. Managing fatigue is therefore critical for workplace health and safety.

Over a 7-day simulated shiftwork protocol, the study assessed the impact of three eating conditions (a meal comprising 30 per cent of energy intake over a 24-hour period (for example, a sandwich, muesli bar, and apple); a snack comprising 10 percent of energy intake (for example, just the muesli bar and apple); and no food intake at all) each consumed at 12:30 am. The 44 participants were randomly split into the three test-conditions and were asked to report on their levels of hunger, gut reaction and sleepiness.

The results showed that while all participants reported increased sleepiness and fatigue, and decreased vigour across the nightshift, consuming a snack reduces the impact of these feelings more so than a meal or no food at all. The snack group also reported having no uncomfortable feelings of fullness as noted by the meal group.

Gupta says the next step in the research is to investigate the different types of snacks and how they affect shiftworkers differently.

“Now that we know that consuming a snack on nightshift will optimise your alertness and performance without any adverse effects, we’re keen to delve more into the types of snacks shiftworkers are eating,” Gupta says.

“Lots of shiftworkers snack multiple times over a nightshift, and understanding the different macronutrient balances is important, especially as many report consuming foods high in fat, such as chips, chocolate and fast foods.

“We’re keen to assess how people feel and perform after a healthy snack versus a less-healthy, but potentially more satisfying snack like chocolate or lollies.

“Ultimately, the goal is to help Australian shiftworkers on the nightshift to stay alert, be safe, and feel healthy.”

Media: Annabel Mansfield: office +61 8 8302 0351 | mobile: +61417 717 504 email: Annabel.Mansfield@unisa.edu.au Lead Researcher: Charlotte Gupta, Sleep and Chronobiology Laboratory office +61 8302 2611   email: charlotte.gupta@mymail.unisa.edu.au

Journal

Nutrients

DOI

10.3390/nu11061352

Credit: 
University of South Australia

Scientists at DGIST discovered how chronic stress causes brain damage

image: Professor Seong-Woon Yu in the Department of Brain and Cognitive Sciences at DGIST (right), Seonghee Jung in the M.S.-Ph.D. Integrated Program in the Department of Brain and Cognitive Sciences at DGIST (left)

Image: 
DGIST

DGIST announced on July 2 that Professor Seong-Woon Yu's team in the Department of Brain and Cognitive Sciences discovered that chronic stress causes autophagic death of adult hippocampal neural stem cells (NSCs). These findings are expected to open up new strategies for combatting stress-associated neural diseases.

Chronic stress is infamous for its association with various mental diseases such as depression and schizophrenia that have become very serious social problems. Stress can even raise the risk of neurodegenerative diseases, such as Alzheimer's disease. However, the exact mechanisms underlying damages of brain functions have not been well known yet. While the previous animal studies found that generation of new neurons is much less in stressed mice, apoptosis1, a well-known cell suicide pathway was not found in NSCs, leading to a conclusion that cell death is not related with loss of NSCs during stress. Thus, the cause of decline in adult neurogenesis2, which is generation of new neural cells in the adult brain, specially in hippocampus, has remained .

Professor Yu's team discovered for the first time that chronic stress causes autophagic death of adult hippocampal NSCs. Autophagy (self-eating in Greek) is a cellular process to protect cells from unfavorable conditions through digestion and recycling of inner cell materials, thereby cells can remove toxic or old intracellular components and get nutrients and metabolites for survival. However, autophagy can turn into self-destruction process under certain conditions, leading to autophagic cell death. Autophagic cell death is a form of cell death distinguished from apoptosis by the causative role of autophagy for cell demise. Using the NSCs derived from rodents and genetically-modified mice, the research team discovered that the death of hippocampal NSCs is prevented and normal brain functions are maintained without stress symptoms when Atg7, one of the major autophagic genes, is deleted.

The research team also further examined the mechanism controlling the autophagy induction of NSCs in more depth, proving that SGK3 (serum/glucocorticoid regulated kinase3) geneis the trigger for autophagy initiation. Therefore, when SGK3 gene is removed, hippocampal NSCs do not undergo cell death and are spared from stress.

Professor Yu in the Department of Brain and Cognitive Sciences said that "It is clear from our study that cognitive defects and mood disorders brought about by stress is through autophagic death of adult hippocampal NSCs. With continuous research, we'll be able to take a step further toward the development of effective treatment of psychological disorders such as depression and anxiety. Furthermore, stress-related neurodegenerative diseases including dementia could be also benefited from our study. We hope to be able to develop much faster and more effective mental disease treatments through joint research with the Chinese National Compound Library to develop SGK3 inhibitor together."

Credit: 
DGIST (Daegu Gyeongbuk Institute of Science and Technology)