Culture

Getting to the root of how plants tolerate too much iron

video: Salk scientists uncover a gene responsible for helping plants thrive in stressful environments.

Image: 
Salk Institute

LA JOLLA--(August 29, 2019) Iron is essential for plant growth, but with heavy rainfall and poor aeration, many acidic soils become toxic with excess iron. In countries with dramatic flood seasons, such as in West Africa and tropical Asia, toxic iron levels can have dire consequences on the availability of staple foods, such as rice.

Despite dozens of attempts in the last two decades to uncover the genes responsible for iron tolerance, these remained elusive until recently. Now, Salk scientists have found a major genetic regulator of iron tolerance, a gene called GSNOR. The findings, published in Nature Communications on August 29, 2019, could lead to the development of crop species that produce higher yields in soils with excess iron.

"This is the first time that a gene and its natural variants have been identified for iron tolerance," says Associate Professor Wolfgang Busch, senior author on the paper and a member of Salk's Plant Molecular and Cellular Biology Laboratory as well as its Integrative Biology Laboratory. "This work is exciting because we now understand how plants can grow in stressful conditions, such as high levels of iron, which could help us make more stress-resistant crops."

In plants such as rice, elevated soil iron levels cause direct cellular damage by harming fats and proteins, decreasing roots' ability to grow. Yet, some plants appear to have inherent tolerance to high iron levels; scientists wanted to understand why.

"We believed there were genetic mechanisms that underlie this resistance, but it was unclear which genes were responsible," says first author Baohai Li, a postdoctoral fellow in the Busch lab. "To examine this question, we used the power of natural variation of hundreds of different strains of plants to study genetic adaption to high levels of iron."

The scientists first tested a number of strains of a small mustard plant (Arabidopsis thaliana), to observe if there was natural variation in iron resistance. Some of the plants did exhibit tolerance to iron toxicity, so the researchers used an approach called genome-wide association studies (GWAS) to locate the responsible gene. Their analyses pinpointed the gene GSNOR as the key to enabling plants and roots to grow in iron-heavy environments.

The researchers also found that the iron-tolerance mechanism is, to their surprise, related to the activities of nitric oxide, a gaseous molecule with a variety of roles in plants including responding to stress. High levels of nitric oxide induced cellular stress and impaired the plant roots' tolerance for elevated iron levels. This occurred when plants did not have a functional GSNOR gene. GSNOR likely plays a central role in nitric oxide metabolism and regulates the plants' ability to respond to cellular stress and damage. This nitric oxide mechanism and the GSNOR gene also affected iron tolerance in other species of plants, such as rice (Oryza sativa) and a legume (Lotus japonicus), suggesting that this gene and its activities are likely critical in many, if not all, species of plants.

"By identifying this gene and its genetic variants that confer iron tolerance, we hope to help plants, such as rice, become more resistant to iron in regions with toxic iron levels," says Busch. "Since we found that this gene and pathway was conserved in multiple species of plants, we suspect they may be important for iron resistance in all higher plants. Additionally, this gene and pathway may also play a role in humans, and could lead to new treatments for conditions associated with iron overload."

Next, Li will be starting his own laboratory at Zhejiang University, in China. He plans to identify the relevant genetic variants in rice and observe if iron-tolerance variants could increase crop yields in flooded Chinese fields.

Credit: 
Salk Institute

Biological 'rosetta stone' brings scientists closer to deciphering how the body is built

video: Time lapse of developing fruit fly larva

Image: 
Carlos Sanchez-Higueras/Hombría Lab/CABD

NEW YORK AND SEVILLE, SPAIN -- Every animal, from an ant to a human, contains in their genome pieces of DNA called Hox genes. Architects of the body, these genes are keepers of the body's blueprints; they dictate how embryos grown into adults, including where a developing animal puts its head, legs and other body parts.

Scientists have long searched for ways to decipher how Hox genes create this body map; a key to decoding how we build our bodies.

Now an international group of researchers from Columbia University and the Spanish National Research Council (CSIC) based at the Universidad Pablo de Olavide in Seville, Spain have found one such key: a method that can systematically identify the role each Hox gene plays in a developing fruit fly. Their results, reported recently in Nature Communications, offer a new path forward for researchers hoping to make sense of a process that is equal parts chaotic and precise, and that is critical to understanding not only growth and development but also aging and disease.

"The genome, which contains thousands of genes and millions of letters of DNA, is the most complicated code ever written," said Richard Mann, PhD, principal investigator at Columbia's Mortimer B. Zuckerman Mind Brain Behavior Institute and the paper's co-senior author. "Deciphering this code has proven so difficult because evolution wrote it in fits and starts over hundreds of millions of years. Today's study offers a key to cracking that code, bringing us closer than ever to understanding how Hox genes build a healthy body, or how this process gets disrupted in disease."

Hox genes are ancient; they can be found across all animal species. Even primitive jellyfish have them. Each type of organism has different combinations of these genes. Fruit flies have eight Hox genes, while humans have 39.

These genes work by producing special proteins called transcription factors, which work together with similar proteins called Hox cofactors to bind to different segments of DNA and turn many other genes on and off at just the right time -- a Rube Goldberg machine of microscopic proportions.

"Because these genes are intricately involved in many aspects of development, it has proven incredibly challenging to isolate individual Hox genes and trace their activity over time," said James Castelli-Gair Hombría, PhD, a principal investigator at the Centro Andaluz de Biología del Desarrollo at the Universidad Pablo de Olavide and the paper's co-senior author. "We had this incredibly complex equation to solve but too many unknowns to make significant progress."

Recently, Dr. Hombría and his team hit upon a bit of luck. While examining genetic activity in a developing fruit fly, they stumbled upon a small piece of regulatory DNA, called vvI1+2, that had an unusual -- and surprising -- attribute. Although it was active in cells across the fruit fly's entire developing body, it appeared to be regulated by all of the fruit fly's eight Hox genes.

"The ubiquity of the vvI1+2 DNA segment across the entire developing fruit fly, combined with the fact that every Hox gene touches it, made it an ideal system by which to study the Hox gene family," said Carlos Sánchez-Higueras, PhD, a postdoctoral researcher in the Hombría lab and the paper's first author. "In this single piece of DNA, we had the perfect tool; we could now devise a method to systematically manipulate vvI1+2 activity to see how each Hox gene functioned."

First, Dr. Sánchez-Higueras teamed up with Dr. Mann at Columbia's Zuckerman Institute and used a sophisticated computer algorithm called No Read Left Behind, or NRLB. NRLB was recently developed by Dr. Mann, his lab, and his collaborators, including Columbia systems biology professor Harmen Bussemaker, PhD. This powerful algorithm pinpoints the locations where transcription factors bind to a stretch of DNA, even if these binding sites are very weak and difficult to capture. For this study, the researchers focused on the Hox transcription factors and Hox cofactors that bind to vvI1+2.

"Our analyses provided a precise road map of Hox binding sites in vvI1+2, which we could then apply to a living fruit fly," said Dr. Mann, who is also the Higgins Professor of Biochemistry and Molecular Biophysics (in Systems Biology) at Columbia's Vagelos College of Physicians and Surgeons.

By employing a combination of elegant genetic manipulations in living, or in vivo, fly embryos, together with advanced biochemical and computational analysis, the researchers could then systematically manipulate Hox target activity with an unprecedented level of precision.

"We now had a starting point from which to systematically decode Hox gene regulation," said Dr. Hombría, "a kind of Rosetta Stone to help us decipher the genetics of body development."

The researchers' findings are especially promising because they can be applied to the entire genome. The steps that Hox genes undergo to regulate vvI1+2 can inform how Hox genes regulate other DNA, not just in fruit flies but beyond -- including in vertebrates, such as mammals, and even humans.

"While there is much about Hox genes that remains to be elucidated, our work is a significant step forward," said Dr. Sánchez-Higueras. "These continued efforts, combined with those of our peers, will help shed light on how the whole system works together in a growing embryo and how they contribute to disease."

Credit: 
The Zuckerman Institute at Columbia University

Overcome the bottleneck of solid electrolytes for Li batteries

On Aug 21st, Prof. MA Cheng from the University of Science and Technology of China (USTC) and his collaborators proposed an effective strategy to address the electrode-electrolyte contact issue that is limiting the development of next-generation solid-state Li batteries. The solid-solid composite electrode created this way exhibited exceptional capacities and rate performances.

Replacing the organic liquid electrolyte in conventional Li-ion batteries with solid electrolytes can greatly alleviate the safety issues, and potentially break the "glass ceiling" for energy density improvement. However, mainstream electrode materials are also solids. Since the contact between two solids is nearly impossible to be as intimate as that between solid and liquid, at present the batteries based on solid electrolytes typically exhibit poor electrode-electrolyte contact and unsatisfactory full-cell performances.

"The electrode-electrolyte contact issue of solid-state batteries is somewhat like the shortest stave of a wooden barrel," said Prof. MA Cheng from USTC, the lead author of the study. "Actually, over these years researchers have already developed many excellent electrodes and solid electrolytes, but the poor contact between them is still limiting the efficiency of Li-ion transport."

Fortunately, MA's strategy may overcome this formidable challenge. The study began with the atom-by-atom examination of an impurity phase in a prototype, perovskite-structured solid electrolyte. Although the crystal structure differed greatly between the impurity and the solid electrolyte, they were observed to form epitaxial interfaces. After a series of detailed structural and chemical analyses, researchers discovered that the impurity phase is isostructural with the high-capacity Li-rich layered electrodes. That is to say, a prototype solid electrolyte can crystallize on the "template" formed by the atomic framework of a high-performance electrode, resulting in atomically intimate interfaces.

"This is truly a surprise," said the first author LI Fuzhen, who is currently a graduate student of USTC. "The presence of impurities in the material is actually a very common phenomenon, so common that most of the time they will be ignored. However, after taking a close look at them, we discovered this unexpected epitaxial behavior, and it directly inspired our strategy for improving the solid-solid contact."

Taking advantage of the observed phenomenon, the researchers intentionally crystallized the amorphous powder with the same composition as the perovskite-structured solid electrolyte on the surface of a Li-rich layered compound, and successfully realized a thorough, seamless contact between these two solid materials in a composite electrode. With the electrode-electrolyte contact issue addressed, such a solid-solid composite electrode delivered a rate capability even comparable to that from a solid-liquid composite electrode. More importantly, the researchers also found this type of epitaxial solid-solid contact may tolerate large lattice mismatches, and thus the strategy they proposed could also be applicable to many other perovskite solid electrolytes and layered electrodes.

"This work pointed out a direction that is worth pursuing," MA said. "Applying the principle raised here to other important materials could lead to even better cell performances and more interesting science. We are looking forward to it."

The researchers intend to continue their exploration in this direction, and apply the proposed strategy to other high-capacity, high-potential cathodes.

Credit: 
University of Science and Technology of China

Most-comprehensive analysis of fentanyl crisis urges innovative action

The U.S. overdose crisis worsened dramatically with the arrival of synthetic opioids like fentanyl -- now responsible for tens of thousands of deaths annually -- and the problem requires innovative new strateges because the epidemic is unlike others that have struck the nation, according to a new RAND Corporation study.

"This crisis is different because the spread of synthetic opioids is largely driven by suppliers' decisions, not by user demand," said Bryce Pardo, lead author of the study and an associate policy researcher at RAND, a nonprofit research organization. "Most people who use opioids are not asking for fentanyl and would prefer to avoid exposure."

While fentanyl had appeared in U.S. illicit drug markets before, production was limited to one or a few capable chemists, and bottlenecks in production and distribution slowed the drug's diffusion. Law enforcement was able to detect and shut down illicit manufacture to contain these outbreaks.

RAND researchers found that today's synthetic opioid surge is fueled by multiple sources. Mexican drug trafficking organizations smuggle fentanyl into the U.S., and China's pharmaceutical and chemical industries are inadequately regulated, allowing producers to advertise and ship synthetic opioids to buyers anywhere in the world.

While traditional criminal organizations play a role in the spread of fentanyl, the internet also has made it easier to traffic these drugs and to share information about their synthesis.

Overdose deaths involving fentanyl and other synthetic opioids have increased from about 3,000 in 2013 to more than 30,000 in 2018. These deaths have remained concentrated in Appalachia, the mid-Atlantic and New England.

"While synthetic opioids have not yet become entrenched in illicit drug markets west of the Mississippi River, authorities must remain vigilant," said Jirka Taylor, study co-author and senior policy analyst at RAND. "Even delaying the onset in these markets by a few years could save thousands of lives."

For U.S. policymakers, nontraditional strategies may be required to address this new challenge. The researchers avoid making specific policy recommendations, but advocate consideration of a broad array of innovative approaches such as supervised consumption sites, creative supply disruption, drug content testing, and increasing access to novel treatments that are available in other countries, such as heroin-assisted treatment.

"Indeed, it might be that the synthetic opioid problem will eventually be resolved with approaches or technologies that do not currently exist or have yet to be tested," said Beau Kilmer, study co-author and director of the RAND Drug Policy Research Center. "Limiting policy responses to existing approaches will likely be insufficient and may condemn many people to early deaths."

RAND researchers say that since the diffusion of fentanyl is driven by suppliers' decisions, it makes sense to consider supply disruption as one piece of a comprehensive response, particularly where that supply is not yet firmly entrenched.

But the researchers note there is little reason to believe that tougher sentences, including drug-induced homicide laws for low-level retailers and couriers, will make a difference. Instead, they call for an exploration of innovative disruption efforts that confuse or dissuade online sourcing.

The study is the most comprehensive document to be published on the past, present and future of illicit synthetic opioids. RAND researchers analyzed mortality and drug seizure data, reviewed existing literature, and conducted expert interviews and international case studies.

RAND researchers examined synthetic opioid markets across the U.S. and in other parts of the world, such as Estonia (where fentanyl first appeared 20 years ago). Canada's experience with synthetic opioids is most similar to that in the United States in terms of its timing, sudden increase in drug-related harms, and regional concentration.

"Problems in parts of Canada are as severe as in the Eastern United States despite substantial differences in drug policy, and the delivery of public health and social services," said Jonathan Caulkins, study co-author and Stever University Professor at Carnegie Mellon University.

A handful of other countries in Europe also have seen synthetic opioids increasingly displace heroin. Their experience is varied and shows a range of directions some future markets in the United States may take. For instance, Sweden developed an online market with fentanyl analogs sold primarily as nasal sprays.

Evidence from abroad suggests synthetic opioids may be here to stay: the study found no instance where fentanyl lost ground to another opioid after attaining a dominant position in drug markets.

Credit: 
RAND Corporation

Burgundy wine grapes tell climate story, show warming accelerated in past 30 years

image: Vineyards in Beaune, Burgundy.

Image: 
Olivier Duquesne via Flickr / This photograph is distributed under a Creative Commons Attribution-ShareAlike 2.0 Generic (CC BY-SA 2.0) Licence

A newly published series of dates of grape harvest covering the past 664 years is the latest line of evidence confirming how unusual the climate of the past 30 years has been. The record shows wine grapes in Burgundy, eastern France, have been picked 13 days earlier on average since 1988 than they were in the previous six centuries, pointing to the region's hotter and drier climate in recent years. The results are now published in the European Geosciences Union (EGU) journal Climate of the Past.

"We did not anticipate that the accelerated warming trend since the mid-1980s would stand out so clearly in the series," says Christian Pfister, a professor at the Oeschger Centre for Climate Change Research at the University of Bern, Switzerland. He conducted the study with other scientists and historians in Switzerland, France and Germany.

Thomas Labbé, a researcher at the universities of Burgundy and Leipzig and lead-author of the study, meticulously reconstructed dates of grape harvest in Beaune - the wine capital of Burgundy - going back to 1354. He used a large number of unedited archival sources, including information on wage payments made to grape pickers, Beaune city council records and newspaper reports. The continuous record of grape harvest dates now published in Climate of the Past extends until 2018 and is the longest ever reconstructed.

"The record is clearly divided in two parts," says Labbé. Until 1987, wine grapes were typically picked from 28 September onward, while harvests have begun 13 days earlier on average since 1988. The team's analysis of the series shows very hot and dry years were uncommon in the past, but have become the norm in the last 30 years.

Grape harvest dates can be used as a proxy to study the climate because wine grapes are very sensitive to temperature and rainfall. As an article in the French tourism website france.fr puts it, "Mother Nature is really the one who decides" when grapes are ripe enough to be picked. In years when the spring-summer (the growing season) is hot and dry, the grapes are ready for harvest earlier than in colder years.

The team validated their grape harvest series using detailed temperature records of Paris covering the past 360 years. This allowed them to estimate the April-to-July temperatures in the Beaune region for the entire 664 years covered by their record of grape harvest dates.

"The transition to a rapid global warming period after 1988 stands out very clearly. The exceptional character of the last 30 years becomes apparent to everybody," says Pfister.

"We hope people start to realistically consider the climate situation in which the planet is at present," he concludes.

Credit: 
European Geosciences Union

Closing the gap -- a two-tier mechanism for epithelial barrier

image: Tight junctions seal the gaps between neighboring cells to form epithelial barrier. In epithelial cells lacking claudins, the barrier against small molecules is disrupted but the barrier against large molecules is still maintained. In epithelial cells lacking both claudins and JAM-A, the space between neighboring cells is widened, and the barrier against small and large molecules is disrupted.

Image: 
Tetsuhisa Otani

Okazaki, Japan - Epithelia are cell sheets that act as a barrier to protect our body from the external environment. Epithelial barrier is critical to maintain our body homeostasis, and its disruption has been linked to various diseases including atopic dermatitis and inflammatory bowel diseases. In order to maintain the epithelial barrier, it is important to completely seal the space between cells to restrict the movement of substances across the epithelial sheet - and that is the task that tight junctions fulfill.

Previous studies have identified claudins as a critical component of tight junctions. However, the roles of other molecules including JAM-A have been relatively unclear. In the study, the researchers utilized genome editing to systematically knockout claudins, and succeeded to generate epithelial cells lacking claudins for the first time.

In examining these cells, the researchers find that claudins form a barrier against small molecules including ions. To their surprise, loss of claudins did not lead to separation of the two neighboring cells; instead, the two membranes were closely attached to each other, and the barrier against larger molecules such as proteins was still maintained.

"We were puzzled with the results, as claudins were thought to be critical for the epithelial barrier", said the first author Tetsuhisa Otani. "However, when we noticed that JAM-A was more accumulated at cell junctions lacking claudins, we started to think that JAM-A may be maintaining the barrier against large molecules", he said.

To test the idea, the researchers further removed JAM-A from the claudin-deficient cells. Indeed, removal of JAM-A led to an expansion of the space between two neighboring cells, and to a disruption of the barrier against large molecules.

Mikio Furuse, the leading scientist of the study says "The study shows that epithelial barrier is made by a combination of two distinct systems: a tight barrier against small molecules made by claudins, and a crude barrier against large molecules made by JAM-A. This challenges the textbook view of tight junctions and epithelial barrier, and may impact our understanding of many diseases with problems in the epithelial barrier".

Credit: 
National Institutes of Natural Sciences

Pancreas on a chip

image: A fully integrated, thermoplastic "Islet on a Chip" designed for scalable manufacturing, automated loading of islets into parallel channels, synchronized nutrient stimulation, and continuous insulin sensing.

Image: 
Image courtesy of Michael Rosnach, Harvard John A. Paulson School of Engineering and Applied Sciences.

By combining two powerful technologies, scientists are taking diabetes research to a whole new level. In a study led by Harvard University's Kevin Kit Parker, microfluidics and human, insulin-producing beta cells have been integrated in an "Islet-on-a-Chip". The new device makes it easier for scientists to screen insulin-producing cells before transplanting them into a patient, test insulin-stimulating compounds, and study the fundamental biology of diabetes.

The design of the Islet-on-a-Chip was inspired by the human pancreas, in which islands of cells ("islets") receive a continuous stream of information about glucose levels from the bloodstream, and adjust their insulin production as needed.

"If we want to cure diabetes, we have to restore a person's own ability to make and deliver insulin," explained Douglas Melton, Xander University Professor of Stem Cell and Regenerative Biology and co-director of the Harvard Stem Cell Institute (HSCI). "Beta cells, which are made in the pancreas, have the job of measuring sugar and secreting insulin, and normally they do this very well. But in diabetes patients these cells can't function properly. Now, we can use stem cells to make healthy beta cells for them. But like all transplants, there is a lot involved in making sure that can work safely."

Before transplanting beta cells into a patient, they must be tested to see whether they are functioning properly. The current method for doing this is based on technology from the 1970s: giving the cells glucose to elicit an insulin response, collecting samples, adding reagents, and taking measurements to see how much insulin is present in each one. The manual process takes so long to run and interpret that many clinicians give up on it altogether.

The new, automated, miniature device gives results in real time, which can speed up clinical decision making.

"Our device arranges islets into separate lines, delivers a pulse of glucose to each one simultaneously, and detects how much insulin is produced," said Aaron Glieberman, co-first author on the paper and Ph.D. candidate in the Parker lab. "It couples glucose stimulation and insulin detection in the same flow path, so it can give a clinician actionable information, quickly. The design also uses materials that are amenable to larger-scale manufacturing, which means more people will be able to use it."

"The Islet-on-a-Chip lets us monitor how donated or manufactured islet cells are releasing insulin, as cells in the body can," said Parker, Tarr Family Professor of Bioengineering and Applied Physics at Harvard. "That means we can make serious headway towards cell therapies for diabetes. The device makes it easier to screen drugs that stimulate insulin secretion, test stem cell-derived beta cells, and study the fundamental biology of islets. There is no other quality-control technology out there that can do it as fast, and as accurately."

Harvard's Office of Technology Development has filed patent applications relating to this technology and is actively exploring commercialization opportunities.

"It was exciting to see our lab's method for measuring islet function taken forward from individual islets to much bigger groups of islets, and incorporated into a device that can be used widely in the community," said co-author Michael Roper of Florida State University, whose lab focuses on the fundamental biology of islets. "Now, we have a device that integrates glucose delivery, islet positioning and capture, reagent mixing, and insulin detection, and requires far fewer reagents. So labs can use it to do more experiments at the same cost, using a much shorter and easier process."

"My main interest is in diabetes itself - all the adults in my family have type 2 diabetes, and that is the reason I've pursued science as a career," said Benjamin Pope, co-first author of the study and postdoctoral fellow in the Parker lab. "I am really excited about seeing this technology used in diabetes research and transplantation screening, because it enables cellular therapies for diabetes.

"It's also a beautiful integration of many different technologies," Pope added. "The physics behind the automatic islet trapping, the microfluidics, the real-time sensor and the biochemistry that underlies it, the electronics and data acquisition components - even the software. The overall device and operation system - integrating so many things from different fields, I learned a ton in the process."

Aside from its application to diabetes, the device has promise for use with other tissues and organs. "We can modify the core technology to sense function in a range of microphysiological systems," added Glieberman. "With the ability to detect cell secretions continuously, we want to make it easier to explore how cells use protein signals to communicate. This technology may eventually develop new insights into dynamic metrics of health for both diagnostics and treatment."

Credit: 
Harvard University

Much fridge food 'goes there to die'

COLUMBUS, Ohio - Americans throw out a lot more food than they expect they will, food waste that is likely driven in part by ambiguous date labels on packages, a new study has found.

"People eat a lot less of their refrigerated food than they expect to, and they're likely throwing out perfectly good food because they misunderstand labels," said Brian Roe, the study's senior author and a professor agricultural, environmental and development economics at The Ohio State University.

This is the first study to offer a data-driven glimpse into the refrigerators of American homes, and provides an important framework for efforts to decrease food waste, Roe said. It was published online this month and will appear in the November print issue of the journal Resources, Conservation & Recycling.

Survey participants expected to eat 97 percent of the meat in their refrigerators but really finished only about half. They thought they'd eat 94 percent of their vegetables, but consumed just 44 percent. They projected they'd eat about 71 percent of the fruit and 84 percent of the dairy, but finished off just 40 percent and 42 percent, respectively.

Top drivers of discarding food included concerns about food safety - odor, appearance and dates on the labels.

"No one knows what 'use by' and 'best by' labels mean and people think they are a safety indicator when they are generally a quality indicator," Roe said, adding that there's a proposal currently before Congress to prescribe date labeling rules in an effort to provide some clarity.

Under the proposal, "Best if used by" would, as Roe puts it, translate to "Follow your nose," and "Use by" would translate to "Toss it."

Other findings from the new study:

People who cleaned out their refrigerators more often wasted more food.

Those who check nutrition labels frequently waste less food. Roe speculated that those consumers may be more engaged in food and therefore less likely to waste what they buy.

Younger households were less likely to use up the items in their refrigerators while homes to those 65 and older were most likely to avoid waste.

Household food waste happens at the end of the line of a series of behaviors, said Megan Davenport, who led the study as a graduate student in Ohio State's Department of Agricultural, Environmental and Development Economics.

"There's the purchasing of food, the management of food within the home and the disposal, and these household routines ultimately increase or decrease waste. We wanted to better understand those relationships, and how individual products - including their labels - affect the amount of food waste in a home," Davenport said.

The web-based pilot study used data from the State of the American Refrigerator survey and included information about refrigerator contents and practices from 307 initial survey participants and 169 follow-up surveys.

The researchers asked about fruits, vegetables, meats and dairy - in particular how much was there and how much people expected to eat. Then they followed up about a week later to find out what really happened. The surveys also asked about a variety of factors that may have influenced decisions to toss food, including date labels, odor, appearance and cost.

An estimated 43 percent of food waste is due to in-home practices - as opposed to waste that happens in restaurants, grocery stores and on the farm - making individuals the biggest contributors. They're also the most complicated group in which to drive change, given that practices vary significantly from home to home, Roe said.

"We wanted to understand how people are using the refrigerator and if it is a destination where half-eaten food goes to die," he said.

"That's especially important because much of the advice that consumers hear regarding food waste is to refrigerate (and eat) leftovers, and to 'shop' the refrigerator first before ordering out or heading to the store."

Roughly one-third of the food produced worldwide for human consumption -- approximately 1.3 billion tons annually -- is lost or wasted, according to the Food and Agriculture Organization of the United Nations. The organization estimates the annual dollar value of that waste at $680 billion in industrialized countries and $310 billion in developing countries.

This study looked at refrigerated food because that's where most perishable foods are found in a household and where the bulk of efforts to encourage people to waste less food have been focused. In addition to better understanding food waste patterns, the researchers wanted to help identify opportunities to design policy or public messaging that will work in driving down waste.

"Our results suggest that strategies to reduce food waste in the U.S. should include limiting and standardizing the number of phrases used on date labels, and education campaigns to help consumers better understand the physical signs of food safety and quality," Davenport said.

Credit: 
Ohio State University

'Mental rigidity' at root of intense political partisanship on both left and right -- study

People who identify more intensely with a political tribe or ideology share an underlying psychological trait: low levels of cognitive flexibility, according to a new study.

This "mental rigidity" makes it harder for people to change their ways of thinking or adapt to new environments, say researchers. Importantly, mental rigidity was found in those with the most fervent beliefs and affiliations on both the left and right of the political divide.

The study of over 700 US citizens, conducted by scientists from the University of Cambridge, is the largest - and first for over 20 years - to investigate whether the more politically "extreme" have a certain "type of mind" through the use of objective psychological testing.

The findings suggest that the basic mental processes governing our ability to switch between different concepts and tasks are linked to the intensity with which we attach ourselves to political doctrines - regardless of the ideology.

"Relative to political moderates, participants who indicated extreme attachment to either the Democratic or Republican Party exhibited mental rigidity on multiple objective neuropsychological tests," said Dr Leor Zmigrod, a Cambridge Gates Scholar and lead author of the study, now published in the Journal of Experimental Psychology.

"While political animosity often appears to be driven by emotion, we find that the way people unconsciously process neutral stimuli seems to play an important role in how they process ideological arguments."

"Those with lower cognitive flexibility see the world in more black-and-white terms, and struggle with new and different perspectives. The more inflexible mind may be especially susceptible to the clarity, certainty, and safety frequently offered by strong loyalty to collective ideologies," she said.

The research is the latest in a series of studies from Zmigrod and her Cambridge colleagues, Dr Jason Rentfrow and Professor Trevor Robbins, on the relationship between ideology and cognitive flexibility.

Their previous work over the last 18 months has suggested that mental rigidity is linked to more extreme attitudes with regards to religiosity, nationalism, and a willingness to endorse violence and sacrifice one's life for an ideological group.

For the latest study, the Cambridge team recruited 743 men and women of various ages and educational backgrounds from across the political spectrum through the Amazon Mechanical Turk platform.

Participants completed three psychological tests online: a word association game, a card-sorting test - where colours, shapes and numbers are matched according to shifting rules - and an exercise in which participants have a two-minute window to imagine possible uses for everyday objects.

"These are established and standardized cognitive tests which quantify how well individuals adapt to changing environments and how flexibly their minds process words and concepts," said Zmigrod.

The participants were also asked to score their feelings towards various divisive social and economic issues - from abortion and marriage to welfare - and the extent of "overlap" between their personal identity and the US Republican and Democrat parties.

Zmigrod and colleagues found that "partisan extremity" - the intensity of participants' attachment to their favoured political party - was a strong predictor of rigidity in all three cognitive tests. They also found that self-described Independents displayed greater cognitive flexibility compared to both Democrats and Republicans.

Other cognitive traits, such as originality or fluency of thought, were not related to heightened political partisanship, which researchers argue suggests the unique contribution of cognitive inflexibility.

"In the context of today's highly divided politics, it is important we work to understand the psychological underpinnings of dogmatism and strict ideological adherence," said Zmigrod.

"The aim of this research is not to draw false equivalences between different, and sometimes opposing, ideologies. We want to highlight the common psychological factors that shape how people come to hold extreme views and identities," said Zmigrod.

"Past studies have shown that it is possible to cultivate cognitive flexibility through training and education. Our findings raise the question of whether heightening our cognitive flexibility might help build more tolerant societies, and even develop antidotes to radicalization."

"While the conservatism and liberalism of our beliefs may at times divide us, our capacity to think about the world flexibly and adaptively can unite us," she added.

Credit: 
University of Cambridge

A unique conducting state under UV-irradiation

image: A shot of UV light has produced a considerable flow of electric current like spring water in an insulating molecular crystal, which phenomenon originates from the unique original physical properties of this material, realized by the interplay between the two kinds of molecules shown in the image.

Image: 
Royal Society of Chemistry, Image reproduced by permission of Naito Toshio et al. from Journal of Materials Chemistry C.

Photoconduction, where an insulating material exhibits a semiconducting property under photoirradiation, was found in 1873, and is now applied in various devices including optical sensors, CCD cameras, remote controllers and solar cells. This long-known important phenomenon has a weak point; it cannot produce high conduction metallic substances. Metallic conduction is more suitable or advantageous for application in electronic devices, because it consumes less energy than semiconductors. In our present work, we have found metal-like conduction behavior in a molecular crystal under UV-irradiation. This material itself is unique in that, without irradiation, its behavior falls between metals and insulators. However, it turned out to be impossible to make it behave metallically in any method other than UV-irradiation. Such a material is often "metallized" by applying high pressure or high temperature, but this was not the case for this particular material. In addition to the newly found metallic photoconduction, our finding is important because it indicates that there are other photoexcited states of matter possessing unique properties. In other words, we should explore the photoexcited states of various materials to find novel properties and functions just as chemists have done for hundreds of years by synthesizing new materials.

Credit: 
Ehime University

Negative interest rate policies are backfiring -- new research

Negative interest rate policies - where nominal rates are set below zero percent - have been introduced in Europe and Japan to stimulate flagging economies but research from the University of Bath shows the unconventional monetary strategy may be doing more harm than good.

Recently, several major European banks announced plans to pass on negative interest rates to corporations and wealthy individuals. Since 2012 Japan and six European economies - the Eurozone, Denmark, Hungary, Norway, Sweden and Switzerland - introduced negative interest rates, making it costly for commercial banks to hold their excess reserves with central banks.

Negative interest rates are supposed to stimulate the domestic economy by facilitating an increase in the demand for bank loans. In theory this could increase new capital investment by firms and domestic consumption, via credit creation.

But the research showed bank margins were being squeezed, curbing loan growth and damaging banking profits.

"This is a good example of unintended consequences. Our study shows negative interest rate policy has backfired, particularly in an environment where banks are already struggling with profitability, slow economic recovery, historically high levels of non-performing loans, and a post banking-crisis deleveraging phase," said Dr. Ru Xie of the university's School of Management.

"If bank margins are compressed due to low long term yields, and if there is limited loan growth, then bank profits will fall accordingly. The decline in profits can erode bank capital bases and hitherto further limit credit growth, thus stifling any positive impact on domestic demand from negative interest rate policy monetary transmission effects," Xie said.

Xie, working with researchers from Bangor Business School, the U.S. Department of the Treasury and the University of Sharjah in United Arab Emirates, identified new evidence that bank margins and profitability fared worse in countries where negative interest rates were adopted than in countries that did not pursue this policy.

The results also suggested that following the introduction of negative interest rates, bank lending was weaker than in countries that did not adopt the policy. This was largely driven by the compressed net interest margin from a long term low yield.

Xie said negative interest rates also appear to have cancelled out the stimulus impact of other forms of unconventional monetary policy such as quantitative easing.

Credit: 
University of Bath

Researchers develop process flow for high-res 3D printing of mini soft robotic actuators

image: A generic process flow is proposed to guide the 3D printing of miniature soft pneumatic actuators that are smaller than a coin. A soft debris remover with an integrated miniature gripper can realize navigation through a confined space and collection of small objects in hard-to-reach positions.

Image: 
SUTD

Soft robots are a class of robotic systems made of compliant materials and capable of safely adapting to complex environments. They have seen rapid growth recently and come in a variety of designs spanning multiple length scales, from meters to submicrometers.

In particular, small soft robots at millimeter scale are of practical interest as they can be designed as a combination of miniature actuators simply driven by pneumatic pressure. They are also well suited for navigation in confined areas and manipulation of small objects.

However, scaling down soft pneumatic robots to millimeters results in finer features that are reduced by more than one order of magnitude. The design complexity of such robots demands great delicacy when they are fabricated with traditional processes such as molding and soft lithography. Although emerging 3D printing technologies like digital light processing (DLP) offer high theoretical resolutions, dealing with microscale voids and channels without causing clogging has still been challenging. Indeed, successful examples of 3D printing miniature soft pneumatic robots are rare.

Recently, researchers from Singapore and China, namely from the Singapore University of Technology and Design (SUTD), Southern University of Science and Technology (SUSTech) and Zhejiang University (ZJU), proposed a generic process flow for guiding DLP 3D printing of miniature pneumatic actuators for soft robots with overall size of 2-15 mm and feature size of 150-350 μm (please refer to image). Their research was published in Advanced Materials Technologies.

"We leveraged the high efficiency and resolution of DLP 3D printing to fabricate miniature soft robotic actuators," said Associate Professor Qi (Kevin) Ge from SUSTech, lead researcher of the research project. "To ensure reliable printing fidelity and mechanical performance in the printed products, we introduced a new paradigm for systematic and efficient tailoring of the material formulation and key processing parameters."

In DLP 3D printing, photo-absorbers are commonly added into polymer solutions to enhance the printing resolutions in both lateral and vertical directions. Meanwhile, overly increasing the dose leads to rapid degradation in the material's elasticity which is crucial for soft robots to sustain large deformations.

"To achieve a reasonable trade-off, we first selected a photo-absorber with good absorbance at the wavelength of the projected UV light and determined the appropriate material formulation based on mechanical performance tests. Next, we characterized the curing depth and XY fidelity to identify the suitable combination of exposure time and sliced layer thickness," explained co-first author Yuan-Fang Zhang from SUTD.

"By following this process flow, we are able to produce an assortment of miniature soft pneumatic robotic actuators with various structures and morphing modes, all smaller than a one Singapore Dollar coin, on a self-built multimaterial 3D printing system. The same methodology should be compatible with commercial stereolithography (SLA) or DLP 3D printers as no hardware modification is required," said corresponding author Professor Qi Ge from SUSTech.

To exemplify the potential applications, the researchers also devised a soft debris remover comprising a continuum manipulator and a 3D printed miniature soft pneumatic gripper. It can navigate through a confined space and collect small objects in hard-to-reach positions.

The proposed approach paves the way for 3D printing miniature soft robots with complex geometries and sophisticated multimaterial designs. This integration of printed miniature soft pneumatic actuators into a robotic system offers opportunities for potential applications such as jet-engine maintenance and minimally invasive surgery.

Credit: 
Singapore University of Technology and Design

Providing a solution to the worst-ever prediction in physics

The cosmological constant, introduced a century ago by Albert Einstein in his theory of general relativity, is a thorn in the side of physicists. The difference between the theoretical prediction of this parameter and its measurement based on astronomical observations is of the order of 10121. It's no surprise to learn that this estimate is considered the worst in the entire history of physics. In an article to be published in Physics Letters B, a researcher from the University of Geneva (UNIGE), Switzerland, proposes an approach that may seemingly resolve this inconsistency. The original idea in the paper is to accept that another constant - Newton's universal gravitation G, which also forms part of the equations on general relativity - may vary. This potentially major breakthrough, which has been positively received by the scientific community, still needs to be pursued in order to generate predictions that can be confirmed (or refuted) experimentally.

"My work consists of a new mathematical manipulation of the equations of general relativity that finally makes it possible to harmonise theory and observation on the cosmological constant," begins Lucas Lombriser, assistant professor in the Department of Theoretical Physics in UNIGE's Faculty of Sciences and sole author of the article.

Expansion in full acceleration

The cosmological constant Λ (lambda) was introduced into equations on general relativity by Einstein over a century ago. The celebrated physicist needed the constant to ensure that his theory would be compatible with a universe he believed was static. However, in 1929 another physicist - Edwin Hubble - discovered that the galaxies are all moving away from each other, a sign that the universe is actually expanding. On learning this, Einstein rued the fact that he had introduced the cosmological constant, which had become useless in his eyes, and even described it as "the greatest blunder of my life".

In 1998, the precise analysis of distant supernovae offered proof that the expansion of the universe, far from being constant, is actually accelerating, as though a mysterious force is swelling the cosmos more and more rapidly. The cosmological constant was then once more called on in order to describe what physicists call "vacuum energy" - an energy whose nature is unknown (we talk about dark energy, quintessence, etc.) but which is responsible for the accelerated expansion of the universe.

The most precise observations of supernovae, and especially of the cosmic microwave background (microwave radiation that comes from all parts of the sky and which is considered to be left over from the Big Bang), have made it possible to measure an experimental value for this cosmological constant. The result is a very small figure (1.11 × 10 -52 m-2 ) that is nevertheless large enough to generate the desired effect of accelerated expansion.

Huge gap between theory and observation

The problem is that the theoretical value of the cosmological constant is very different. This value is obtained using quantum field theory: this holds that pairs of particles on a very small scale are created and destroyed almost instantaneously at every point of space and at any moment. The energy of this "vacuum fluctuation" - a very real phenomenon - is interpreted as a contribution to the cosmological constant. But when its value is calculated, an enormous figure is obtained (3.83 × 10+69 m-2), which is largely incompatible with the experimental value. This estimate represents the largest gap ever obtained (by a factor of 10121, i.e. a "1" followed by 121 «0») between theory and experiment across science.

This problem of the cosmological constant is one of the "hottest" subjects in current theoretical physics, and it is mobilising numerous researchers around the world. Everyone is looking at the equations of general relativity from all sides in an attempt to unearth ideas that will solve the question. Although several strategies have been put forward, there is no general consensus for the time-being.

Professor Lombriser, for his part, had the original idea a few years ago of introducing a variation into the universal constant of gravitation G (Newton's) which appears in Einstein's equations. This means that the universe in which we live (with a G of 6.674 08 × 10 -11 m3 / kg s2) becomes a special case among an infinite number of different theoretical possibilities.

After numerous developments and hypotheses, professor Lombriser's mathematical approach means it is possible to calculate the parameter ΩΛ (omega lambda), which is another way of expressing the cosmological constant but which is much easier to manipulate. This parameter designates also the current fraction of the universe that is made up of dark energy (the rest being composed of matter). The theoretical value obtained by the Geneva-based physicist is 0.704 or 70.4%. This figure is in close agreement with the best experimental estimate obtained to date, 0.685 or 68.5%, stating that this is a huge improvement over the 10121 discrepancy.

This initial success now needs to be followed by further analyses in order to verify whether the new framework proposed by Lombriser can be used to reinterpret or clarify other mysteries of cosmology. The physicist has already been invited to present and explain his approach in scientific conferences, which reflects the interest shown by the community.

Credit: 
Université de Genève

Mechanism of epilepsy causing membrane protein is discovered

image: (Left) CLC transporter structure identified by the Dr. Mackinnon Group
(Right) CLC transporter structure newly identified in this study.

Image: 
@ Korea Brain Research Institute

On August 21, Korea Brain Research Institute (KBRI, President Pann-Ghill Suh) announced that a team led by principal researcher Lim Hyun-Ho discovered a new 3D structure and mechanism of membrane protein which causes epilepsy and muscle problems.

The study results were published in the August issue of Proceedings of the National Academy of Sciences (PNAS), an international journal with the paper name and authors as follows.

* Paper name: Mutation of external glutamate residue reveals a new intermediate transport state and anion binding site in a CLC Cl-/H+ antiporter

* Authors: Kunwoong Park (first author), Byoung-Cheol Lee, and Hyun-Ho Lim (corresponding author)

Neurons control physiological phenomenon such as delivery of electrical signals and secretion of signal transduction materials by exchanging chloride (Cl-) ions and hydrogen ions (H+) in the cell membrane. If there is a problem with the CLC transporter protein that is involved in this process, muscle problems, epilepsy, hearing loss and blindness can develop.

The research team led by Dr. Lim Hyun-Ho, succeeded in identifying a new structure of external glutamate residue, which plays a critical role in the ion exchange of single CLC transporter proteins, for the first time in the world.

The research team produced mutated CLC proteins, where external glutamate residue is changed, and identified a 3D structure under 9 different conditions. In addition, the team found new areas where chloride ions (Cl-) are combined in a transporter. Based on this, the team found that a single CLC protein can have four different structures in the ion exchange process for the first time in the world.

Dr. Mackinnon of Rockefeller University, who won the Nobel Prize for Chemistry in 2003 by identifying CLC protein structure for the first time, predicted there would be more than 3 kinds of structural diversity of this protein but less than two types of structure have been reported for the same species.

This research is meaningful in that a new structure and functions are identified in membrane protein, for which structural determination is difficult, and the principle of material transport is identified. It is expected that this study will lead to the development of technologies that control various physiological phenomenon and diseases by controlling various functions based on membrane protein.

Dr. Lim Hyun-ho said "Our team could achieve good results thanks to KBRI which provided long-standing support for the systemic research that integrates structure and function even though the results could not be achieved immediately." He added that "Our team will continue our research on membrane protein, which is essential to maintaining the physiological function of a brain."

Credit: 
Korea Brain Research Institute

Friendships factor into start-up success (and failure)

New research co-authored by Cass Business School academics has found entrepreneurial groups with strong friendship bonds are more likely to persist with a failing venture and escalate financial commitment to it.

Hundreds of thousands of new businesses are registered in the UK every year but 20 per cent fail within the first 12 months and 60 per cent are terminated within their first three years.

Decisions to terminate a venture typically occur as financial losses increase, and founding entrepreneurs typically face the decision to escalate their commitment to a failing venture multiple times before finally terminating it.

Against this backdrop, researchers Tori Yu-wen Huang and Vangelis Souitaris of Cass Business School and Sigal G. Barsade of The Wharton School, University of Pennsylvania, sought to understand how entrepreneurial teams react when their venture's finances begin to suffer.

The study found that the stronger the friendship bond among the team members, the more likely the teams were to escalate their commitment to failing ventures rather than terminate them.

"Our results indicate the importance of entrepreneurs understanding and managing their team emotions for best decision making," write the researchers.

"It also helps explain the continued engagement of entrepreneurial teams who, even when fearful, have hope."

Emotions have been shown to be important in entrepreneurial decision-making, and those emotions are strongly felt when ventures face termination.

"We focus on the influence of group fear and group hope because, compared to other emotions, fear and hope are more associated with uncertainty, which is inherent to the decision to escalate commitment to a venture," write the researchers.

"We compare a founding team's fear that a currently failing venture will ultimately increase financial losses to their hope that the venture can be turned around, recover the losses, and ultimately make money."

Using a simulation based on data from 66 entrepreneurial teams across 569 decision-making rounds, they found that "hope trumps fear" -- that is, the relationship between group hope and escalating commitment to a failing venture is stronger than the relationship between group fear and terminating that venture.

Since entrepreneurs invest not only money but time, effort, and attention in their ventures, the researchers examined the team's engagement as a mediator between fear and hope and escalation of commitment versus termination and found that it explained the results.

"We employed an immersive laboratory methodology to realistically simulate and observe teams of three business students serving as co-founders of a computer startup," write the researchers.

"To examine the dynamic nature of these decisions, we longitudinally tracked each team's joint level of fear, hope and behavioural engagement through multiple rounds of simulation."

The researchers conclude that although escalating commitment to what is forecasted to be a failing venture can be costly, persistence is a quality widely valued in business and entrepreneurial contexts.

"Distinguishing 'problematic' escalation from 'fruitful' persistence is kind of an art, and a skill that entrepreneurs have to develop," Professor Vangelis Souitaris said.

Credit: 
City St George’s, University of London