Culture

How to feed the world by 2050? Recent breakthrough boosts plant growth by 40 percent

image: Four unmodified plants (left) grow beside four plants (right) engineered with alternate routes to shortcut photorespiration -- an energy-expensive process that costs yield potential. The modified plants are able to reinvest their energy and resources to boost productivity by 40 percent.

Image: 
Claire Benjamin/RIPE Project

One of the most significant challenges of the 21st Century is how to sustainably feed a growing and more affluent global population with less water and fertilizers on shrinking acreage, despite stagnating yields, threats of pests and disease, and a changing climate. Recent advances to address hunger through agricultural discovery will be highlighted at this year's annual meeting of the American Association for the Advancement of Science (AAAS) at 8 a.m. Feb. 16, 2019, at the Marriott Wardman Park.

"The meeting this year is about 'Science Transcending Boundaries'--the idea for the session is to highlight research that is transcending scientific and knowledge boundaries, with the ultimate goal to transcend geographic boundaries and reach smallholder farmers in Africa," said Lisa Ainsworth, a scientist with the U.S. Department of Agriculture, Agricultural Research Service (USDA-ARS) and an adjunct professor of plant biology at the University of Illinois. Recently, Ainsworth was awarded the 2019 National Academy of Sciences Prize in Food and Agriculture Sciences.

Session speaker Donald Ort, the Robert Emerson Professor of Plant Biology and Crop Sciences at Illinois' Carl R. Woese Institute for Genomic Biology, will discuss the global food security challenge and a recent breakthrough in Science (see original news release) that boosted crop growth by 40 percent by creating a shortcut for a glitch that plagues most food crops.

"Plants have to do three key things to produce the food we eat: capture sunlight, use that energy to manufacture plant biomass, and divert as much of the biomass as possible into yields like corn kernels or starchy potatoes," Ort said. "In the last century, crop breeders maximized the first and third of these, leaving us with the challenge to improve the process where sunlight and carbon dioxide are fixed--called photosynthesis--to boost crop growth to meet the demands of the 21st Century."

This landmark work is part of Realizing Increased Photosynthetic Efficiency (RIPE), an international research project that is engineering crops to photosynthesize more efficiently to sustainably increase worldwide food productivity with support from the Bill & Melinda Gates Foundation, the Foundation for Food and Agriculture Research (FFAR), and the U.K. Government's Department for International Development (DFID).

"Land plants evolved with a biochemical glitch whereby a photosynthetic enzyme frequently captures oxygen instead of carbon dioxide, necessitating a convoluted and energy-expensive process called photorespiration to mitigate this glitch," said Ort, who is also the deputy director of the RIPE project. "Crops like soybean and wheat waste more than 30 percent of the energy they generate from photosynthesis dealing with this glitch, but modeling suggested that photorespiratory shortcuts could be engineered to help the plant conserve its energy and reinvest it into growth."

Borrowing genes from algae and pumpkins, the team engineered three alternate routes to replace the circuitous native photorespiration pathway in tobacco, a model plant used to show proof of concept before scientists move technologies to food crops that are much more difficult and time-consuming to engineer and test. Now, the team is translating this work to boost the yields of other crops including soybean, cowpea, rice, potato, tomato, and eggplant.

"It is incredible to imagine the calories lost to photorespiration each year around the globe," Ort said. "To reclaim even a portion of these calories would be a huge success in our race to feed 9.7 billion people by 2050."

Of course, Ort cautions, it will take 15 years or more for these technologies to be translated into food crops and achieve regulatory approval for distribution to farmers. When that day comes, RIPE and its sponsors are committed to ensuring that smallholder farmers, particularly in Sub-Saharan Africa and Southeast Asia, will have royalty-free access to this technology.

Other talks in this session will include "Discoveries to Improve Nitrogen Fixation in Cereals" by Jean-Michel Ane', a professor of agronomy at the University of Wisconsin-Madison, and "Genome Editing for Sustainable Crop Improvement" for the staple food crop cassava by Rebecca Bart, an assistant member at the Donald Danforth Plant Science Center, whose work is also supported by the Gates Foundation. The session will conclude with a panel discussion of how agricultural science is crossing traditional disciplines.

In addition, two leading plant scientists from the IGB will be inducted as Fellows of the AAAS: Andrew Leakey is a professor of plant biology and crop sciences at Illinois who studies plant responses to climate change as well as the development of crops that are more drought tolerant. Ray Ming is a professor of plant biology and an expert on plant genomics and sex chromosome evolution, which could help improve papaya production.

Credit: 
Carl R. Woese Institute for Genomic Biology, University of Illinois at Urbana-Champaign

Political and policy feedbacks in the climate system

image: Two graphs outlining Democrat vs. Republican beliefs that Global Warming is happening.

Image: 
Matto Mildenberger

Matto Mildenberger, University of California Santa Barbara explains how perceived experiences with climate change in the United States can be linked to political shifts in Congress, culture and society. He will demonstrate how partisan opinions about the prevalence and dangers of climate change in each of the 50 states and 435 congressional districts in the United States can change policymaking by Congress.

Announcing the 2018 Partisan Climate Opinion Maps

We are pleased to announce our new estimates of Democrats and Republicans who hold particular beliefs, attitudes, and policy preferences about global warming. These estimates cover both states and US congressional districts. The visualize the distribution of climate and energy beliefs among US Democrats and US Republicans.

This new data release will be made available shortly at: http://climatecommunication.yale.edu/visualizations-data/

About the Partisan Climate Opinion Maps

Even as US partisan polarization shapes climate and energy beliefs and attitudes, substantial heterogeneity in climate opinions still exists among both Republicans and Democrats. To date, our understanding of this partisan variability has been limited to analysis of national or less commonly, state-level opinion poll subsamples. The Partisan Climate Opinion Maps provide new data about how Republican and Democratic climate and energy opinions vary across all 50 states and all 435 congressional districts. They reveal new spatial patterns with policy-relevant implications for the trajectory of US climate change policy reforms. These maps have now been updated through to 2018, and give new information about the state of partisan climate and energy beliefs in the current political context.

The public opinion estimates were generated using a statistical model that combines nationally representative survey data gathered by the Yale Program on Climate Change Communication and the George Mason Center for Climate Change Communication between 2008 and 2016 with voter registration, U.S. census, and geographic data. Party registration data is available for 32 states, and is imputed in the remaining states (i.e., in Alabama, Georgia, Illinois, Indiana, Michigan, Minnesota, Mississippi, Missouri, Montana, North Dakota, South Carolina, Tennessee, Texas, Vermont, Virginia, Washington, and Wisconsin).

Credit: 
University of California - Santa Barbara

Altered data sets can still provide statistical integrity and preserve privacy

Synthetic networks may increase the availability of some data while still protecting individual or institutional privacy, according to a Penn State statistician.

"My key interest is in developing methodology that would enable broader sharing of confidential data in a way that can aid in scientific discovery," said Aleksandra Slavkovic, professor of statistics and associate dean for graduate education, Eberly College of Science, Penn State. "Being able to share confidential data with minimal quantifiable risk for discovery of sensitive information and still ensure statistical accuracy and integrity, is the goal."

Slavkovic has found solutions to this data privacy problem through interdisciplinary collaborations, especially with computer and social scientists. Her research focuses on various data, including network data that capture relationship information between entities such as individuals or institutions. She reported her approaches to providing synthetic networks that satisfy a notion of differential privacy today (Feb 16) during the 2019 annual meeting of the American Association for the Advancement of Science in Washington, D.C.

Differential privacy provides a mathematically provable guarantee of the level of privacy loss to individuals.

Scientists want access to data collected by others for their research, but such access could also compromise personal privacy, even after removal of so-called personally identifiable data.

"An abundance of auxiliary data is the main culprit," said Slavkovic. "With methodological and technological advances in data collection and record linkage, easier access to variety of data sources that could be linked with a dataset in hand, and funding agencies requirements to share data, the risks to data privacy are increasing. But, finding good solutions for managing privacy loss are essential for enabling sound scientific discovery."

Publicly available information from a drug trial on an HIV drug, for example, would indicate who was in the treatment group and who was in the control group. The treatment group would contain only people diagnosed with HIV and even though the data owners withheld personal particulars from that data set, some identifying information would remain. Because so much information is today available online in social media and in other datasets, it is possible to connect the dots and identify people, potentially revealing their HIV status.

"Techniques to link two data sets, say voter records and health insurance data, have greatly improved," said Slavkovic. "In one of the earliest findings, Latanya Sweeny (now at Harvard) showed that by linking these type of data, you can identify 87 percent of the people in the U.S. Census from 1990 based on their date of birth, gender and 5-digit zip code. More recently, researchers used tweets and associated Twitter metadata to show that they can identify users with 96.7 percent accuracy."

Slavkovic notes that it is not just people or institutions whose data are contained in the databases, but that people outside the database can also suffer from invasion of privacy, directly or by association. Linkages between information in a dataset and information on social media might lead to a serious privacy breech -- something like HIV status or sexual orientation could have severe repercussions if revealed.

While privacy is important, collected datasets make up an essential source of information for researchers. Currently, in some cases when the data are exceptionally sensitive, researchers must physically go to the data repositories to do their research, making research more difficult and expensive.

Slavkovic is interested in network data. Information that shows the interconnectedness of people or institutions -- the nodes -- and the connections between nodes. Her approach is to create slightly altered, mirrored network datasets with a few of the nodes moved, connections shifted or edges altered.

"The aim is to create new networks that satisfy the rigorous differential privacy requirements and at the same time capture most of the statistical features from the original network," said Slavkovic.

These synthetic datasets might be sufficient for some researchers to satisfy their research needs. For others, it would be sufficient to test their approaches and hypothesis before having to go to the data storage site. Researchers could test code, do exploratory research and perhaps basic analysis while waiting for permission to use the original data in its repository site.

"We can't satisfy demands for all statistical analysis with the same type of altered data," said Slavkovic. "Some people will need the original data, but others might go a long way with synthetic data such as synthetic networks."

Credit: 
Penn State

Large-scale window material developed for PM2.5 capture and light tuning

image: Photograph of a large-scale conductive nylon mesh. Inset is photograph of Ag nanowire ink using ethanol as a solvent with a concentration of 3.92 mg mL-1.

Image: 
YU Shuhong

Tuning the light intensity and reducing the concentration of atmospheric particulate matter (PM) in commercial buildings are both crucial to keep indoor people comfortable and healthy. While, the intelligent smart windows fabricated on the flexible transparent electrodes can change its transmittance in response to electrical or thermal stimulus to tune the light intensity of commercial buildings to maintain thermal comfort. Up to now, it is still a significant challenge to fabricate the large-scale flexible transparent smart window for high-efficiency PM2.5 capture.

Recently, a research team led by Prof. YU Shuhong from the University of Science and Technology of China (USTC) develops a simple solution based process to fabricate large-area Ag-nylon flexible transparent windows for high-efficiency PM2.5 capture.

It takes only about 15.03 dollars and 20 minutes to fabricate 7.5 m2 Ag-nylon flexible transparent windows without any modification showing a sheet resistance of as low as 8.87 ? sq-1 and optical transmittance of 86.05%.

The obtained Ag-nylon mesh serves not only to turn the indoor light intensity as thermochromic smart windows after uniformly coated with thermochromic dye but also to purify indoor air as high-efficiency PM2.5 filter.

The time-dependent temperature profiles and uniform heat distribution show that the obtained Ag-nylon electrodes can be used as an ideal intelligent thermochromic smart window with excellent mechanical stability whose performance remains stable even after 10,000 bending cycles of bending test with a minimum bending radius of 2.0 mm and 1,000 cycles of stretching deformation with mechanical strain as high as 10%.

In addition, the Ag-nylon electrodes can be constructed for PM filter showing a removal efficiency of 99.65% and maintaining stable even after 100 cycles of PM filtration and cleaning process.

The success of the present design strategy provides more choices in developing next-generation flexible transparent smart windows and air pollution filters.

Credit: 
University of Science and Technology of China

First evidence discovered of a gigantic remnant around an exploding star

A San Diego State University astrophysicist has helped discover evidence of a gigantic remnant surrounding an exploding star--a shell of material so huge, it must have been erupting on a regular basis for millions of years.

When a white dwarf, the core of a dead star, is in a close orbit with another star, it pulls gas from the other star. The gas becomes heated and compressed, eventually exploding to create a nova. This explosion causes the star to brighten by a millionfold and eject material at thousands of miles per second. The ejected material forms a remnant or shell surrounding the nova.

Allen Shafter and former SDSU postdoc. Martin Henze, along with a team of astrophysicists led by Matthew Darnley at Liverpool John Moores University in England, have been studying a nova in the nearby Andromeda galaxy known as M31N 2008-12a. What makes the nova unusual is that it erupts far more frequently than any other known nova
system.

"When we first discovered that M31N 2008-12a erupted every year, we were very surprised," said Shafter. A more typical pattern is about every 10 years.

Shafter and his team believe M31N 2008-12a has been erupting regularly for millions of years. These frequent eruptions over time have resulted in a "super remnant" surrounding the nova measuring almost 400 light years across.

Using Hubble Space Telescope imaging along with ground-based telescopes, the team worked to determine the chemical composition of the super-remnant and confirm its association with M31N 2008-12a. These findings, published in an article in the journal Nature, open the door to the possibility that this nova and remnant are linked to something more crucial to the universe.

Type Ia supernovae are among the most powerful and luminous objects in the universe and are believed to occur when a white dwarf exceeds its maximum allowable mass. At that point, the entire white dwarf is blown apart instead of experiencing explosions on the surface as other novae do. These are relatively rare and unseen in our own galaxy since the early 1600s.

Theoretical models show that novae experiencing frequent explosions surrounded by large remnants must harbor massive white dwarfs that are nearing their limit. This means M31N 2008-12a is behaving precisely the way astronomers believe a nova does before it potentially explodes as a supernova.

The discovery of additional large remnants around other novae will help identify systems undergoing repeated eruptions and help astronomers determine how many type Ia supernovae are formed; how frequently they occur; and their potential association with novae like M31N 2008-12a. Type la supernova are a critical part of understanding how the entire universe expands and grows.

"They are, in effect, the measuring rods that allow us to map the visible universe," said Shafter. "Despite their importance, we don't fully understand where they come from."

Shafter and his team are now working to understand if what they observed with M31N 2008-12a is rare, or if there is an unseen population of novae experiencing this as well.

Credit: 
San Diego State University

Study shows hope for fighting disease known as Ebola of frogs

image: UCF Biologist Anna Savage and her team are unraveling the mystery surrounding frogs on the brink of mass extinction.

Image: 
William Hawthorne

Despite widespread infection, some frog populations are surviving a deadly disease that is the equivalent of mankind's Ebola virus. The reason --genetic diversity.

That's the finding of a new study published this week in the journal Immunogenetics. Anna Savage, an assistant professor of biology at the University of Central Florida, is the lead author of the study.

The research is important because frogs are facing what may be a mass extinction as a result of disease, Savage says.

"If you have more genetic variation, you have more potential to respond and adapt to anything," Savage says.

However, protecting frog habitats from destruction and pollution is critical, she says.

"Don't destroy habitats, maintain large population sizes -- these simple things are the best actions to implement, given whatever limited information we have, to give populations the chance to rebound," she says.

The virus Savage and her colleagues studied is called Ranavirus. It affects cold-blooded animals, such as amphibians, reptiles and fish. It causes a tadpole's internal organs to fill with blood and explode, much like the Ebola virus does in humans. It is one of the top two pathogens causing worldwide amphibian decline.

Researchers suspect that Ranavirus and other similar pathogens have long been in the environment, but they are exploring why the pathogens are now causing so many disease outbreaks.

"Certainly, the rise of these infectious pathogens coincides with the period when global temperatures started to significantly increase," she says. "There are a lot of biologists working on studies trying to tease apart the relationship between climate and amphibian health and how that might translate to some of these global disease problems."

It is important to study frogs because of the roles they play, Savage says. They help control diseases by eating insects that can infect humans and also are an essential part of the food chain.

"If we lost them, there would be this major energetic crisis where we wouldn't have a food source for many other animals that depend on them to survive," Savage says.

In the study, researchers collected tail clippings from tadpoles in 17 randomly selected ponds in Patuxent Research Refuge in Maryland over the course of two years. Tail clipping is a minimally invasive and nonlethal method for tissue collection. The clippings were used to analyze and determine the presence and severity of Ranavirus in the tadpoles. The team also checked for major histocompatibility complex (MHC) genes, which can help a tadpole's immune system fight off disease.

They found Ranavirus infection in 26 percent of the 381 tadpoles they sampled and that the presence of a particular combination of MHC genes was associated with decreased severity of the virus.

"There was evidence that this combination of immune genes was helping those individuals limit how bad the viral infection can get," she says. "To our knowledge this is the first study that shows that this group of immune genes is actually important for Ranavirus susceptibility."

The findings could have implications for frog species in Florida, as Ranavirus is a disease that threatens frogs in the state, including the American bullfrog, the southern leopard frog and the endangered Gopher frog.

"These immune genes aren't completely different across different species," she says. "We actually see a lot of the same variants shared at the level of the entire genus or even the whole family. So, some of the work we've done is showing that we're finding the same genetic variants in wood frogs as in other frogs, including species in Florida."

Credit: 
University of Central Florida

Artificial intelligence can predict survival of ovarian cancer patients

The artificial intelligence software, created by researchers at Imperial College London and the University of Melbourne, has been able to predict the prognosis of patients with ovarian cancer more accurately than current methods. It can also predict what treatment would be most effective for patients following diagnosis.

The trial, published in Nature Communications took place at Hammersmith Hospital, part of Imperial College Healthcare NHS Trust.

Researchers say that this new technology could help clinicians administer the best treatments to patients more quickly and paves the way for more personalised medicine. They hope that the technology can be used to stratify ovarian cancer patients into groups based on the subtle differences in the texture of their cancer on CT scans rather than classification based on what type of cancer they have, or how advanced it is.

Professor Eric Aboagye, lead author and Professor of Cancer Pharmacology and Molecular Imaging, at Imperial College London, said:

"The long-term survival rates for patients with advanced ovarian cancer are poor despite the advancements made in cancer treatments. There is an urgent need to find new ways to treat the disease. Our technology is able to give clinicians more detailed and accurate information on the how patients are likely to respond to different treatments, which could enable them to make better and more targeted treatment decisions."

Professor Andrea Rockall, co-author and Honorary Consultant Radiologist, at Imperial College Healthcare NHS Trust, added:

"Artificial intelligence has the potential to transform the way healthcare is delivered and improve patient outcomes. Our software is an example of this and we hope that it can be used as a tool to help clinicians with how to best manage and treat patients with ovarian cancer."

Ovarian cancer is the sixth most common cancer in women and usually affects women after the menopause or those with a family history of the disease. There are 6,000 new cases of ovarian cancer a year in the UK but the long-term survival rate is just 35-40 per cent as the disease is often diagnosed at a much later stage once symptoms such as bloating are noticeable. Early detection of the disease could improve survival rates.

Doctors diagnose ovarian cancer in a number of ways including a blood test to look for a substance called CA125 - an indication of cancer - followed by a CT scan that uses x-rays and a computer to create detailed pictures of the ovarian tumour. This helps clinicians know how far the disease has spread and determines the type of treatment patients receive, such as surgery and chemotherapy.

However, the scans can't give clinicians detailed insight into patients' likely overall outcomes or on the likely effect of a therapeutic intervention.

Researchers used a mathematical software tool called TEXLab to identify the aggressiveness of tumours in CT scans and tissue samples from 364 women with ovarian cancer between 2004 and 2015.

The software examined four biological characteristics of the tumours which significantly influence overall survival - structure, shape, size and genetic makeup - to assess the patients' prognosis. The patients were then given a score known as Radiomic Prognostic Vector (RPV) which indicates how severe the disease is, ranging from mild to severe.

The researchers compared the results with blood tests and current prognostic scores used by doctors to estimate survival. They found that the software was up to four times more accurate for predicting deaths from ovarian cancer than standard methods.

The team also found that five per cent of patients with high RPV scores had a survival rate of less than two years. High RPV was also associated with chemotherapy resistance and poor surgical outcomes, suggesting that RPV can be used as a potential biomarker to predict how patients would respond to treatments.

Professor Aboagye suggests that this technology can be used to identify patients who are unlikely to respond to standard treatments and offer them alternative treatments.

The researchers will carry out a larger study to see how accurately the software can predict the outcomes of surgery and/or drug therapies for individual patients.

The study was funded by the NIHR Imperial Biomedical Research Centre, the Imperial College Experimental Cancer Medicine Centre and Imperial College London Tissue Bank.

This research is an example of the work carried out by Imperial College Academic Health Science Centre, a joint initiative between Imperial College London and three NHS hospital trusts. It aims to transform healthcare by turning scientific discoveries into medical advances to benefit local, national and global populations in as fast a timeframe as possible.

Credit: 
Imperial College London

Surprise findings turn up the temperature on the study of vernalization

image: A new study highlights surprise findings on vernilization in wheat.

Image: 
Sam Sapin

Researchers have uncovered new evidence about the agriculturally important process of vernalization in a development that could help farmers deal with financially damaging weather fluctuations.

Vernalization is the process by which plants require prolonged exposure to cold temperature before they transition from the vegetative state to flower. For decades it's been a key focus of research into plant development and crop productivity.

But how vernalization might work under variable temperatures in the field has been unclear, as have some of the underlying molecular controls of the process.

The research carried out by John Innes Centre scientists in collaboration with colleagues in Hungary and France shows that vernalization is influenced by warm conditions as well as cold, and a much wider temperature range than previously thought.

Led by Dr Laura Dixon, the study began as an exploration into how variance in ambient temperatures might influence flowering regulation in winter wheat. But it unexpectedly uncovered an "extreme vernalization response".

"We have shown that vernalization responds to warmer conditions than those classically associated with vernalizing. Before this study we thought vernalization only happened up to a maximum of about 12°C, but the true temperature is much higher. This information is immediately useful to breeders," says Dr Dixon.

The researchers used a panel of 98 wheat cultivars and landraces and exposed them to temperatures ranging from 13 to 25 °C in controlled environments.

Normally, once the vernalization process completes, plant growth is accelerated under warm temperatures. But the team identified one cultivar, named Charger, which did not follow this standard response.

Gene expression analysis revealed that the wheat floral activator gene (VRN-A1) was responsible for this trait. Further experiments showed that expression of genes that delay flowering is reactivated in response to high temperatures (of up to 24 °C), demonstrating that vernalization is not only a consequence of how long the plant experiences continuous cold.

This study published in the journal Development highlights complex workings of a genetic network of floral activators and repressors that coordinate a plant's response to a range of temperature inputs. It also finds that the Charger cultivar is an extreme version of a response to warmer temperatures that may be prevalent in winter wheat cultivars.

The team is now looking to provide diagnostic genetic markers which will allow breeders to track the distinct allele responsible for this warm-temperature vernalization trait. They also hope to use their new knowledge of warm weather interruption to reduce the length of vernalization in the breeding cycle, so that new wheat lines can be generated more quickly.

Dr Dixon explains: "This study highlights that to understand the vernalization response in agriculture we must dissect the process in the field and under variable conditions. The knowledge can be used to develop new wheat cultivars that are more robust to changing temperatures."

Credit: 
John Innes Centre

Lithium-air batteries can store energy for cars, houses and industry

image: Growth in the offer of renewable energy sources will mean increased demand for devices optimal for energy storing said Rubens Maciel Filho, a professor at the School of Chemical Engineering of the University of Campinas (UNICAMP) during FAPESP Week London.

Image: 
André Julião

Current lithium ion battery technology will probably not be able to handle the coming decades' huge demand for energy. It is estimated that by 2050, electricity will make up 50% of the world's energy mix. Today that rate is 18%. But installed capacity for renewable energy production is expected to increase fourfold. This will require batteries that are more efficient, cheaper and environmentally friendly.

One of the alternatives being studied today in many parts of the world is the lithium-air battery. Some of the Brazilian efforts in the search for such device were presented on Day Two of FAPESP Week London, held February 11-12, 2019.

"There is a lot of talk today about electric cars. Some European countries are also thinking about banning combustion engines. In addition, renewable sources like solar energy need batteries to store what is generated during the day through solar radiation," said Rubens Maciel Filho, a professor at the School of Chemical Engineering of the University of Campinas (UNICAMP).

The lithium-air battery, currently functioning only on a laboratory scale, uses ambient oxygen as a reagent. The battery stores additional energy through an electrochemical reaction that results in the formation of lithium oxide.

"It is a sustainable way to store electrical energy. With advances, it can support numerous discharge/charge cycles. It has great potential for use in transportation, in light and heavy vehicles alike. It can also work in electric power distribution networks," said the researcher.

But turning experiments into commercially viable products involves understanding the fundamentals of the electrochemical reactions that occur in the process.

"It also requires the development of new materials that allow us to leverage desirable reactions and minimize or avoid undesirable ones," said Maciel, director of the New Energy Innovation Center (CINE). With units at UNICAMP, the Nuclear Energy Research Institute (IPEN) and the São Carlos Chemistry Institute at the University of São Paulo (USP), the center is supported by FAPESP and Shell under the scope of the Engineering Research Centers Program (ERC).

He went on to explain that some of the phenomena need to be observed in operando, or in other words, in real time. "The idea is to keep track of the reactions that occur in dynamic experiments and the different chemical species that are formed, even if temporarily.

Otherwise, some of the stages in the process get lost and the battery becomes inefficient in terms of charge time and duration of charge."

To conduct these measurements, the researchers are using the National Synchrotron Light Laboratory (LNLS) at the Brazilian Center for Light Research in Energy and Materials (CNPEM), located in Campinas.

Another project presented during the session involved sulfur-air batteries. Despite not being as efficient, they are inexpensive and store energy for many hours. "They can store energy for up to 24 hours at a very low cost. Its main ingredients are sulfur and caustic soda and they are extremely inexpensive. That is why we are investing in them," said Nigel Brandon, a professor at Imperial College.

Because of these characteristics, sulfur-air batteries can be used in homes or businesses. Brandon believes, however, that their greatest potential is in charging stations for electric cars, which will become much more commonplace due to the European goal of cutting carbon emissions 80% by 2050.

"It is important to underscore the fact that the different battery projects are not competing with each other but rather are complementing each other," said Geoff Rodgers of Brunel University London, session facilitator.

Sun, hydrogen and biofuels

More efficient batteries are particularly important in a scenario in which the use of solar energy is expected to increase. Peak solar radiation during the day will require the need for efficient storage of energy so it can be drawn upon at night.

Maciel also talked about a project at CINE to develop more efficient photovoltaic cells that could be used in the future to convert solar energy to electricity as well to obtain chemical products, or even hydrogen from water hydrolysis.

Liquid hydrogen is a very efficient fuel, but its production entails high-energy costs. It is one of the options being considered in the United Kingdom since biofuels are not as viable as in Brazil.

"We are looking for new bacterial enzymes for oxidation of lignin, an aromatic polymer that makes up more than 25% of plant cell walls and is part of the residue of biofuel production. The goal is to develop new products such as biofuels, new plastics and chemical products for industry," said Timothy Bugg of the University of Warwick.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

UTMB develops drug to rejuvenate muscle cells

image: Climbing man

Image: 
The University of Texas Medical Branch at Galveston

GALVESTON, Texas - Researchers from The University of Texas Medical Branch at Galveston have developed a promising drug that has proven to significantly increase muscle size, strength and metabolic state in aged mice, according to a study just published in Biochemical Pharmacology.

As we age, our bodies increasingly lose the ability to repair and rebuild degenerating skeletal muscles. Beginning around age 35, muscle mass, strength and function continually decline as we get older. This can dramatically limit the ability of older adults to live fully active and independent lives.

"We identified a protein in muscle stem cells that appears to be responsible for their age-related dysfunction, and then developed a small molecule drug that limits the effects of this protein," said senior author Stanley Watowich, UTMB associate professor in the department of biochemistry and molecular biology. "By resetting muscle stem cells to a more youthful state, we were able to rejuvenate them so that they could more effectively repair muscle tissues."

In the study, aged mice with a muscle injury were treated with either the drug or a placebo. Following seven days of drug treatment, researchers found that the aged mice that received the drug had more functional muscle stem cells that were actively repairing the injured muscle. In the treated group, muscle fiber size doubled, and muscle strength increased by 70 percent, compared with the placebo group. In addition, the blood chemistry of the drug-treated and untreated mice was similar, suggesting no adverse drug effects were occurring.

Adults over 65 are the fastest growing segment of the population in many countries. In the next decade, the U.S. elderly population will increase by 40 percent and the cost of their health care is expected to double, accounting for over half of all U.S. health care spending. Much of this spending will be used to treat health problems related to muscle decline, including hip fractures, falls and heart disease.

"There are no treatments currently available to delay, arrest or reverse age-related muscle degeneration," said senior author Harshini Neelakantan, a UTMB research scientist in the department of biochemistry and molecular biology. "These initial results support the development of an innovative drug treatment that has the potential to help the elderly to become fitter, faster and stronger, thus enabling them to live more active and independent lives as they age."

Credit: 
University of Texas Medical Branch at Galveston

Brain-computer interface, promise of restoring communication discussed at AAAS presentation

Brain-computer interfaces promise to restore communication for individuals with severe speech and physical impairments. Current brain computer interfaces share many features of high-tech, conventional augmentative and alternative communication systems, but via direct brain link. Choosing the "right" brain-computer interface that maximizes reliability of the neural control signal and minimizes fatigue and frustration is critical.

Jonathan Brumberg, assistant professor of speech-language-hearing at the University of Kansas, will present on this subject and demonstrate a variety of brain-computer interfaces in his talk, "Evolution in Technology to Aid and Restore Communication," at the AAAS Annual Meeting in Washington, D.C.

What: "Talking without Speaking: Overcoming Communication Challenges with Technology," a scientific sessions panel at AAAS.
Who: Jonathan Brumberg, University of Kansas, Lawrence, KS; brumberg@ku.edu
When: 10:00 AM - 11:30 AM Sunday, February 17, 2019
Where: Marriott Wardman Park - Thurgood Marshall Ballroom East, 2660 Woodley Rd NW, Washington, D.C., 20008

Background (panel description): Millions live with developmental or acquired communication disorders that significantly limit their ability to communicate with those around them. People can be left at a loss for words because of disorders such as autism, cerebral palsy, or intellectual disability, as well as acquired disorders such as stroke and brain injury. Augmentative and alternative communication (AAC) helps people overcome communication barriers via a range of high- and low-tech options. No longer simply science fiction, brain-computer interfaces can now be a plausible solution for acquired disorders. Evolving mobile technology has helped to normalize AAC use by making tablet and smartphones central to everyday interaction. However, the attitude that there's an app for everything creates its own problems. First, basic language challenges, such as aphasia and autism, require well-organized interface designs and partner support for successful AAC use. For people with relatively intact cognitive-linguistic skills, barriers include physical access to devices. The recipe for successful communication for people needing AAC requires the right technology as well as an understanding of user abilities and limitations. While possibilities are endless, considerations about the application of technology must always be at the forefront of AAC implementation practice. The session explores these scientific opportunities and pragmatic challenges.

Credit: 
University of Kansas

New study shows hidden genes may underlie autism severity

AURORA, Colo. (Feb. 15, 2019) - Scientists at the University of Colorado Anschutz Medical Campus have implicated a largely hidden part of the human genome in the severity of autism symptoms, a discovery that could lead to new insights into the disorder and eventually to clinical therapies for the condition.

The researchers found the critical genes are part of the human genome that is so complex and difficult to study that it has been unexamined by conventional genome analysis methods.

In this case, the region encodes most copies of the Olduvai (formerly DUF1220) protein domain, a highly duplicated (~300 copies in the human genome) and highly variable gene coding family that has been implicated in both human brain evolution and cognitive disease.

The researchers, led by James Sikela, PhD, a professor in the Department of Biochemistry and Molecular Genetics at the CU School of Medicine, analyzed the genomes of individuals with autism and showed that, as the number of copies of Olduvai increased, the severity of autism symptoms became worse.

While the Sikela lab has shown this same trend previously, the discovery has not been pursued by other researchers due to the complexity of the Olduvai family.

"It took us several years to develop accurate methods for studying these sequences, so we fully understand why other groups have not joined in." Sikela said. "We hope that by showing that the link with autism severity holds up in three independent studies, we will prompt other autism researchers to examine this complex family."

In order to provide more evidence that the association with autism severity is real, the Sikela lab used an independent population and developed a different, higher resolution measurement technique. This new method also allowed them to zero in on which members of the large Olduvai family may be driving the link with autism.

Though autism is thought to have a significant genetic component, conventional genetic studies have come up short in efforts to explain this contribution, Sikela said.

"The current study adds further support to the possibility that this lack of success may be because the key contributors to autism involve difficult-to-measure, highly duplicated and highly variable sequences, such as those encoding the Olduvai family, and, as a result, have never been directly measured in other studies of autism," Sikela said.

Credit: 
University of Colorado Anschutz Medical Campus

Exposure to chemical in Roundup increases risk for cancer, says statistical analysis

Exposure to glyphosate — the world’s most widely used, broad-spectrum herbicide and the primary ingredient in the weedkiller Roundup — increases the risk of some cancers by more than 40 percent, according to new research from the University of Washington.

Various reviews and international assessments have come to different conclusions about whether glyphosate leads to cancer in humans.

Get fit for a fit gut

Bacteria, often synonymous with infection and disease, may have an unfair reputation. Research indicates there are as many, if not more, bacterial cells in our bodies as human cells, meaning they play an important role in our physiology (1). In fact, a growing body of evidence shows that greater gut microbiota diversity (the number of different species and evenness of these species' populations) is related to better health. Now, research published in Experimental Physiology has suggested that the efficiency with which we transport oxygen to our tissues (cardiorespiratory fitness) is a far greater predictor of gut microbiota diversity than either body fat percentage or general physical activity.

The findings suggest that exercise at a sufficiently high intensity, to improve cardiorespiratory fitness, may support health through favourable alterations in the presence, activity and clustering of gut microbes. Such exercise-induced improvements, in cardiorespiratory fitness, often correspond with central (e.g. increased volume of blood pumped by the heart each beat) and peripheral adaptations (e.g. increased number of capillaries to transport oxygen from blood to muscles).

Before now, it was understood that higher cardiorespiratory fitness tended to coincide with greater gut microbiota diversity, but it was unclear whether this relationship was attributable to body fat percentage or physical activities of daily-living. Since cancer treatment is known to trigger physiological changes detrimental to cardio-metabolic health, including increased body fat percentage and declining cardiorespiratory fitness, this research was performed on cancer survivors. In total, 37 non-metastatic breast cancer survivors, who had completed treatment at least one year prior, were enrolled.

Participants performed a graded exercise test to estimate peak cardiorespiratory fitness, assessments of total energy expenditure and examination of gut microbiota from faecal swipes. The results showed that participants with the higher cardiorespiratory fitness had significantly greater gut microbiota diversity compared to less fit participants. Further statistical analyses highlighted that cardiorespiratory fitness accounted for roughly a quarter of the variance in species richness and evenness, independent of body fat percent.

These data offer intriguing insight into the relationship between cardiorespiratory fitness and gut microbiota diversity. However, given the cross-sectional nature of the study design, the research team's findings are correlative in nature. The participant sample was restricted to women with a history of breast cancer, who tended to exhibit low cardiorespiratory fitness and other health problems, meaning generalisation to other groups should be made with caution.

Stephen Carter, lead author of the paper from Indiana University, is enthusiastic about continuing his team's research:

"Our group is actively pursuing an interventional study to determine how variation in exercise intensity can influence gut microbiota diversity under controlled-feeding conditions to uncover how exercise may affect functional outcomes of gut microbiota, as well as, studying how exercise prescription may be optimized to enhance health outcomes among clinical populations."

Credit: 
The Physiological Society

Personal and social factors impact return to work after ill-health

Support from managers and colleagues, as well as a positive attitude, are most likely to enable a more long-term return to work for employees after a sickness absence, according to a new review of research led by the University of East Anglia (UEA).

The review evaluated the impact of personal and social factors on sustainable return to work after ill-health due to musculoskeletal disorders, such as joint and back pain, and common mental health conditions, for example stress, depression or anxiety.

It also compared the effects of these factors across the two types of conditions, which are recognised as the most common causes of sickness absence in developed countries.

Personal and social factors were found to play a role in enabling sustainable return to work after ill health. However, sustainable return to work does not appear to be the result of a single factor. Instead, it seems to be influenced by a combination of multiple factors.

Researchers from UEA's Norwich Business School and Uppsala University in Sweden found the most consistent evidence for achieving sustainable return to work was for support from line managers or supervisors and co-workers, employees having a positive attitude and high self-efficacy - their belief in their capabilities to achieve a goal or outcome - being younger and having higher levels of education.

The review examined evidence from 79 previous studies conducted between 1989 and 2017. Its findings are published in the Journal of Occupational Rehabilitation.

For the purposes of the review, sustainable return to work was defined as a stable full-time or part-time return to work to either the original or a modified job for a period of at least three months, without relapse or sickness absence re-occurrence.

Lead author Abasiama Etuknwa, a postgraduate researcher at UEA, said: "These findings will help us understand what factors may either bring about or hinder a sustainable return to work. The relationship between the social environment and personal factors like attitudes and self-efficacy appears to impact positively on maintainable return to work outcomes.

"Promoting a culture of support at the workplace is essential, a culture that makes returning workers feel valued, worthy and not necessarily blamed for absence, as the former would improve work attitudes and ease the transition back to work."

The economic cost of sickness absence is growing yearly. Extended sickness absence is associated with reduced probability of return to work, which becomes costly for employers, increasing the urgency to help workers return early.

Co-author Kevin Daniels, professor of organisational behaviour at UEA, said: "To reduce costs related to sickness absence and reduce the risk of long-term disability associated with extended absence from work, there is a big need for a better understanding of the factors that either impede or facilitate a sustainable return to work for staff sick-listed with musculoskeletal and common mental health disorders.

"Previous studies have shown how poor quality jobs can cause ill-health. However, there is also strong evidence that good quality jobs, for example those that enable reasonable work-life balance, allow staff some say in how their work is done and have supportive managers, are an important component for a speedy recovery after ill-health episodes and are generally beneficial for physical and mental health."

Other personal factors identified as impacting return to work included economic status/income, length of sickness absence, and job contract/security. There was no consistent evidence of whether gender affected sustainable return to work.

Social factors also included job crafting - employees redesigning their job task to fit their motives, strengths and passions - and its related practices, such as employee-initiated changes to their job or how work is done.

The authors say the review provides employers and policymakers with knowledge of the key factors that will help with implementing more effective return to work programmes.

"Existing return to work programmes need to encourage supportive interactions between leaders and co-workers and returning workers during the process, especially as this could have a direct effect on sustainable return to work, as well as an indirect effect through enhanced returners' attitudes towards work and self-efficacy," said Miss Etuknwa.

"Although return to work takes place within a complex system involving employing organizations and the healthcare system, given the consistent evidence of the role line managers play, we recommend that policymakers consider ways to provide guidance for employers."

This guidance could: outline the supportive role of line managers and other key workplace professionals, for example human resources professionals and occupational health providers, during the return to work process; train these professionals on the return to work process and how to effectively manage and support returning workers; and outline ways to assist line managers in providing necessary support.

Credit: 
University of East Anglia