Culture

Pre-life building blocks spontaneously align in evolutionary experiment

image: An outtake from a mural on the origin of life celebrates famous experimental milestones in the science that tries to explain how chemicals evolved into the first building blocks of life on an Earth before life existed. The NSF Center for Chemical Evolution headquartered at Georgia Tech has adopted this banner as a symbol.

Image: 
Painted by Christine He and David Fialho for Georgia Tech

When Earth was a lifeless planet about 4 billion years ago, chemical components came together in tiny molecular chains that would later evolve into proteins, crucial life building blocks. A new study has shown how fortuitously some early predecessors of protein may have fallen into line.

In the laboratory, under conditions mimicking those on pre-life Earth, a small selection of amino acids linked up spontaneously into neat segments in a way that surprised researchers at the Georgia Institute of Technology. They had given these amino acids found in proteins today some stiff competition by adding amino acids not found in proteins, thinking these non-protein, or non-biological, amino acids would probably not allow predominantly biological segments to come together.

The non-biological amino acids had the potential to chemically react equally well or better than the biological ones and frequently become part of the tiny chains, perhaps serving as an in-between step in the greater evolution toward proteins. The experiment dashed those expectations -- but to the researchers' delight. The reactions resulted mostly in strings that were closer to today's actual proteins and less in chains that included non-biological amino acids.

"The non-biological amino acids were being excluded to some extent," said Nick Hud, one of the study's principal investigators and a Regents Professor in Georgia Tech's School of Chemistry and Biochemistry.

Doorway to evolution

In particular, the researchers had thought the non-biological amino acids would outcompete the biological amino acid lysine, but it was not the case. They also thought lysine would often not fit neatly into the chains the way it does in proteins. The reaction surprised them again.

"Lysine went into the chains predominantly in the way that it is connected in proteins today," said Hud, who is also director of the National Science Foundation/NASA Center for Chemical Evolution (CCE), which is headquartered at Georgia Tech and explores the chemistry that may have paved the way to first life.

The research team, which included collaborators from The Scripps Research Institute, published their results in the journal Proceedings of the National Academy of Sciences on July 29, 2019. The research was funded by the NSF and NASA.

The study's experiment points to chemical evolution having prefabricated some amino acid chains useful in living systems before life had evolved a way to make proteins. The preference for the incorporation of the biological amino acids over non-biological counterparts also adds to possible explanations for why life selected for just 20 amino acids when 500 occurred naturally on the Hadean Earth.

"Our idea is that life started with the many building blocks that were there and selected a subset of them, but we don't know how much was selected on the basis of pure chemistry or how much biological processes did the selecting. Looking at this study, it appears today's biology may reflect these early prebiotic chemical reactions more than we had thought," Loren Williams, another principal investigator in the study and a professor in Georgia Tech's School of Chemistry and Biochemistry.

Mono, oligo, poly

To help understand the study's significance, let's look at how proteins form, then at the study's core experiment, which revealed an unexpectedly high preference for bonds between sites called alpha-amines (α-amines) on the biological amino acids. Those bonds gave resulting molecular segments a protein-like shape in the lab.

In a protein, one amino acid is a single chemical unit, or monomer. A few of them linked together is called an oligomer, and really long chains are polymers. In proteins, the polymer is called a polypeptide -- named after the peptide bonds that link its monomers together.

Polypeptides are long chains that often form helices, like old phone cords, or flat sheets. They kink and fold up into specific, mostly functional wads, sheets, and other shapes, which are called proteins. The study looked at how amino acid monomers linked up to make interesting oligomers that look like small pieces of proteins.

Hadean Eon experiment

Late in the Hadean Eon, Earth's earliest phase, when prebiotic chemistry was taking shape, the planet's surface was awash in vulcanism and rain, and large meteors pummeled it with new chemicals. The researchers' experimental lab setup reflected relatively mild conditions for those times and feasibly present ingredients.

First author Moran Frenkel-Pinter placed the biological amino acids lysine, arginine, and histidine together with three non-biological competitors in water containing hydroxy acids. Hydroxy acids are known to facilitate amino acid reactions and would have been common on prebiotic Earth.

The mixture was heated to 85 degrees Celsius, pushing the reaction and evaporating the water, and the researchers analyzed the products formed.

"We found this high preference for the inclusion of these biological amino acids and the linkage via the α-amine," said Frenkel-Pinter, a NASA postdoctoral researcher in the CCE.

Amine groups are made of nitrogen and hydrogen and are quite reactive, but the α-amine is part of the core of an amino acid, and other amines in this experiment were at the end of a sidechain extending off the core. The latter is often more reactive.

"It surprised us that this chemistry favored the α-amine connection found in proteins, even though chemical principles might have led us to believe that the non-protein connection would be favored," Frenkel-Pinter said. "The preference for the protein-like linkage over non-protein was about seven to one."

Easy chemical evolution

Most resulting oligomers had evenly placed links in the chain, which are used in life, as opposed to non-α-amine bonded oligomers, which built more irregular chains.

The finished products were mostly depsipeptides, which the CCE previously established as stepping stone products in an easy, reliable pathway to peptides.

In another reflection of life chemistry, the abiotic depsipeptide transition to peptides is the same basic reaction (ester-amide) carried out by ribosomes, the cellular machines that make proteins today.

Surprise reactions, in which potential pre-life chemistry casually falls into place, have happened often in the CCE's research. They have shored up the center's core hypothesis that most biological polymers formed in wet and dry cycles, perhaps on rain-swept dirt flats or lakeshore rocks regularly baked by the sun's heat.

Despite its grounded simplicity, the premise of everyday wet-dry cycles being key to the origin of life is unconventional, challenging a more established narrative that improbable concurrences of cataclysms and multiple ingredients were necessary to produce life's early materials in rare and volatile events.

Credit: 
Georgia Institute of Technology

Human genetic diversity of South America reveals complex history of Amazonia

image: The vast cultural and linguistic diversity of Latin American countries is still far from being fully represented by genetic surveys. A new study explores the genetic roots of 26 populations from diverse regions and cultures of western South America and Mexico. The results reveal long-distance connections between speakers of the same language, and new traces of genetic diversity within the Amazonia.

Image: 
Chiara Barbieri and Rodrigo Barquera

The vast cultural and linguistic diversity of Latin American countries is still far from being fully represented by genetic surveys. Western South America in particular holds a key role in the history of the continent due to the presence of three major ecogeographic domains (the Andes, the Amazonia, and the Pacific Coast), and for hosting the earliest and largest complex societies. A new study in Molecular Biology and Evolution by an international team lead by scholars from the Max Planck Institute for the Science of Human History and from the University of Zurich reveals signatures of history, ecology and cultural diversity in the genetic makeup of living rural populations.

A thorough study, a collaboration between scientists and institutes from Europe, the USA, Mexico, Ecuador, Colombia and Peru, including geneticists, linguists and anthropologists, has shed new light on the population history of South America. The study, which involved working closely with local populations in numerous regions, was published in Molecular Biology and Evolution. The results confirmed the impact of large, complex societies already known from archaeological evidence, but also revealed previously unknown migrations and connections across vast distances, including in Amazonia, an area that has not been as deeply studied archaeologically.

Genetic studies have played a fundamental role in understanding the population history of the American continent. By reconciling genetic evidence with the archaeological record and with paleoclimatic data, scientists have been able to pinpoint the time and scale of the earliest migrations, possible routes through the continent, the subsequent formation of population structure, and preferential routes of population migration and contact. Yet the picture is necessarily superficial because of the lack of representative data from all the diverse regions of the continent. One recurrent simplification relies on the contrast between the Andes, site of the famous large complex societies of the Wari, Tiahuanaco and Inca who built a vast network of roads, and Amazonia, where people apparently have been living in small isolated groups. The Pacific Coast, key player in the earliest migration routes and theatre of other large-scale societies (like the Moche and Chimú, among others) is not properly incorporated in this traditional model.

"We wanted to bring attention to the fine-grained, complex events taking place during the pre-colonial and post-European contact times. We therefore visited diverse regions of South America, collecting new samples from rural populations with different cultural backgrounds. In our analysis, we focused on signals of contact and shared ancestry, trying to find exceptions to the current paradigm on the diversity of the continent," explains Chiara Barbieri, a geneticist from the Max Planck Institute for the Science of Human History in Jena now working at the University of Zurich, and lead author of the study.

"On a continental scale, one of the major findings is the presence of a distinct ancestry component in Amazonia, present at high frequency in populations from Ecuador and Colombia, near the Eastern Andes slope. This genetic component, previously undetected, might have started to differentiate from other ancestry components (for example the one frequently found in Amazonia and the one found in the Coast and Andes) more than 4,000 years ago. This has implications for understanding the early migrations and structure of the continent, and suggests that human diversity in Amazonia is larger than we thought," adds Barbieri.

On a local scale, the high-resolution genomic data generated for this study was able to distinguish fine-grained cases of genetic exchange that correspond to population contact. These exchanges connect populations separated by hundreds of kilometers and who live in different ecogeographic domains. "Connections have been found between speakers of Quechua (a widely-spoken Andean language) through the Andes and part of Amazonia. Also, some populations of Loreto and San Martín (Amazonian regions) of Peru are genetically very close to the Cocama speakers (an Amazonian language) of Colombia. These genetic signatures suggest that here languages are diffused by movement of actual people instead of by cultural diffusion alone," explain José Sandoval and Ricardo Fujita of the University of San Martín de Porres in Lima, Peru, who coauthored the study.

Finally, the genetic analyses show demographic traces of a large population that correspond to ancient complex societies both in the Andes and on the North Coast of Peru. Signatures of relatively large populations are also found in a few populations of Amazonia, suggesting the possibility of complex interactions in this region. Here archaeological excavations are less numerous, making the chances of uncovering relevant archaeological findings less likely. Lars Fehren-Schmitz, a biological anthropologist from the University of California, Santa Cruz, who contributed to the study, concludes, "Taken all together, these findings bring attention to the diversity and complexity of people from Amazonia and their interactions with neighboring regions of the Andes. The genetic inheritance of South American people still bears traces of the important events that took place before the historical record in colonial times."

Credit: 
Max Planck Institute of Geoanthropology

How roads can help cool sizzling cities

image: A permeable concrete specimen.

Image: 
Hao Wang/Rutgers University-New Brunswick

Special permeable concrete pavement can help reduce the "urban heat island effect" that causes cities to sizzle in the summer, according to a Rutgers-led team of engineers.

Their study appears in the Journal of Cleaner Production.

Impermeable pavement made of concrete or asphalt covers more than 30 percent of most urban areas and can exceed 140 degrees Fahrenheit in the summertime. It heats the air, posing human health risks, and surface runoff, threatening aquatic life.

In cities with 1 million or more people, average air temperatures can be 1.8 to 5.4 degrees Fahrenheit higher than in less densely populated areas. The difference can be up to 22 degrees at night. The heat can increase peak demand for energy in the summertime, air conditioning costs, air pollution and greenhouse gas emissions, heat-related illness and deaths, and water pollution, according to the U.S. Environmental Protection Agency.

The engineering team at Rutgers developed designs for permeable concrete that is highly effective in handling heat. Permeable pavement contains large connected pores, allowing water to drain through and reducing pavement temperature. Water in pores will also evaporate, reducing pavement surface temperature. Moreover, permeable concrete pavement does a better job reflecting heat than asphalt pavement.

The study found that permeable concrete pavement gives off slightly more heat on sunny days compared with conventional concrete pavement, but 25 to 30 percent less heat on days after rainfall. The engineers improved the design of permeable concrete with high thermal conductivity - meaning it can transfer heat more quickly to the ground - further reducing heat output by 2.5 percent to 5.2 percent.

"Highly efficient permeable concrete pavement can be a valuable, cost-effective solution in cities to mitigate the urban heat island effect, while benefitting stormwater management and improving water quality," said corresponding author Hao Wang, an associate professor in the Department of Civil and Environmental Engineering in the School of Engineering at Rutgers University-New Brunswick. He is also an affiliated researcher at the Center for Advanced Infrastructure and Transportation.

Incorporating industry byproducts and waste into permeable concrete can increase its economic and environmental benefits. In another study in the Journal of Cleaner Production, Wang's team designed permeable concrete with fly ash and steel slag to reduce the costs, energy consumption and greenhouse gas emissions linked to raw materials.

Previously, permeable pavement has been used as green infrastructure to reduce stormwater runoff and flooding risk in urban areas. Today, permeable concrete is mainly used in lightly trafficked areas, such as sidewalks, parking lots and rest areas. The Rutgers-led team is studying how to make permeable concrete stronger and more durable so it can be used in urban streets.

Credit: 
Rutgers University

Promising new solar-powered path to hydrogen fuel production

image: Steven McIntosh et al.
Enzymatic synthesis of supported CdS quantum
dot/reduced graphene oxide photocatalysts

Image: 
Courtesy of Green Chemistry

Engineers at Lehigh University are the first to utilize a single enzyme biomineralization process to create a catalyst that uses the energy of captured sunlight to split water molecules to produce hydrogen. The synthesis process is performed at room temperature and under ambient pressure, overcoming the sustainability and scalability challenges of previously reported methods.

Solar-driven water splitting is a promising route towards a renewable energy-based economy. The generated hydrogen could serve as both a transportation fuel and a critical chemical feedstock for fertilizer and chemical production. Both of these sectors currently contribute a large fraction of total greenhouse gas emissions.

One of the challenges to realizing the promise of solar-driven energy production is that, while the required water is an abundant resource, previously-explored methods utilize complex routes that require environmentally-damaging solvents and massive amounts of energy to produce at large scale. The expense and harm to the environment have made these methods unworkable as a long-term solution.

Now a team of engineers at Lehigh University have harnessed a biomineralization approach to synthesizing both quantum confined nanoparticle metal sulfide particles and the supporting reduced graphene oxide material to create a photocatalyst that splits water to form hydrogen. The team reported their results in an article entitled: "Enzymatic synthesis of supported CdS quantum dot/reduced graphene oxide photocatalysts" featured on the cover of the August 7th issue of Green Chemistry, a journal of the Royal Society of Chemistry.

The paper's authors include: Steven McIntosh, Professor in Lehigh's Department of Chemical and Biomolecular Engineering, along with Leah C. Spangler, former Ph.D. student and John D. Sakizadeh, current Ph.D. student; as well, as Christopher J. Kiely, Harold B. Chambers Senior Professor in Lehigh's Department of Materials Science and Engineering and Joseph P. Cline, a Ph.D. student working with Kiely.

"Our water-based process represents a scalable green route for the production of this promising photocatalyst technology," said McIntosh, who is also Associate Director of Lehigh's Institute for Functional Materials and Devices.

Over the past several years, McIntosh's group has developed a single enzyme approach for biomineralization?the process by which living organisms produce minerals of size-controlled, quantum confined metal sulfide nanocrystals. In a previous collaboration with Kiely, the lab successfully demonstrated the first precisely controlled, biological way to manufacture quantum dots. Their one-step method began with engineered bacterial cells in a simple, aqueous solution and ended with functional semiconducting nanoparticles, all without resorting to high temperatures and toxic chemicals. The method was featured in a New York Times article: "How a Mysterious Bacteria Almost Gave You a Better TV."

"Other groups have experimented with biomineralization for chemical synthesis of nanomaterials," says Spangler, lead author and currently a Postdoctoral Research Fellow at Princeton University. "The challenge has been achieving control over the properties of the materials such as particle size and crystallinity so that the resulting material can be used in energy applications."

McIntosh describes how Spangler was able to tune the group's established biomineralization process to not only synthesize the cadmium sulfide nanoparticles but also to reduce graphene oxide to the more conductive reduced graphene oxide form.

"She was then able to bind the two components together to create a more efficient photocatalyst consisting of the nanoparticles supported on the reduced graphene oxide," says McIntosh. "Thus her hard work and resulting discovery enabled both critical components for the photocatalyst to be synthesized in a green manner."

The team's work demonstrates the utility of biomineralization to realize benign synthesis of functional materials for use in the energy sector.

"Industry may consider implementation of such novel synthesis routes at scale," adds Kiely. "Other scientists may also be able to utilize the concepts in this work to create other materials of critical technological importance."

McIntosh emphasizes the potential of this promising new method as "a green route, to a green energy source, using abundant resources."

"It is critical to recognize that any practical solution to the greening of our energy sector will have to be implemented at enormous scale to have any substantial impact," he adds.

Credit: 
Lehigh University

Doing more with less: Flexible, reduced-load jobs a win-win for workers, employers

WEST LAFAYETTE, Ind. - Attracting and retaining the best and brightest employees with an unemployment rate that is hovering near a 50-year low is a challenge for companies.

The challenge is made more difficult with more workers reaching retirement age, declining birth rates and fewer replacement workers to fill job openings.

One emerging answer: Get rid of unproductive "busy work" and commit to learning how to effectively design and implement flexible workload arrangements for interested high-potential employees on a career path.

Other countries, such as the Netherlands, have been trying it for years and seeing success, as well as increased gender equality among men and women after the birth of a child, says a Purdue University work-life balance expert.

Reduced-load work is finally gaining a little more traction in the United States, says Ellen Ernst Kossek, the Basil S. Turner Professor at Purdue's Krannert School of Management and research director of Purdue's Susan Bulkeley Butler Center for Leadership Excellence.

Reduced-load is a flexible form of part-time work, where an individual works with a manager to customize their jobs to reduce workload while still progressing in a career. For instance, a corporate sales manager with a portfolio of 10 products might keep eight, while taking a 20% pay cut and being able to choose to work longer hours four days a week or fewer hours during the traditional five-day work week.

The supervisor and employee work together to determine the duties that add the most value to the organization and then cut low-value tasks or identify tasks that others can cover. Or they can identify legacy tasks that aren't adding much value and should go away.

"We need to jettison our old conceptions of part-time work being low pay or a career dead end, or something relegating one to a secondary mommy or daddy track," Kossek said. "Given that many salaried professional jobs today have ballooned to be up to 50 or 60 hours a week, reduced-load work is desperately needed as a temporary or ongoing career option for workers at all career ages and stages."

The Department of Labor considers full time for hourly paid jobs to be at least 30 hours a week.

"A reduced workload can enable sustainable careers for managers and other professionals," Kossek said. "Yet we see not only hesitancy from organizations to implement it, but also implementation hurdles such as insufficient workload reductions fitting the pay cut and stalled careers often adversely affecting women and caregivers.

"One reason there is hesitancy is managers sometimes think that having someone work less than full-time hours means they are getting less work done. But the reality is many reduced-load workers work more intently and often get as much done as a full-time worker. This is because they enjoy the opportunity to have an interesting job yet still be able to flexible in a way that enables time for other life interests - from continuing education, to caregiving to community involvement."

Another barrier to implementing reduced-load work is that many current accounting systems primarily use standardized headcount for labor costing, an approach that inadvertently can penalize managers experimenting with workload redesign because many firms make it hard to hire additional staff as a traditional tactic for controlling overhead costs. Yet employers need to move away from counting bodies toward using full-time equivalent costing.

Kossek says it is important to identify which tasks can be integrated with other workers' roles, from current staff to interns or trainees, and which tasks need to be differentiated as unique as part of the reduced-load redesign strategy.

"This enables managers to have more flexibility in organizing tasks and not be penalized for learning how to reallocate workloads or hours for individual talented workers," Kossek said. "Doing so can also develop other team members' knowledge and provide better backup for clients.

"In sum, the concept of reduced-load work requires a new approach to assigning job tasks and a way of thinking about how to manage work and careers. For those employers who consider such a prospect, the rewards can be great in productivity and company morale."

The Purdue work appears in a June edition of the Journal of Vocational Behavior. The study was partially funded by the Alfred P. Sloan Foundation.

Kossek examined reduced-load work by interviewing nearly 100 manager, human resources experts and executives across 20 major leading North American employers across five industries that have been early adopters of reduced workload. She identified work redesign tactics that either reduced or reshuffled workloads.

"Reduced workload work is important to study because it is one of the few flexible work forms that prompt organizations to quantify the parameters of a full-time load and experiment with tactics to actually reduce workloads," Kossek said. "Reduced workload work matters for individuals and societies because being able to reduce one's workload while staying on a career track is an important asset to build careers that are sustainable."

Kossek says reduced workload also keeps more older individuals in the workforce and not seeking retirement as soon or dropping out to care for elderly parents. It also may help keep mothers or fathers of young children or children with special needs in the workforce rather than taking very long leaves of absence or quitting altogether. Reduced-load work can foster retention and prevent burnout. Many reduced-load workers can shift back to full time later in their careers.

Kossek and her team developed a three-stage process of collaborative crafting of reduced-load work: exploration, implementation and career embedding. The third stage involves troubleshooting issues to ensure that the arrangement was working well over time.

Kossek has received worldwide attention for her research on work-life balance and has worked with the Purdue Research Foundation on some of her studies.

Credit: 
Purdue University

Scientists crack the code to improve stress tolerance in plants

In any eukaryotic organism, the DNA in a cell exists not as a loose strand, but as a highly condensed complex that consists of DNA and other proteins known as histones. Overall, this condensed structure is referred to as chromatin, and this packaging is important for maintaining the integrity of DNA structure and sequence. However, as chromatin restricts the topology of DNA, modification of chromatin (via modification of histones) is an important form of regulation of genes and is referred to as epigenetic regulation. Now, a group of scientists, led by Prof Sachihiro Matsunaga from Tokyo University of Science, has uncovered a novel epigenetic regulation mechanism, at the center of which lies a histone demethylase enzyme called lysine-specific demethylase 1-like 1 (LDL1). Prof Matsunaga states, "The novel mechanism of epigenetic regulation that we found is related to DNA damage repair in plants, and we believe that it has a lot of real-world applications." This study is published in Plant Physiology.

An organism's genome is constantly subjected to various stresses that cause instabilities or errors in it, resulting in damages or "breaks" in the sequences. These breaks are repaired autonomously by a process called homologous recombination (HR), and thus, HR is essential for maintaining the stability of a genome. Like for all other genetic regulatory processes, the chromatin structure needs to be modified for HR to occur smoothly. Prof Matsunaga and team had previously discovered a conserved protein called RAD54; they found that RAD54 is involved in chromatin remodeling in the model plant Arabidopsis and thus aids in genomic stability and response to DNA damage. However, both the recruitment of RAD54 at the site of HR and the proper dissociation of RAD54 from the site are important for it exert its effects. When asked about their motivation for this study, Prof Matsunaga candidly states "Our previous study identified that RAD54 aids HR but the mechanisms of recruitment and dissociation were poorly understood. Our new study attempts to shed light on these mechanisms."

Using techniques such as co-immunoprecipitation and mass spectrometry, the scientists first identified and shortlisted proteins that interact with RAD54 and regulate its dynamics with chromatin during HR-based DNA damage repair in Arabidopsis. They then identified, for the first time, that the histone demethylase LDL1 interacts with RAD54 at DNA damage sites. They found that RAD54 specifically interacts with the methylated 4th lysine amino acid on one of the four core histones in the chromatin, H3 (H3K4me2). The scientists then found that LDL1 suppresses this interaction by demethylating H3K4me2. They concluded that LDL1 removes excess of RAD54 from DNA damage sites via the demethylation of H3K4me2 and thus promotes HR repair in Arabidopsis. Thus, LDL1 ensures proper dissociation of RAD54 from the HR repair site in the DNA.

Hailing this exciting result, Prof Matsunaga says, "This finding is an important addition to plant science as well as basic molecular biology. This is an extension of our previous research that showed that RAD54 accumulated in damaged sites in Arabidopsis and that excessive RAD54 suppresses damage repair, which could be dangerous to the plant. Our new study shows that LDL1 aids and improves DNA damage repair by removing RAD54 from the damaged site."

So, why are the findings of this study so important? Prof Matsunaga explains this, too. "Unlike animals, plants are stationary and, therefore, more vulnerable to environmental stresses such as high temperatures, dryness, pathogens, parasites, and poor soil conditions," says Prof Matsunaga, "and these stresses suppress the development and growth of plants by causing DNA damage. Therefore, an efficient DNA damage response is crucial to ensure optimum growth and survival of plants. Our study reveals one possible epigenetic regulation mechanism that can improve the DNA damage response in plants."

Finally, Prof Matsunaga touches upon the most important application of this group's research. "Plants can be treated with LDL1 to artificially control epigenetic modification so that they become more tolerant to stresses such as infections, environmental stresses and mechanical stress," says Prof Matsunaga. "We think that this will be useful in creating resistant varieties of crop plants with improved growth and longevity and better characteristics, thus contributing to global food security."

Credit: 
Tokyo University of Science

Species aren't adapting fast enough to cope with climate change, according to new study

image: The study focused on abundant bird species such as the common magpie and the European pied flycatcher, which are known to have developed adaptations climate change. The study also included data on turtles, contributed by Fredric Janzen, and some mammal species.

Image: 
Leibniz Institute for Zoo and Wildlife Research

AMES, Iowa - Climate change is outpacing the ability of birds and other species to adapt to their changing environment.

That's the conclusion made by an international team of scientists that includes an Iowa State University biologist. The team's recent paper in the academic journal Nature Communications evaluated more than 10,000 published scientific studies. The analysis concludes that animals can respond to climate change, but those responses generally don't allow species to cope with the rapid pace of rising temperatures. The research team included 64 researchers and was led by Viktoriia Radchuk, Alexandre Courtiol and Stephanie Kramer-Schadt from the Leibniz Institute for Zoo and Wildlife Research in Berlin, Germany.

The researchers analyzed abundant bird species such as the common magpie and the European pied flycatcher, which are known to have developed adaptations climate change. Fredric Janzen, an ISU professor of ecology, evolution and organismal biology, contributed data on turtles to the study.

"The big picture is that climate is already changing. We know this," Janzen said. "We also know that lots of organisms are responding to changing climatic conditions. What we found out is that, while these species are adapting, it's just not happening fast enough."

The international research team identified relevant data from the scientific literature to relate changes in climate over the years to possible changes in traits possessed by the species included in the study. The team then evaluated whether observed trait changes were associated with desired outcomes, such as higher survival rates or increased number of offspring.

Species respond to climate change by shifting the timing of important biological processes, such as hibernation, reproduction and migration. The study found those changes, known as phenological traits, occurred more commonly in temperate regions, where biological processes were shifted to earlier dates than in the past. Species also can experience changes in morphological traits, such as body size and mass. But the study found no systematic pattern to explain how climate change affects morphological traits.

The researchers compared the rate of climate change responses observed in the scientific literature with a rate modeled to reflect how traits would have to shift to track climate change precisely. This comparison found populations undergoing adaptive change aren't adapting fast enough to guarantee long-term persistence.

Janzen's lab has studied turtles on the Mississippi River for decades. He said his research has found the same general patterns in painted turtles noted in the new study, but the longevity enjoyed by the species can mask those trends. Because painted turtles can live for decades, it may seem like their populations are successfully adapting to altered environmental conditions brought on by climate change. But climate change may cause breeding declines in turtle populations that could threaten their existence years hence.

"Individual turtles live such a long time, it's possible we'll have populations that will be functionally extinct but won't be able to produce sufficient offspring to replenish themselves in the long term," Janzen said.

Credit: 
Iowa State University

New approach could make HVAC heat exchangers five times better

image: Turbulent heat exchangers use fluid dynamics to move heat. Plumes of warmer water rise from a hot bottom surface to a cold upper surface, where they lose their heat and sink back to the bottom. A new study shows that adding a small amount of a liquid additive to a water-based heat exchanger can speed the motion of the plumes and increase heat exchange capacity.

Image: 
Chao Sun

PROVIDENCE, RI [Brown University] -- Researchers from Tsinghua University and Brown University have discovered a simple way to give a major boost to turbulent heat exchange, a method of heat transport widely used in heating, ventilation and air conditioning (HVAC) systems.

In a paper published in Nature Communications, the researchers show that adding a readily available organic solvent to common water-based turbulent heat exchange systems can boost their capacity to move heat by 500%. That's far better than other methods aimed at increasing heat transfer, the researchers say.

"Other methods for increasing heat flux -- nanoparticle additives or other techniques -- have achieved at best about 50% improvement," said Varghese Mathai, a postdoctoral researcher at Brown and co-first author of the study, who worked with Chao Sun, a professor at Tsinghua who conceived of the idea. "What we achieve here is 10 times more improvement than other methods, which is really quite exciting."

Turbulent heat exchangers are fairly simple devices that use the natural movements of liquid to move heat. They consist of a hot surface, a cold surface and tank of liquid in between. Near the hot surface, the liquid heats up, becomes less dense and forms warm plumes that rise toward the cold side. There, the liquid loses its heat, becomes denser and forms cold plumes that sink back down toward the hot side. The cycling of water serves to regulate the temperatures of each surface. This type of heat exchange is a staple of modern HVAC systems widely used in home heaters and air conditioning units, the researchers say.

In 2015, Sun had the idea to use an organic component known as hydrofluoroether or HFE to speed the cycling of heat inside this kind of exchanger. HFE is sometimes used as the sole fluid in heat exchangers, but Sun suspected that it might have more interesting properties as an additive in water-based systems. Working with the study's co-first author Ziqi Wang, Mathai and Sun experimented with adding small amounts of HFE and, after three years of work, were able to maximize its effectiveness in speeding heat exchange. The team showed that concentrations of around 1% HFE created dramatic heat flux enhancements up to 500%.

Using high-speed imaging and laser diagnostic techniques, the researchers were able to show how the HFE enhancement works. When near the hot side of the exchanger, the globules of HFE quickly boil, forming biphasic bubbles of vapor and liquid that rise rapidly toward the cold plate above. At the cold plate, the bubbles lose their heat and descend as liquid. The bubbles affect the overall heat flux in two ways, the researchers showed. The bubbles themselves carry a significant amount of heat away from the hot side, but they also increase the speed of the surrounding water plumes rising and falling.

"This basically stirs up the system and makes the plumes move faster," Sun said. "Combined with the heat that the bubbles themselves carry, we get a dramatic improvement in heat transfer."

That stirring action could have other applications as well, the researchers say. It could be useful in systems designed to mix two or more liquids. The extra stir makes for faster and more complete mixing.

The researchers pointed out that the specific additive they used -- HFE7000 -- is non-corrosive, non-flammable and ozone friendly. One limitation is that the approach only works on vertical heat exchange systems -- ones that move heat from a lower plate to an upper one. It doesn't currently work on side-to-side systems, though the researchers are considering ways to adapt the technique. Still, vertical exchangers are widely used, and this study has shown a simple way to improve them dramatically.

"This biphasic approach generates a very large increase in heat flux with minimal modifications to existing heating and cooling systems," Mathai said. "We think this has great promise to revolutionize heat exchange in HVAC and other large-scale applications."

Credit: 
Brown University

System to image the human eye corrects for chromatic aberrations

image: These are images of the smallest cone photoreceptors in the retina, about 2 microns wide. Coloration was added to represent the different wavelengths of light used to capture the images after compensating chromatic aberration.

Image: 
Xiaoyun Jiang, Ramkumar Sabesan, University of Washington

WASHINGTON -- Researchers report a new imaging system that cancels the chromatic optical aberrations present in a specific person's eye, allowing for a more accurate assessment of vision and eye health. By taking pictures of the eye's smallest light-sensing cells with multiple wavelengths, the system also provides the first objective measurement of longitudinal chromatic aberrations (LCA), which could lead to new insights on their relationship to visual halos, glare and color perception.

In Optica, The Optical Society's journal for high-impact research, the researchers, from the University of Washington, Seattle, U.S.A., say the technology can be readily deployed in the clinic, where it could be particularly useful for assessing eye changes associated with aging and can also help inform the design of new multifocal lenses by accounting for chromatic aberrations in the lenses themselves. For vision research, the technique could advance studies of color blindness and how different people perceive color.

"The previous methods of compensating the eye's native LCA rely on population average estimates, without individualized correction on a person-by-person basis," said research team leader, Ramkumar Sabesan. "We demonstrate a modified filter-based Badal optometer that offers the capability to tune LCA across different wavelength bands and for each individual in a customized fashion."

The researchers report incorporating a new optical assembly into conventional adaptive optics instruments to produce individually tailored high-resolution, multiple-wavelength pictures of the smallest cone photoreceptors in the eye, measuring about 2 microns across.

"Our study establishes a flexible tool to compensate for chromatic aberration in different wavelength bands and in an individualized manner, thus facilitating future investigations into how we see color in our environment, unimpeded by the native chromatic imperfections of the individual," said Sabesan. "Now equipped with the tools to control chromatic aberration, we plan to conduct studies on normal and deficient color vision."

Compensating for aberrations

Like manufactured optical elements such as microscopes and camera lenses, the cornea and lens of the eyeball contain optical aberrations that distort the image formed on the retina. Aberrations blur the images projected on a person's retina, degrading his/her vision. They also affect the images doctors obtain when viewing the inside of the eye with ophthalmologic instruments.

Adaptive optics is a method to compensate for these aberrations. Adaptive optics technology, currently utilized by astronomers to address aberrations that occur when viewing space through Earth's atmosphere, have been incorporated into eye imaging tools. However, while current instruments are effective at correcting for monochromatic aberrations (those that do not change depending on the wavelength of light being applied), chromatic aberrations (those that are affected by wavelength) are more challenging.

To get around this problem, today's instruments use assumptions about the aberrations expected in an average or "typical" eye, rather than information about the actual aberrations in a specific person's eye. While this is sufficient for many applications, it is less suited for other applications that demand the simultaneous and fine focus control of multiple wavelengths.

To overcome this limitation, the researchers used a device known as a Badal optometer, which consists of a pair of lenses that are a certain distance apart. Changing the distance between the two lenses changes the focus without altering the size of an image viewed through the lenses.

The researchers modified this simple Badal optometer by adding two filters that transmit longer wavelengths of light while reflecting shorter ones. These filters were kept stationary within a traditional Badal optometer, such that now, when the distance between the lenses is changed, the transmitted and reflected wavelength bands have subtly different levels of focus sufficient to compensate for the eye's native chromatic aberration for the two wavelength bands.

By finely tuning the selection of filters, distances between the lenses and multiple color illumination, this setup can be used collectively to measure and compensate for chromatic aberration in a customized fashion.

A valuable tool for the clinic and the laboratory

The researchers implemented their new LCA compensator in two different adaptive optics instruments: adaptive optics vision simulation and adaptive optics scanning laser ophthalmoscopes. They used the new instruments to image the eyes of human volunteers.

They found that the new method successfully overcame inconsistencies in previous estimates of the human eye's native LCA related to depth of focus, monochromatic aberration and wavelength-dependent light interactions with retinal tissue. When both monochromatic and chromatic aberrations were compensated, a person's vision was limited only by the arrangement of cone photoreceptors - light-detecting cells - in the retina, while removing the chromatic aberration compensation meant that either red or green vision was optimized.

The researchers also demonstrated the system's ability to image the smallest cone photoreceptors with multiple wavelengths simultaneously by minimizing chromatic aberration, showing that the Badal LCA compensator offers a fine level of detail, an important advance for enabling color vision research.

In addition to providing better images of the inside of the retina, the technology is useful for studying how chromatic aberrations affect retinal image quality and visual performance. This has previously been difficult because tools that provide fine individualized control of LCA did not exist. Also, measurements of LCA obtained subjectively and objectively did not match up.

"By applying the technology to two different adaptive optics-based modalities, we show a high fidelity of visual performance and retinal imaging once chromatic and monochromatic aberrations are compensated," said Sabesan. "The high resolution retinal images thus obtained allowed us to quantify chromatic aberration objectively and compare against a large body of literature dedicated to measuring chromatic aberration."

Credit: 
Optica

Study compares HIV, cancer treatments, outcomes in older patients

Bottom Line: This study compared outcomes after a cancer diagnosis in patients with and without HIV who were 65 or older, had similar stages of cancer, and had received stage-appropriate cancer treatment in the year following diagnosis.

Authors: Anna E. Coghill, Ph.D., M.P.H., of the H. Lee Moffitt Cancer Center & Research Institute, in Tampa, Florida, is the corresponding author.

(doi:10.1001/jamaoncol.2019.1742)

Editor's Note: The article includes funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

#  #  #

Media advisory: The full study is linked to this news release.

Embed this link to provide your readers free access to the full-text article: This link will be live at the embargo time: https://jamanetwork.com/journals/jamaoncology/fullarticle/2740690?guestAccessKey=56450181-17df-4f40-a766-345a9b5ecf85&utm_source=For_The_Media&utm_medium=referral&utm_campaign=ftm_links&utm_content=tfl&utm_term=080119

Credit: 
JAMA Network

Do yellow-lens night-driving glasses improve visibility, reduce headlight glare?

What The Study Did: With the use of a driving simulator, three commercially available yellow-lens night-driving glasses were compared with clear lenses to examine their ability to detect pedestrians or reduce the negative effects of headlight glare. The study included 22 participants.

Authors: Alex D. Hwang, Ph.D., of Harvard Medical School in Boston, is the corresponding author.

(doi:10.1001/jamaophthalmol.2019.2893)

Editor's Note: The article includes funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

#  #  #

Media advisory: The full study and commentary are linked to this news release.

Embed this link to provide your readers free access to the full-text article This link will be live at the embargo time: https://jamanetwork.com/journals/jamaophthalmology/fullarticle/2740739?guestAccessKey=eb9fae31-be7e-467e-b28a-99227617a891&utm_source=For_The_Media&utm_medium=referral&utm_campaign=ftm_links&utm_content=tfl&utm_term=080119

Credit: 
JAMA Network

Scientists identified a new signaling component important for plant symbiosis

image: Confocal microscopy images showing NICK4-GFP translocation to the nucleus upon perception of nod factors in Lotus japonicus roots.

Image: 
Marcin Nadzieja/AU

A proteomics-based protein-protein interaction study has led to the discovery of proteins that interact with a legume receptor that mediates signal transduction from the plasma membrane to the nucleus. This shows how symbiotic signals from symbiotic bacteria are transmitted upon perception, ultimately leading to their accommodation within the host plant.

Legumes are of significant agricultural importance mainly due to their abilities to establish symbiotic relationship with nitrogen-fixing bacteria known as rhizobia. A deeper understanding of biological nitrogen fixation (BNF) and subsequent transfer of this knowledge to crop plants would allow us to circumvent the use of fertilizers and grow crops sustainably. In addition, the successful transfer of BNF to non-legume crops would especially benefit smallholder farmers who can then increase crop yield without facing cash constraints associated with the purchase of inorganic fertilizers.

To contribute to this goal, members of the Plant Molecular Biology group at Aarhus University directed by Professor Jens Stougaard have dedicated their research to understanding legume-rhizobial symbiosis. In 2003, the group identified the plasma-membrane localized Nodulation (Nod) factor receptors 1 and 5 (NFR1 and NFR5) responsible for the recognition of compatible symbionts; Nod factors are symbiotic signalling molecules that vary in structure depending on the rhizobium species. Using a stringent lock (Nod factor receptors) and key (Nod factor) mechanism, only compatible rhizobia are allowed to enter the plant while incompatible bacteria will not be able to infect and colonize the root nodules.

A breakthrough after 15 years’ research

For over 15 years, components involved in directly relaying Nod factor signals downstream of Nod factor receptors have remained elusive, limiting the researchers’ understanding of how these Nod factor receptors ultimately lead to changes in root hair structure and formation of new organs (nodules) required for rhizobia entry and accommodation. A breakthrough was finally achieved using an elegant Proteomics approach and by taping on the expertise of colleagues from Cyril Zipfel’s research group (The Sainsbury Laboratory, UK) who have constantly unravelled new players in plant defence signalling.

The work published in the PNAS journal describes how the researchers fished for interactors using NFR5 as bait. Similar to signalling processes involved in plant defence, vegetative, and reproductive growth, they observed that a receptor-like cytoplasmic kinase is pivotal for transducing signals downstream of receptors after ligand perception in symbiosis signalling. The researchers named this cytoplasmic kinase NFR5-interacting cytoplasmic kinase 4 (NiCK4). They hypothesize that upon Nod factor perception, a phosphorylation cascade involving NiCK4, NFR1 and NFR5 results in NiCK4 localization to the nucleus where several critical legume symbiosis components are present, and subsequently promote the formation of nodules used to house compatible rhizobia.

The Nod factor and Nod factor receptor triggered movement of NiCK4 from the plasma membrane to the nucleus is very exciting data as calcium oscillations in the nucleus is a hallmark of symbiosis signalling in legumes. Following this discovery, the research team hopes to assemble and connect more symbiosis signalling components. A thorough understanding of all components involved in the symbiosis signalling pathway is crucial for successful transfer of BNF to non-legume crops.

Credit: 
Aarhus University

Levels of 'ugly cholesterol' in the blood are much higher than previously imagined

The amount of remnant particle cholesterol in the blood, the so-called ugly cholesterol, is much higher than previously believed. This is shown in new research from the University of Copenhagen and Copenhagen University Hospital. The discovery may have implications for future prevention and treatment of cardiovascular disease.

Three quarters of the Danish population have moderately elevated levels of cholesterol. If cholesterol levels are too high, risk of cardiovascular disease is increased. Often, LDL cholesterol, the so-called bad cholesterol, is considered the culprit. However, new research from the Faculty of Health and Medical Sciences at the University of Copenhagen and Copenhagen University Hospital shows that a completely different type of cholesterol may be more responsible than previously assumed. What we are talking about is remnant cholesterol - also known as ugly cholesterol.

To their surprise, the researchers have discovered that the amount of remnant cholesterol in the blood of adult Danes is much higher than previously believed. From the age of 20 until the age of 60, the amount in the blood is constantly increasing, and for many people it remains at a high level for the rest of their lives.

'Our results show that the amount of remnant cholesterol in the blood of adult Danes is just as high as the amount of the bad LDL cholesterol. We have previously shown that remnant cholesterol is at least as critical as LDL cholesterol in relation to an increased risk of myocardial infarction and stroke, and it is therefore a disturbing development', says Professor and Chief Physician Børge Nordestgaard from the University of Copenhagen and Copenhagen University Hospital.

The results are based on data from people from the Copenhagen General Population Study. A total of 9,000 individuals had cholesterol in their fat particles in the blood measured by means of new advanced measuring equipment, known as 'metabolomics'. The measurements show that total cholesterol in the blood consists of equal parts of "ugly", "bad" and "good" cholesterol.

Overweight and Obesity Are the Main Cause

'Previous studies from the Copenhagen General Population Study show that overweight and obesity are the main cause of the very high amount of remnant cholesterol in the blood of adult Danes. In addition, diabetes, hereditary genes and lack of exercise play a part', says one of the authors, MD Mie Balling from the University of Copenhagen and the Department of Clinical Biochemistry, Copenhagen University Hospital.

In 2018, a large international, controlled clinical trial was published that clearly showed that when triglycerides and thus remnant cholesterol were reduced by the help of medication in people with elevated levels in the blood, the risk of cardiovascular disease was reduced by 25%.

'Our findings point to the fact that prevention of myocardial infarction and stroke should not just focus on reducing the bad LDL cholesterol, but also on reducing remnant cholesterol and triglycerides. So far, both cardiologists and GPs have focused mostly on reducing LDL cholesterol, but in the future, the focus will also be on reducing triglycerides and remnant cholesterol', says Professor Børge Nordestgaard.

According to Børge Nordestgaard, the most important thing you yourself can do to achieve the lowest possible level of remnant cholesterol and triglycerides in the blood is to maintain a normal body weight.

The three kinds of cholesterol:

Remnant cholesterol = ugly cholesterol: the cholesterol content in triglyceride-rich lipoproteins or remnant particles. Elevated remnant cholesterol leads to cardiovascular disease.

LDL cholesterol = bad cholesterol: the cholesterol content in low-density lipoprotein (LDL). Elevated LDL cholesterol leads to cardiovascular disease.

HDL cholesterol = "good" cholesterol = innocent cholesterol: the cholesterol content in high-density lipoprotein (HDL). Levels of HDL cholesterol does not affect cardiovascular disease risk.

Credit: 
University of Copenhagen - The Faculty of Health and Medical Sciences

Two fraudsters, one passport

Computers are more accurate than humans at detecting digitally manipulated ID photos, which merge the images of two people, new research has found.

Face morphing is a method used by fraudsters in which two separate identity photographs are digitally merged to create a single image that sufficiently resembles both people. This image is then submitted as part of the application for a genuine passport or driving licence, and if accepted, potentially allows both people to use the same genuine identification document without arousing suspicion.

A new study by psychologists at the University of Lincoln asked participants in one experiment to decide whether an image showed the person standing in front of them. In this task, participants accepted the digitally created morphs around half of the time, while a basic computer model could correctly identify morphs over two thirds of time.

The research used high quality 'face morphs' over a series of four experiments which included screen-based image comparison tasks alongside a live task, designed to mimic a real-life border-control situation in which an agent would have to accept or reject a passport image based on its resemblance to the person in front of them.

Results showed that participants not only failed to spot 51 percent of these fraudulent images, but once they were provided with more information on face-morphing attacks, detection rates only rose to 64 percent. In another experiment, the researchers showed that training did not help participants to detect morphs presented onscreen, and detection rates remained around chance level. The results suggest that the morphs were accepted as legitimate ID photos often enough that they may be feasible as tools for committing fraud, especially in border control situations where the final acceptance decision is often made by a human operator.

When similar images were put through a simple computer algorithm trained to differentiate between morphs and normal photos, 68 percent of the images were correctly identified as morph images, showing the programme to be significantly more accurate than human participants. The algorithm used was relatively basic as a demonstration, and recent software being developed by computer scientists is far more sophisticated and shows even greater levels of success.

Lead researcher Dr Robin Kramer from the University of Lincoln's School of Psychology said: "The advancements and availability of high quality image editing software has made these kinds of 'face morphing attacks' more sophisticated and the images harder to detect.

"Our results show that morph detection is highly error-prone and the level at which these images were accepted represents a significant concern for security agencies. Training did not provide a useful solution to this problem.

"Our research could be significant for security agencies and suggests that the use of computer algorithms may be a better method for minimising how often these kinds of morphing attacks slip through the net."

Credit: 
University of Lincoln

Bats use leaves as mirrors to find prey in the dark

image: This is a portrait of Micronycteris microtis.

Image: 
Inga Geipel

On moonless nights in a tropical forest, bats slice through the inky darkness, snatching up insects resting silently on leaves--a seemingly impossible feat. New experiments at the Smithsonian Tropical Research Institute (STRI) show that by changing their approach angle, the echolocating leaf-nosed bats can use this sixth sense to find acoustically camouflaged prey. These new findings, published in Current Biology, have exciting implications for the evolution of predator-prey interactions.

"For many years it was thought to be a sensory impossibility for bats to find silent, motionless prey resting on leaves by echolocation alone," said Inga Geipel, Tupper Postdoctoral Fellow at STRI. Geipel's team discovered how the bats achieve the impossible. By combining evidence from experiments using a biosonar device to create and measure artificial signals, with evidence from high-speed video observations of bats as they approach prey, the importance of the approach angle was revealed.

Bats have a superpower humans do not share: they flood an area with sound waves and then use information from the returning echoes to navigate through the environment. Leaves reflect echolocation signals strongly, masking the weaker echoes from resting insects. So in the thick foliage of a tropical forest, echoes from the leaves may act as a natural cloaking mechanism for the insects, known as acoustic camouflage.

To understand how bats overcome acoustic camouflage and seize their prey, the researchers aimed sound waves at a leaf with and without an insect from more than 500 positions in order to create a full, three-dimensional representation of the echoes. At each position, they calculated the intensity of the echoes for five different frequencies of sound that represent the frequencies of a bat's call.

Leaves both with and without insects strongly reflect back the sound if it comes from straight ahead (i.e., from angles smaller than 30 degrees). When a bat approaches from these angles, it cannot find its prey as strong echoes from the leaves mask the echoes from the insect. But Geipel and colleagues found that if the sound originates from oblique angles greater than 30 degrees, the sound is reflected away from the source and leaves act like a mirror, just as a lake reflects the surrounding forest at dusk or dawn. The approach angle makes a resting insect detectable.

Based on these experiments, Geipel and colleagues predicted that bats should approach resting insects on leaves from angles between 42 and 78 degrees, the optimal angles for discerning whether a leaf has an insect on it or not.

Next, Geipel recorded actual bats at STRI's Barro Colorado Island research station in Panama as they approached insects positioned on artificial leaves. Using recordings from two high-speed cameras, she reconstructed the three-dimensional flight paths of the bats as they approached their prey and determined their positions. She discovered that, as predicted, almost 80 percent of the approach angles were within the range of angles that makes it possible for the bats to distinguish insect from leaf.

"This study changes our understanding of the potential uses of echolocation," Geipel said. "It has important implications for the study of predator-prey interactions and for the fields of sensory ecology and evolution."

Credit: 
Smithsonian Tropical Research Institute