Culture

Conserving rare species for the maintenance of Mediterranean forests

A study led by researchers from the Department of Plant Biology and Ecology at the University of Seville has shown the importance of conserving rare species for the maintenance of complex ecosystems like Mediterranean forests. Therefore, for these species, it becomes essential to understand the factors that make conservation successful. This research has been published in the review Forest Ecology and Management, an important publication in the field of forestry management.

Specifically, this work focused on the pine forests in the Iberian Peninsula. Currently, these forests are exposed to threats that will worsen in the future, including the consequences of climate change (which are severe in this area of the Mediterranean) and disturbances caused by human beings (forest exploitation, expansion of farming, etc.)

In the case study, the start of mining activity made it necessary to manage a pine forest, situated in Niebla (Huelva), near to the Doñana National Park. One of the species present in this forest system is a woodland carnation, Dianthus inoxianus. "It is not a very common plant, and is only present in this and other pine forests near the provinces of Huelva and Seville. It is always found in sandy soil. This carnation is officially catalogued as an endangered species and its uniqueness in numerous characteristics has been shown in previous studies. Due to this uniqueness, it probably has a key role in the ecosystem that it lives in, so it should be considered in any local conservation and restoration plan", states the University of Seville teacher Francisco Balao. For that reason, he adds, an experiment was carried out in a fenced-off area within the pine forest in the study, with the aim of understanding the best conditions for "translocating" it, that's to say, for planting it somewhere else as a means of conservation for this species of carnation.

This study indicated that the success of translocation is limited by two periods of stress: the phase immediately after planting and the first summer. To survive these stages, the experts have shown that various factors are important, like having a mild climate during the first weeks and being properly watered until the end of the first summer. As for climate, radiation and temperature influenced the success of translocation. Also, the method of translocation had an effect on the results that were obtained. So, the situation is different if the plant is transported directly from its natural habitat or if it is brought from a greenhouse. Due to all this and evaluating the economic cost, the best options according to this University of Seville research group is to translocate plants in winter that have been picked directly from their habitat (without the need for watering) or, in spring, to translocate plants grown in a greenhouse and to water them until the end of the first summer.

The use of this information will improve future conservation plans for Mediterranean pine forests and this species of endangered carnation.

Credit: 
University of Seville

Researchers explore the many factors impacting the pH of dicamba spray mixtures

image: Tractor spraying crops.

Image: 
Shutterstock/ Valentin Valkov

WESTMINSTER, Colorado - September 11, 2019 - The EPA now requires new dicamba formulations registered for dicamba-resistant crops to have a pH of 5.0 or higher because of volatility and off-target damage concerns. When it comes to applying spray mixtures under field conditions, though, how do you ensure that pH remains sufficiently high?

In an article written for the journal Weed Technology, researchers summarize studies to determine the pH effect of various commercial products used in dicamba-based spray mixtures - including dicamba formulations, glyphosate, drift retardant, ammonium sulfate and several pH modifiers. In each instance, the products were added to water with an initial pH of 4.6 to 8.4.

Factors Increasing pH. The team found that the BAPMA salt formulation of dicamba (Engenia) increased spray solution pH. In addition, all the pH modifiers they tested raised pH above 5.0 - the critical value on the latest dicamba application labels.

Factors Decreasing pH. Adding potassium salt of glyphosate to dicamba spray mixtures decreased pH by 1 to 2.1 units, which could have a profound effect on dicamba volatility. Isopropylamine salts of glyphosate produced similar results. Ammonium sulfate, which is commonly used to increase the activity of glyphosate, also decreased pH - but typically by less than 0.5 pH units.

Factors Producing Mixed Results. In contrast to BAPMA, diglycolamine formulations of dicamba plus the pH modifier VaporGrip™ produced a mixed response.

Factors with Little to No Impact. The drift retardant Intact had no effect on pH. Spray carrier volume and the mixing order of various pH modifiers had only limited influence.

"Though no direct efficacy or volatility measurements are made in our report, it is clear that having an accurate understanding of what is happening to spray mixture pH is foundational to the sustainable and environmentally responsible use of dicamba products," says Tom Mueller, Ph.D., lead author of the paper and a professor at the University of Tennessee.

Credit: 
Cambridge University Press

Focusing on key sustainable development goals would boost progress across all, analysis finds

By using a mathematical network analysis to map the relationships identified by an International Council for Science report, the University of Bath research reveals that direct efforts focussed on a critical few: Life below Water, Life on Land, and Gender Equality, would reinforce the virtuous circles buried in the network and hence lead to greater overall progress.

The UN Sustainable Development Goals (SDGs) were agreed in 2015 as a blueprint for a sustainable world to be achieved by the year 2030. The seventeen goals cover economic, societal and environmental issues including ending poverty and hunger, providing quality education, reducing inequalities, conserving marine ecosystems and acting on climate change, and between them contain 169 separate targets.

Professor Jonathan Dawes from the Centre for Networks and Collective Behaviour and the Department of Mathematical Sciences, University of Bath, set out to analyse the direction of influences between all the goals using a systems perspective - the first time this has been done.

His analysis draws on an influential report from the International Council for Science (ICSU, now the International Science Council - ISC) and reveals that if resources were unequally allocated, and goals that were further 'upstream' in the system were prioritised, then this prioritisation would generate many more positive impacts. It would also mitigate the trade-offs that would otherwise mean achieving some goals only at the expense of others. The clearest senses of direction in these effects are between goals 4-16 and goals 1-3 (no poverty, zero hunger, and good health and wellbeing).

Progress on almost every one of the broader societal and environmental objectives encapsulated within goals 4-16 will have significant positive impacts on goals 1-3. For example the ICSU Report notes that achieving gender equality (goal 5) is fundamental to poverty eradication (goal 1), will result in increased engagement in food security and initiatives to improve nutrition (goal 2), and will cause greater attention to be given to the need for access to healthcare (goal 3). In these ways, progress on goal 5 will, in itself, have significant positive impacts on progress towards goals 1-3; but the impacts the other way around are far smaller; it is possible to meet goals 1-3 without achieving gender equality. This pattern is repeated in many other cases: in order to achieve every goal, it then follows that more attention should be directly focused on goals 4-16 rather than goals 1-3.

The study is published in the journal Sustainable Development.

Professor Dawes, who is also Director of the Institute for Mathematical Innovation at the University of Bath, said: “Prioritisation is both politically and operationally challenging, but by focusing efforts on a few key areas we could sharpen up the SDG messages to ensure that this incredibly broad global agenda results in clearer global action.

"This analysis, using a very simple mathematical model, takes a systems level look at all the linkages between the goals and what we see is a directionality - or a domino effect - in that progress in some areas leads to more progress in others.

"Some goals are better reinforced by others in virtuous circles, but others aren't: we need to prioritise and support the goals that aren't in order to make sure that the system as a whole is successful."

Prof Dawes would like to see the wider academic community, particularly in applied mathematics and economics, contribute more to the challenge of understanding the linkages between the SDGs, including understanding variations from country to country and over time.

He said: "Almost all academic research focusses on issues related to one or only a handful of SDGs and targets. In contrast, this research is intended to be complementary - looking at the system-level linkages between goals, and their implications. It's definitely not the last word on the topic: these are complex problems, and there's a great opportunity for the academic community to become more involved in these systemic issues.

"Modelling also allows us to make predictions as to which goals are most likely to be achieved by 2030 under 'business as usual' scenarios, and then to propose alternatives. We're privileged as academic researchers to able to discuss courses of action that politicians would find very difficult to introduce into the public debate.

"Perhaps we can follow the example of climate modellers, where different computer models make predictions that vary in detail, but when taken together provide consistent estimates of future trends. Multiple research clusters examining the SDGs and their linkages from different viewpoints might allow us to pick out common factors - and a systems level approach gives us the ability to pull out clear priorities from a complicated, difficult picture. "

On September 24-25 world leaders will meet at the UN for the first Sustainable Development Goals Summit since the adoption of the 2030 Agenda, to discuss the 2019 Global Sustainable Development Report, and highlight SDG 'acceleration actions', as part of the Action for People and Planet week within the 74th Session of the UN General Assembly.

Credit: 
University of Bath

Shoppers more likely to pay for upgrades when extra cost is an 'add-on,' study finds

Shoppers are up to one-third more likely to shell out for the premium option when the extra cost is expressed as an add-on, as opposed to a higher overall price, according to new research from the UBC Sauder School of Business.

The study, published recently in the Journal of Marketing Research, is the first of its kind to examine the effect of add-on pricing on product upgrades. The researchers say consumers could benefit from this research by being more aware of how pricing may influence their shopping decisions.

"Imagine booking a plane ticket - comparing a ticket that is $200 when it involves a two-hour layover with a ticket to fly direct for $250. Put another way, a regular ticket is $200, but upgrading to a direct flight costs $50 more. Which option is more appealing?" asked David Hardisty, study co-author and assistant professor of marketing and behavioural science at UBC Sauder.

The answer boils down to dollars and cents, said Hardisty. Consumers perceive $250 as expensive because the number is higher than the base price of $200, whereas $50 as an add-on price seems inexpensive.

"When you see '$50 more' as an add-on price, it's a smaller number than the total, and we focus on that smaller number," said co-author Dale Griffin, professor and advisory council chair in consumer behaviour at UBC Sauder. "Mathematically, the prices are the same, and on consideration we can see that, but intuitively add-on prices just feel less expensive."

The researchers found this effect applied whether participants were being asked to donate to a local food bank, buy a computer monitor, choose an external hard drive or even order breakfast. They also observed this effect when reminding consumers of the final price of their purchase, suggesting that the shift in preference does not occur because of deception or confusion, but rather because of how people justify their purchase decisions. However, the effect only occurs with pricing, not with other kinds of product upgrades. For example, if shoppers are looking at a two-terabyte hard drive, a four-terabyte hard drive is no more appealing than one that is presented as "two terabytes more."

But not everyone is susceptible to the add-on pricing effect. "Individuals who are very careful and deliberate when making decisions naturally compare prices whether they are expressed as included or as add-ons," Hardisty said.

The information could prove invaluable to retailers and other businesses who provide "premium" products and services -- while also benefiting consumers the next time they're offered an add-on price.

"Businesses typically earn higher margins on more expensive products, so it would be good for them to use the add-on price framing if they want to promote these kind of higher quality items," said Hardisty. "For the consumer, it's good to be aware of how these different price frames influence you. Why are they doing that? And what effect is that having on me? Now we know."

Credit: 
University of British Columbia

Soil scientist researches nature versus nurture in microorganisms

image: Ember Morrissey

Image: 
West Virginia University

A West Virginia University researcher used science and data to solve the timeless argument of nature versus nurture - at least when it comes to microorganisms.

Ember Morrissey, assistant professor of environmental microbiology in the Davis College of Agriculture, Natural Resources and Design, uncovered that nature significantly affects how the tiny organisms under our feet respond to their current surroundings.

"We found that evolutionary history (nature) shapes the traits of microbes in the soil more than their local environment (nurture)," she said. "I'm hopeful that we can use that information to make predictions about undiscovered species and organisms."

Microorganisms cycle nutrients and play a vital role in the carbon cycle. According to Morrissey, understanding them better will help inform decisions related to fighting climate change.

"Soils contains more carbon than what is found in the atmosphere and plants," she explained. "Microorganisms break down and consume this carbon as they live and grow, converting it into the greenhouse gas carbon dioxide. Consequently, the activity of microorganisms in soil has the potential to alleviate or worsen climate change, so we need to form predictions regarding their activities."

Her research focused on four different ecosystems in Arizona that varied in temperature and moisture. Measuring growth rate and carbon assimilation rate, Morrissey and her team discovered patterns in the biodiversity of soil microorganisms in each ecosystem.

They were able to identify hundreds of species present at all four sites and assess their evolutionary history. In some families - or groups of evolutionarily-related species - all member species grew quickly while in others they grew slowly.

According to Morrissey, this pattern of growth indicates that evolutionary history is a significant factor in determining the function of microbes.

"This enabled us to decide whether the functional traits of microorganisms are determined more by their evolutionary history or local environment," she said. "Because the activity of species within a family are similar, and consistent across environmental variation, we may be able to generalize about how a species is likely to behave in a variety of environments based on what family it belongs to."

Morrissey's goal is to better understand microbial activity in the environment to help scientists predict how ecosystems respond to change. With more accurate predictions, they can help manage ecosystems to mitigate some of the human impacts on the environment.

"I think it is important for people to be aware that the soil under their feet is alive and is playing a really important role in determining the health of our ecosystems," she said. "Microbes in the soil respond to all the different ways humans are changing the environment."

Credit: 
West Virginia University

Study of newly homeless ED patients finds multiple contributors to homelessness

image: Semistructured interviews, 31 English-speaking adult patients newly homeless (shelter or street living) in previous six months.

Image: 
KIRSTY CHALLEN, B.SC., MBCHB, MRES, PH.D., LANCASHIRE TEACHING HOSPITALS, UNITED KINGDOM.

DES PLAINES, IL -- A qualitative study of recently homeless emergency department (ED) patients found multiple contributors to homelessness that can inform future homelessness prevention interventions. The study findings are published in the September 2019 issue of Academic Emergency Medicine (AEM), a journal of the Society for Academic Emergency Medicine (SAEM).

The lead author of the study is Kelly M. Doran, MD, MHS, assistant professor in the Departments of Emergency Medicine and Population Health at NYU School of Medicine and and Bellevue Hospital Center.

The study is the first to examine pathways to homelessness among ED patients. The findings of the study are discussed in a recent AEM podcast, "It Wasn't Just One Thing": A Qualitative Study of Newly Homeless Emergency Department Patients."

Homelessness plays an oversized role in U.S. EDs, in part due to the ED's role as a medical and social safety net and in part due to the greater than average health needs of people who are homeless. The researchers found that among the contributors to homelessness are unexpectedness, health and social conditions, lack of support from family or friends, and structural issues such as the job market and affordable housing availability.

The findings demonstrate gaps in current homeless prevention services and can help inform future interventions for unstably housed and homeless ED patients. More broadly, the findings may help ED providers to better understand the life experiences of their patients that contribute to their health and ED use.

Commenting on the study is Lewis R. Goldfrank, MD, Herbert W. Adams Professor of Emergency Medicine at Bellevue Hospital Center and the Ronald O. Perelman Department of Emergency Medicine of the New York University School of Medicine:
 

"This fascinating qualitative study demonstrates that listening carefully to our patient's needs will allow us to discern social determinants that left unattended may lead to homelessness. Our task in the emergency department is to presume that each patient who comes to our doors is there because of a critical lesion in the public health system. In addressing these lesions, we will begin to achieve our dreams of preventing homelessness."

Credit: 
Society for Academic Emergency Medicine

Mako shark tracking off west coast reveals 'impressive' memory and navigation

image: Mako sharks demonstrate impressive memories and navigational skills by returning to Southern California regularly to feed and reproduce, scientists found after tagging 105 mako sharks.

Image: 
NOAA Fisheries/Walter Heim

The largest effort ever to tag and track shortfin mako sharks off the West Coast has found that they can travel nearly 12,000 miles in a year. The sharks range far offshore, but regularly return to productive waters off Southern California, an important feeding and nursery area for the species.

The findings demonstrate "an impressive show of memory and navigation." The sharks maneuver through thousands of miles of the Pacific but return to where they have found food in years past, said Heidi Dewar, a research fisheries biologist at NOAA Fisheries' Southwest Fisheries Science Center in La Jolla, California.

Researchers tagged 105 mako sharks over 12 years--from 2002 to 2014. The tags record the sharks' movements, as well as the environments the sharks pass through. Researchers have long recognized that ocean waters from Santa Barbara south to San Diego, known as the Southern California Bight, are an important habitat for mako sharks. Prior to this study, however, they knew little about what the sharks do and where they went beyond those waters.

The researchers are from NOAA Fisheries, Stanford University, Tagging of Pacific Predators, and the Center for Scientific Research and Higher Education in Baja California. They reported their results in the journal Animal Biotelemetry.

"We did not know what their overall range was. Were there patterns that they followed?" asked Nicole Nasby-Lucas, a NOAA Fisheries research scientist at the Southwest Fisheries Science Center and lead author of the new research. "It turns out they have their own unique movement patterns." Sharks tracked over multiple years returned to the same offshore neighborhoods year after year.

Long-Range Travelers

The tagging data overall revealed that the sharks travel widely along the West Coast. They venture as far north as Washington, as far south as Baja California, and westward across the Pacific as far as Hawaii. The sharks tagged off California remained on the eastern side of the Pacific east of Hawaii. This indicates that they do not mix much with mako sharks in other parts of the Pacific.

Although there are examples of mako sharks crossing the ocean, it is probably the exception rather than the rule, said Dewar, a coauthor of the new research.

The finding provides insight into population dynamics of mako sharks across the Pacific. It also allows scientists to identify which fisheries the tagged mako sharks might encounter. Muscular mako sharks are a popular sport fishing target. They are also caught in U.S. longline and drift gillnet fisheries and are common in the international trade in shark fins. Mako sharks are overfished in the Atlantic Ocean, but not in the Pacific.

The researchers used two types of tags to track the sharks. One type, called pop-up tags, collect data and eventually pop off the animal and float to the surface, where they transmit their data via satellite. The second type transmits data to satellites each time the shark surfaces, determining the animal's location by measuring tiny shifts in the frequency of the radio transmission.

Remembering Southern California

Mako sharks are among the fastest swimmers in the ocean, hitting top speeds of more than 40 miles per hour. The larger tagged sharks traveled an average of about 20 miles a day and a maximum of about 90 miles per day. They travel long distances in part because they must swim to move water through their gills so they can breathe, Dewar said.

Large numbers of juvenile sharks caught in the Southern California Bight indicate that it is a nursery area for the species. Tagged mako sharks returned there annually, most typically in summer when the waters are most productive. The tracks of the tagged sharks may look at first like random zig-zags across the ocean, Dewar said. They actually illustrate the sharks searching for food and mates based on what they remember from previous years.

"If you have some memory of where food should be, it makes sense to go back there," Dewar said. "The more we look at the data, the more we find that there is a pattern behind their movements."

The tagging results also provide a wealth of data that scientists can continue to plumb for details of the sharks' biology and behavior. About 90 percent of the time the sharks remained in the top 160 feet of ocean, for example, occasionally diving as deep as 2,300 feet. Although the sharks traveled widely, they mainly stayed in areas with sea surface temperatures between about 60 and 70 degrees Fahrenheit.

"We can continue to ask new questions of the data to understand these unique movement patterns," Nasby-Lucas said. "There's a lot more to learn."

Credit: 
NOAA Fisheries West Coast Region

Brain: How to optimize decision making?

Our brains are constantly faced with different choices: Should I have a chocolate éclair or macaroon? Should I take the bus or go by car? What should I wear: a woollen sweater or one made of cashmere? When the difference in quality between two choices is great, the choice is made very quickly. But when this difference is negligible, we can get stuck for minutes at a time - or even longer - before we're capable of making a decision. Why is it so difficult to make up our mind when faced with two or more choices? Is it because our brains are not optimised for taking decisions? In an attempt to answer these questions, neuroscientists from the University of Geneva (UNIGE), Switzerland, - in partnership with Harvard Medical School - developed a mathematical model of the optimal choice strategy. They demonstrated that optimal decisions must be based not on the true value of the possible choices but on the difference in value between them. The results, which you can read all about in the journal Nature Neuroscience, show that this decision-making strategy maximises the amount of reward received.

There are two types of decision-making: first, there is perceptual decision-making, which is based on sensory information: Do I have time to cross the road before that car comes nearer? Then there is value-based decision-making, when there is no good or bad decision as such but a choice needs to be made between several proposals: Do I want to eat apples or apricots? When taking value-based decisions, choices are made very quickly if there is a large difference in value between the different proposals. But when the propositions are similar, decision-making becomes very complex even though, in reality, none of the choices is worse than any other. Why is this?

The value of a choice lies in the difference

Satohiro Tajima, a researcher in the Department of Basic Neurosciences in UNIGE's Faculty of Medicine, designed a simple mathematical model that demonstrates the following: the optimal strategy when faced with two propositions is to sum up the values associated with the memories you have of each choice, then calculate the difference between these two sums (do I have more positive memories linked to chocolate eclairs or macaroons?). The decision is made when this difference reaches a threshold value, fixed in advance, which determines the time taken in making the decision. This model leads to rapid decision-making when the values of the two possibilities are very far apart. But when two choices have almost the same value, we need more time, because we need to draw on more memories so that this difference reaches the decision threshold. Is the same process at work when we have to choose between three or more possibilities?

The average of the values for each choice decides the winner

For each choice, we want to maximise the possible gain in the minimum amount of time. So, how do we proceed? &laquoThe first step is exactly the same as when making a binary choice: we amass the memories for each choice so we can estimate their combined value,» explains Alexandre Pouget, a professor in the Department of Basic Neurosciences at UNIGE. Then, using a mathematical model based on the theory of optimal scholastic control, instead of looking at the cumulative value associated with each choice independently, the decision rests on the difference between the cumulative value of each choice and the average value of the accumulated values over all the choices. As in the earlier case, the decision is made when one of these differences reaches a pre-determined threshold value. "The fact that the decision is based on the cumulative value minus the average of the values of all the possibilities explains why the choices interfere with each other, even when some differences are glaring,» continues professor Pouget.

If the different possible choices have similar values, the average will be almost identical to the value of each choice, resulting in a very lengthy decision-making time. &laquoMaking a simple choice can take 300 milliseconds but a complicated choice sometimes lasts a lifetime,» notes the Geneva-based researcher.

The UNIGE study shows that the brain does not make decisions according to the value of each opportunity but based on the difference between them. "This highlights the importance of the feeling of having to maximise the possible gains that can be obtained," says professor Pouget. The neuroscientists will now focus on how the brain revisits memory to call on the memories associated with every possible choice, and how it simulates information when faced with the unknown and when it cannot make a decision based on memories.

Credit: 
Université de Genève

Research on the good life

image: Pictured here: Professor Dr. Minh Nguyen.

Image: 
Bielefeld University

In recent decades, the centrally planned socialist economy in countries such as Laos, China, and Vietnam has been replaced by a market economy that remains under the political rule of the Communist party. The resulting changes to society have had profound implications on the idea of the good life--from deliberations on housing to those around religion. This is the topic of the conference entitled 'The Good Life in Late Socialist Asia: Aspirations, Politics, and Possibilities' to be held from 16 to 18 September at Bielefeld University's Center for Interdisciplinary Research (ZiF). The conference also marks the launch of the 'WelfareStruggles' project for which the social anthropologist Professor Dr Minh Nguyen is receiving an ERC Starting Grant from the European Research Council--one of the most important EU research grants.

'Logics of privatization prevail increasingly in Laos, China, or Vietnam. The socialist state is no longer committed to ensure universal care and well-being. People themselves are made responsible for their individual well-being,' says Professor D. Minh Nguyen from the Faculty of Sociology. She is organizing the conference together with Dr Phill Wilcox. 'On the one hand, that puts pressure on people and generates a feeling of moral decline and uncertainty. On the other hand, economic developments open up space for a whole range of aspirations and desires.'

The conference will address the good life with interdisciplinary analyses of a variety of issues such as infrastructure, politics, religion, or marriage. For example, the talk entitled 'The good life as the green life' will ask which challenges arise if one wants to bring about a vision of the good life that focuses on ecological balance. The talk on 'Europe's new Kulturbürger' will address the Chinese migration and the desire for a cosmopolitan life. 'In this conference, we want to integrate the different perspectives. Therefore, we have put together a team of international researchers from different disciplines--such as political science, anthropology, or geography,' says Nguyen.

In her project 'WelfareStruggles', Nguyen is studying the welfare of migrant workers in China and Vietnam. 'The conference at the ZiF ties in very well with this project. To understand welfare--the organization of health insurance and pensions--in late socialist contexts, you need to ground it in underlying ideas of the good life.' Nguyen's project has been awarded an ERC Starting Grant by the European Research Council. These grants aim to promote excellent and promising young academics. Nguyen will be receiving approximately 1.5 million euro over a five-year period. 'The conference coincides with the kick-off phase of the project. We have just finished setting up our research team and the conference is a good opportunity to explore our research together with colleagues,' says Nguyen.

Credit: 
Bielefeld University

UBC study finds health isn't the only issue with bacteria growth

Microorganisms growing inside aging buildings and infrastructure are more than just a health issue, according to new research from UBC Okanagan.

The research, coming from the School of Engineering and biology department, examined the impact of fungal mould growth and associated microbes within structures on university campuses. The study focuses on the observed biodeteriorative capabilities of indoor fungi upon gypsum board material (drywall) and how it affects a building's age and room functionality.

Assistant Professor Sepideh Pakpour says fungal growth significantly affected the physical (weight loss) and mechanical (tensile strength) properties of moisture-exposed gypsum board samples. In some cases, tensile strength and weight of some boards decreased by more than 80 per cent.

And she notes the issue of fungal growth, intensified by climate change, is two-fold.

"Increasing flooding and rainfall related to climate change is aiding fungi to grow more rapidly, causing degradation of the mechanical properties of buildings and infrastructure," she says. "Not only are the fungi breaking down the integrity of our buildings, but their proliferation is increasing health hazards for the people who live and work in these buildings."

The researchers also looked at other factors that can impact microbial growth including temperature, humidity, dustiness and occupancy levels--the more people, the quicker it can grow

According to the study, drywall experienced a significant effect on its mechanical properties when microbes were present. If the microbes were bolstered by moisture, the drywall's ability to withstand breakage when under tension dropped 20 per cent. Older buildings, on average, exhibited higher concentrations and types of fungi in the air, leading to higher mould coverage and biodeterioration on the drywall.

"Our findings would suggest a critical need towards multi-criteria design and optimization of next-generation healthy buildings," explains Pakpour. "Furthermore, we hope this study will enable engineers, architects and builders to develop optimal designs for highly microbial-resistant building materials that will decrease long-term economic losses and occupant health concerns."

The inter-disciplinary research was overseen by UBCO Biology Professor John Klironomos, Professor Abbas Milani, director of the School of Engineering's Materials and Manufacturing Research Institute and Pakpour who supervised the microbial and material degradation analyses conducted by their doctoral student Negin Kazemian.

The researchers plan on turning their attention next to the exposure levels of airborne microorganisms and possible remedies.

Credit: 
University of British Columbia Okanagan campus

First 'overtones' heard in the ringing of a black hole

image: This simulation shows how a black hole merger would appear to our eyes if we could somehow travel in a spaceship for a closer look. It was created by solving equations from Albert Einstein's general theory of relativity using LIGO data from the event called GW150914.

Image: 
SXS, the Simulating eXtreme Spacetimes (SXS) project

When two black holes collide, they merge into one bigger black hole and ring like a struck bell, sending out ripples in space and time called gravitational waves. Embedded in these gravitational waves are specific frequencies, or tones, which are akin to individual notes in a musical chord.

Now, researchers have detected two such tones for the first time in the "ringdown" of a newly formed black hole. Previously, it was assumed that only a single tone could be measured and that additional tones, called overtones, would be too faint to be detected with today's technologies.

"Before, it was as if you were trying to match the sound of a chord from a guitar using only a single string," says Matthew Giesler, a graduate student at Caltech and second author of a new study detailing the results in the September 12 issue of Physical Review Letters. Giesler is lead author of a related paper submitted to Physical Review X about the technique used to find the overtones.

The results, which were based on reanalyzing data captured by the National Science Foundation's LIGO (Laser Interferometer Gravitational-wave Observatory), have put Albert Einstein's general theory of relativity to a new kind of test. Because merging black holes experience crushing gravity, studies of these events allow researchers to test the general theory of relativity under extreme conditions. In this particular case, the researchers tested a specific prediction of general relativity: that black holes can be fully described by just their mass and rate of spin. Yet again, Einstein passed the test.

"This kind of test had been proposed long before the first detection, but everybody expected it would have to wait many years before detectors would be sensitive enough," says Saul Teukolsky (PhD '73), the Robinson Professor of Theoretical Astrophysics at Caltech and advisor to Giesler. "This result shows that we can start carrying out the test already with today's detectors by including the overtones, an unexpected and exciting result."

LIGO made history in 2015 when it made the first-ever direct detection of gravitational waves, 100 years after Einstein first predicted them. Since then, LIGO and its European-based partner observatory, Virgo, have detected nearly 30 gravitational-wave events, which are being further analyzed. Many of these gravitational waves arose when two black holes collided, sending quivers through space.

"A new black hole forms out of a violent astrophysical process and thus is in an agitated state," says Maximiliano (Max) Isi (PhD '18), lead author of the Physical Review Letters study, now at MIT. "However, it quickly sheds this surplus energy in the form of gravitational waves."

As part of Giesler's graduate work, he started to investigate whether overtones could be detected in current gravitational-wave data in addition to the main signal, or tone, even though most scientists believed these overtones were too faint. He specifically looked at simulations of LIGO's first detection of gravitational waves, from a black hole merger event known as GW150914.

During the end-phase of the merger, a period of time known as the ringdown, the newly merged black hole is still shaking. Giesler found that the overtones, which are loud but short-lived, are present in an earlier phase of the ringdown than previously had been realized.

"This was a very surprising result. The conventional wisdom was that by the time the remnant black hole had settled down so that any tones could be detected, the overtones would have decayed away almost completely," says Teukolsky, who is also a professor of physics at Cornell University. "Instead, it turns out that the overtones are detectable before the main tone becomes visible."

The newfound overtones helped the researchers test the "no hair" theorem for black holes--the idea that there are no other characteristics, or "hairs," needed to define a black hole other than mass or spin. The new results confirm that the black holes do not have hairs, but scientists suspect that future tests of the theory, in which even more detailed observations are used to probe black hole mergers, may show otherwise.

"Einstein's theory could break down if there are quantum effects at play," says Giesler.

"Newton's theory of gravity passes many tests where gravity is weak, but completely fails when it comes to describing gravity at its most extreme, like when it comes to trying to describe merging black holes. Similarly, as we eventually probe the signal from black holes with increasing accuracy, it is possible that even general relativity might someday fail the test."

Over the next few years, planned upgrades to LIGO and Virgo will make the observatories even more sensitive to gravitational waves, revealing more hidden tones.

"The bigger and louder an event, the more likely LIGO can pick up these overtones," says Alan Weinstein, a professor of physics at Caltech and a member of the LIGO Laboratory, who is not associated with this study. "With LIGO's first detection of gravitational waves, we confirmed predictions made by general relativity. Now, by searching for overtones, and even fainter signals called higher-order modes, we are looking for deeper tests of the theory, and even potential evidence of the theory breaking down."

Says Isi, "Little by little, black holes will shed their mysteries, revolutionizing our understanding of gravity, space, and time."

Credit: 
California Institute of Technology

Caregiver stress: The crucial, often unrecognized byproduct of chronic disease

Philadelphia, September 10, 2019 - There is growing evidence that caregivers of patients with cardiovascular disease (CVD) are vulnerable to developing their own poor cardiovascular health. Investigators report in the Canadian Journal of Cardiology, published by Elsevier, on a proof-of-concept couples-based intervention in a cardiac rehabilitation setting. This intervention has shown potential for reducing caregiver distress, and future studies are evaluating its impact on both caregivers' and patients' cardiovascular health.

Nearly half of Canadians have been in caregiving roles to family and friends, with similar figures in the United States and Europe. A caregiver is broadly defined as someone who provides informal or unpaid work to a family member or friend with a chronic condition or disability. Caregivers provide crucial, rarely remunerated support to sick family members or friends. About 40 per cent of caregivers, of whom more than half are women, report high psychological, emotional, physical, social, and financial stresses imposed by the caregiving role. These factors can contribute to a higher risk of CVD among caregivers themselves. However, despite an appreciation of these issues, few approaches have been effective in reducing caregiver stress. This need is expected to increase because pressures on "cardiac" caregivers are projected to rise in the next decade as the population ages, length of hospital stays decline, and CVD and associated risk factors continue to increase.

"It is abundantly clear that caregivers need to be better supported!" said lead investigator Heather Tulloch, PhD, Division of Cardiac Prevention and Rehabilitation, University of Ottawa Heart Institute (UOHI), and University of Ottawa, Ottawa, ON, Canada. First author, Karen Bouchard, PhD, postdoctoral fellow in behavioral medicine at UOHI added, "Caregivers are critical for patients' cardiovascular health management and are an invaluable healthcare resource, contributing enormously to the Canadian healthcare system. Individuals who care for their partners may experience additional cardiovascular risk - a risk that should be recognized and to which we should respond."

In this narrative review, investigators look at evidence from the fields of health psychology and relationship science and highlight the direct (e.g., physiological) and indirect (e.g., behavioral, emotional) factors that link caregiver distress with caregivers' own cardiovascular risk. For example, caregivers are more likely to continue to smoke and less likely to be physically active than individuals who provide no or low levels of care; their diets tend to be high in saturated fat intake leading to greater body mass indexes; they spend less time engaging in self-care activities and report poor preventive health behaviors; they experience less or disordered sleep; and demonstrate poor adherence to medication. Spousal caregivers have higher levels of depressive symptoms, physical and financial burden, relationship strain, and lower levels of positive psychological wellbeing compared to adult children caregivers, for example.

The researchers report that the risk of hypertension and metabolic syndrome may be directly related to high-intensity caregiving, defined as providing more than 14 hours of caregiving per week over two consecutive years. They also report findings that estimate the economic contribution of caregivers' unpaid labor to be $26 billion annually in Canada, which is projected to increase to $128 billion by 2035 (likely translating to over a trillion dollars each in the USA and Europe).

The investigators contend that the cardiovascular health of both patient and caregiver could be improved by enhancing the quality of the patient-caregiver relationship. They describe a proof of concept testing of Healing Hearts Together, a relationship-enhancement and educational program for patients and partners. Based on attachment theory, which states that close emotional bonds are essential when faced with a threat such as a cardiac event, the program guides couples through conversations in which they review information on heart health and attachment; share their unique experiences with heart disease with partners and peers; and learn to clearly communicate their need for connection and reassurance. This connection enhances couple satisfaction and problem solving. Participants reported improvements in relationship quality, mental health, and select quality of life measures. A controlled evaluation of the impact of the program on cardiovascular risk factors is underway.

"The aim of Healing Hearts Together is to increase emotional accessibility and responsiveness in couples facing CVD," explained Dr. Tulloch. "Taken together, couples-based interventions in a cardiac rehabilitation setting may be a timely and appropriate approach to reduce caregiver distress and enhance caregivers' comprehensive health outcomes. There is an emerging opportunity to care for those who care for their partners and enhance the health of both. It is important that healthcare professionals recognize the burden of caregiving and act sensitively and strategically to address these challenges."

"Detrimental effects of the caregiving experience are greater among middle-aged caregivers, those known as the 'sandwich generation,' because they balance paid work commitments and interpersonal relationships with care delivery tasks for parents, children, and/or partners," commented Monica Parry, NP-Adult, MEd, MSc, PhD, University of Toronto, Toronto, ON, Canada, in an accompanying editorial. She points out that men and women deal differently with caregiving and, as the landscape of heart disease in women is changing, so must our approach and understanding of the caregiving experiences of men. For example, male caregivers may struggle with the societal views of caring; feel invisible at times; and may be unsure how to assimilate the caring role, masculinity, and accessing help for themselves. "We are facing an epidemic of caregiver burden. Caregivers cannot remain under-researched, under-diagnosed, under-treated and/or under-supported," she concluded.

A recent policy statement from the American Heart Association/American Stroke Association on palliative care for CVD and stroke recommended attention to the physical, emotional, spiritual, and psychological distress of the patient's family and care system.

Credit: 
Elsevier

Scientists find biology's optimal 'molecular alphabet' may be preordained

image: The coded amino acids each have unique properties that help biological proteins fold optimally. The set of amino acids biology uses appears to have been evolutionarily optimized.

Image: 
H.J. Cleaves, ELSI/ Wikimedia Commons

An international and interdisciplinary team working at the Earth-Life Science Institute (ELSI) at the Tokyo Institute of Technology has modeled the evolution of one of biology's most fundamental sets of building blocks and found that it may have special properties that helped bootstrap itself into its modern form.

All life, from bacteria to blue whales to human beings, uses an almost universal set of 20 coded amino acids (CAAs) to construct proteins. This set was likely "canonicalized" or standardized during early evolution; before this, smaller amino acid sets were gradually expanded as organisms developed new synthetic proofreading and coding abilities. The new study, led by Melissa Ilardo, now at the University of Utah, explored how this set evolution might have occurred.

There are millions of possible types of amino acids that could be found on Earth or elsewhere in the Universe, each with its own distinctive chemical properties. Indeed, scientists have found these unique chemical properties are what give biological proteins, the large molecules that do much of life's catalysis, their own unique capabilities. The team had previously measured how the CAA set compares to random sets of amino acids and found that only about 1 in a billion random sets had chemical properties as unusually distributed as those of the CAAs.

The team thus set out to ask the question of what earlier, smaller coded sets might have been like in terms of their chemical properties. There are many possible subsets of the modern CAAs or other presently uncoded amino acids that could have comprised the earlier sets. The team calculated the possible ways of making a set of 3-20 amino acids using a special library of 1913 structurally diverse "virtual" amino acids they computed and found there are 1048 ways of making sets of 20 amino acids. In contrast, there are only ~ 1019 grains of sand on Earth, and only ~ 1024 stars in the entire Universe. "There are just so many possible amino acids, and so many ways to make combinations of them, a computational approach was the only comprehensive way to address this question," says team member Jim Cleaves of ELSI. "Efficient implementations of algorithms based on appropriate mathematical models allow us to handle even astronomically huge combinatorial spaces," adds co-author Markus Meringer of the Deutsches Zentrum für Luft- und Raumfahrt.

As this number is so large, they used statistical methods to compare the adaptive value of the combined physicochemical properties of the modern CAA set with those of billions of random sets of 3-20 amino acids. What they found was that the CAAs may have been selectively kept during evolution due to their unique adaptive chemical properties, which help them to make optimal proteins, in turn helping organisms that could produce those proteins become more fit.

They found that even hypothetical sets containing only one or a few modern CAAs were especially adaptive. It was difficult to find sets even among a multitude of alternatives that have the unique chemical properties of the modern CAA set. These results suggest that each time a modern CAA was discovered and embedded in biology's toolkit during evolution, it provided an adaptive value unusual among a huge number of alternatives, and each selective step may have helped bootstrap the developing set to include still more CAAs, ultimately leading to the modern set.

If true, the researchers speculate, it might mean that even given a large variety of starting points for developing coded amino acid sets, biology might end up converging on a similar set. As this model was based on the invariant physical and chemical properties of the amino acids themselves, this could mean that even Life beyond Earth might be very similar to modern Earth life. Co-author Rudrarup Bose, now of the Max Planck Institute of Molecular Cell Biology and Genetics in Dresden, further hypothesizes that "Life may not be just a set of accidental events. Rather, there may be some universal laws governing the evolution of life."

Credit: 
Tokyo Institute of Technology

Scientists listed ways of applying genetic engineering to treat Parkinson's disease

image: Possible mechanisms of development of Parkinson's disease and the most promising ways of applying CRISPR/Cas9 for its studying and treating.

Image: 
Alena Manuzina

Researchers of Sechenov University and University of Pittsburgh described the most promising strategies in applying genetic engineering for studying and treating Parkinson's disease. This method can help evaluate the role of various cellular processes in pathology progression, develop new drugs and therapies, and estimate their efficacy using animal disease models. The study was published in Free Radical Biology and Medicine.

Parkinson's disease is a neurodegenerative disorder accompanied by a wide array of motor and cognitive impairments. It develops mostly among elderly people (after the age of 55-60). Parkinson's symptoms usually begin gradually and get worse over time. As the disease progresses, people may have difficulty controlling their movements, walking and talking and, more importantly, taking care of themselves. Although there is no cure for Parkinson's disease, medicines, surgical treatment, and other therapies can often relieve some symptoms.

The disease is characterized by significant (up to 50-70%) loss of dopaminergic neurons, i.e. nerve cells that synthesize neurotransmitter dopamine which enables communication between the neurons. Another hallmark is the presence of Lewy bodies - oligomeric deposits of a protein called alpha-synuclein inside the neurons.

Scientists discovered several mechanisms that seem to trigger and exacerbate the development of Parkinson's disease. In nearly 10% of all cases the disease is genetically predetermined, however, in most of the cases the disease is believed to involve both genetic and environmental risk factors, e.g. taking some drugs or pesticide/herbicide poisoning.

The authors of the study focused on the possible causes of the disease that are related to the oxidation-reduction (redox) reactions in cells as well as to mechanisms of cell death - apoptosis and ferroptosis. These processes were thoughtfully discussed in the context of direct manipulation (e.g. insertion or deletion of DNA sequences) with genetic material of living organisms or cells by applying a novel genome editing approach - CRISPR/Cas9 technology.

"Parkinson's disease is the second most common neurodegenerative disorder in the world, and its incidence is growing with age. The disease affects patients' quality of life and impose significant social and economic burden on societies. CRISPR is a promising technology, a strategy to find new effective treatments to neurodegenerative diseases. The technology gives researchers ample opportunity to knock in and knock out any gene of interest, to define the processes it is involved in and unearth entire metabolic pathways, in order to understand the underlying mechanisms and causes of the disease. Special attention should be drawn to CRISPR editing applications for creation of cellular and animal models that fully reproduce pathological states observed in human patients, provide insights into the mechanisms underlying the disorder and are applicable for drug discovery," says Margarita Artyukhova, co-author of the work, 4th year student, Institute for Regenerative Medicine, Sechenov University, and member of Sechenov Biomedical Club.

The first group of mechanisms link the role of mitochondria with Parkinson's disease pathology. Along with being powerhouses of the cell mitochondria form reactive oxygen species (ROS) that play important roles in cell signaling and homeostasis. Damage to mitochondria or disruption of their work may lead to the extensive accumulation of ROS that may cause significant damage to cell structures. Because of the danger of having damaged mitochondria in the cell, the timely elimination of damaged and aged mitochondria is essential for maintaining the integrity of the cell. This turnover is known as mitophagy. Inefficient or excessive mitophagy may contribute to neurodegeneration. Previous studies demonstrated that animals with mutations in the genes that encode for PINK1 and Parkin - proteins essential for proper mitochondrial functioning - developed mitochondrial dysfunction, muscle degeneration and remarkable loss of dopaminergic neurons typical to Parkinson's disease. Results from several research groups suggested that the role of these proteins could be much more complicated and may involve other partners that contribute to the execution and regulation of mitophagy, thus, the time for implementing genome editing to manipulate them is yet to come. However, there are several other proteins and lipids that impact mitophagy and mitochondrial function (DJ-1, alpha-synuclein, Fbxo7, and cardiolipin), and CRISPR/Cas9 can be used to search for yet unknown genes and their protein products involved in the disease progression and development. Moreover, generation of animal and cellular models with various mutations in mentioned genes will help to define the role of corresponding proteins in the disease pathology and make them targets for drug discovery.

The second group of cellular processes is related to iron homeostasis. A certain amount of iron and ROS is important for a proper functioning of the cell, thus, the disruption of this balance may cause a chain of harmful and pathological alterations that affect crucial cellular processes. Iron can accumulate in brain tissues of elderly people and animals, in particular, in the areas of the brain responsible for motor and cognitive functions. It was shown that a high level of iron in one of these areas (substantia nigra) is accompanied by death of dopaminergic neurons. Furthermore, interaction between dopamine and iron can promote the production of toxic metabolites that cause damage to mitochondria and trigger aggregation of alpha-synuclein in neurons. Scientists also examine proteins involved in iron import and export within the neurons: tau protein, transferrin, transferrin receptor, ferroportin, and amyloid precursor protein. CRISPR/Cas9 can help in developing of drugs that will normalize iron homeostasis in affected tissues.

The third group of processes reviewed in the article are cell death programs, apoptosis and ferroptosis. During apoptosis proteins and DNA of the cell are broken up with special enzymes, and the cell itself disintegrates. Ferroptosis follows the iron-dependent oxidation of specific lipid classes; the products of oxidation accumulate and poison the cell. Both apoptosis and ferroptosis play role in the development of neurological diseases, speeding up the death of dopaminergic neurons. In this case CRISPR/Cas9 will be useful in detailed studies of cell death pathways that promote Parkinson's disease-associated loss of dopaminergic neurons as well as in the regulation of specific enzymes and proteins that are involved in the execution of these cell death programs.

Functional analysis of genome, a powerful approach to gain comprehensive information on the entire genome, helped obtain many of these results, and has already proved its efficiency in research on schizophrenia, bipolar disorder and autism spectrum disorder. Paired with CRISPR/Cas9 it can contribute to the understanding of genetic basis of Parkinson's disease, promote generation of individual cellular and animal models as well as design of new drugs and treatment approaches.

"This is one important step of many toward bringing the promise of this new technology to patients with serious diseases like neurodegenerative disorders. It has already been used in human trials (in China, Germany, the USA) to treat patients with cancer at a late stage and beta-thalassemia. Such studies allow us to see vast potential of genome editing as a therapeutic strategy. It's hard not to be thrilled and excited when you understand that progress of genome editing technologies can completely change our understanding of treatment of Parkinson's disease and other neurodegenerative disorders," added Margarita Artyukhova.

Credit: 
Sechenov University

Every time the small cabbage white butterfly flaps its wings it has us to thank

image: An unassuming, small, white butterfly is among the world's most invasive pests affecting crops like cabbage, kale and broccoli. A newly published study in the Proceedings of the National Academy of Sciences? (PNAS)? documents how humans have helped Pieris rapae, the small cabbage white butterfly, spread across the globe for thousands of years.

Image: 
Image by Lauren Nichols, Department of Applied Ecology, North Carolina State University. Used by permission.

KNOXVILLE, Tenn. -- The caterpillar form of an unassuming, small, white butterfly is among the world's most invasive pests affecting agricultural crops, and a newly published paper by a consortium of scientists documents how humans have helped it spread for thousands of years.

Through close examination of genetic variation and similarities between existing populations, and comparisons of historical data regarding infestations of Pieris rapae in Brassicaceae crops--like cabbage, canola, bok choy and turnips--the researchers document how humans helped the small cabbage white butterfly spread from Europe across the world. Led by Sean Ryan, formerly a postdoctoral researcher in the Department of Entomology and Plant Pathology at the University of Tennessee Institute of Agriculture, the team of scientists from eight institutions partnered with more than 150 volunteer citizen scientists from 32 countries to detail the pest's range and current genetic diversity.

Published online on September 10, 2019, in the Proceedings of the National Academy of Sciences (PNAS), the paper correlates the pest's invasive spread across the world through human travel and trade beginning with the overland ancient Silk Road routes from Europe to Asia, followed by the tall ships that traveled the more modern Silk Trade Routes, to the "iron horses" that traversed North America beginning in the second half of the19th century.

"The success of the small cabbage white butterfly is the consequences of human activities. Through trade and migration humans humans helped to inadvertently spread the pest beyond its natural range, and through the domestication and diversification of mustard crops, like cabbage, kale and broccoli, humans provided it with the food its caterpillars would need to flourish," says Ryan.

Prior to the study, historical records provided some indication of when this agricultural pest arrived in each new continent it invaded. However, the timing, sources, and routes remained unsolved. What's more, such detailed knowledge is crucial in developing an effective biological control program as well as for answering basic questions associated with the invasion process, such as genetic changes and how species adapt to new environments.

The research team took to social media to ask the public for help. The approach was similar to how researchers have been expanding our understanding of human ancestry through in-home DNA sampling kits. Instead of asking people to swab their cheek, the butterfly research team asked citizen scientists to grab a butterfly net, then catch and send small cabbage white butterflies to the team for genetic testing. Ryan, currently with Exponent, Inc., in Menlo Park, California, then used the DNA from the submitted specimens to analyze genetic data and determine how the small cabbage white spread across the world. More than 3,000 butterflies were submitted. The samples cover nearly the entire native and invaded ranges of the butterfly and comprise 293 localities.

The researchers found that the small cabbage white butterfly likely originated in eastern Europe and then spread into Asia and Siberia when trade was increasing along the Silk Road. The researchers also found that, as expected, Europe was responsible for the introduction of the small cabbage white to North America. Surprisingly, the introduction into New Zealand came from San Francisco, California. Also, the butterflies living in central California and the surrounding area are genetically distinct from all other butterflies in North America and appear to be the consequence of a few butterflies hitching a train ride from the eastern U.S. to San Francisco. Although each invasion into a new area or country led to significant loss of genetic diversity, the invasions were successful, hence the abundance of small cabbage white butterflies today.

Citizen science--research in which members of the public play a role in project development, data collection or discovery--is subject to the same system of peer review as conventional science. Its power lies in its ability to help conventional studies overcome challenges involving large spatial and temporal scales. Social media and the internet are key tools that allow citizen scientists, who are often share similar interests through memberships in nature-based groups or professional societies, enhance the scale and scope of a particular project and its impact on society.

"Citizen science projects have been growing exponentially over the last decade, opening doors to new scientific frontiers and expanding the limits of what was once feasible," says DeWayne Shoemaker, professor and head of the UT Department of Entomology and Plant Pathology, and one of the paper's co-authors. "The relatively unique approach we took was asking the public to help collect--not just observe--these agricultural pests, and in so doing we were able to extract information recorded within the DNA of each individual butterfly. That information, when aggregated, told a story about the collective past of the small cabbage white butterfly."

"The international success of our citizen science project--the Pieris Project--demonstrates the power of the public to aid scientists in collections-based research addressing important questions in invasion biology, and ecology and evolutionary biology more broadly," says Ryan. He believes the use of collection-based citizen science projects will help society more accurately document ecological and evolutionary changes, which can lead to improvements in crop management and success as well as better environmental controls for invasive species.

Credit: 
University of Tennessee Institute of Agriculture