Culture

Frequent COVID-19 testing key to efficient, early detection, study finds

image: COVID-19 testing works best when administered multiple times per week as part of a routine screening program, such as the SHIELD Illinois saliva-testing protocol, a new study found.

Image: 
Photo by Fred Zwicky, University of Illinois

CHAMPAIGN, Ill. -- The chance of detecting the virus that causes COVID-19 increases with more frequent testing, no matter the type of test, a new study found. Both polymerase chain reaction and antigen tests, paired with rapid results reporting, can achieve 98% sensitivity if deployed at least every three days.

"This study shows that frequent testing can be really effective at catching COVID-19 infections and potentially blocking transmission," said study leader Christopher Brooke, a virologist and professor of microbiology at the University of Illinois Urbana-Champaign. "There are many places where vaccination is not yet widespread. With the rise of variants, testing remains an important tool for blocking the spread of the virus."

Part of the Rapid Acceleration of Diagnostics Tech program of the National Institutes of Health, the study brought together researchers at Illinois; the University of Massachusetts Medical School, Worcester; Johns Hopkins School of Medicine, Baltimore; and the NIH National Institute of Biomedical Imaging and Bioengineering. The researchers published their results in the Journal of Infectious Diseases.

Students and employees at the U. of I. who had tested positive for COVID-19 or were identified as close contacts of a person who tested positive were invited to participate. Because of the SHIELD Illinois screening program, which required students and employees to take multiple saliva-based tests each week and returned results in less than 24 hours, the university provided an ideal location for identifying cases before they became symptomatic, the researchers said.

The 43 study participants received three tests daily for 14 days: a PCR nasal swab, a PCR saliva test and an antigen nasal swab. The results of each were compared with live viral cultures taken from the PCR nasal swab, which show when a person is actively infectious. The study also examined how the frequency of testing affected each method's efficacy at detecting an infection.

"Different tests have different advantages and limitations. Antigen tests are fast and cheap, but they are not as sensitive as PCR tests. PCR are the gold standard, but they take some time to return results and are more expensive," said Rebecca Lee Smith, a professor of epidemiology at Illinois and the first author of the study. "This study was to show, based on real data, which test is best under which circumstances and for what purpose."

The results showed that the PCR tests - particularly saliva-based ones - were best at detecting cases before the person had an infectious viral load, a key to isolating individuals before they can spread the virus, Smith said. For all three methods, testing every three days had 98% sensitivity to detecting infection.

If that testing frequency declined to once a week, the PCR methods maintained their high sensitivity but the antigen tests dropped to around 80%. That means organizations that wish to deploy antigen testing as part of a reopening strategy or individuals who wish to monitor their status at home should use antigen tests multiple times each week to achieve similar results to PCR testing, the researchers said.

"This work also shows how the PCR and antigen tests could be used in combination," Smith said. "For example, I work with a lot of school districts, helping them to plan for fall, since vaccines are not yet available to those under 12 years old. If a student had a known exposure or comes to school symptomatic, give them both tests. Antigen tests are really good at finding those highly infectious people, so that can tell administrators right away if the child needs to be sent home, rather than waiting 24 hours for PCR results. If the antigen test is negative, the PCR test is a backup, as it may detect the infection earlier than an antigen test would, before the student becomes contagious."

The results of the study helped inform the U.S. Food and Drug Administration's recommendations and instructions on how to use at-home antigen tests that recently received emergency use authorization. The researchers said they hope the results assist schools, businesses and other organizations as they reopen.

"If you are in a situation where you have the resources and capacity to do large-scale PCR testing with rapid results reporting like we did here at Illinois, you can identify infections early and potentially isolate people before they become contagious," Brooke said. "In places where PCR testing is not readily available or rapid results reporting is not possible, but the cheaper and more rapid tests are available, our data show how those tests can be deployed in a way that can increase their sensitivity - through repeated serial testing, ideally three times a week or more."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Do 'Made in USA' claims make a difference in marketing results?

Key Takeaways:

Research reveals consumer demand declines when product packaging and marketing materials removed the claim, "Made in USA" and increase when the claim was featured.

The impact on sales is insufficient to convince some companies to manufacture more products in the United States but enough to incentivize companies to make deceptive "Made in USA" claims.

CATONSVILLE, MD, June 30, 2021 - Pick up any product in just about any store and you're likely to find information that indicates the country of origin of the product. The U.S. Federal Trade Commission (FTC) requires this for any imported product, but not for products made in the United States. When you see the words "Made in USA" on a product, it's purely for marketing purposes. So, does it work?

According to a new research study, yes, it does. Seeing "Made in USA" does in fact affect consumer purchasing. However, some products falsely claim to be American made.

The study to be published in the July issue of the INFORMS journal Marketing Science, "Do 'Made in USA' Claims Matter?" is authored by Xinyao Kong and Anita Rao, both of the University of Chicago.

Since 2010, the FTC investigated more than 150 cases of deceptive or misleading claims about products made in the United States. For a brand to make the "Made in USA" claim, the FTC requires that the product be "all or virtually all" made in the United States with no or negligible foreign content.

For this study, the researchers focused on four of the cases that were found not to fit the FTC's criteria for allowing claims of being made in the United States. The researchers were able to examine and compare sales of the products before and after the information was removed from products, packaging, advertising, websites and other marketing channels.

"We focused our attention on four brands that included Gorilla Glue, Loctite Glue, Gorilla Tape and Tramontina cookware," said Kong. "For three of the four brands, the removal of the information had a negative impact on sales. Tramontina cookware saw a 19.5% decline in weekly store sales; Loctite Glue experienced a 6.1% decline; and Gorilla Glue suffered a 1.9% decline. The fourth brand we studied, Gorilla Tape, experienced a 'trend decline' following the FTC decision."

In addition to the study of those four brands, the authors also ran a field experiment on eBay where they were able to run more than 900 auctions over the course of three months, varying only whether a product was advertised with or without the "Made in USA" claim. This was done to learn more about whether the American-made claims provided significantly higher incentive to include that information.

The experiment showed the claims did matter. Auction transaction prices were 28% higher with the "Made in USA" claim, indicating that resellers and auctioneers have more incentive to display that information.

"In the field experiments, we chose a product category in which demand was already high on eBay," said Kong. "We then offered two variants of the product, one with the country-of-origin information, and one without. The products we chose were screen protectors for handheld devices. We eventually sold 912 screen protectors using three-day auctions on eBay."

The "Made in USA" claim on the auction prices placed significant price premium on products.

"While the increase in sales is not sufficient to justify the economics of relocating manufacturing operations to the United States, it is enough to incentivize some firms to engage in making improper and deceptive country-of-origin claims," added Rao.

Credit: 
Institute for Operations Research and the Management Sciences

Machine learning helps in predicting when immunotherapy will be effective

When it comes to defense, the body relies on attack thanks to the lymphatic and immune systems. The immune system is like the body's own personal police force as it hunts down and eliminates pathogenic villains.

"The body's immune system is very good at identifying cells that are acting strangely. These include cells that could develop into tumors or cancer in the future," says Federica Eduati from the department of Biomedical Engineering at TU/e. "Once detected, the immune system strikes and kills the cells."

Stopping the attack

But it's not always so straightforward as tumor cells can develop ways to hide themselves from the immune system.

"Unfortunately, tumor cells can block the natural immune response. Proteins on the surface of a tumor cell can turn off the immune cells and effectively put them in sleep mode," says Oscar Lapuente-Santana, PhD researcher in the Computational Biology group.

Fortunately, there is a way to wake up the immune cells and restore their antitumor immunity, and it's based on immunotherapy.

Introducing immunotherapy

Immunotherapy is a cancer treatment that assists the immune system in its fight against cancer cells. One type of immunotherapy involves immune checkpoint blockers (ICB), which are drugs that tell the immune cells to ignore the shutdown orders coming from cancer cells.

The discovery of ICB has been revolutionary for cancer treatment, with James P. Allison and Tasuku Honjo jointly awarded the 2018 Nobel Prize in Physiology or Medicine for their work on ICB.

Although ICB has been successfully used to treat plenty of patients and different cancer types, only one-third of patients respond to the treatment.

"ICB has had a big impact, but it could be bigger if we could figure out quickly which patients are most likely to respond to the treatment," says Eduati. "And it would also be great if we could understand why other patients are not responding to ICB."

To solve this problem, Lapuente-Santana and Eduati, along with colleagues Maisa van Genderen (TU/e), Peter Hilbers (TU/e) and Francesca Finotello (Medical University of Innsbruck), turned to machine learning to predict how patients might respond to ICB. Their work has just been published in the journal Patterns.

Searching the tumor microenvironment

To predict whether a patient will respond to ICB, the researchers first needed to find particular biomarkers in tumor samples from the patients.

"Tumors contain more than just tumour cells, they also contain several different types of immune cells and fibroblasts, which can have a pro- or anti-tumour role, and they communicatie with each other", explains Lapuente-Santana. "We needed to find out how complex regulatory mechanisms in the tumor microenvironment affect response to ICB. We turned to RNA-sequencing datasets to provide a high-level representation of several aspects of the tumor microenvironment."

To find the right mechanisms that could serve as biomarkers to predict patients' response to ICB, the team searched the microenvironment of tumors using computational algorithms and datasets from previous clinical patient care.

"RNA-sequencing datasets are publicly available, but the information about which patients responded to ICB therapy is only available for a small subset of patients and cancer types," says Eduati. "So, we used a trick to solve the data problem."

The trick

For their trick, instead of looking for the actual biological response to ICB treatment, the researchers picked out several substitute immune responses from the same datasets. Despite not being the primary response to ICB, together they could be used as an indicator of the effectiveness of ICB.

Thanks to this approach, the team could use a large public dataset with thousands of patient samples to robustly train machine learning models.

"A significant challenge with this work was the proper training of the machine learning models. By looking at substitute immune responses during the training process, we were able to solve this," says Lapuente-Santana.

With the machine learning models in place, the researchers then tested the accuracy of the model on different datasets where the actual response to ICB treatment was known. "We found that overall, our machine learning model outperforms biomarkers currently used in clinical settings to assess ICB treatments," says Eduati.

But why are Eduati, Lapuente-Santana, and their colleagues turning to mathematical models to solve a medical treatment problem? Will this replace the doctor? "Mathematical models can provide a big picture of how individual molecules and cells are interconnected, while at the same time approximate the behavior of tumors in a particular patient. In clinical settings, this means that immunotherapy treatment with ICB can be personalized to a patient. It's important to remember that the models can help doctors with their decisions on the best treatment, they won't replace them." says Eduati.

In addition, the model also helps in understanding which biological mechanisms are important for the biological response. Understanding and identifying the mechanisms that mediate ICB response can guide how best to combine ICB with other treatments to improve its clinical efficacy. However, this will first require experimental validation of the identified mechanisms before translating these results to clinical settings.

Dare to DREAM

The machine learning approach presented in the paper was also used by some of the researchers to take part in a DREAM challenge called "Anti-PD1 Response Prediction DREAM Challenge".

DREAM is an organization dedicated to running crowd-sourced challenges involving algorithms in biomedicine. "We came first in one of the sub-challenges and competed under the name cSysImmunoOnco team," adds Eduati.

Our immune system might be an efficient detective and disease hunter, but every now and then it needs a helping hand to eradicate elusive villains like cancer cells. Immunotherapy using immune checkpoint blockers is one such approach, but it doesn't work for everyone.

Lapuente-Santana, Eduati, and colleagues have certainly dared to dream, and their work could prove pivotal in quickly identifying those who could be successfully treated with ICB in the future.

Thanks to machine learning, the researchers hope to rapidly deliver proper and effective cancer treatments to specific patients.

And for some cancer cells, it means that there could be no place to run, and no place to hide.

Credit: 
Eindhoven University of Technology

Conservatives' sensitivity to pandemic threat suppressed by distrust of science, media

Researchers studying the intersection of politics and psychology have long documented a link between threat sensitivity and social conservatism: People who are more socially conservative tend to react more strongly to threats. Conversely, those who are more socially liberal tend to be less sensitive to threats, viewing the world as a generally safe place and embracing change to explore new possibilities.

These findings have held across a variety of events, but during the pandemic, U.S. polls show that Democrats, who tend to be more liberal, have generally been more concerned about the COVID-19 threat than Republicans, who tend to be more conservative. A new UCLA study explores this reversal, probing the relationship between innate dispositions toward threats, the social environment and responses to the pandemic.

Led by UCLA graduate student Theodore Samore, anthropology professor Daniel Fessler and postdoctoral scholar Adam Sparks, along with cognitive scientist Colin Holbrook from UC Merced, the study found that Republicans' and independents' inclinations to embrace protective behaviors in proportion to their degree of conservatism were overruled by distrust in science and in liberal or moderate information sources. Republicans and independents also focused on the negative economic impacts of the lockdowns and the perceived infringement upon personal liberties. Together, these factors led socially conservative Republicans and independents to take fewer precautionary measures, such as mask-wearing, physical distancing and sanitizing.

"The distrust of science and public health officials, as well as distrust of moderate and liberal media sources, actually countermanded responses that reflected people's underlying personality traits," Fessler said. "For example, we can think about how behaviors among Republicans and independents may have been different if evangelical church pastors had promoted mask-wearing to protect the elderly and most vulnerable. Instead, conservative Republicans and independents were influenced by high-profile individuals who downplayed the severity of the virus and undermined reliable information sources."

Both major political parties span a range of social perspectives. Democrats were less influenced by messaging that undercut scientists and moderate journalists. Socially conservative Democrats were more willing to adopt precautionary measures than more liberal party members, demonstrating the connection between social conservatism and threat reactivity.

"There's been a broad tendency to see partisan responses to the pandemic as existing along a simple left-right axis, where more liberal Americans have exhibited greater precautions than more conservative ones," Samore said. "However, we find that more socially conservative Democrats were taking greater COVID-19 precautions than more liberal ones, suggesting that these political dynamics are in fact more complex than is commonly presumed."

As Holbrook explained, "The data show that conservatives who trusted scientific authorities and media reporting that advised precaution actually took greater COVID-19 safety measures than progressives did, and these patterns are not explained by factors such as differences in age or employment."

The research raises disturbing implications. "The findings suggest that Republicans would have been substantially more careful had their media environment encouraged them to do so, plausibly saving many thousands of lives and preventing scores of long-term health problems related to COVID infection," Holbrook said.

The investigators note that today's media environment is a key factor in how people understand and respond to major events.

"As we flatten the information highway, giving everyone a voice on social media, we undermine or lose the authority of professional journalists and scientists," Fessler said. "There is great potential for misinformation about science to be disseminated, and for people to actually act against their gut instincts or self-interest."

The investigators ran two studies, using identical methods, six weeks apart to ensure results would not reflect a specific time period's social and political landscape. For each study, 1,000 paid participants were recruited through an online crowdsourcing platform. Survey questions asked about political party affiliation; hot-button political issues, such as abortion, tax rates and military intervention abroad; and attitudes toward social change more broadly. Other questions assessed participants' views on science and attitudes toward a variety of media sources, individual journalists, and prominent politicians and scientists. Participants also reported the extent to which they followed various COVID-19 protocols, such as hand-washing, physical distancing and mask-wearing.

The good news, Fessler said, is that individual decisions aren't necessarily set in stone.

"As the success of the ongoing U.S. vaccination campaign demonstrates, when people of diverse political orientations are able to unite in the face of danger, everyone benefits," he said. "Understanding how individuals differ in their reactions to threats, and how this interacts with their political leanings and information consumption, may provide a vital link to understanding and addressing shared challenges in our increasingly interconnected world."

Credit: 
University of California - Los Angeles

Protein 'big bang' reveals molecular makeup for medicine and bioengineering

image: Research by Gustavo Caetano-Anollés and Fayez Aziz, University of Illinois, reveals a "big bang" during evolution of protein subunits known as domains. The team looked for protein relationships and domain recruitment into proteins over 3.8 billion years across all taxonomic units. Their results could have implications for vaccine development and disease management.

Image: 
Fred Zwicky, University of Illinois

URBANA, Ill. - Proteins have been quietly taking over our lives since the COVID-19 pandemic began. We've been living at the whim of the virus's so-called "spike" protein, which has mutated dozens of times to create increasingly deadly variants. But the truth is, we have always been ruled by proteins. At the cellular level, they're responsible for pretty much everything.

Proteins are so fundamental that DNA - the genetic material that makes each of us unique - is essentially just a long sequence of protein blueprints. That's true for animals, plants, fungi, bacteria, archaea, and even viruses. And just as those groups of organisms evolve and change over time, so too do proteins and their component parts.

A new study from University of Illinois researchers, published in Scientific Reports, maps the evolutionary history and interrelationships of protein domains, the subunits of protein molecules, over 3.8 billion years.

"Knowing how and why domains combine in proteins during evolution could help scientists understand and engineer the activity of proteins for medicine and bioengineering applications. For example, these insights could guide disease management, such as making better vaccines from the spike protein of COVID-19 viruses," says Gustavo Caetano-Anollés, professor in the Department of Crop Sciences, affiliate of the Carl R. Woese Institute for Genomic Biology at Illinois, and senior author on the paper.

Caetano-Anollés has studied the evolution of COVID mutations since the early stages of the pandemic, but that timeline represents a vanishingly tiny fraction of what he and doctoral student Fayez Aziz took on in their current study.

The researchers compiled sequences and structures of millions of protein sequences encoded in hundreds of genomes across all taxonomic groups, including higher organisms and microbes. They focused not on whole proteins, but instead on structural domains.

"Most proteins are made of more than one domain. These are compact structural units, or modules, that harbor specialized functions," Caetano-Anollés says. "More importantly, they are the units of evolution."

After sorting proteins into domains to build evolutionary trees, they set to work building a network to understand how domains have developed and been shared across proteins throughout billions of years of evolution.

"We built a time series of networks that describe how domains have accumulated and how proteins have rearranged their domains through evolution. This is the first time such a network of 'domain organization' has been studied as an evolutionary chronology," Fayez Aziz says. "Our survey revealed there is a vast evolving network describing how domains combine with each other in proteins."

Each link of the network represents a moment when a particular domain was recruited into a protein, typically to perform a new function.

"This fact alone strongly suggests domain recruitment is a powerful force in nature," Fayez Aziz says. The chronology also revealed which domains contributed important protein functions. For example, the researchers were able to trace the origins of domains responsible for environmental sensing as well as secondary metabolites, or toxins used in bacterial and plant defenses.

The analysis showed domains started to combine early in protein evolution, but there were also periods of explosive network growth. For example, the researchers describe a "big bang" of domain combinations 1.5 billion years ago, coinciding with the rise of multicellular organisms and eukaryotes, organisms with membrane-bound nuclei that include humans.

The existence of biological big bangs is not new. Caetano-Anollés' team previously reported the massive and early origin of metabolism, and they recently found it again when tracking the history of metabolic networks.

The historical record of a big bang describing the evolutionary patchwork of proteins provides new tools to understand protein makeup.

"This could help identify, for example, why structural variations and genomic recombinations occur often in SARS-CoV-2," Caetano-Anollés says.

He adds that this new way of understanding proteins could help prevent pandemics by dissecting how virus diseases originate. It could also help mitigate disease by improving vaccine design when outbreaks occur.

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Business professors study ideal responses to ransomware attacks

image: Kay-Yut Chen.

Image: 
UTA

A pair of College of Business professors and their doctoral student at The University of Texas at Arlington are exploring how ransomware attacks sometimes pit organizations against the law enforcement agencies trying to protect them.

Kay-Yut Chen, Jingguo Wang and Yan Lang are authors of a new study in the journal Management Science titled "Coping with Digital Extortion: An Experimental Study on Benefit Appeals and Normative Appeals." Chen and Wang are professors of information systems and operations management at UTA. Lang is a doctoral student in the department.

A ransomware attack is like a cyber hijacking, with criminals infiltrating and seizing an organization's data or computer systems and demanding a payment or ransom to restore access.

In its study, the UTA trio explains that companies are finding that it makes sense to negotiate with their attackers to drive down the cost of the ransom. But such behavior in turn incentivizes attackers to continue their illegal activities and runs counter to FBI guidance.

"From a policy perspective, the FBI is telling businesses not to give in," Wang said. "But we've found that when you're trying to run a business, there is almost always a ransom that becomes similar to a break-even point."

This study investigates in part how to nudge companies toward adopting strategies that decrease the risk of digital extortion. The researchers used behavioral game theory to study tactics such as investing in cybersecurity or refusing to pay ransoms and used human subject experiments to analyze strategic decisions made by interacting players.

"We reason that when companies are hit with ransomware attacks, even if they pay the ransom, they still must pay for added security," Chen said.

National data shows these ransomware attacks are spiking, with experts saying an organization is attacked by ransomware every 40 seconds. Earlier this year, one of the nation's largest pipelines, carrying gasoline and jet fuel from Texas to the East Coast, shut down after a ransomware attack.

"We must convince companies that just because the bad actors come down on the ransom, it doesn't make it right to pay them--and you'll probably continue to have problems," Wang said. "We need to encourage firms to do the right thing in security investing. Recognizing the long-term benefits of this approach could help other companies come to the right decision."

Credit: 
University of Texas at Arlington

Newly discovered proteins protect against progression of diabetic kidney disease

image: The cumulative incidence of End Stage Renal Disease (ESRD) according to index of protection of three proteins, Index=0/3 indicates no protection whereas Index=3/3 indicates full protection.

Image: 
Copyright Joslin Diabetes Center

Elevated levels of three specific circulating proteins are associated with protection against kidney failure in diabetes, according to research from the Joslin Diabetes Center that will be published 30th June in Science Translational Medicine.

"As well as acting as biomarkers for advancing kidney disease risk in diabetes, the proteins may also serve as the basis for future therapies against progression to the most serious types of kidney disease," said Andrzej S. Krolewski MD, PhD, senior author on the publication, senior investigator at Joslin Diabetes Center and professor of medicine at Harvard Medical School. This would likely include the delay and prevention of end stage renal disease (ESRD), which is the most serious and advanced stage of diabetic kidney disease.

The study marks a move towards looking for markers associated with protection against, rather than increased individual risk, for the rapid progression of diabetic kidney disease. This should more directly derive potential targets for slowing progression since it is based on the thinking that individuals with slow progression will have protective factors of some sort.

"Our research became possible only recently," said Dr. Krolewski. "We were able to search for these markers thanks to the development of high-throughput proteomic platforms. More importantly, the availability of biobank specimens that we established many years ago in the Joslin Kidney Study was critical."

According to the report, the researchers profiled levels of just over 1000 proteins in the plasma samples that were taken at baseline in the original study. All of them had diabetes and moderately impaired kidney function. They used two cohorts of individuals with either type 1 or type 2 diabetes that were followed for between 7 and 15 years.

The main aim was to identify proteins that were elevated in individuals with slow or minimal decline in kidney function over the follow-up period. Notably they did validate the initial findings in a further cohort of individuals with type 1 diabetes.

Working through potential candidate proteins, they found three proteins that appeared to offer protection against progressive decline. These were fibroblast growth factor 20 (FGF20), angiopoietin-1 (ANGPT1) and tumor necrosis factor ligand superfamily member 12 (TNFSF12).

In each case elevated circulating levels reduced odds of progressive kidney decline and progression towards ESRD. The combined effect of having elevated levels of all three proteins translated to very low risk for ESRD.

"The protective effects of these proteins seem to be independent, which suggests that there are multiple mechanisms involved. They may be causally related to the disease process or represent as-yet unidentified pathways involved in progressive renal decline," said first author Zaipul Md Dom PhD, a research fellow in the Dr. Krolewski's laboratory

The authors go further to look at the current biological knowledge relating to the individual proteins and kidney disease, identifying a number of potential mechanisms that might explain their protective effects. According to Dr. Krolewski these are potential new routes for research that they will follow.

Dr Kevin Duffin, co-author on the publication, and chief operating officer at Eli Lilly, Diabetes and Diabetic Complications said: "Our study identified specific circulating proteins that were depleted in diabetes patients with kidney disease who progressed to ESRD. These results suggest a personalized medicine approach might be possible for treating patients with low levels of the protective proteins. We think that administering protein therapeutic mimetics or treatments that enhance circulating levels of these depleted proteins might be the future." 

Dr. Krolewski added: "We have already started to develop protocols on how to measure concentrations of the protective proteins in clinical settings. We hope that these proteins can then be used to identify patients at risk of progression to ESRD, who can then be treated with new therapies."

Credit: 
Joslin Diabetes Center

Proteins could offer risk markers and therapy targets in diabetic kidney disease

A 7- to 15-year longitudinal study of 358 diabetics has linked 3 proteins in blood with a slower progression of diabetic kidney disease and progressive kidney failure. The results from Zaipul Md Dom and colleagues suggest that the proteins could help researchers identify diabetics most at risk of kidney damage, potentially enabling earlier interventions and treatment. Despite advancements in blood sugar control and kidney therapies, patients with type 1 or type 2 diabetes still face a high risk of diabetic kidney disease. This condition can eventually progress to end-stage kidney disease, but some patients show slower kidney decline than others. In recent years, scientists have focused on understanding why some individuals progress at slower rates and whether they might harbor proteins that protect the kidneys from the effects of diabetes. As part of the Joslin Kidney study, Md Dom et al. followed two groups of patients with type 1 or type 2 diabetes and varying degrees of diabetic kidney disease (358 total) for between 7 to 15 years. While analyzing more than 1,000 proteins in the patients' plasma, the researchers discovered that patients who progressed slowly had higher amounts of the proteins ANGPT1, TNFSF12, and FGF20. The team confirmed this protective link in an independent group of 294 type 1 diabetics; they also found that FGF20 was elevated in healthy, non-diabetic parents of type 1 diabetics who remained free of kidney complications. If validated in larger studies, this finding "could have a profound implication in future research on determinants of progressive renal decline in [type 1 diabetes]," the authors say. However, they caution that more studies are necessary to confirm a causal link between the 3 proteins and protection from diabetic kidney disease.

Credit: 
American Association for the Advancement of Science (AAAS)

AI and marshmallows: Training human-AI collaboration

Despite unprecedented advancements in technology and countless depictions of complex human-AI interactions in sci-fi movies, we have yet to fully achieve AI bots that can engage in conversation as naturally as humans can. Kushal Chawla, researcher at the USC Institute for Creative Technologies (ICT) and a doctoral student in computer science, along with collaborators at both the USC Information Sciences Institute (ISI) and ICT are taking us one step closer to this reality by teaching AI how to negotiate with humans.

The research, presented at the 2021 Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL) this month, relied on a scenario-based dataset that was collected to teach negotiation skills to human users through role-play dialogue. With a campsite setting as the imaginary backdrop, participants in the data collection study were instructed to interact with each other as if they were campers negotiating for resources. The researchers discovered a total of nine strategies that the participants utilized throughout the exercise. The stand out lesson: cooperative strategies of negotiation were more effective than selfish strategies. This information can be used in the future to inform the creation of an automated system which takes various strategies of negotiation into account.

Training AI

CaSiNo, which stands for Camp Site Negotiations, is a scenario-based dataset that was collected to teach negotiation skills to human users through role-play dialogue.It consists of over a thousand negotiation dialogues which are carried out by two participants at a time. At the core of these dialogues, there are three essential camping items that the participants negotiated for -- food, water, and firewood. Each participant is assigned a preference order for these items and negotiates based on this model. As the participants negotiate with one another, they come to conclusions about how to allocate the items to best maximize each person's rewards.

Prior to these dialogues, participants underwent a training module which consisted of watching a video tutorial on negotiation. Doing so allowed participants to understand best practices of negotiation to be incorporated into the performance.

"We evaluate the negotiation performance of the participants in three ways: final points scored depending on what they were able to negotiate for, how satisfied they were with the outcome, and how much their opponents like them," explained Chawla. "All these metrics are crucial in the context of real-world negotiations."

Standing Out

Chawla has extensive prior research in AI, but CaSiNo is his most ambitious approach yet.

"One difference with these prior works is that in these cases, the negotiations don't involve language-based communication, but rather are based on button clicks and drop-down choices in a menu," explains Chawla. "However, our work on the CaSiNo dataset would promote the development of AI systems that can negotiate using language (such as in English) and have real, rich conversations with humans."

Similarly, most work in the field of automated negotiation systems has been focused on a menu-driven interface rather than language-based communication. Though these technologies have been easy to navigate, Chawla argued that "they fail to capture free-form emotion and persuasion, which are key components of real-world negotiations." Language, on the other hand, encapsulates human-like characteristics that help ground AI communication in the real world.

Achieving this new level of AI communication requires construction of complex negotiation datasets through which AI can be trained. It can be a challenge to construct the perfect dataset -- prior efforts at doing so have often been either too restrictive or too open-ended. In order to find the perfect balance between the two, Chawla and his team approached this challenge by "proposing a novel task that enables linguistically rich and personal conversations, but still in a constrained environment."

Applications in Pedagogy and Beyond

As an effective way of automating negotiation instead of relying on humans, it's no wonder CaSiNo has a variety of real-world applications. This technology can be applied to various industries, including business, education, entrepreneurship and tech. Specifically, CaSiNo can help with teaching negotiation skills in various pedagogical contexts, whether it be training business students to secure deals or helping lawyers to assess settlement rates more accurately.

CaSiNo is also highly valuable for improving negotiation skills of conversational AI assistants. Chawla mentions the Google Duplex prototype as an example, in which AI assistants express negotiation skills to automatically make appointments over the phone.

Future Directions

Going forward, Chawla and his team are broadly interested in looking deeper into other types of non-collaborative dialogue outside of negotiation, such as persuasion. Non-collaborative dialogue is generally defined as communication "where the goals of the involved parties may not perfectly align with each other."

More specifically, Chawla outlines two directions of future research based on the current work with CaSiNo. Firstly, the team is interested in looking at the predictive capabilities of AI through how emotional expression in CaSiNo dialogues correlates with the outcomes of negotiation. By doing so, these AI agents can be improved to become more emotion-aware. Secondly, the team is looking to improve the believability of negotiation skills by building upon realistic free-form language training. Ultimately, CaSiNo is a groundbreaking system that will serve as a solid foundation for improvements in human-computer interactions.

Credit: 
University of Southern California

New markers for coronary microvascular disease identified

image: Zeynep Madak-Erdogan, left, and first author Alicia Arredondo Eve identified biomarkers for coronary microvascular disease in postmenopausal women.

Image: 
Jillian Nickel

Although cardiovascular disease is the main cause of illness among women in the U.S., certain conditions such as coronary microvascular disease (CMD) cannot be easily diagnosed. In a new study, researchers at the University of Illinois Urbana-Champaign have identified specific biomarkers for CMD, which might reduce future hospitalizations.

CMD damages the inner walls of blood vessels causing spasms and decreased blood flow to the heart muscle. "Clinicians look for plaque formation in the blood vessels, which does not occur in CMD," said Zeynep Madak-Erdogan (CGD/EIRH/GSP), an associate professor of nutrition. "Usually, women leave without having the root causes of the chest pain addressed and they come back with further complications within a year. Since this condition is more common in postmenopausal women, we want to identify the biomarkers that are associated with CMD."

The researchers collected blood samples from three different groups containing 20-25 women each: postmenopausal women who were healthy, those with coronary artery disease, which is characterized by plaque formation, and those with CMD. The blood serum samples were then analyzed to see if there were any molecules that were different in the CMD group.

Out of 175 molecules scanned, the researchers identified stearic acid, which is found in animal and plant fats, and ornithine, an amino acid commonly found in meat, fish, diary, and eggs, as indicators of CMD.

Ornithine is formed from the amino acid arginine which is broken down by two separate pathways. One forms ornithine and the other forms nitric oxide, which helps in maintaining the normal functioning of the blood vessels.

"Our observations imply that the increase in ornithine means that the second branch is not working, which is why we can use this molecule as a biomarker for the disease," Madak-Erdogan said.

Interestingly, other researchers have found that estrogen may have a role in the development of CMD, as evidenced by hormone-replacement therapies which decrease CMD risk up to 30%. "Our observations further indicate that estrogen is involved because we know that it improves the function of nitric oxide," Madak-Erdogan said. "Since postmenopausal women have lower levels of estrogen, it would explain why this condition is more prevalent in these populations."

The researchers are trying to identify more biomarkers, such as proteins, that can be used to detect CMD. Additionally, they are testing more women to validate their findings. "This study was done with patients in Turkey, so we don't know if the same biomarkers will be present in the U.S. We want to look at bigger populations to see if we can combine the data to find efficient signatures for CMD," Madak-Erdogan said.

The study "Identification of circulating diagnostic biomarkers for coronary microvascular disease in postmenopausal women using machine-learning techniques" was published in Metabolites and can be found at https://doi.org/10.3390/metabo11060339.

Credit: 
Carl R. Woese Institute for Genomic Biology, University of Illinois at Urbana-Champaign

International team develops predictive tool to help mitigate COVID-19 in Africa

image: The modeling tool incorporates current case data, population, economic status, current mitigation efforts and meteorological sensing from satellites to project how COVID-19 might spread in and among African countries.

Image: 
Andrew Geronimo, Penn State

The virus that gives rise to COVID-19 is the third coronavirus to threaten humanity in the past two decades. It also happens to move more efficiently from person to person than either SARS or MERS did. The first African case of COVID-19 was diagnosed in Egypt in mid-February of 2020. Four weeks later, the first lockdowns began across Africa. Steven Schiff, Brush Chair Professor of Engineering at Penn State, who already had established research partnerships in Uganda, saw an opportunity for his team to apply what they were learning from their ongoing efforts to track and control infectious disease and provide countries such as Uganda with more information to help guide policy to mitigate the viral pandemic.

The result was a multi-country collaboration to develop a surveillance modeling tool that provides a weekly projection of expected COVID-19 cases in all African countries, based on current case data, population, economic status, current mitigation efforts and meteorological sensing from satellites. Developed in collaboration with Uganda's National Planning Authority (NPA), the country's senior organization for development and economic planning, the tool's COVID-19 projections use openly available data to provide a projection of cases, as well as lower and upper ranges to help the country decide if mitigation policies need to be implemented or modified.

The researchers published their approach today (June 29) in the Proceedings of the National Academy of Sciences of the United States of America. The project was funded in part by the National Institutes of Health Director's Transformative Research Award, a grant awarded to Schiff in 2018 for his "high-risk, high-reward" approach to predictive, personalized public health (P3H).

Prediction guiding prevention in the face of a pandemic

"When the COVID-19 pandemic began, we had this unusual team of scientists hard at work on implementing P3H in Africa, and we thought that we had much we could contribute toward the fight against this new virus," said Schiff, who founded the Penn State Center for Neural Engineering and serves as a professor of engineering science and mechanics in the College of Engineering and of neurosurgery in the College of Medicine.

The team includes Paddy Ssentongo, assistant research professor of engineering science and mechanics. Ssentongo is originally from Uganda, where he earned a medical degree before moving to Penn State to complete a master of public health and a doctorate in epidemiology. He graduated this year.

"This pandemic has shown us that we need to put more emphasis on global public health -- especially in places with fragile health care systems, including many countries in Africa," Ssentongo said. "If we wait for people to get sick, we're already losing. The best thing we can do is prevention."

The researchers reached across disciplines to bring in experts -- from epidemiologists to meteorologists to economists -- on every factor influencing viral spread.

"We pulled together a large team to tackle what was necessary," said Schiff, who is also a researcher in the Penn State Neuroscience Institute. "The team consists of 19 people across four countries, plus many more individuals who contributed through discussions and support."

The complexity of mitigation

Equally important as understanding the number and location of people with active cases, according to Schiff, is understanding the importance of weather, geography and other factors, especially in developing countries where many people live and work in more exposed conditions than do people in industrialized countries.

"If a coastal country closes its borders, landlocked Uganda is likely going to see cases go up because they depend on the coastal countries for imports -- without the imports, people will move around and interact more to find work and food," Schiff said, noting that such changes in movement may create shifts in projections of new cases from neighboring countries versus internal cases. "You need information gathered in real-time on the virus, such as testing and lockdowns, as well as the other influencing factors such as the varied economic security of different countries and their health systems. Our strategy synthesizes all of these data across Africa to make surprisingly good projections of the expected number of cases based on how these factors interact and influence COVID transmission in the population."

Abraham J. B. Muwanguzi, paper co-author and manager of the science and technology department at the NPA, also serves as the principal investigator in Uganda on Schiff's NIH grant.

"We're working closely with the Ministry of Health to use the model in analyzing how the COVID trends are moving," Muwanguzi said. "In September and October of 2020, at the peak of COVID cases, the model projected an increase in cross-border cases, prompting the government to close our border. We had fewer cases than projected because we were able to mitigate a predicted source that was captured well in the model."

Muwanguzi also noted that the tool not only helps provide data for mitigation policies, but it also helps the country plan how to use its resources.

"For example, in March and April of this year, the model projected a tremendous drop in cases," Muwanguzi said. "Our hospital centers started emptying out -- there really were fewer cases. We could then scale down operations and reappropriate resources to other areas of need."

Yet, on June 18, Uganda entered a 42-day lockdown after the daily number of new cases increased from fewer than a hundred at the end of May to nearly 2,000. The week after the lockdown started, the model projected 11,222 new cases would be reported if no mitigation efforts were put in place.

"Unlike the previous wave where factors influencing the spread were mostly from outside the country, the current wave is influenced by internal factors," said Joseph Muvawala, executive director of NPA, in a column published by New Vision, a national newspaper in Uganda. "With these statistics, a total lockdown was inevitable, irrespective of the known economic consequences; human life is far too precious to lose."

According to Muvawala's column, the projected increases have helped Uganda better prepare their hospital centers by procuring enough supplies and planning to avoid overwhelming hospitals and health care workers.

However, Ssentongo warned, the model is only as good as the data provided to it.

"We hope other countries in Africa will not only use this tool, but also collaborate to make sure they are integrating data in terms of testing and reporting cases," Ssentongo said. "The tool is a roadmap to tell a country how the pandemic is evolving and where the country is going. It's successful if the country sees the projections, implements mitigation efforts and sees a lower number of actual cases."

Global benefit of global collaboration

According to Schiff, their findings clearly demonstrate the advantages of inter-country cooperation in pandemic control.

"This is a crisis that no single country can fully manage on its own," Schiff said.

The researchers plan to continue updating the tool with more information as it becomes available, as well as implement data regarding vaccinations as they become more available in Africa. It is available freely online.

"One of the limitations of doing science is that you can do clever work, publish in a good journal that is reviewed by your peers, but it is still difficult to translate the work into effective policy," Schiff said. "We wanted to implement this tool to do good and help save lives. We could never have accomplished this without the close collaboration with our African colleagues in Uganda. It was critical to make sure this was a framework that people who make policy can use and apply in their work -- that's what makes this valuable."

Credit: 
Penn State

Research lays groundwork for restoring lost oral functions with pacemaker-like devices

image: Top and bottom view of the retainer containing the electrodes for intraoral electrical stimulation.

Image: 
Dr. Hangue Park, Texas A&M Engineering

Even the mundane act of swallowing requires a well-coordinated dance of more than 30 muscles of the mouth. The loss of function of even one of these, due to disease or injury, can be extremely debilitating. For these people, nerve stimulation offers a ray of hope to regain some of their lost oral function.

In a new study, researchers at Texas A&M University have delineated the minimum size of electrical currents needed to provide sensation in different parts of the mouth. The researchers said their study is a first but vital step toward building electrical stimulation implants that can restore essential intraoral functions that are lost due to nerve or brain damage.

The results of the study are published in the journal Institute of Electric and Electronics Engineers' (IEEE) Transactions on Biomedical Engineering.

Many essential bodily functions are coordinated by the nervous system via sensorimotor feedback loops. As the name suggests, these neural circuits involve the brain interpreting incoming signals from sensory nerves and then commanding the motor nerves to execute a certain movement. So, for example, sensorimotor loops play a vital role in voluntary functions, like walking or holding an object, and involuntary movements, like sneezing or blinking.

Within the mouth, also referred to as the intraoral cavity, there is a rich supply of both sensory and motor nerves. In particular, sensorimotor nerves in the soft palate and tongue coordinate several intraoral movements related to swallowing, speech and respiration. And so, damage to either the sensory or motor nerve fibers due to neurotrauma or disease can compromise these essential functions, reducing the quality of life of those afflicted.

Electrical nerve stimulation might help jumpstart the nerves into action, much like how a pacemaker can electrically stimulate nerves in the heart, causing the heart muscle to contract. But unlike a pacemaker, the details on the frequency and amplitude of the electrical currents needed for proper stimulation of different parts of the mouth have not been investigated.

"Electrical stimulation can modulate nerve currents or action potentials, which are the mode of communication to and from the brain," said Dr. Hangue Park, assistant professor in the Department of Electrical and Computer Engineering. "And so, electrical stimulation should be carefully applied, because if not, then it might cause undesirable effects, or it might not stimulate anything at all."

To investigate the minimum stimulation currents needed, Park and his team inserted tiny metal electrodes into a standard dental retainer. These electrodes were positioned in subjects to stimulate either their soft palate or the side and tip of the tongue, which receive a rich supply of sensory nerves. For each of these locations, the researchers slowly changed the amplitude of the stimulation current, keeping the frequency fixed. Then, subjects were asked to report when they just began feeling a sensation and when the sensation was uncomfortable. Next, they repeated the same experiment for a higher frequency of current.

After compiling their data, the team determined the average perception and discomfort thresholds for the tongue and soft palate. In addition, they produced an equivalent circuit of the intraoral cavity to duplicate the electrical properties of that area. This circuit, the researchers said, can help to further study the effects of electrical stimulation offline without requiring human subjects.

The researchers noted that their next steps would be to electrically stimulate the intraoral region and investigate how these simulations change chewing, swallowing and other behaviors.

"Sensorimotor systems can be extremely vulnerable to damage due to neural defects, aging and neurodegenerative diseases," said Park. "In this study, we have begun to lay the groundwork for electrically stimulating parts of the mouth that control involuntary and voluntary movements. Our work is a seminal study and it is important so that we can, in the near future, help people that face enormous challenges doing everyday tasks that we take for granted."

Credit: 
Texas A&M University

Assessment tool helps future pharmacists prepare for work in the community

A recent University of Arizona College of Pharmacy study suggests that Objective Structured Clinical Examinations (OSCEs) may be a valuable means of assesing clinical skills while providing learning experiences for pharmacy students in community pharmacy settings. While the OSCEs were designed to assess health care professionals in a clinical setting, there was limited data on its use in testing skills required in community pharmacies, until now.

For pharmacists working in retail, guiding patients on the use of over-the-counter (OTC) drugs is a common part of the job. According to a recent survey from the American Pharmacists Association (APhA), pharmacists make an average of 29 OTC recommendations each week, and approximately 81% of consumers purchase an OTC product their pharmacists recommended. As the profession evolves to providing more patient care services, there is a continued need for pharmacy curricula to maintain pace.

Self-Care Pharmacotherapeutics, a course required by the UArizona Doctor of Pharmacy program, teaches students the appropriate use of medications for self-care inquiries including selection of medications, appropriate dosing, and analysis of safety in a community pharmacy setting. But there was limited information on how best to assess the skills taught during this training. Seeking to fill this gap, a study by College of Pharmacy Assistant Professors Bernadette Cornelison, PharmD, MS, BCPS, and Beth Zerr, PharmD, BCACP, evaluated the use of a community pharmacy-based OSCE in assessing pharmacy students in their first year. The analysis found that students and facilitators believed the OSCE tested the skills needed to provide care in a community setting.

"We teach the self-care therapeutics course in the first semester of the first year in pharmacy school," explained Dr. Cornelison. "We felt it was important to innovate and evaluate new ways of teaching that would educate the students and, hopefully, help them retain information."

The study, published in the Journal of the American College of Clinical Pharmacy, also found that the design of the simulation reflected the accurate amount of time a student intern would have to complete the Pharmacist' Patient Care Process (PPCP). This finding further demonstrates that the PPCP can be applied to patients even when time may seem limited. The standardization of this process is an important step in advancing pharmacists as recognized patient care providers across the country.

Drs. Cornelison and Zerr say the results show a strong case for fully implementing OSCEs in program curricula and are hopeful other colleges of pharmacy will, too.

Credit: 
University of Arizona Health Sciences

Keep your friends close, cortisol levels low for life

image: Led by Si On Yoon (left) and Michelle Rodrigues (right), an interdisciplinary team at the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign determined that older adult women converse more effectively with strangers than their younger counterparts; additionally, communicating with female friends decreases stress hormone levels for women across the lifespan.

Image: 
Headshots provided courtesy of Yoon and Rodrigues.

Directing a meeting, dialing up an old acquaintance, dictating the perfect tuna salad sandwich across a drive-through window. For business and for pleasure, human beings are in constant communication.

Our proclivity for socialization is lifelong, equally prominent in the lives of adolescents and adults. A recent study determined key differences in the ways that various age groups communicate, as well as one conversational component that stands the test of time: friendship. Specifically, bonds between individuals who identify as female.

Led by former Beckman Institute postdoctoral researchers Michelle Rodrigues and Si On Yoon, an interdisciplinary team evaluated how interlocutors’ age and familiarity with one another impacts a conversation, reviewing the interaction’s overall effectiveness and stress responses generated as a result.

The study, titled “What are friends for? The impact of friendship on communicative efficiency and cortisol response during collaborative problem solving among younger and older women,” was published in the Journal of Women and Aging in May 2021.

Two hypotheses form the foundation of this female-focused study. First, the tend-and-befriend hypothesis, which challenges the traditionally masculine “fight-or-flight” dichotomy.

“Women have evolved an alternative mechanism in response to stress,” said Rodrigues, who is currently an assistant professor in the Department of Social and Cultural Sciences at Marquette University. “In order to deal with stress, women can befriend female peers.”

The team also tested the socio-emotional selectivity hypothesis, which postulates a social “pruning” as humans advance in age and pursue more intimate, higher-quality circles of friends.

The introduction of age as a variable is novel in the field and stems from an interdisciplinary Beckman collaboration.

“I was working with several different groups in several different disciplines, coming from the perspective of studying friendship but having previously done research on adolescent girls, but not older women,” Rodrigues said.

She combined forces with then-Beckman-postdoc Si On Yoon, who was studying the cognitive mechanisms of natural conversation across the lifespan, including healthy younger and older adults.

“My research program was focused on language measures in social interactions, and I was glad to work with Dr. Rodrigues to develop an integrative approach including both language processing and physiological measures to study social interactions,” said Yoon, who is currently an assistant professor in the Department of Communication Sciences and Disorders at the University of Iowa.

The interdisciplinary team merged both theories into a single query: Across women’s lifespans, how are the tendencies to “tend and befriend” as well as socially select reflected in their communication?

They tested a pool of 32 women: 16 “older adults” aged 62-79, and 16 “younger adults” aged 18-25. Each participant was either paired with a friend (a “familiar” conversation partner) or a stranger (“unfamiliar”).

The partnerships underwent a series of conversational challenges, wherein the participant instructed her partner to arrange a set of tangrams in an order that only the former could see. The catch? Each shape was abstract, their appearances purposefully difficult to describe.

“You could look at one [tangram] and say, ‘This looks like a dog.’ Or, you could say, ‘This looks like a triangle, with a stop sign, and a bicycle wheel,’” Rodrigues said.

This exercise helped quantify each conversation’s efficiency: partners who achieved the desired tangram arrangement in fewer words were considered more efficient, and pairs who needed more words to complete the task were considered less efficient.

The researchers found that while the younger adult pairs communicated more efficiently with familiar partners than their older counterparts, they communicated less efficiently with unfamiliar partners; alternatively, the older adults demonstrated conversational dexterity, quickly articulating the abstract tangrams to friends and strangers alike.

“A referential communication task like this requires that you see where the other person is coming from. It seems like the younger adults are a little more hesitant in trying to do that, whereas the older adults have an easier time doing that with strangers,” Rodrigues said.

This was not predicted based on the socio-emotional selectivity hypothesis, which anticipated a correlation between age and social isolation.

“Even though older adults choose to spend more time with people who matter to them, it’s clear that they have the social skills to interact with unfamiliar people if and when they choose to,” Rodrigues said.

Rodrigues' team also measured salivary cortisol to quantify and compare participants’ stress levels throughout the testing process.

“When you experience something stressful, if you have a stress response system that’s working as it should, the result is an elevated amount of cortisol, our primary stress hormone, which then tells our bodies to release glucose into our bloodstreams," she said. "That’s reflected in our saliva about 15 to 20 minutes after we experience it. If we see a rise in salivary cortisol from an individual’s baseline levels, that indicates that they are more stressed than they were at the time of the earlier measurements."

Across both age groups, those working with familiar partners had consistently lower cortisol levels than those working with unfamiliar partners.

“A lot of the research on the tend-and-befriend hypothesis has only focused on young women, so it’s great to have these results that pull that out to the end of life. We can see that friendship has that same effect throughout the lifespan. Familiar partners and friendship buffer stress, and that’s preserved with age,” Rodrigues said.

Credit: 
Beckman Institute for Advanced Science and Technology

Slowing down grape ripening can improve berry quality for winemaking

Wine grapes are particularly finicky when it comes to their environment. For instance, heatwaves and droughts lead to earlier berry ripening and lackluster wine. And these types of episodes are expected to intensify as Earth's climate changes. Now, researchers reporting in ACS' Journal of Agricultural and Food Chemistry have tweaked growing conditions for Cabernet Sauvignon grapes to slow down their ripening, which increased the levels of compounds associated with wine's characteristic floral and fruity notes.

As grapes ripen and change color from light green to deep red, sugars and aroma compounds accumulate in the berries. But, when they ripen quickly because of heat or water stress, the resulting fruits produce a less desirable wine with more alcohol, a duller color and a lingering taste of cooked fruit. To counteract these negative effects of climate change on wine quality, scientists have been testing different ways to grow the plants. Previous researchers have shown that reducing the crop on the vines can speed-up grape ripening, while more intense irrigation later in the growing season can delay the process. Christopher Ford and colleagues wanted to examine the impacts of these techniques on the chemical components that contribute to the berries' quality.

The researchers grew Cabernet Sauvignon wine grapes at a commercial vineyard in the San Joaquin Valley in California. Then, they either removed a portion of the clusters on the vines, irrigated the plants more during the later growing season, did both or did neither, and collected grapes throughout the ripening period. The plants with the fewest berry clusters had the fastest increase in sugar content and were ripe the earliest for all of the tested conditions. However, the plants that were both thinned and watered more had the slowest rate of sugar accumulation. The researchers found that slowing down grape ripening decreased six-carbon aldehydes and alcohols and 2-isobutyl-3-methoxypyrazine -- associated with green and vegetal wine notes -- and increased norisoprenoids and terpenes -- associated with pleasant floral and fruity wine notes. The longer growing time improved the quality of grapes for winemaking, the researchers explained, but these adaptation strategies should be monitored over several years before changes are made to current practices.

Credit: 
American Chemical Society