Culture

DNA base editing induces substantial off-target RNA mutations

In a study published in Nature on June 10, researchers from Dr. YANG Hui's Lab at the Institute of Neuroscience of the Chinese Academy of Sciences (CAS), and collaborators from the CAS-MPG Partner Institute for Computational Biology of CAS and Sichuan University demonstrated that DNA base editors generated tens of thousands of off-target RNA single nucleotide variants (SNVs) and these off-target SNVs could be eliminated by introducing point mutations to the deaminases.

This study revealed a previously overlooked aspect of the risk of DNA base editors and further provided a solution to the problem by engineering deaminases.

DNA base-editing methods have enabled direct point mutation correction in genomic DNA without generating any double-strand breaks (DSBs), but the potential off-target effects have limited the application of these methods. Adeno-associated viruses (AAV) are the most common delivery system for DNA editing gene therapies. Since these viruses can sustain long-term gene expression in vivo, the extent of potential RNA off-target effects induced by DNA base editors is of great concern for their clinical application.

Several previous studies have evaluated off-target mutations in genomic DNA by DNA base editors. Meanwhile, the deaminases integral to commonly used DNA base editors often exhibit RNA binding activities. For example, the cytosine deaminase APOBEC1 used in cytosine base editors (CBEs) was found to target both DNA and RNA, and the adenine deaminase TadA used in adenine base editors (ABEs) was found to induce site-specific inosine formation on RNA. However, any potential RNA mutations caused by DNA base editors had not been evaluated.

In order to evaluate the off-target effect of DNA base editors at the level of RNA, the researchers counted the off-target RNA SNVs in each replicate of CBE- or ABE-treated cells, and then explored the possibility of eliminating the off-target RNA SNVs by engineering deaminases of DNA base editors.

They transfected one type of CBE, BE3 (APOBEC1-nCas9-UGI), or ABE, ABE7.10 (TadA-TadA*-nCas9), together with GFP and with or without single-guide RNA (sgRNA), into HEK293T-cultured cells. After validating the high on-target efficiency of DNA editing by both BE3 and ABE7.10 in these HEK293T cells, they performed RNA-seq at an average depth of 125X on these samples and quantitively evaluated the RNA SNVs in each replicate.

The on-target editing efficiency was evaluated in each replicate of the CBE- or ABE-treated cells to guarantee efficient editing. Then the number of off-target RNA SNVs in CBE- and ABE-treated groups was compared with the GFP-only control group. They found strikingly higher numbers of RNA SNVs in DNA base editor-treated cells.

Furthermore, the researchers found that the mutation bias in BE3- or ABE7.10-treated cells was the same as that of APOBEC1 or TadA, respectively, indicating the off-target effects were caused by the overexpression of DNA base editors. They also identified CBE- and ABE-specific motifs and genetic regions of these off-target RNA SNVs.

To eliminate the RNA off-target activity of base editors, they examined the effect of introducing point mutations on APOBEC1 or TadA. Three high-fidelity variants, BE3W90Y+R126E, BE3 (hA3AR128A) and BE3 (hA3AY130F), reduced RNA off-target SNVs to the base level. Similarly, an ABE variant ABE7.10F148A also showed complete elimination of off-target effects.

This study obtained both high-fidelity variants for both CBEs and ABEs by introducing point mutations to the deaminases and provided a proposed method using rational engineering to increase the specificity of base editors.

Credit: 
Chinese Academy of Sciences Headquarters

Combating mosquito-borne diseases with bacteria

image: Wolbachia bacteria (stained red) inside mosquito cells (with nuclei stained blue).

Image: 
Cassandra Koh / Monash University

Viruses, spread through mosquito bites, cause human illnesses such as dengue fever, Zika and yellow fever. A new control technique harnesses a naturally occurring bacterium called Wolbachia that blocks replication of viruses and breaks the cycle of mosquito-borne disease, according to an international team of researchers.

"Wolbachia is present in around 50 percent of all insects," said Beth McGraw, professor and Huck Scholar in Entomology at Penn State, who did this research while at Monash University. "Interestingly it is not present in some of the major mosquito vectors (insects that transmit pathogens). After researchers put Wolbachia into mosquitoes, they found that, quite excitingly, Wolbachia effectively vaccinates mosquitoes, preventing viruses from replicating."

Spread by Aedes aegypti mosquitoes, dengue virus affects millions of people each year. Symptoms include fever, body aches and nausea, although a more severe version, known as dengue hemorrhagic fever, can be fatal.

In the tropics and subtropics where Ae. aegypti resides, several large releases of Wolbachia are underway to test whether Wolbachia can reduce the incidence of human disease.

In a paper published today in Virus Evolution, McGraw and her team report that dengue virus failed to evolve resistance to Wolbachia in controlled lab-based experiments. These findings show promise for the long-term efficacy of Wolbachia following field release.

"I am continually surprised by Wolbachia," said McGraw. "I thought we would get dengue variants that would evolve resistance. Wolbachia is doing a better job than I expected at controlling virus replication in cells."

The researchers took dengue virus and infected mosquito cells that either had Wolbachia or were free of bacteria. After five days, they collected the viruses that had been released from the cells and used them to infect fresh cells.

"Dengue takes over the machinery of the host cells, makes lots of copies of itself, and then it buds or burst out of the cell," explained McGraw.

After nine rounds of passaging the virus through mosquito cells, the team found that the amount of virus released was stable in the Wolbachia-free cells. However, in the presence of Wolbachia, virus levels crashed -- and in some cases, disappeared completely.

Dengue viruses grown with Wolbachia were also less effective at infecting mosquito cells and had reduced ability to replicate, compared to viruses grown without the bacterium.

Although this is good news for the control of dengue and other mosquito-transmitted diseases, the researchers note the study has limitations. The researchers used mosquito cells -- which may not reflect what happens within the whole insect. And outside the lab, where mosquito populations are much larger, there may be more opportunities for the virus to develop resistance to Wolbachia.

"I think our study suggests that the evolution of resistance to Wolbachia in the virus is challenging," said McGraw. "I don't think it's a guarantee that the virus is not going to evolve under field conditions because the natural system is much more complex. The real experiment is being done in the field right now, because Wolbachia has been released into communities in Australia, Indonesia and Brazil, among others. Monitoring in release areas will be needed to test for the emergence of resistance in the virus."

Other control methods for dengue have largely been unsuccessful. Because Ae. aegypti is active during the day, bed nets are ineffective at reducing mosquito bites. Spraying of insecticides to control the mosquito and removing standing-water breeding sites have also been difficult to implement in urban environments where the mosquito thrives.

Wolbachia is an attractive control option because it blocks the replication of many disease-causing viruses. It is also self-spreading because of a curious effect, where Wolbachia-containing male mosquitoes cannot reproduce successfully with Wolbachia-free females. According to McGraw, this means that these males prevent Wolbachia-free females from reproducing. Because the bacterium is transmitted from mother to offspring, each generation has successively more mosquitoes containing Wolbachia.

Researchers are still unsure exactly how Wolbachia reduces virus replication in the mosquito.

"We think it might have to do with competition between Wolbachia and the virus for physical space (inside the cell) or for nutrition they both need from the mosquito," said McGraw. "Or it could be that Wolbachia is increasing the immune capacity of the mosquito. There are a whole range of theories, none of which are entirely satisfying."

Credit: 
Penn State

Study finds FDA dermatology advisors receive payments following drug approvals

A team of researchers led by a member of the University of Colorado School of Medicine faculty at the Anschutz Medical Campus examined post-advisory financial relationships between U.S. physicians who advised FDA committees during dermatological drug approval processes. Critics of these industry-physician relationships claim these types of payments could incentivize advisors to alter their voting habits.

The findings are published in a research letter in the Journal of the American Academy of Dermatology.

"It's known from previous studies that financial payments to FDA advisors can take place after a drug is approved but this is the first time we've researched and seen that this trend spans to the dermatology field," said Robert Dellavalle, MD, PhD, professor of dermatology and public health at the University of Colorado School of Medicine.

Dellavalle adds, "It's hard to control post-advisory financial relationships since it's not on the record going into the committee and top doctors can be paid as ongoing academic advisors for a variety of reasons. Regardless, financial conflicts of interest in medical research are important to discuss and monitor."

Physician advisors serve as external experts in determining whether a new medical therapy is fit for the U.S. market. Of the advisors analyzed, 54 percent received at least one payment from pharmaceutical companies. Twenty-seven percent accepted more than $1,000, 15 percent accepted more than $50,000 and nine percent took more than a $100,000. The advisors received a mean of more than $47,000. For the majority of the drugs examined, payments from competitors outnumbered payments from manufacturers.

The study analyzed Open Payment data, a national transparency program that collects and publishes information about financial relationships between the health care industry (i.e., drug and device companies) and providers (i.e., physicians and teaching hospitals). The study focused on payments made by U.S. physicians who advised FDA committees during the approval of ten dermatologic therapies.

Credit: 
University of Colorado Anschutz Medical Campus

Clarifying the economic value of adjusting the power consumption

Since the output of renewable energy such as photovoltaic generation tends to fluctuate, the power system can be viewed as a large-scale complex system with uncertainty. To stabilize the balance of supply and demand of electricity, we need an energy management system to control this. In recent years, energy management systems have been actively researched against the background of the liberalization of power and the spread of smart meters that visualize the power consumption. Koichi Kobayashi, associate professor at Hokkaido University, Shun-ichi Azuma, professor at Nagoya University, and Nobuyuki Yamaguchi, associate professor at Tokyo University of Science etc. developed demand response analysis and control technologies focusing on time-varying power generation costs.

Demand response is one of the methods in energy management systems. Demand response is defined as "when the supply-demand balance is tight, consumers conserve the power consumption and change the power consumption pattern according to the setting of the electricity price or the payment of incentives (rewards)." The cost-effectiveness has not been clarified.

The introduction of the "aggregator" that controls the power consumption of consumers has attracted much attention. In this framework, aggregators trade between electric power companies and consumers, instead of direct trade between consumers and electric companies. Aggregators manage hundreds of consumers and control their power consumption in response to requests from electric companies. By the introduction of aggregators, control of the whole power system becomes easier.

During a day, the cost-effectiveness of demand response fluctuates depending on the demand and supply of electricity. It is expected that this fluctuation becomes larger by the spread of renewable energy. Demand response is aimed at maintaining the balance between supply and demand, and its cost-effectiveness has not been focused. However, in the future, it will be important to evaluate the economic value of demand response, focusing on the power generation cost and the adjustment cost (the cost required to adjust power consumption) at each time. Furthermore, it is necessary to develop control strategies that maximize the economic value of demand response.

In order for demand response to produce the economic value, the unit price of power generation costs needs to fluctuate greatly during the day. If the difference between the highest and lowest generation costs is large compared to the adjustment costs, then demand response produces the economic value. In this research, more specifically, we derived the condition that "demand response produces the economic value if the difference between the highest price and the lowest price is more than twice the adjustment cost". Because it is a simple condition, it can also be used as a guide to calculate the rewards to consumers.

Next, in order to maximize the economic value, a control method for demand response is developed based on model predictive control in which the optimal control strategy is found by the prediction via a mathematical model. In the simulation, the effectiveness of the proposed method is presented by using the data from the Japan Electric Power Exchange as a forecast value of the power generation cost and the power consumption.

Credit: 
Japan Science and Technology Agency

Dramatic change in ancient nomad diets coincides with expansion of networks across Eurasia

image: Map of millet and wheat/barley consumption over time: a) 1000-500 cal BC, b) 500-200 cal BC, and c) 200 BC-AD 400.

Image: 
I. Reese and A. R. Ventresca Miller, 2017

A meta-analysis of dietary information recorded in the bones of ancient animals and humans recovered from sites scattered across the Eurasian steppe, from the Caucasus region to Mongolia, demonstrates that pastoralists spread domesticated crops across the steppe through their trade and social networks. Researchers from Kiel University sifted through previously published stable isotopic data and applied new quantitative analyses that calibrate human dietary intake against environmental inputs. The results have allowed them to better isolate the timing of the incorporation of agricultural products into the diets of pastoral nomads and, crucially, link burgeoning socio-political networks to this dietary transformation.

Through a big data project that explored over a thousand stable isotope data points, researchers were able to find evidence for an early transition to agriculture - based on dietary intake across Eurasia. "Our understanding of the pace of crop transmission across the Eurasian steppe has been surprisingly unclear due in part to a focus on the excavation of cemeteries, rather than settlements where people threw out their food," says Alicia Ventresca Miller, lead author, formerly of Kiel University and currently at the Max Planck Institute for the Science of Human History. "Even when settlement sites are excavated, the preservation of carbonized seed remains is often poor. This is what makes stable isotope analyses of human remains from this region so valuable - it provides direct insights into the dietary dynamics of ancient pastoralists who inhabited diverse environments."

Millet spreads across the Eurasian steppe

Millet, originally domesticated in China, appears to have been occasionally consumed at low levels by pastoralists inhabiting the far-flung regions of Siberia and southeastern Kazakhstan, possibly as early as the late third millennium. This initial uptake of millet coincided with the expansion of trans-regional networks across the steppe, when objects and ideas were first regularly exchanged over long-distances.

However, it was not until a thousand years later that millet became a regular feature of pastoralist diets. This timing coincides with the intensification of complex political structures at the transition to the Iron Age. Burgeoning socio-political confederations drove a marked increase in the exchange of costly prestige goods, which strengthened political networks - and facilitated the transfer of cultigens.

Wheat and Barley in the Trans-Urals

Despite taking part in these political networks, groups in the Trans-Urals invested in wheat and barley farming rather than millet. A dietary focus on wheat and barley may have been due to different farming techniques, greater water availability, or a higher value on these cultigens. "Our research suggests that cultigens were converted from a rare luxury during the Bronze Age to a medium demarcating elite participation in political networks during the Iron Age," states Cheryl Makarewicz of Kiel University.

Regional variation in millet consumption

While herding of livestock was widespread, not all regions adopted millet. In southwest Siberia, dietary intake was focused on pastoral animal products and locally available wild plants and fish. In contrast, the delayed adoption of millet by populations in Mongolia during the Late Iron Age coincides with the rise of the Xiongnu nomadic empire. "This is particularly interesting because it suggests that communities in Mongolia and Siberia opted out of the transition to millet agriculture, while continuing to engage with neighboring groups," explains Ventresca Miller.

This study shows the great potential of using the available isotope record to provide evidence for human dietary intake in areas where paleobotany is understudied. Further research should clarify the exact type of grains, for example broomcorn or foxtail millet, were fundamental to the shift in dietary intake and how networks of exchange linked different regions.

Credit: 
Kiel University

'Green Revolution' in RNAi tools and therapeutics

"Green revolution" in the early 1950s, the extensive cultivation of Dwarf Rice solves the food problem in developing countries. At present, chronic infection with hepatitis B virus (HBV) has been a major public health problem. According to the announcement from World Health Organization (WHO), an estimated 257 million people are chronically infected with hepatitis B. Recent studies have shown that the expression level of hepatitis B virus surface antigen gene (HBsAg) is correlated with the occurrence of HCC or fibrosis severity in transgenic mice and HBV infection patients, therefore, HBsAg becomes a rising target for drug design for the treatment of hepatitis B.

In a study recently published in Biomaterials, Dr. Zhi Hong and Dr. Chen-Yu Zhang from Nanjing University and the collaborators report that the small silencing RNA sequences against HBsAg generated in edible lettuce (Lactuca sativa L.) can specifically bind and inhibit gene expression in p21-HBsAg knock-in transgenic mice at a relatively low amount when compared to synthetic siRNAs. More importantly, continuous administration of amiRNA-containing decoction relieves the liver injury in transgenic mice without extra negative effects even after 15-month treatment.

This work utilizes the plant endogenous microRNA biogenesis machinery to produce methylated short interfering sequences for increasing the stability of target siRNAs while reducing the cost of production. Therefore, this work not only provide an affordable treatment strategy for chronic hepatitis B patients in developing countries, but also reduces the required dose of RNAi drugs to minimize the potential side effects of RNAi therapy and allow the administration for a relatively long period or in conjunction with other antiviral drugs.

To those patients in immune-tolerant phase or resistant to conventional antivirus treatment, this RNAi-based therapy may effectively reduce their risk of liver injury by daily consumption of vegetable decoction containing HBsAg silencing RNAs.

If we take a long view, this method may also be applicable to the treatment of hepatitis C or other infectious diseases due to the effective, less toxic and financially viable strategy to produce short interfering sequences using engineered plants. It can be predicted that plant derived siRNAs will bring a "green revolution" in RNAi tools and therapeutics.

When we look back, the Green Revolution has brought us a richer food supply. At the same time, we should also know that the daily food is also changing ourselves, in which the small RNAs we take from food may play an important role.

Credit: 
Nanjing University School of Life Sciences

What's your poison? Scrupulous scorpions tailor venom to target

Replenishing venom takes time and energy - so it pays to be stingy with stings.

According to researchers at the Australian National Institute of Tropical Health and Medicine, scorpions adapt their bodies, their behavior and even the composition of their venom, for efficient control of prey and predators.

Writing in Frontiers in Ecology and Evolution, they say it's not just the size of the stinger, but also how it's used that matters.

Stingy stingers

"Scorpions can store only a limited volume of venom, that takes time and energy to replenish after use," says lead author Edward Evans. "Meanwhile the scorpion has a reduced capacity to capture prey or defend against predators, so the costs of venom use are twofold."

As a result, over 400 million years of evolution scorpions have developed a variety of strategies to minimize venom use.

The most obvious of these is to avoid using venom at all.

"Research has shown the lighter, faster male specimens of one species are more likely to flee from danger compared to the heavier-bodied females, rather than expend energy using toxins," notes Evans. "Others - particularly burrowing species - depend instead on their large claws or 'pedipalps', and have a small, seldom-used stinging apparatus."

When immobility, threat or lively prey forces venom use, scorpions can adjust the volume they inject - both within each sting and through the application of multiple stings.

"Scorpions can hold prey in their pedipalps and judiciously apply stings, just until it stops struggling."

At the other extreme, when the survival stakes are high some species abandon precision and spray their venom through the air.

"Spraying venom defensively is potentially wasteful but can avoid dangerous close contact with predators such as grasshopper mice, which disarm scorpions by biting off their tails."

Venom versatility

Scorpions can also tailor the composition of their venom to a target - both on-the-fly, and more precisely over weeks of exposure.

For starters, any given sting has three levels: dry, prevenom or venom.

As a light deterrent, a scorpion may sting with no venom at all. A 'wet' sting begins with clear, salty prevenom - essentially a "stun" setting - and might go no further.

"Research on prevenom suggests it contains an extremely high potassium salt concentration, which may cause quick paralysis in insects and pain in vertebrates," says Evans. "It seems to regenerate quickly and presumably at a low metabolic cost."

If things get heavy, the scorpion can go on to inject or spray a thick, milky, protein-rich venom.

"Venom injection is reserved for more active, persistent or sizeable targets. It is more toxic, but once spent can take weeks to replenish - leaving the scorpion vulnerable and with limited prey options."

Recent work by the James Cook University group suggests that scorpions can make more personalized changes to venom composition, in response to extended periods of predator exposure.

"Repeated encounters with a surrogate vertebrate predator - a taxidermied mouse - over a six week period led the scorpion Hormurus waigiensis to produce a higher relative abundance of a particular group of toxins, including some with vertebrate predator-specific activity," explains senior author Dr. David Wilson.

How exactly the change occurs remains unknown, however.

"Future work is needed to investigate how far observed changes in venom composition and use are due to adaptive responses - and to identify the precise stimuli for change," Wilson and Evans conclude.

Credit: 
Frontiers

The common wisdom about marketing cocreated innovations is wrong

Researchers from the University of Hong Kong, University of Tennessee, University of British Columbia, and Arizona State University published a new paper in the Journal of Marketing that seeks the optimal strategy for communicating the value of cocreated innovations in order to drive consumer purchase and acceptance in the marketplace.

The study forthcoming in the July issue of the Journal of Marketing, titled "Successfully Communicating a Cocreated Innovation," is authored by Helen Si Wang, Charles Noble, Darren Dahl, and Sungho Park.

Online platforms make it easy and inexpensive for companies to run contests, gather customers' ideas, and commercialize the most promising ideas into finished products. This is a key reason cocreation has been adopted as a key innovation strategy by nearly 78% of large companies. Thus far, however, the strategy has yielded disappointing results. One of the most heralded cocreation firms, Quirky, withdrew 70% of its 500-plus cocreated innovations between 2009 and 2014 because of stagnant sales and filed for bankruptcy thereafter. And at Apple's App Store, 80% of the apps do not generate enough revenue to survive for more than a few months.

Is the cocreation model a legitimate strategy to drive innovation and adoption of resulting products--or is it flawed by design? Marketing communications is often regarded as one of the major influences on innovation adoption and creators typically take two approaches to marketing new products. They either share a consumer creation or genesis story (also called user-generated content or UGC) or use more traditional, firm-generated content (FGC) that often stresses a feature's products and benefits. This research shows that it is wise to combine these strategies but with an interesting twist on conventional advertising wisdom.

When sharing a genesis story, creators tend to take one of two tacks: 1) an approach-oriented message about how they achieved new or desired outcomes; or 2) an avoidance-oriented message that promises to help users avoid unpleasant or undesirable outcomes they themselves experienced. Advertising best practice stresses that a firm should use consistent messaging to communicate with customers.

This practice does not hold up to scrutiny in the area of co-created products, however. Instead, the researchers found that a mixed or "mismatch" communication strategy works best to speed individual and mass consumer adoption. A mismatch communication strategy means that if the product creator's claim is approach-oriented, the firm should use an avoidance-oriented and vice versa.

As an example, for the cocreated Starbucks® Doubleshot Energy Mexican Mocha Coffee Drink, the creators' authentic message was approach-oriented and focused on "Embracing winter... fueling me with all of the winter warmth and energy I want." When the researchers combined this with an avoidance firm message, "What the world can't miss this winter... say bye-bye to the winter chill and blues" to make a mismatch strategy, adoption levels increased compared to when the approach firm message was used--"What the world desires this winter... makes you embrace all the winter warmth and joy."

Key findings from five studies include:

Products using a mismatch strategy were adopted 56.1% of the time compared to 26.3% of those using matching communication strategies.

This approach works best with low-expertise consumers who reference their own life stories when buying and using goods. High-expertise consumers are less motivated by this approach.

Firms using a mismatch communication strategy are 10% more likely to experience early takeoff, which is critical to the mass adoption of the innovation.

"This research offers important implications for managers and companies seeking to leverage the creative power of the crowd in developing innovations," says Wang. Noble adds, "Our findings challenge the conventional wisdom in many marketing campaigns. If you want takeoff, mismatch your message with the innovator creator's message."

Credit: 
American Marketing Association

Cyber of the fittest: Researchers develop first cyber agility framework to measure attacks

image: Cyber of the fittest: UTSA develops 1st framework to measure the evolution of cyber attacks.

Image: 
Photo courtesy of UTSA

(June 7, 2019) –- For more than a year, GozNym, a gang of five Russian cyber criminals, stole login credentials and emptied bank accounts from unaware Americans. To detect and quickly respond to escalating cyber-attacks like these, researchers at The University of Texas at San Antonio (UTSA) have developed the first framework to score the agility of cyber attackers and defenders. The cyber agility project was funded by the Army Research Office.

“Cyber agility isn’t just about patching a security hole, it’s about understanding what happens over time. Sometimes when you protect one vulnerability, you expose yourself to 10 others,” said computer science alumnus Jose Mireles ’17, who now works for the U.S. Department of Defense and co-developed this first known framework as part of his UTSA master’s thesis. “In car crashes, we understand how to test for safety using the rules of physics. It is much harder to quantify cybersecurity because scientists have yet to figure out what are the rules of cybersecurity. Having formal metrics and measurement to understand the attacks that occur will benefit a wide range of cyber professionals.”

To develop a quantifiable framework, Mireles collaborated with fellow UTSA student Eric Ficke, researchers at Virginia Tech, U.S. Air Force Research Laboratory, and the U.S. Army Combat Capabilities Development Command Army Research Laboratory (CCDC ARL). The project was conducted under the supervision of UTSA Professor Shouhuai Xu, who serves as the director of the UTSA Laboratory for Cybersecurity Dynamics.

Together, they used a honeypot—a computer system that lures real cyber-attacks—to attract and analyze malicious traffic according to time and effectiveness. As both the attackers and the defenders created new techniques, the researchers were able to better understand how a series of engagements transformed into an adaptive, responsive and agile pattern or what they called an evolution generation.

The framework proposed by the researchers will help government and industry organizations visualize how well they out-maneuver attacks. This groundbreaking work will be published in an upcoming issue of IEEE Transactions on Information Forensics and Security, a top cybersecurity journal.

“The cyber agility framework is the first of its kind and allows cyber defenders to test out numerous and varied responses to an attack,” said Xu. “This is an outstanding piece of work as it will shape the investigation and practice of cyber agility for the many years to come.”

"The DoD and US Army recognize that the Cyber domain is as important a battlefront as Ground, Air and Sea," said Purush Iyer, Ph.D. division chief, network sciences at Army Research Office, an element of CCDC ARL. "Being able to predict what the adversaries will likely do provides opportunities to protect and to launch countermeasures."

Mireles added, “A picture or graph in this case is really worth more than 1,000 words. Using our framework, security professionals will recognize if they’re getting beaten or doing a good job against an attacker.”

UTSA is home to the nation’s top cybersecurity program, an interdisciplinary approach that spans three colleges: the College of BusinessCollege of Engineering and College of Sciences. Research centers and outreach programs provide UTSA students and faculty with additional opportunities to explore the various facets of this high demand and ever-changing field.

The Department of Computer Science, housed in the UTSA College of Sciences, offers bachelor’s, master’s and doctoral degree programs that support more than 1,360 undergraduate students and 68 graduate students. Its major research units include the UTSA Institute for Cyber Security, which operates the FlexCloud and FlexFarm laboratories dedicated to both basic and applied cybersecurity research, and the UTSA Center for Infrastructure Assurance and Security (CIAS), which focuses on the cybersecurity maturity of cities and communities while conducting national cyber defense competitions for high school and college students.

San Antonio is home to one of the largest concentrations of cybersecurity experts and industry leaders outside Washington, D.C., which uniquely positions the city and UTSA to lead the nation in cybersecurity research and workforce development.

Credit: 
University of Texas at San Antonio

Exposure to videos of race-based violence online may be spurring mental-health issues

Social media-based movements like #BlackLivesMatter and #SayHerName have taken off over the past decade as a response to highly scrutinized police shootings of African American people. Recordings from body cameras or bystanders are frequently posted online and shared by activists and others as a way to press for police accountability.

But those videos may also have deleterious effects on the mental health of young members of the same racial communities as the victims in those shootings, suggests a new study published today in the Journal of Adolescent Health.

Previous research has linked exposure to violent media with trauma, and other research has connected actual police killings in a given region to poor mental health in same-race communities. Study authors say this study is the first to explore the relationship between repeated youth exposure to traumatic events online with mental health.

"Increased exposure to traumatic events online, whether they involve members of one's own racial-ethnic group or those of other racial-ethnic groups, are related to poor mental health outcomes," said lead author Brendesha Tynes, an associate professor of education and psychology at the USC Rossier School of Education.

Data were collected from a nationally representative sample of 302 Black and Hispanic adolescents ages 11-19. African American and Hispanic participants were asked about police shootings, immigrants being detained by federal border agents, and beatings.

Study participants reported the frequency of their exposure to traumatic events online, depressive symptoms, PTSD symptoms, and other demographic information.

Though not establishing causality, the researchers' findings showed that Hispanic participants reported significantly more depressive symptoms than African American participants. Female participants reported significantly more depressive and PTSD symptoms than male participants. This was true for teens that viewed violence involving both African Americans and Hispanic individuals.

"The study shows that the increase in depressive and PTSD symptoms crosses racial and ethnic lines - in other words, the mental health of both African American and Latinx teens may be linked to viewing any racial violence, not just that which depicts their own racial or ethnic group," Tynes said.

Pew Internet Research's most recent (2018) survey of adolescent technology usage shows that 45 percent of youth report they are online "almost constantly."

Given such high internet use, the researchers suggested that mental health professionals and educators have conversations with young people of color about their exposure to racial violence online, and that those professionals should also take steps to improve their own cultural competency.

"The videos of these injustices should be public and people should continue to record and post them," Tynes said. "The findings show that mental health problems are exacerbated with exposure, so viewers should be mindful of their viewing practices, auto-play settings and how they think about the event after they've seen it. They should exhaust all technological, personal and community resources to protect themselves and thrive in the face of these seemingly ubiquitous events."

Credit: 
University of Southern California

What if dark matter is lighter? Report calls for small experiments to broaden the hunt

image: Junsong Lin, an affiliate in Berkeley Lab's Physics Division and UC Berkeley postdoctoral researcher, holds components of a low-mass dark matter detector that is now in development at UC Berkeley.

Image: 
Marilyn Chung/Berkeley Lab

The search for dark matter is expanding. And going small.

While dark matter abounds in the universe - it is by far the most common form of matter, making up about 85 percent of the universe's total - it also hides in plain sight. We don't yet know what it's made of, though we can witness its gravitational pull on known matter.

Theorized weakly interacting massive particles, or WIMPs, have been among the cast of likely suspects comprising dark matter, but they haven't yet shown up where scientists had expected them.

Casting many small nets

So scientists are now redoubling their efforts by designing new and nimble experiments (https://newscenter.lbl.gov/2019/06/10/small-dark-matter-experiments-broaden-hunt/) that can look for dark matter in previously unexplored ranges of particle mass and energy, and using previously untested methods. The new approach, rather than relying on a few large experiments' "nets" to try to snare one type of dark matter, is akin to casting many smaller nets with much finer mesh.

Dark matter could be much "lighter," or lower in mass and slighter in energy, than previously thought. It could be composed of theoretical, wavelike ultralight particles known as axions. It could be populated by a wild kingdom filled with many species of as-yet-undiscovered particles. And it may not be composed of particles at all.

Momentum has been building for low-mass dark matter experiments, which could expand our current understanding of the makeup of matter as embodied in the Standard Model of particle physics, noted Kathryn Zurek, a senior scientist and theoretical physicist at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab).

Zurek, who is also affiliated with UC Berkeley, has been a pioneer in proposing low-mass dark matter theories and possible ways to detect it.

"What experimental evidence do we have for physics beyond the Standard Model? Dark matter is one of the best ones," she said. "There are these theoretical ideas that have been around for a decade or so," Zurek added, and new developments in technology - such as new advances in quantum sensors and detector materials - have also helped to drive the impetus for new experiments.

"The field has matured and blossomed over the last decade. It's become mainstream - this is no longer the fringe," she said. Low-mass dark matter discussions have moved from small conferences and workshops to a component of the overall strategy in searching for dark matter.

She noted that Berkeley Lab and UC Berkeley, with their particular expertise in dark matter theories, experiments, and cutting-edge detector and target R&D, are poised to make a big impact in this emerging area of the hunt for dark matter.

Report highlights need to search for "light" dark matter low-mass

Dark matter-related research by Zurek and other Berkeley Lab researchers is highlighted in a DOE report, "Basic Research Needs for Dark Matter Small Projects New Initiatives", based on an October 2018 High Energy Physics Workshop on Dark Matter. Zurek and Dan McKinsey, a Berkeley Lab faculty senior scientist and UC Berkeley physics professor, served as co-leads on a workshop panel focused on dark matter direct-detection techniques, and this panel contributed to the report.

The report proposes a focus on small-scale experiments - with project costs ranging from $2 million to $15 million - to search for dark matter particles that have a mass smaller than a proton. Protons are subatomic particles within every atomic nucleus that each weigh about 1,850 times more than an electron.

This new, lower-mass search effort will have "the overarching goal of finally understanding the nature of the dark matter of the universe," the report states.

In a related effort, the U.S. Department of Energy this year solicited proposals for new dark matter experiments, with a May 30 deadline, and Berkeley Lab participated in the proposal process, McKinsey said.

"Berkeley is a dark matter mecca" that is primed for participating in this expanded search, he said. McKinsey has been a participant in large direct-detection dark matter experiments including LUX and LUX-ZEPLIN and is also working on low-mass dark matter detection techniques.

3 priorities in the expanded search

The report highlights three major priority research directions in searching for low-mass dark matter that "are needed to achieve broad sensitivity and ... to reach different key milestones":

1. Create and detect dark matter particles below the proton mass and associated forces, leveraging DOE accelerators that produce beams of energetic particles. Such experiments could potentially help us understand the origins of dark matter and explore its interactions with ordinary matter, the report states.

2. Detect individual galactic dark matter particles - down to a mass measuring about 1 trillion times smaller than that of a proton - through interactions with advanced, ultrasensitive detectors. The report notes that there are already underground experimental areas and equipment that could be used in support of these new experiments.

3. Detect galactic dark matter waves using advanced, ultrasensitive detectors with emphasis on the so-called QCD (quantum chromodynamics) axion. Advances in theory and technology now allow scientists to probe for the existence of this type of axion-based dark matter across the entire spectrum of its expected ultralight mass range, providing "a glimpse into the earliest moments in the origin of the universe and the laws of nature at ultrahigh energies and temperatures," the report states.

This axion, if it exists, could also help to explain properties associated with the universe's strong force, which is responsible for holding most matter together - it binds particles together in an atom's nucleus, for example.

Searches for the traditional WIMP form of dark matter have increased in sensitivity about 1,000-fold in the past decade.

Berkeley scientists are building prototype experiments

Berkeley Lab and UC Berkeley researchers will at first focus on liquid helium and gallium arsenide crystals in searching for low-mass dark matter particle interactions in prototype laboratory experiments now in development at UC Berkeley.

"Materials development is also part of the story, and also thinking about different types of excitations" in detector materials, Zurek said.

Besides liquid helium and gallium arsenide, the materials that could be used to detect dark matter particles are diverse, "and the structures in them are going to allow you to couple to different dark matter candidates," she said. "I think target diversity is extremely important."

The goal of these experiments, which are expected to begin within the next few months, is to develop the technology and techniques so that they can be scaled up for deep-underground experiments at other sites that will provide additional shielding from the natural shower of particle "noise" raining down from the sun and other sources.

McKinsey, who is working on the prototype experiments at UC Berkeley, said that the liquid helium experiment there will seek out any signs of dark matter particles causing nuclear recoil -a process through which a particle interaction gives the nucleus of an atom a slight jolt that researchers hope can be amplified and detected.

One of the experiments seeks to measure excitations from dark matter interactions that lead to the measurable evaporation of a single helium atom.

"If a dark matter particle scatters (on liquid helium), you get a blob of excitation," McKinsey said. "You could get millions of excitations on the surface - you get a big heat signal."

He noted that atoms in liquid helium and crystals of gallium arsenide have properties that allow them to light up or "scintillate" in particle interactions. Researchers will at first use more conventional light detectors, known as photomultiplier tubes, and then move to more sensitive, next-generation detectors.

"Basically, over the next year we will be studying light signals and heat signals," McKinsey said. "The ratio of heat to light will give us an idea what each event is."

These early investigations will determine whether the tested techniques can be effective in low-mass dark matter detection at other sites that provide a lower-noise environment. "We think this will allow us to probe much lower energy thresholds," he said.

New ideas enabled by new technology

The report also notes a wide variety of other approaches to the search for low-mass dark matter.

"There are tons of different, cool technologies out there" even beyond those covered in the report that are using or proposing different ways to find low-mass dark matter, McKinsey said. Some of them rely on the measurement of a single particle of light, called a photon, while others rely on signals from a single atomic nucleus or an electron, or a very slight collective vibration in atoms known as a phonon.

Rather than ranking existing proposals, the report is intended to "marry the scientific justification to the possibilities and practicalities. We have motivation because we have ideas and we have the technology. That's what's exciting."

He added, "Physics is the art of the possible."

Credit: 
DOE/Lawrence Berkeley National Laboratory

Last-ditch attempt to warn of coalmine harm

image: Scientists warn the Doongmabulla Springs Complex could be permanently damaged if the mine goes ahead.

Image: 
Coast and Country

Groundwater experts from around Australia have repeated calls for further investigations into the potential effects on heritage groundwater reserves in central Queensland if the giant Adani Carmichael coalmine gets the final regulatory go-ahead.
Concerns the ancient Doongmabulla Springs face a 'reasonable threat of extinction' from Adani's proposed Galilee Basin coalmine are raised in a new position paper, which echoes previous research by CSIRO and Geoscience Australia.

The Queensland Government is due to rule on the groundwater hurdle this week after clearing the way to another environmental concern, supporting Adani's proposed management plan for the endangered black-throated finch.

Experts from Flinders University, RMIT, Monash and Latrobe universities say their report, 'Deficiencies in the scientific assessment of the Carmichael Mine impacts to the Doongmabulla Springs' - now before the Queensland Government - highlights problems with Adani's own claims that the springs are safeguarded by "an impervious layer, restricting water from flowing between the underground aquifers".

"Adani has not properly examined the link between the mine's groundwater drawdown and impacts to the Doongmabulla Springs, which is a fundamental requirement of the Carmichael mine's approvals," says Flinders University Professor of Hydrogeology Adrian Werner, a founding member of National Centre for Groundwater Research and Training.

Instead Professor Werner, with Flinders Associate Professor Andy Love, Dr Eddie Banks and Dr Dylan Irvine - with Associate Professor Matthew Currell from RMIT University, Professor Ian Cartwright from Monash University and Associate Professor John Webb from Latrobe warn the springs face a "plausible threat of extinction'.

"Six years of advice from experts that the science is flawed does not seem to have overcome critical shortcomings with the science that have persisted despite several iterations of Adani's environmental management plans," says Professor Werner.

"With the deadline for approval approaching, we are compelled to reiterate concerns that flaws in Adani's scientific methods, modelling results, and the proposed 'adaptive management' approach by Adani have the potential to seriously mislead decision-makers," he says pointing to the 2013 Independent Expert Scientific Committee report, Land Court case of 2014-15 and this year's CSIRO review.

Professor Werner says: "We hope that our report assists the Queensland Government by highlighting the significant risk that the Carmichael Mine will cause the Doongmabulla Springs to become extinct, and will impact other groundwater-dependent ecosystems and water users to a greater degree than has so far been suggested by Adani."

The report pinpoints four areas where Adani's investigation and environmental management strategies do "not stack up against the science":

Adani appears likely to have significantly underestimated future impacts to the Doongmabulla Springs Complex.

Should the Carmichael Mine cause springs within the Doongmabulla complex to cease flowing, the impact could be permanent.

Adani's safeguard against the impacts, namely 'adaptive management', is unsuitable and unlikely to protect the springs from extinction.

Cumulative impacts to the Springs that may result from other mining activities in the Galilee Basin have not been adequately considered.

Credit: 
Flinders University

Engineers use graph networks to accurately predict properties of molecules and crystals

image: This is a schematic illustration of MEGNet models.

Image: 
Chi Chen/Materials Virtual Lab

Nanoengineers at the University of California San Diego have developed new deep learning models that can accurately predict the properties of molecules and crystals. By enabling almost instantaneous property predictions, these deep learning models provide researchers the means to rapidly scan the nearly-infinite universe of compounds to discover potentially transformative materials for various technological applications, such as high-energy-density Li-ion batteries, warm-white LEDs, and better photovoltaics.

To construct their models, a team led by nanoengineering professor Shyue Ping Ong at the UC San Diego Jacobs School of Engineering used a new deep learning framework called graph networks, developed by Google DeepMind, the brains behind AlphaGo and AlphaZero. Graph networks have the potential to expand the capabilities of existing AI technology to perform complicated learning and reasoning tasks with limited experience and knowledge--something that humans are good at.

For materials scientists like Ong, graph networks offer a natural way to represent bonding relationships between atoms in a molecule or crystal and enable computers to learn how these relationships relate to their chemical and physical properties.

The new graph network-based models, which Ong's team dubbed MatErials Graph Network (MEGNet) models, outperformed the state of the art in predicting 11 out of 13 properties for the 133,000 molecules in the QM9 data set. The team also trained the MEGNet models on about 60,000 crystals in the Materials Project. The models outperformed prior machine learning models in predicting the formation energies, band gaps and elastic moduli of crystals.

The team also demonstrated two approaches to overcome data limitations in materials science and chemistry. First, the team showed that graph networks can be used to unify multiple free energy models, resulting in a multi-fold increase in training data. Second, they showed that their MEGNet models can effectively learn relationships between elements in the periodic table. This machine-learned information from a property model trained on a large data set can then be transferred to improve the training and accuracy of property models with smaller amounts of data--this concept is known in machine learning as transfer learning.

Credit: 
University of California - San Diego

Marijuana and fertility: Five things to know

For patients who smoke marijuana and their physicians, "Five things to know about ... marijuana and fertility" provides useful information for people who may want to conceive. The practice article is published in CMAJ (Canadian Medical Association Journal).

Five things to know about marijuana and fertility:

1. The active ingredient in marijuana, tetrahydrocannabinol (THC), acts on the receptors found in the hypothalamus, pituitary and internal reproductive organs in both males and females.

2. Marijuana use can decrease sperm count. Smoking marijuana more than once a week was associated with a 29% reduction in sperm count in one study.

3. Marijuana may delay or prevent ovulation. In a small study, ovulation was delayed in women who smoked marijuana more than 3 times in the 3 months before the study.

4. Marijuana may affect the ability to conceive in couples with subfertility or infertility but does not appear to affect couples without fertility issues.

5. More, and better quality, research is needed into the effects of marijuana on fertility.

Credit: 
Canadian Medical Association Journal

How to improve care for patients with disabilities? We need more providers like them

image: Bonnielin Swenor, Ph.D., M.P.H.

Image: 
Johns Hopkins Medicine

It is common for patients to prefer seeking care from a clinician similar to them -- such as of the same gender, ethnicity and culture -- who can relate to their experiences and make treatment plans that work better for their lives. To meet these preferences from patients and improve quality of care, a diverse clinician workforce that matches the diversity in the general population is needed. However, when it comes to patients with disabilities, the chance of getting a clinician "like them" is extremely low, which may lead to patients' reluctance to seek care or follow prescribed interventions and treatments. Meanwhile, without adequate scientists with disabilities bringing perspectives to patient-centered research, the ability to improve care for patients with disabilities is limited.

Why is the representation of people with disabilities so limited in the biomedical workforce?

Bonnielin Swenor, Ph.D., M.P.H., associate professor of ophthalmology at the Johns Hopkins Wilmer Eye Institute and associate professor of epidemiology at the Johns Hopkins Bloomberg School of Public Health, is working to solve this disparity.

Living with low vision herself, Swenor experiences difficulties in many aspects of her life, but devotes her time to researching how to help patients like herself, and assuring those patients that there are ways to overcome the hardships and pursue their goals. Swenor sees herself as more a patient than a researcher, and uses her unique perspective to formulate patient-centered research questions to bring better care to people with visual impairment. She believes that more people with disabilities like her are needed in the biomedical workforce.

In a new editorial published on May 30 in the New England Journal of Medicine, Swenor and Lisa Meeks, a collaborator from University of Michigan Medical School, address barriers to an inclusive workforce and propose a roadmap to guide academic medical institutions toward creating a work environment more inclusive for people with disabilities.

"Although more institutions are embracing diversity and inclusion, people with disabilities still face barriers in pursuing and getting support in their careers," says Swenor. "We are providing employers with recommendations to enhance inclusion of persons with disabilities in these settings."

In the NEJM piece, Swenor and Meeks recommend that academic medical centers include in their diversity efforts people with disabilities, develop centralized ways to pay for accommodations that might be required and other actions that would encourage more students with disabilities to pursue careers in medicine.

Credit: 
Johns Hopkins Medicine