Culture

Ultra-secure form of virtual money proposed

A new type of money that allows users to make decisions based on information arriving at different locations and times, and that could also protect against attacks from quantum computers, has been proposed by a researcher at the University of Cambridge.

The theoretical framework, dubbed 'S-money', could ensure completely unforgeable and secure authentication, and allow faster and more flexible responses than any existing financial technology, harnessing the combined power of quantum theory and relativity. In fact, it could conceivably make it possible to conduct commerce across the Solar System and beyond, without long time lags, although commerce on a galactic scale is a fanciful notion at this point.

Researchers aim to begin testing its practicality on a smaller, Earth-bound scale later this year. S-money requires very fast computations, but may be feasible with current computing technology. Details are published in the Proceedings of the Royal Society A.

"It's a slightly different way of thinking about money: instead of something that we hold in our hands or in our bank accounts, money could be thought of as something that you need to get to a certain point in space and time, in response to data that's coming from lots of other points in space and time," said Professor Adrian Kent, from Cambridge's Department of Applied Mathematics and Theoretical Physics, who authored the paper.

The framework developed by Professor Kent can be thought of as secure virtual tokens generated by communications between various points on a financial network, which respond flexibly to real-time data across the world and 'materialise' so that they can be used at the optimal place and time. It allows users to respond to events faster than familiar types of money, both physical and digital, which follow definite paths through space.

The tokens can be securely traded without delays for cross-checking or verification across the network, while eliminating any risk of double-trading. One way of guaranteeing this uses the power of quantum theory, the physics of the subatomic world that Einstein famously dismissed as "spooky".

The user's privacy is maintained by protocols such as bit commitment, which is a mathematical version of a securely sealed envelope. Data are delivered from party A to party B in a locked state that cannot be changed once sent and can only be revealed when party A provides the key - with security guaranteed, even if either of the parties tries to cheat.

Other researchers have developed theoretical frameworks for 'quantum' money, which is based on the strange behaviour of particles at the subatomic scale. While using quantum money for real world transactions may be possible someday, according to Kent, at the moment it is technologically impossible to keep quantum money secure for any appreciable length of time.

"Quantum money, insofar as it's currently understood, would require long-term storage of quantum states, or quantum memory," said Kent. "This would require an awful lot of resources, and even if it becomes technologically feasible, it may be incredibly expensive."

While the S-money system requires large computational overhead, it may be feasible with current computer technology. Later this year, Kent and his colleagues hope to conduct some proof-of-concept testing working with the Quantum Communications Hub, of which the University of Cambridge is a partner institution. They hope to understand how fast S-money can be issued and spent on a network using off-the-shelf technologies.

"We're trying to understand the practicalities and understand the advantages and disadvantages," said Kent.

Patent applications for the research have been filed by Cambridge Enterprise, the University's commercialisation arm.

Credit: 
University of Cambridge

Arsenic in drinking water may change heart structure

DALLAS, May 7, 2019 - Drinking water that is contaminated with arsenic may lead to thickening of the heart's main pumping chamber in young adults, a structural change that increases the risk for future heart problems, according to new research in Circulation: Cardiovascular Imaging, an American Heart Association journal.

"People drinking water from private wells, which are not regulated, need to be aware that arsenic may increase the risk for cardiovascular disease. Testing those wells is a critical first step to take action and prevent exposure," said Gernot Pichler, M.D., Ph.D., M.Sc., lead author of the study and medical specialist for Internal Medicine, Department of Cardiology at Hospital Hietzing/Heart Center Clinic Floridsdorf in Vienna, Austria, scientific collaborator at INCLIVA Health Research Institute in Valencia, Spain, and a visitor scholar in the Department of Environmental Health Sciences at Columbia University in New York City.

People are most frequently exposed to arsenic, a toxic metalloid, through drinking water in areas where groundwater is contaminated, including many American Indian tribal communities and other rural and suburban communities in the United States. Previously, several studies have shown that arsenic exposure raises the risk of heart disease and its risk factors, including high blood pressure and diabetes. This is the first study to review the question in young American Indians in Oklahoma, Arizona and North and South Dakota.

Here, researchers reviewed data from the Strong Heart Family Study, a study evaluating cardiovascular risk factors among American Indians. Arsenic exposure was measured in urine samples from 1,337 adults (average age 30.7 years, 61% female) and the size, shape and function of their hearts were assessed using ultrasound (echocardiography). None of the participants had diabetes or heart disease at the start of the five-year study.

Overall, arsenic exposure was higher than in the general United States population, but lower than that found in other studies conducted in Mexico and Bangladesh. With a two-fold increase in arsenic in the urine, the researchers found:

47% greater chance of thickening of the heart's main pumping chamber (left ventricle) in the group as a whole; and

58% greater chance of thickening of the left ventricle in participants with increased or high blood pressure (blood pressure at least 120/80 mm Hg or using pressure-lowering medication)

"The stronger association in subjects with elevated blood pressure suggests that individuals with pre-clinical heart disease might be more prone to the toxic effects of arsenic on the heart," Pichler said.

The study is limited by having only one measure of arsenic exposure, and by the lack of long-term follow-up of the participants. Although this study was performed in tribal populations in the north, central and southwestern United States, the results are likely to be generalizable to millions of people in other rural locations exposed to low or moderate levels of arsenic in their water, according to Pichler.

"The study raises the question of whether the changes in heart structure are reversible if exposure is reduced. Some changes have occurred in water sources in the study communities, and it will be important to check the potential health impact of reducing arsenic exposure," Pichler said.

"Observational studies can stimulate future research on genetic, environmental and clinical factors to shed light on the relationship between arsenic and cardiovascular disease," said editorial author, Rajiv Chowdhury, M.B.B.S., Ph.D., Cardiovascular Epidemiology Unit, Department of Public Health and Primary Care, University of Cambridge, United Kingdom "These studies are important since cardiovascular disease remains the single leading cause of adult premature death worldwide and millions of individuals globally are exposed to arsenic and other metal contaminants."

Credit: 
American Heart Association

Highly qualified staff at state preschools overcome private sector staffing advantage

Better-qualified staff maintain the quality of state-funded preschools, making up for the larger number of children per staff member in comparison to private and voluntary settings, finds a new study by researchers at the University of Oxford. They also show that the quality of private early years settings can be predicted by staff qualifications, and for voluntary settings, an in-house training plan and a better staff to child ratio.

Published in Frontiers in Education, the researchers also compare data before and after substantial policy change in the UK between 1999 and 2014, which was aimed at increasing the uptake and improving the quality of early years education and care. It indicated that such policy changes could have powerful effects in improving preschool and nursery settings for 3 to 4 year olds, with implications for long-term child and potentially adult, well-being.

"A better staff to child ratio leads to improvements in quality, but staff qualifications and training is the most important factor," says Edward Melhuish, a Professor of Human Development at the University of Oxford. "While there is still a long way to go, the evidence suggests that the policy changes in the UK have led to higher-quality early childhood education and care."

Staff training and qualifications matter

Substantial policy changes, influenced by research highlighting the benefits of quality education and care for preschool children, have been implemented in the UK since 1999. These changes aimed to increase uptake through state-funded provision and improve the quality of teaching, the curriculum and the experiences of the child by enhancing the training and qualifications of staff.

"We wanted to understand how policy changes might affect the everyday experiences of children in ways that might benefit their long-term development," explains Prof. Melhuish. "We used observations of nearly 600 early childhood education and care settings in England and collected information on training, qualifications, ratios and other factors through staff interviews."

The researchers found that factors predicting the quality of a setting differed, depending on how they were funded and managed. Staff qualifications predicted quality at private (for profit) settings, whereas at voluntary settings, where staff qualifications were similar, a staff-training plan and lower numbers of children per staff member were linked to higher quality. State-funded settings tended to have higher quality ratings and it is thought the presence of highly qualified staff maintained this quality despite less-favorable child to staff ratios.

"Our study shows that having well-trained and qualified staff increases the quality of education and care in a child's early years. Also, better staff to child ratios mean staff can spend more time in one-to-one interaction with children and this is very beneficial," explains Prof. Melhuish.

Government policy could make a real difference

The comparison of data sets from 1998-1999 and 2014-2015, which were before and after a period of substantial policy change in the UK, revealed that the quality of early years education and care has risen significantly over this time. It is hoped the findings from this study can provide important indications about ways that child development may be enhanced through policy change.

"The research and evidence-based policy approach in the UK has lessons for other countries, as acknowledged by international organizations such as the Organization for Economic Co-operation and Development. Existing evidence would lead us to expect that these changes will have long-term benefits for the population and future economic development of the country, as economic development in the modern world is increasingly dependent on the education of the workforce."

Future work should focus on enhancing staff training, suggests Melhuish.

"There is a need to enhance staff qualifications and in-service professional development, because training on the job is so effective. So much existing training is inadequate and based on ideology rather than evidence of what actually helps children's development."

Credit: 
Frontiers

Contracts give Coca-Cola power to 'quash' health research, paper alleges

A study of over 87,000 documents obtained through Freedom of Information requests has revealed a contract mechanism that could allow Coca-Cola to "quash" findings from some of the health research it funds at public universities in the US and Canada.

The study, published today in the Journal of Public Health Policy, identified several clauses in legal documents that give the company early sight of any findings, combined with the right to "terminate without reason" and walk away with the data and intellectual property.

Taken together, these clauses could suppress "critical health information", and indeed may have done so already, according to the study's authors. Much of the research Coca-Cola supports is in the fields of nutrition, physical inactivity and energy balance.

The authors argue that the clauses contravene Coca-Cola's commitments to transparent and "unrestricted" support for science, which came after criticism of the opaque way some major food corporations fund health research.

Researchers from the University of Cambridge, London School of Hygiene and Tropical Medicine, University of Bocconi, and US Right to Know, call on corporate funders to publish lists of terminated studies. They say scientists should publish agreements with industry to reassure the public that findings are free from influence.

"It is certainly true that the contracts we have found allow for unfavourable developments or findings to be quashed prior to publication," said lead author Dr Sarah Steele, a policy researcher from Cambridge's Department of Politics and International Studies.

"Coca-Cola have declared themselves at the forefront of transparency when it comes to food and beverage giants funding health research. In fact, our study suggests that important research might never see the light of day and we would never know about it.

"We are already hearing accusations from experts in nutrition that the food industry is copying tactics from big tobacco's playbook. Corporate social responsibility has to be more than just shiny websites stating progressive policies that get ignored."

Consumption of high calorie, low nutrient food and drink is believed to be a major factor in the childhood obesity epidemic. Last year, the UK government introduced a "sugar tax" on many soft drinks, including Coca-Cola's flagship product.

US Right to Know, a non-profit consumer and public health research group, submitted 129 FOI requests between 2015 and 2018 relating to academics at North American institutions who received Coca-Cola funding.

The research team combed through the vast tranche of resulting documents and discovered five research agreements made with four universities: Louisiana State University, University of South Carolina, University of Toronto and the University of Washington.

The funded work includes "energy flux and balance" studies and research on beverage intake during exercise. Coca-Cola's own transparency website declares that scientists retain full control over their research and the company has no right to prevent publication of results.

However, while contracts show Coca-Cola does not control day-to-day conduct, the company retains various rights throughout the process. These include the right to receive updates and comment on findings prior to research publication, and the power to terminate studies early without reason.

The documents yielded by the FOI requests contained no firm examples of Coca-Cola suppressing unfavourable research, although the study authors say "what is important is that the provision exists". All documents relating to the contracts are now accessible on the US Right to Know website.

Emails show one scientist expressing uncertainty over his study termination ("...they have not communicated with us in several months") and concern over intellectual property.

Another scientist is seen arguing that his contract is "very restrictive for an 'unrestricted grant'".

"These contracts suggest that Coke wanted the power to bury research it funded that might detract from its image or profits," said Gary Ruskin, co-director of US Right to Know.

"With the power to trumpet positive findings and bury negative ones, Coke-funded science seems more like an exercise in public relations."

The researchers acknowledge that the food and beverage industry may be updating research contracts in line with new public commitments, but without seeing those contracts it is hard to know.

They say their Coca-Cola case study suggests a continued lack of transparency that should be remedied with "hard" information on funding, rather than relying on self-reported conflicts of interest.

"Journals should require authors of funded research to upload the research agreements for studies as appendices to any peer-reviewed publication," said Steele.

"The lack of robust information on input by industry and on studies terminated before results are published, makes it impossible to know how much of the research entering the public domain reflects industry positions."

Credit: 
University of Cambridge

Developed countries benefit economically from counterterrorism efforts

A new study in Oxford Economic Papers suggests that developed counties may see significant economic gains from their efforts to combat terrorist threats. Developing counties, in contrast, appear to suffer economically from counterterrorism threats.

Major trading countries like the United States or trading blocs like the European Union are targets of terrorist organizations. Typically, these groups, such as al-Qaida or ISIS, locate in developing countries that lack the resources to stop them from operating. During the last two decades, this resource scarcity is often complemented by radical ideologies that can be more easily implanted among disaffected people, thereby supplying terrorist recruits. As a consequence, terrorist hotbeds result in remote and difficult-to-govern areas like Afghanistan, Pakistan, Somalia, Syria, and Yemen. To protect against such attacks, targeted countries deploy defensive counterterrorism measures at home, which deflect attacks abroad. In addition, terrorism disrupts the production of goods and services in an economy. These production considerations affect global supply and demand of goods, thus changing trade patterns and the prices of imports and exports.

With their limited means, terrorist organizations target both types of countries. Greater defensive counterterrorism by either country reduces terrorism at home, but potentially increases it in the other country as the terrorist group redirects its attacks. Such defensive measures may take the form of enhanced border security and greater surveillance.

Counterterrorism limits the production of manufactured goods through demand for closely related resources. Guns, surveillance cameras, helicopters, police vehicles, communication grids, and other manufactured goods are required for effective defensive counterterrorism efforts. Defensive measures also require labour in terms of guards and police, who must have equipment to protect potential targets and to coordinate defensive operations.

This paper investigates the interplay of trade and terrorism under free trade between a developed nation that exports a manufactured good to and imports a primary product from a developing nation. Terrorist organizations target both types of nations and reduce their attacks in response to a nation's defensive counterterrorism efforts. This reality leads the developed nation to raise its counterterrorism efforts, thus compounding its overprovision of these measures. By contrast, developing nations limit their defensive countermeasures due to the falling price of their exports. Researchers here considered a developed (e.g., United States) and a developing country (e.g., Pakistan) with two goods - manufactured and primary. The developing country imports the manufactured good and exports the primary product. The developed country imports the primary product but exports the manufactured good to developing countries.

When deciding which defensive counterterrorism measures to pursue, the developing country must weigh trade loss against its gain from containing terrorism at home. The opposite is true for the developed country, whose independent defensive choice not only augments the country's trade position as manufactured goods, which it produces, become relatively more expensive, but also deflects potential attacks abroad. Thus, the developed country has an incentive to increase its defensive measures. This asymmetry is a novel finding. The prognosis for global welfare thus is better if the developing country is more afflicted by terrorism, so that its initial overprovision is relatively greater than that of the developed country.

Next, consider proactive counterterrorism measures which limit terrorist resources and prowess and are usually undersupplied by targeted countries. The developed country is now incentivized to increase its proactive efforts relative to the small-country case. Since it both produces and benefits from manufactured security goods, it will benefit from producing more of them. The developing country, however, is incentivized to decrease its substandard proactive efforts. However, the global welfare implications are now different as the developed country does more to address its underprovision for the sake of both security and economic gains, which can improve global welfare.

The longer term impact of these measures cause exported primary goods to get cheaper. This asymmetry between the proactive measures of the targeted countries highlights how trade adds a novel consideration that results in somewhat more optimistic welfare findings.

In terms of welfare consequences, the asymmetry between how developing and developed countries prioritize their counterterrorism choice switches, with the developed country improving its efficiency. Since the developed country is generally the main supplier of proactive counterterrorism measures, the trade effect of counterterrorism measures are likely to improve global welfare.

"This article shows that counterterrorism must be investigated in a way that accounts for subtle, but important, trade consequences," said one of the paper's authors, Todd Sandler.

Credit: 
Oxford University Press USA

Study reveals final fate of levitating Leidenfrost droplets

image: A new study shows the ultimate fate of Leidenfrost droplets, liquid drops that levitate above very hot surfaces. Larger drops explode violently with an audible crack. Smaller ones simple shrink and fly away.

Image: 
Lyu/Mathai

PROVIDENCE, R.I. [Brown University] -- Splash some water on a hot skillet, and you'll often see the droplets sizzle and quickly evaporate. But if you really crank up the heat, something different happens. The droplets stay intact, dancing and skittering over the surface in what's known as the Leidenfrost effect. Now a team of researchers has detailed how these Leidenfrost droplets meet their ultimate fate.

In a paper published in Science Advances, the team shows that Leidenfrost droplets that start off small eventually rocket off the hot surface and disappear, while larger drops explode violently with an audible "crack." Whether the droplet finally explodes or escapes depends on its initial size and the amount of solid contaminants -- ambient dust or dirt particles -- the droplet contains.

In addition to explaining the cracking sound that Johann Gottlob Leidenfrost reported hearing in 1756 when he documented the phenomenon, the findings could prove useful in future devices -- cooling systems or particle transport and deposition devices -- that may make use of the Leidenfrost effect.

"This answers the 250-year-old question of what produces this cracking sound," said Varghese Mathai, a postdoctoral researcher at Brown University and the study's co-lead author. "We couldn't find any prior attempts in the literature to explain the source of the crack sound, so it's a fundamental question answered."

The research, published in Science Advances, was a collaboration between Mathai at Brown, co-lead author Sijia Lyu from Tsinghua University and other researchers from Belgium, China and the Netherlands.

In the years since Leidenfrost observed this peculiar behavior in water droplets, scientists have figured out the physics of how the levitation phenomenon occurs. When a liquid drop comes into contact with a surface that's well beyond the liquid's boiling point, a cushion of vapor forms beneath the droplet. That vapor cushion supports the drop's weight. The vapor also insulates the drop and slows its rate of evaporation while enabling it to glide around as if it were on a magic carpet. For water, this happens when it encounters a surface in excess of around 380 degrees Fahrenheit. This Leidenfrost temperature varies for other liquids like oils or alcohol.

A few years back, a different research team observed the ultimate fate of tiny Leidenfrost drops, showing that they steadily shrink in size and then suddenly launch off the surface and disappear. But that didn't explain the cracking sound Leidenfrost heard, and no one had done a detailed study to see where that sound came from.

For this new study, the researchers set up cameras at recording speeds up to 40,000 frames per second and sensitive microphones to observe and listen to individual drops of ethanol above their Leidenfrost temperatures. They found that when the droplets started out relatively small, they behaved in the way that the previous researchers had observed -- shrinking and then escaping. At a certain point, when these droplets become sufficiently small and lightweight, the vapor flow around them causes them to suddenly fling into the air where they finally disappear.

But when droplets start out a millimeter in diameter or larger, the study showed, something very different happens. The larger drops steadily shrink, but they don't get small enough to fly away. Instead, the larger droplets steadily sink toward the hot surface below. Eventually the droplet makes contact with the surface, where it explodes with an audible crack. So why don't those larger droplets shrink down enough to take flight like the droplets that start out smaller? That, the researchers say, is a matter of contaminants.

No liquid is ever perfectly pure. They all have tiny particle contaminants -- dust and other particles that influence the Leidenfrost process. As droplets shrink, the concentration of particle contaminants within them increases. That's especially true for drops that start out larger because they have a higher absolute of particles to start with. So for drops that start out large, the researchers surmised, the concentration of contaminants can become so high that the particles accumulate into a solid shell along the droplet's surface. That shell cuts off the supply of vapor that forms the cushion beneath. As a result, the droplet sinks toward the hot surface beneath and explodes on contact.

To test this idea, the researchers observed liquid droplets that had different levels of contamination with titanium dioxide microparticles. They found that as the contaminant level increased, so did the average size of the droplets at the moment of explosion. The research was also able to image the contaminant shells among the explosion debris.

Taken together, the evidence suggests that even minute quantities of contaminants play a key role in determining the fate of Leidenfrost droplets. The finding could have practical applications beyond just explaining the cracking sound that Leidenfrost first reported.

Recent research has shown that the direction in which Leidenfrost drops move can be controlled. That could make them useful as levitating particle carriers in microelectronic fabrication processes. There's also the possibility of using Leidenfrost drops in heat exchangers that are designed to keep electronic components at specific temperatures.

"You can use these contaminants to change the lifetime of a Leidenfrost droplet," Mathai said. "So you can figure out in principle where it's going to deposit the particles, or control how long the heat transfer persists by fine-tuning the amount of contaminants."

The research results could potentially be used to develop new purity testing methods for water and other liquids because the size at which droplets explode is so closely linked to its contaminant load.

Credit: 
Brown University

Researchers find protein that suppresses muscle repair in mice

image: From left, U. of I. cell and developmental biology professor Jie Chen and graduate students Kook Son and Jae-Sung You discovered a new role for LRS in muscle repair. The study was conducted in mice.

Image: 
Photo by L. Brian Stauffer

CHAMPAIGN, Ill. -- Researchers report that a protein known to be important to protein synthesis also influences muscle regeneration and regrowth in an unexpected manner. The discovery, reported in the Journal of Clinical Investigation, could one day lead to new methods for treating disorders that result in muscle weakness and loss of muscle mass, the researchers said.

Scientists have long studied leucine tRNA-synthetases, or LRS, for its role in protein synthesis, said University of Illinois cell and developmental biology professor Jie Chen, who led the research.

"In the last 5-10 years, scientists have begun to realize that LRS and other proteins like it have functions independent of protein synthesis," Chen said. "Previously, my lab and other labs discovered that one of such functions of LRS is that it can regulate cell growth. Our new study is the first report of its function in muscle regeneration."

Chen and her colleagues used mammalian cell cultures and mice in the new study. They compared the speed of muscle repair in mice with normal and lower-than-normal LRS levels. They discovered that mice with lower levels of LRS in their tissues recovered from muscle injury much more quickly than their counterparts with normal LRS levels.

A 70% reduction of LRS proteins in the cell does not affect protein synthesis, Chen said.

"But lower levels do positively influence muscle regeneration," she said. "We saw that, seven days after injury, the repaired muscle cells are bigger when LRS is lower."

While it is not possible to lower LRS in human subjects, the researchers sought another method to block its effects.

Chen and her colleagues further unraveled the exact molecular mechanism by which LRS influences muscle regeneration. This led them to hypothesize that a nontoxic inhibitor that their collaborators in South Korea previously developed would block the effect of LRS on muscle cells without interfering with its role in protein synthesis.

"We showed that this inhibitor works both in mammalian cells and in mice," Chen said. Muscle repair occurred more rapidly - and the regenerated muscles were stronger - when the inhibitor was present.

As the science progresses, researchers are gaining greater insights into the multifunctionality of proteins once thought to have only a single role in cells, Chen said.

"We now understand that 'protein moonlighting,' where one protein does many different things in the cell, is the norm," she said.

Chen and her colleagues are investigating the effect of LRS on older mice, which tend to rebuild their muscles more slowly and have less muscle tone than younger mice.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Mechanics, chemistry and biomedical research join forces for noninvasive tissue therapy

image: Jeffrey Moore, left, King Li, postdoctoral researcher Gun Kim and graduate student Abigail Halmes have collaborated to develop an ultrasound-activated synthetic molecule that can emit light deep inside biologic tissue for a variety of medical uses and therapies.

Image: 
Photo by Fred Zwicky

CHAMPAIGN, Ill. -- A fortuitous conversation between two University of Illinois scientists has opened a new line of communication between biomedical researchers and the tissues they study. The new findings, reported in the Proceedings of the National Academy of Sciences, show that high-intensity focused ultrasound waves can penetrate biological tissue to activate molecules able to perform specific tasks.

The research, conducted in vitro and in mice, addresses the challenges of noninvasive access to deep tissue for therapeutic purposes without causing permanent damage. The study successfully demonstrates the ability to trigger chemical reactions on demand, in a very targeted manner while using a technology already approved for medical use.

"In the broadest sense, we are trying to develop remote-controlled systems that can eventually be used in biomedical applications," said King Li, the dean of the Carle Illinois College of Medicine, a researcher at the Beckman Institute for Advanced Science and Technology at Illinois and a study co-author.

"I learned that King was interested in finding a way to remotely activate genes using light - a field called optogenetics," said Jeffrey Moore, the director of the Beckman Institute, a chemistry professor and a study co-author. "This presented a great opportunity to tell him about my research in synthetic polymer chemistry and mechanics."

Moore studies synthetic molecules called mechanophores that respond to force by changing color or generating light - something he believed could harness the mechanical force of an ultrasound wave and trigger a chemical reaction that emits light. The concept is exactly what Li was seeking.

Light cannot travel through opaque material, but ultrasound waves - which have a well-documented safety record - can, the researchers said.

"Light has a limited penetration range in opaque materials, including living tissues," Li said. "The ability to use ultrasound to penetrate opaque materials and then trigger mechanophores to produce light deep within these materials will open up many possibilities for applications such as gene activation."

Although the researchers have successfully demonstrated remote generation of light in biologic tissue without causing damage, the intensity of that light is still not enough for optogenetic applications.

"We are getting close," Moore said. "When we completed the study, we were within about a factor of 10 of the light intensity needed to switch on genes, but now we are closer to a factor of two."

The interdisciplinary team of study co-authors, which includes electrical and computer engineering professor Michael Oelze and Beckman Institute researchers Gun Kim, Vivian Lau and Abigail Halmes, continues to refine the technique and seek other biomedical applications.

"This combination of high-intensity focused ultrasound and mechanophores can be utilized for many applications, and light production is only the beginning," Li said. "We are already actively exploring other applications."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Evaluating the impact of user interface changes on time spent searching for prior exams

Leesburg, VA, May 5, 2019--The upgraded picture archiving and communication system (PACS) improves over the prior system in reduced time spent searching for prior studies and total time spent reading studies, according to a study to be presented at the ARRS 2019 Annual Meeting, set for May 5-10 in Honolulu, HI.

Penn State Health introduced a PACS update designed to allow sorting with a wider range of the dictionary of the type of study available. The previous system sorted studies by modality and body system. The upgraded search system allowed modality and body part to further narrow the examination type. The newer dictionary also allowed the system to automatically open a user-defined number of prior studies specific to the body part being imaged. The study was conducted to evaluate whether these changes reduced the search time radiologists spent finding prior studies for each examination.

Attending radiologists were video recorded using both systems. 219 studies (204 plain film) using the older PACS were evaluated, and 110 studies (98 plain film) after the system upgrade. The average time radiologists spent searching for a prior study improved from 6.1 seconds to 4.6 seconds, with the median improving from 4 seconds to 2. The average time per study interpretation went from 137 seconds using the old system to 113 on the upgraded system. The maximum time spent searching was reduced by almost half, from 40 to 25 seconds.

The results indicate that the upgraded PACS system saved attending radiologists time searching for prior studies and significantly reduced their total time spent reading studies.

Credit: 
American Roentgen Ray Society

DNA test is an effective cervical cancer screening tool for women in low-income countries

image: An inexpensive DNA-based test for human papillomavirus (HPV) can effectively be deployed in low- and middle-income countries such as Honduras, where prevalence of cervical cancer is high due to lack of screening resources.

Image: 
Tsongalis Laboratory

LEBANON, NH - Cervical cancer is a major issue in low- and middle-income countries due to the lack of adequate screening such as routine Pap smear testing. These countries have high incidences of cervical cancer linked to human papillomavirus (HPV). Due to lack of resources for cancer screenings, these countries account for 85% of all cervical cancer cases.

A group of researchers from Dartmouth's Norris Cotton Cancer Center, led by Gregory Tsongalis, PhD, have introduced an inexpensive DNA-based testing protocol for HPV in Honduras. The team found that of 1,732 women screened, 28% were positive for a high-risk HPV type and of those, 26% had more than one HPV infection. Results also showed that the most common HPV genotypes detected during testing were different than those commonly found in the United States. Their findings, "Screening for Human Papillomavirus in a Low- and Middle-Income Country" are newly published in ASCO's Journal of Global Oncology.

"We have shown that cervical cancer screening can be implemented in low-resource settings using this method, and that women are very interested and engaged in testing and follow-up clinic visits when necessary," says Tsongalis. "This study also identified something we were not expecting and that is a very significant difference in the types of high-risk HPV that we were detecting."

Such findings could mean profound implications for vaccination programs. "The causes of cervical cancer, while viral in nature, are not always the same type of virus and that could impact aggressiveness of disease, vaccinations and therapies," says Tsongalis.

The team would like to use their findings to guide studies of actual cervical cancer tissue and also to formulate therapeutic vaccine trials. "Being able to screen individuals who have never been tested before and studying the impact of the testing on their healthcare as well as our understanding of the biology of the disease is most exciting," says Tsongalis.

Credit: 
Dartmouth Health

Study asks patients' input to improve the hospital experience

image: Luci Leykum, M.D., of the Long School of Medicine at UT Health San Antonio, led the i-HOPE Study that surveyed 499 stakeholders, resulting in 11 priority research questions that can be used to stimulate the national conversation on changes patients desire to see in hospital care.

Image: 
UT Health San Antonio

SAN ANTONIO - American hospitals engage in continuous quality and safety improvement, but information remains scarce on what patients, families and caregivers themselves most want to change about their hospital experiences.

The i-HOPE Study, led by Luci Leykum, M.D., M.B.A., M.Sc., of UT Health San Antonio, sought to give patients, families and other stakeholders a voice in setting priorities for improving hospital care. Eight hospitalist researchers and their patient partners conducted the study, in which 499 patients, caregivers, health care providers and researchers stated their priority unanswered questions to improve hospital care. Respondents included 244 patients and caregivers. Forty-seven organizations partnered with the Society of Hospital Medicine to conduct the study.

Out of nearly 800 submitted questions, 11 were identified as top priorities. Topics included shared decision-making, patient-provider communication, care transitions, telemedicine and confusion about medications. "If answered, these questions could lead to significant improvements in hospitalization," Dr. Leykum said.

Two-way communication

The top-ranked question is, "What interventions ensure that patients share in decision making regarding their goals and plans of care?" Studies before i-HOPE showed that while physicians were skilled at providing health information, they were less skillful at seeking feedback from patients, assessing patients' level of understanding, or meaningfully incorporating patient preferences into treatment plans.

Communication between physician and patient is crucial throughout a patient's hospital stay, from discussing treatment options to making joint decisions to knowing who to call after discharge, the study authors wrote. "Relationships between patients, caregivers and providers are critical for effective solutions and represent an important area for improvement," Dr. Leykum said. "i-HOPE showed this."

Committed team of diverse voices

The study has limitations. For example, although patients, caregivers and patient and family advisory councils were included from across the country, they may not be representative of all patients because the i-HOPE group of investigators is already engaged in improving health care delivery.

The study also has strengths. Questions were identified and prioritized by "a diverse group of voices and perspectives that typically are not included when prioritizing hospital research and improvement efforts," the authors wrote. The innovative partnership between researchers, patients, caregivers and stakeholders ensures the relevance of the results.

Driving the national conversation

"We hope that patients and caregivers will use our results to advocate for research and improvement in areas that matter the most to them," the authors noted. They also hope the results will drive a national conversation about how best to address the priority areas. Details on how the study was conducted are available at i-HOPE Study. The Twitter handle is @iHOPEstudy. The 11 priority questions are listed here.

"We invite patients and caregivers to have their seat at the table," Dr. Leykum said.

Credit: 
University of Texas Health Science Center at San Antonio

This artificial brain synapse is fast, efficient and durable

image: An array of artificial synapses designed by researchers at Stanford and Sandia National Laboratories can mimic how the brain processes and stores information.

Image: 
Armantas Melianas and Scott Keene

The brain's capacity for simultaneously learning and memorizing large amounts of information while requiring little energy has inspired an entire field to pursue brain-like - or neuromorphic - computers. Researchers at Stanford University and Sandia National Laboratories previously developed one portion of such a computer: a device that acts as an artificial synapse, mimicking the way neurons communicate in the brain.

In a paper published online by the journal Science on April 25, the team reports that a prototype array of nine of these devices performed even better than expected in processing speed, energy efficiency, reproducibility and durability.

Looking forward, the team members want to combine their artificial synapse with traditional electronics, which they hope could be a step toward supporting artificially intelligent learning on small devices.

"If you have a memory system that can learn with the energy efficiency and speed that we've presented, then you can put that in a smartphone or laptop," said Scott Keene, co-author of the paper and a graduate student in the lab of Alberto Salleo, professor of materials science and engineering at Stanford who is co-senior author. "That would open up access to the ability to train our own networks and solve problems locally on our own devices without relying on data transfer to do so."

A bad battery, a good synapse

The team's artificial synapse is similar to a battery, modified so that the researchers can dial up or down the flow of electricity between the two terminals. That flow of electricity emulates how learning is wired in the brain. This is an especially efficient design because data processing and memory storage happen in one action, rather than a more traditional computer system where the data is processed first and then later moved to storage.

Seeing how these devices perform in an array is a crucial step because it allows the researchers to program several artificial synapses simultaneously. This is far less time consuming than having to program each synapse one-by-one and is comparable to how the brain actually works.

In previous tests of an earlier version of this device, the researchers found their processing and memory action requires about one-tenth as much energy as a state-of-the-art computing system needs in order to carry out specific tasks. Still, the researchers worried that the sum of all these devices working together in larger arrays could risk drawing too much power. So, they retooled each device to conduct less electrical current - making them much worse batteries but making the array even more energy efficient.

The 3-by-3 array relied on a second type of device - developed by Joshua Yang at the University of Massachusetts, Amherst, who is co-author of the paper - that acts as a switch for programming synapses within the array.

"Wiring everything up took a lot of troubleshooting and a lot of wires. We had to ensure all of the array components were working in concert," said Armantas Melianas, a postdoctoral scholar in the Salleo lab. "But when we saw everything light up, it was like a Christmas tree. That was the most exciting moment."

During testing, the array outperformed the researchers' expectations. It performed with such speed that the team predicts the next version of these devices will need to be tested with special high-speed electronics. After measuring high energy efficiency in the 3-by-3 array, the researchers ran computer simulations of a larger 1024-by-1024 synapse array and estimated that it could be powered by the same batteries currently used in smartphones or small drones. The researchers were also able to switch the devices over a billion times - another testament to its speed - without seeing any degradation in its behavior.

"It turns out that polymer devices, if you treat them well, can be as resilient as traditional counterparts made of silicon. That was maybe the most surprising aspect from my point of view," Salleo said. "For me, it changes how I think about these polymer devices in terms of reliability and how we might be able to use them."

Room for creativity

The researchers haven't yet submitted their array to tests that determine how well it learns but that is something they plan to study. The team also wants to see how their device weathers different conditions - such as high temperatures - and to work on integrating it with electronics. There are also many fundamental questions left to answer that could help the researchers understand exactly why their device performs so well.

"We hope that more people will start working on this type of device because there are not many groups focusing on this particular architecture, but we think it's very promising," Melianas said. "There's still a lot of room for improvement and creativity. We only barely touched the surface."

Credit: 
Stanford University

Industry-ready process makes plastics chemical from plant sugars

MADISON, Wis. -- Developing renewable, plant-based alternatives for petroleum-derived chemicals is a major piece of the effort to transition away from a fossil-fuel based economy toward a more sustainable and environmentally friendly bio-based economy. But integration of novel and unproven technology into existing industrial systems carries an element of risk that has made commercialization of such advances a significant challenge.

In new research, published recently in the journal Energy and Environmental Science, a team from the Great Lakes Bioenergy Research Center and the University of Wisconsin-Madison describe an efficient and economically feasible process for producing HMF -- 5-hydroxymethylfurfural, a versatile plant-derived chemical considered crucial for building a renewable economy.

What's more, the process is simple and compatible with the existing infrastructure in the high fructose corn syrup industry, the researchers show.

"We integrated into a current process to reduce the initial risk quite a bit and decrease the initial capital required to put things on the ground to prove the technology," says Ali Hussain Motagamwala, who led the project while a UW-Madison graduate student in chemical and biological engineering.

HMF can be used to make a wide range of chemicals, plastics and fuels. It is an appealing candidate for commercialization in part because there is already an established market for many of the products made with HMF. One is a fully plant-derived version of polyethylene terephthalate (PET), the common plastic used to make beverage bottles and other food packaging. For example, Coca-Cola, Danone, and BASF have already invested in the production of furandicarboxylic acid, an HMF-derived chemical used to make 100 percent bio-based plastic bottles.

To date, however, HMF's use has been limited by its high production cost. Bio-based plastics are currently more expensive than their petroleum counterparts, largely due to the scale of the existing manufacturing processes.

"There is a demand for sustainable alternatives. The question is, how cost-competitive can we be with petroleum-based products?" says Motagamwala.

UW-Madison chemical and biological engineering Professor James A. Dumesic, senior author of the paper, has been working for more than two decades on technologies to sustainably and economically produce HMF from biomass-derived sugars.

"We have known for many years that HMF is a platform molecule with tremendous potential, but it has been an ongoing challenge to produce HMF in a cost-effective manner from sustainable carbohydrate resources," Dumesic says. "Our early work focused on the use of special solvent systems to produce HMF from fructose with high yields."

The problem has always been the solvent in which HMF has been produced.

"The solvents that are generally used are expensive themselves, and separation of the solvent and product makes the process even more expensive," Motagamwala says. "Now we have shown that we can make HMF in really high yield -- close to 95% -- with an inexpensive solvent system that can be removed very easily."

The GLBRC team's process dehydrates fructose to HMF using a solvent system composed of just acetone and water, with a stable solid acid catalyst. In addition to being cheap and readily accessible, the solvents are environmentally benign and easy to separate from the resulting HMF.

"One of the best things about the new process is that all the unit operations used are simple and are currently employed in the industry," Motagamwala says. That means a lower capital investment and less risk than is generally associated with unproven technologies.

The researchers conducted a techno-economic assessment to evaluate the feasibility of deploying the new process. It shows that a minimum selling price for HMF of $1,710 per ton will achieve a 25% return on investment -- a comfortable percentage intended to build in some reassurance of profitability.

The largest factor in determining that price is the cost of the feedstock -- in this case, fructose. That means a company within the corn industry would already control the biggest cost in the system. It also means that when high fructose corn syrup supply is higher than demand, the industry could shunt excess fructose into HMF as a separate, high-value product stream.

The researchers also demonstrated how the process can be expanded to use glucose as a feedstock. Because glucose can be readily produced from biomass, it is cheaper and more abundant than fructose. However, HMF production from glucose requires an extra processing step and additional infrastructure.

Proving the technology using fructose is the first step, according to Motagamwala.

"The long-term implementation of the process is to get plants starting from glucose, which will drive the cost even lower," he says.

Credit: 
University of Wisconsin-Madison

Study finds that collaborating with business contributes to academic productivity

Interaction between universities and companies in Brazil has societal, economic and environmental impacts, as well as positive effects on academic productivity. Researchers and research groups who collaborate with business organizations are scientifically more productive. The intellectual and scientific impacts of the partnership are positive.

This is the main finding of a study conducted by Renato de Castro Garcia, a professor at the University of Campinas Economics Institute (IE-UNICAMP), and presented to the 8th Annual Meeting of the Global Research Council (GRC).

The GRC summit was attended by heads of research funding agencies from dozens of countries around the world. Organized by the São Paulo Research Foundation (FAPESP), Argentina's National Scientific and Technical Research Council (CONICET) and the German Research Foundation (DFG), the meeting took place on May 1-3, 2019, in São Paulo, Brazil.

The study is published in the journal Science and Public Policy. Its findings are based on a questionnaire answered by 1,005 researchers and representatives of research centres who reported collaboration with firms to Brazil's National Council for Scientific and Technological Development (CNPq). The data are for 2002-08.

"We divided the researchers into those who interacted regularly and those who interacted only once with business organizations. We found that commercial factors were important for both groups. However, those who interacted regularly saw intellectual benefits such as new ideas for projects or scientific publications as most important," Garcia said.

Garcia co-edited the book Estudos de caso da interação universidade empresa no Brasil ("Case studies on university-business interaction in Brazil") with Márcia Rapini of the Federal University of Minas Gerais (UFMG) and Silvio Cário of the Federal University of Santa Catarina. Focusing on studies of academia-industry interaction conducted in several countries, the book is available for download free of charge.

"In Brazil the sectors that interact with universities are often not those considered science-intensive or close to the knowledge frontier, such as electronics, pharmaceuticals or aerospace, for example," said Rapini, who took part in the GRC's annual meeting.

An example can be found in Minas Gerais, she added, where mining and steelmaking are at the forefront of interaction with universities. "These are traditional, well-established industries that focus on export. We observed similar examples in all Brazilian states," she said.

Commenting further on the mining and steelmaking example, Rapini noted that these industries are obliged to interact with universities by law. "It's not spontaneous interaction. This made us realize that interaction is defined by the existence of demand on the part of the firm," she said.

Another finding she highlighted was that interaction occurs in firms with their own internal research and development (R&D) departments. "When a firm produces knowledge internally, it tends to want to reach out to academia. Firms that merely survive don't produce knowledge. This was a lesson we learned. If the firm doesn't want to do it, it doesn't happen. If the basic demand isn't there, there won't be any interaction," Rapini said.

In areas where it is possible to do basic research and publish articles, interactions is more evident. "There are areas in which partnerships occur because without interacting with industry, firms or productive agricultural establishments the researcher can't do the research or know whether the product developed can be mass-produced or even if it's economically viable," Rapini said.

The book has three levels of analysis: sectoral studies, knowledge areas and studies of firms.

"The book was made possible by a team who brought actual case studies from each state. Some chapters analyze partnerships between universities and non-traditional industries. We obtained different results from those reported in studies conducted elsewhere, mainly in developed countries. We learned a great deal about our own reality," Rapini said.

The book also shows, Rapini added, that focusing excessively on cooperation with business may lead research centres to ignore opportunities to partner with other sectors, such as NGOs or government. "These partnerships can have a major societal and economic impact in developing countries and should be highly valued," she said.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

RIT professor develops microfluidic device to better detect Ebola virus

A faculty-researcher at Rochester Institute of technology has developed a prototype micro device with bio-sensors that can detect the deadly Ebola virus. With this type of device, those infected can be treated earlier, and the early detection process can potentially decrease the spread of infections.

Ke Du, a faculty-researcher in RIT's Kate Gleason College of Engineering, developed a microfluidic device that utilizes CRISPR gene-editing technology to monitor and detect the nucleic acid markers that indicate Ebola virus. The virus is highly contagious and there is limited treatment once an individual has been diagnosed, he said. There are several prominent strains of Ebola, and his research team has focused on the EBOV strain, which has a high mortality rate.

"If an individual travels from one infected community to another, they can easily spread the epidemic. That is why before any symptoms of Ebola, such as cough or fever present, individuals can take a blood test before being allowed to travel," said Du, an assistant professor of mechanical engineering. He leads a multidisciplinary team of engineers and biochemists developing a rapid point-of care system and biochemistry array for in-field pathogen diagnosis. According to early results, the team has found that the Ebola RNA in test environments can be detected within five minutes by combining automated sample processing, fluorescence sensing and a unique CRISPR-Cas13a assay originated from a bacterial adaptive immune system.

The microfluidic device is an automated and small chip with a highly sensitive fluorescence sensing unit embedded into the device. Physicians take patient samples and add them into the device where Ebola RNA can be seen by activating the CRISPR mechanism. Du is also developing a device that could detect multiple virus strains from Ebola to influenza and zika, for example.

Du's research was published in the April 2019 issue of ACS Sensors. The article "Rapid and Fully Microfluidic Ebola Virus Detection with CRISPR-Cas13a," features an international and multidisciplinary team assessing the use of CRISPR technology--gene editing technology--to improve virus detection. The group members are from University of California, Berkeley; Tsinghua Berkeley Shenzhen Institute (China); Dong-A University (Korea); Texas Biomedical Research Institute; and Boston University.

"For this work, we are trying to develop a low-cost device that is easy to use especially for medical personnel working in developing countries or areas where there are outbreaks. They'd be able to bring hundreds of these devices with them for testing, not just one virus or bacteria at one time, but many different kinds," he explained.

Researchers have tried for the past 40 years to develop an effective Ebola vaccine. Early detection remains an important strategy for controlling outbreaks, the most recent in the Congo, where more than 1,000 individuals have died, according to the Centers for Disease Control.

"If you look at this like influenza, and people don't look at it as a virus which also can kill people each year. Some strains may not be as deadly as Ebola, but we know that infectious diseases, regardless of the type, are problems that can threaten the public," Du said. "I grew up in China and experienced the 2002-2004 SARS outbreak. I have seen many people lose their relatives and friends because of infectious diseases. If we can have early detection systems to help screen for all types of diseases and patterns, this can be very useful because it can provide information to medical doctors and microbiologists to help develop the vaccines, and early detection and identification can control and even prevent outbreaks."

Credit: 
Rochester Institute of Technology