Culture

Paper sensors remove the sting of diabetic testing

video: A KAUST-led group has developed a technique that enables biologically active enzymes to survive the rigors of inset printing.

Image: 
2018 KAUST

A technique that enables biologically active enzymes to survive the rigors of inkjet printing presents a promising alternative to routine blood screening exams faced by diabetic patients. The KAUST-led team used this approach to make disposable devices that can measure glucose concentrations in human saliva.

Strips of pH-sensitive paper are commonly used to test whether a liquid is acidic or alkaline. Researchers are now working to apply similar principles to create paper sensors that quickly indicate disease biomarkers. Key to this approach is replacing traditional electronic circuitry in the sensors with low-cost plastics that can be manufactured quickly and in large quantities.

Bioscientist Sahika Inal collaborated with electrical engineer Khaled Salama and materials scientist Derya Baran to use inkjet technology to produce sensors sensitive to small sugar concentrations in biofluids.

Utilizing a commercial ink made from conducting polymers, the team printed microscale electrode patterns onto glossy paper sheets. Next, they printed a sensing layer containing an enzyme, glucose oxidase, on top of the tiny electrodes. The biochemical reaction between available glucose and the enzyme creates electrical signals easily correlated to blood sugar levels.

"Paper is porous, which makes it challenging to print conducting and biological inks that are dissolved in water," says Eloise Bihar, a postdoctoral researcher at KAUST and the first author of the study. "Printing the enzyme is tricky as well--it's sensitive to variations of temperature, the voltage applied at the cartridge, and the pH of the ink."

After optimizing the enzyme-printing conditions, the researchers had another obstacle to tackle. While fluids, such as sweat or saliva, contain enough sugar for monitoring purposes, they also contain molecules, such as ascorbic acid, that interfere electrically with conducting polymers. Coating the sensor with a nafion polymer membrane that repels the negative charges present in most interfering species enabled measurement of only the relevant glucose levels in saliva samples from volunteers.

Experiments showed the top coating gave the sensor an unprecedented shelf life--the enzyme could be kept alive and active for a month if stored in a sealed bag. These results are encouraging the team to expand the capabilities of this approach by incorporating different enzymes into the sensing layer.

"Optimization never ends in engineering, so we are trying to make this system more robust to detect other metabolites in biofluids," says Inal. "We are also looking to integrate printed and self-powered energy devices into the sensors, giving us a more user-friendly platform that eliminates external batteries or wires."

Credit: 
King Abdullah University of Science & Technology (KAUST)

Researchers detect age-related differences in DNA from blood

PROVIDENCE, R.I. [Brown University] -- Researchers have discovered age- and health-related differences in fragments of DNA found floating in the bloodstream (not inside cells) called cell-free DNA (cfDNA). These differences could someday be used to determine biological age -- whether a person's body functions as older or younger than their chronological age, the researchers say.

In a proof-of-concept study, researchers extracted cfDNA from blood samples from people in their 20s, people in their 70s, and healthy and unhealthy centenarians. The team led by Nicola Neretti, an assistant professor of molecular biology, cell biology and biochemistry at Brown University, detected differences in how the DNA was packaged in the four groups.

The findings were published on Friday, Dec. 21, in the journal Aging Cell.

Specifically, they found nucleosomes -- the basic unit of DNA packaging in which a segment of DNA is wrapped around a protein core -- were well-spaced in the DNA of the volunteers in their 20s but were less regular in the older groups, especially the unhealthy centenarians, Neretti said. Additionally, the signal from nucleosome spacing for the healthy centenarians was more similar to the signal from the people in their 20s than people in their 70s.

Nucleosome packing is one aspect of the epigenome -- the collection of heritable changes that affect gene expression or activity without affecting the DNA sequence, or genome

"Among other traits, healthy centenarians preserve the epigenomic profile of younger individuals," Neretti said. "As with anything in aging, many things work together, and it is not clear what the cause or the effect is. With our cfDNA test, we hope to gain understanding of these epigenetic changes and what they mean."

Message in a bottle

Scientists first found cfDNA in the blood of cancer patients, and the fragments can be useful for diagnosing cancer. Earlier research has found that cfDNA is produced by dying cells, and as the cells die, the DNA is cut in between nucleosomes, Neretti said.

The team at Brown used next-generation sequencing of the cfDNA combined with complex computational analysis to reconstruct the pattern of nucleosome spacing in different regions of the genome -- both areas that are typically open for expressing genes as well as areas that are normally tightly packed. The cfDNA extraction and sequencing processes were developed in collaboration with Ana Maria Caetano Faria from the Universidade Federal de Minas Gerais in Brazil.

"cfDNA is somewhat like a message in a bottle that captures what the cell looked like, epigenetically speaking, before it died," Neretti said. "A lot of cellular machinery is involved in maintaining nucleosome spacing, and these components can go downhill as you age. The nucleosomes don't move apart or become more dense themselves. The nucleosome spacing is just the read-out of the changes of that machinery."

However, he added, changes in nucleosome packing produce changes in the accessibility of different parts of the genome, which leads to even more things going awry, including the freeing of normally locked-down genetic elements called transposons.

The team did detect a reduction in cfDNA signals at the beginning of two common transposons with increasing age. This suggests that these transposons are less locked-down in the unhealthy centenarians and people in their 70s and thus more likely to be "copying and pasting" themselves into the genome, causing genetic mayhem.

Future work

The study only analyzed the cfDNA of 12 individuals from Bologna, Italy -- three from each group. The samples were collected by collaborator Claudio Franceschi, from the University of Bologna. A larger study is needed to gain the information necessary to use these epigenetic markers to predict biological age, Neretti said. However, because the cfDNA test uses easy-to-collect blood instead of invasive tissue samples, he thinks it will be straightforward to expand the proof-of-concept study.

"Ideally, you would like to track a population of individuals over 20 or 30 years to see how each individual's epigenome changes, and the rate of change, as they age," he said. A large study could allow the association of epigenomic differences with health conditions, lifestyles or diets, he added.

Meanwhile, the research team is refining the test.

They are working to optimize the process of extracting cfDNA from blood. In mice, they can reliably get the amount of cfDNA they need from a quarter teaspoon of blood. Neretti thinks that they don't need to sequence the whole genome to detect the age- and health-related epigenetic changes. For this study, they did whole-genome sequencing, but he expects that sequencing 2 to 5 percent of the genome could be sufficient.

In addition to refining the nucleosome positioning analysis, the researchers would like to study another kind of epigenetic marker -- DNA methylation patterns -- in the cfDNA, Neretti said. This would provide additional information, including markers that can indicate what tissue the cfDNA came from. Determining the sources of cfDNA at different ages -- or what tissues are experiencing a lot of cell death -- could provide insights into the aging process.

Better understanding the epigenetic changes of the aging process could aid in developing treatments for age-associated disorders or someday be used to determine whether your body is aging faster or slower than typical, Neretti added.

Credit: 
Brown University

Bidi smoking costs India an annual $11 billion in ill health and early death

Bidi smoking cost India 805.5 billion rupees in ill health and early deaths in 2017 alone, finds research published in the journal Tobacco Control.

The poor already bear the brunt of these costs, and unhindered use of bidi tobacco threatens to push even more households into poverty, the researcher warns.

Bidi is very popular in India, accounting for most (81%) of the tobacco smoked and 72 million regular users over the age of 15.

Although bidi contains less tobacco than conventional cigarettes, the nicotine content is significantly higher. And the relatively low burn point forces smokers to breathe in more of the harmful chemicals produced.

Bidi smoking is implicated in several types of cancer, tuberculosis, and various long term lung conditions. But despite its impact on the nation's health, it has been taxed at a rate that is a fraction of that applied to cigarettes, says the researcher.

The financial toll taken by bidi smoking in India has never been calculated. To try and put this right, the researcher drew on several sources of national and international data to estimate the direct and indirect costs of treating the ill health and early deaths attributable to the habit among 30-69 year olds in 2017.

His calculations revealed that bidi smoking cost India INR 805.5 billion (US$12.4 billion) in terms of ill health and early deaths.

Direct costs-tests, drugs, doctors' fees, hospital stays, and transport-make up around a fifth of this total (just under 21%; INR 168.7 bn), with the remainder made up of indirect costs-accommodation for relatives/carers and loss of household income (INR 811.2 bn).

Given that around one in four 30 to 69 year old men smokes bidi, the habit takes a disproportionate toll on the nation's men, says the researcher.

These figures amount to around 0.5 per cent of the goods and services (GDP) India produces and more than 2 per cent of its total health spend, he calculates. Yet the tax revenue derived from bidi smoking came to just INR4.17 billion in 2016-17.

Nearly one in five households in India faces "catastrophic expenditures" due to healthcare costs, the researcher points out, with more than 63 million people pushed into poverty, as a result.

"Diseases associated with bidi smoking add to this, potentially pushing more people into poverty," he writes, suggesting that about 15 million face poverty because of spending on tobacco and associated health costs.

"Expenditure on tobacco also crowds out expenditure on food and education in India, especially among the poor," he adds.

"Despite overwhelming evidence on the effectiveness of taxing tobacco products, taxation as a tool to regulate bidi smoking has been highly underutilised in India," he insists, calling for for a tax hike on bidi tobacco to halt its unfettered consumption.

"Allowing bidi consumption to continue unhindered would make income distribution even more regressive, as the poor will continue to bear a disproportionately large share of economic costs from bidi smoking due to their higher bidi smoking prevalence," he concludes.

Credit: 
BMJ Group

NHS trusts struggling to produce Brexit plans amid continuing uncertainty

NHS trusts are struggling to produce contingency plans for Brexit because of the continuing uncertainty about the UK's future relationship with the European Union, reveals an investigation published by The BMJ today.

Many have been unable to accurately forecast how crucial areas such as supply chains, medicines, and workforce will be affected after the 29 March exit deadline.

The BMJ sent Freedom of Information (FOI) requests to all 231 NHS trusts in England and 26 health boards across Scotland, Wales, and Northern Ireland and received 182 responses (a 71% response rate) - 161 from NHS trusts and 21 from health boards.

The analysis found that only 9% of English trusts (15 out of 161 that responded) have established a committee or body to oversee preparations for Brexit. Out of the 21 health boards in Wales, Scotland, and Northern Ireland that responded (out of a total of 26), 14 have set up a committee.

The BMJ also asked trusts and health boards to disclose any current risk assessment related to Brexit.

Only a quarter (26%) of those that responded (47 out of 182) were able to disclose this information, with a number saying they were still assessing the risk. Those that have been done are largely thin on detail and similar risks have often been assessed differently from trust to trust.

Saffron Cordery, deputy chief executive of NHS Providers, the body representing NHS trusts in England, told The BMJ: "All of the uncertainty has just exacerbated an already difficult situation. Trusts have planned as far as they can, but so much of this is reliant on central government action."

The Department of Health and Social Care, which is overseeing central coordination of risk areas such as medicines, food, medical devices, and clinical consumables, has said trusts are responsible for their own contingency activity. On Monday 17 December, the health secretary Matt Hancock told the BBC's Newsnight that the Department had instituted "full no-deal planning" for the NHS.

Hancock has sought to reassure MPs that NHS supplies, workforce, and medicines regulation will be secure in the event of a no deal "if everybody does everything they need to do." But with the terms of Brexit still uncertain, much of the detail of what trusts actually "need to do" is not clear.

Trusts have drawn up lists of contracts that could be affected by a "no deal," but most have been unable to move beyond basic scenario planning for Brexit.

The investigation did find that some trusts and health boards are taking action to support their EU staff, including paying for them to achieve settled status, while others have issued instructions not to stockpile medicines or write longer prescriptions for patients in the weeks leading up to Brexit, as requested by the Government.

For example, Royal United Hospital Bath NHS Trust, which has set up a Brexit committee, said it would advise doctors not to overprescribe, but it said that some products, such as furosemide and EpiPens, were "already in short supply."

Commenting on the findings, Martin McKee, professor of European Public Health at the London School of Hygiene and Tropical Medicine, said: "The picture painted by these responses is extremely concerning. It is clear that any form of Brexit will have profound implications for the NHS."

He added: "Even though ministers have been unable to provide reassurance that patients will not die as a result of their policies, they have been unable to offer any useful guidance for trusts. It is inconceivable that the NHS will be prepared for anything other than a situation that, in effect, continues the current arrangements by the end of March 2019."

Credit: 
BMJ Group

Elegant trick improves single-cell RNA sequencing

ITHACA, N.Y. - Droplet microfluidics has revolutionized single-cell RNA sequencing, offering a low-cost, high-throughput method for single-cell genomics. However, this method has been limited in its ability to capture complete RNA transcription information.

Researchers at Cornell - led by Iwijn De Vlaminck, assistant professor in the Meinig School of Biomedical Engineering - have come up with an elegant, low-cost method that solves that problem. And not only does it push single-cell genomics forward, it may allow for new avenues for studies of infection and immune biology.

"Simultaneous Multiplexed Amplicon Sequencing and Transcriptome Profiling in Single Cells" was published recently in Nature Methods. Postdoctoral researcher Mridusmita Saikia and doctoral student Philip Burnham, both of the De Vlaminck lab, are lead authors.

Also contributing were Charles Danko, assistant professor at the Baker Institute for Animal Health in the College of Veterinary Medicine, and John Parker, associate professor of virology in the Baker Institute.

In 2015, researchers from Harvard University and the Massachusetts Institute of Technology introduced Drop-seq, a method to simultaneously and efficiently characterize the identities of thousands of cells, using nanoliter-scale droplets and attaching a unique identifier to each cell's RNA.

"Those technologies are very popular because they've lowered the cost of these types of analyses and sort of democratized them, made them very cheap and easy to do for many labs," De Vlaminck said.

The drawback, however, is that they can only identify a certain type of messenger RNA (mRNA) molecule, which limits the potential scope of analyses. Messenger RNA carries the genetic information copied from DNA in the process of translation.

De Vlaminck and his collaborators have come up with a simple, inexpensive twist to the existing Drop-seq protocol, and call their new method DART-seq (droplet-assisted RNA targeting by single-cell sequencing).

In Drop-seq, individual cells are encapsulated with labeled microparticles that initiate reverse transcription of cellular mRNA. The De Vlaminck group devised an effective method to enzymatically customize the beads prior to performing conventional Drop-seq analysis, which allows for the recovery and analysis of a greater variety of molecules than are available through Drop-seq sequencing.

In addition, this technology can identify virus-infected cells, and quantify viral and host gene expression, thus enabling examination of the host response to infection at single-cell level.

"A single virus species can be very diverse, and that diversity permits them to do extraordinary things," Burnham said. "So if you can zoom down to the single-cell level, you can actually see how minor changes in the virus cause a potentially huge change in how the cell reacts to that small mutation."

Saikia, who has a dual appointment with the veterinary college, thinks DART-seq will also help inform new approaches to cancer therapy.

"Cancer cells are a very heterogeneous population," she said, "and when you don't look at them at the single-cell level, you often miss important information. So our technology also allows that."

Credit: 
Cornell University

Police interactions linked to increased risk of client violence for female sex workers

The more abusive interactions street-based female sex workers (FSWs) have with police, the higher their risk of violence at the hands of clients, a new study by researchers at the Johns Hopkins Bloomberg School of Public Health suggests. The findings suggest the need for interventions that address relationships between FSW and police to help alleviate negative impacts on FSW work environments, the authors say.

The findings will be published Dec. 20 in American Journal of Public Health.

"It's no secret that street-based sex work can be dangerous in the context of criminalization," says senior author Susan Sherman, PhD, a professor in the Bloomberg School's Department of Health, Behavior and Society. Research suggests that, globally, FSWs experience considerable work-related violence, including physical, verbal and sexual abuse, robbery, kidnapping and murder. (Over their lifetime, the prevalence rate for work-related violence among FSWs is 45 to 75 percent.) But, she adds, the role of the police in this violence against this vulnerable population hasn't been well understood.

The study, which involved 250 FSWs, found that FSWs interact with the police--not only because of the illegal nature of their jobs, but because many are engaged in the drug market, says Katherine Footer, MSc, assistant scientist at the Bloomberg School and the paper's lead author.

An analysis of the study data showed that all participating FSWs had experienced police interactions in their lifetimes, with nearly half having weekly encounters and about one in 10 having daily police encounters. These interactions ranged from regular patrol or enforcement activities, such as asking the women to move along or performing a routine stop, to abusive, such as verbal or emotional harassment or sexual harassment or assault. Excluding arrest, 92 percent had experienced at least one patrol/enforcement activity, and 78 percent had experienced at least one abusive encounter in their lifetime.

The research is part of the SAPPHIRE study, short for Sex Workers and Police Promoting Health in Risky Environments. Launched earlier this year by a team at the Bloomberg School, SAPPHIRE's research will examine the type, frequency and associations of police practices with FSW's work-related risks, including violence.

For this study, the researchers gathered data from 250 Baltimore cis-gender FSWs recruited between April 2016 and January 2017. The researchers found these volunteers by waiting at specific locations they identified using a combination of publicly available arrest data, emergency services calls, a "johns" website and 300 hours of police ride-alongs. When potential study participants came in contact with study staff, the researchers asked a few screening questions to confirm that they were street-based FSWs and invited those who qualified to participate.

These volunteers then answered a series of questions administered by interviewers on a variety of topics, including sociodemographic factors, their history of sex work and drug use, the nature and frequency of their interactions with police and any client-related violence. Participants also underwent testing for HIV and common sexually transmitted infections.

Drug use appeared to boost the frequency and type of police encounters; for example, 42 percent of daily heroin users reported that at least one patrol/enforcement activity occurred weekly and 14 percent reported that at least one abusive encounter occurred over the same time frame, compared to 25 percent and five percent, respectively among other participants.

Client violence was also common among participants--more than half had experienced physical or sexual client violence in the past three months. The researchers found that for every additional type of patrol/enforcement practice experienced, FSWs had 1.3 times the odds of experiencing client violence. In addition, each additional type of abusive police interaction was associated with a 30 percent increase in the odds of experiencing client violence.

Although the study didn't examine the exact mechanism of police interactions associated with client violence, Sherman says, the build-up of frequent negative interactions accumulated over months and years promotes mistrust or fear of the police. Studies have previously linked this mistrust to different types of riskiness that might boost the odds of client violence, including rushing of client negotiations and moving to unfamiliar or unsafe areas.

"Criminalization and social marginalization place female sex workers in persistent positions of vulnerability," says Footer. "The fact that we found these two factors to be conjoined speaks to the clustering of the incredible risks to these women."

Decriminalizing sex work would help alleviate the pressure that intensifies this risk, Sherman adds. But in an environment of continued illegality, there are still structural interventions that could help improve sex worker safety, she says. For example, she and her colleagues helped launch a drop-in center where FSWs can not only take care of basic needs, such as showering or doing their laundry, but also seek assistance in filing complaints about abusive interactions with the police. They have also developed a "know your rights" brochure and talk to women about how to protect themselves in police interactions.

The paper underscores the importance of the impact of police and their responsibility in reducing their harmful behaviors. On the police side, Sherman and her colleagues suggest developing training specifically to address how police treat FSWs and having an ombudsman in police departments specifically for sex workers to make sure their voices are heard. Most importantly, the authors note that modifying the enabling environment for client-perpetrated violence, including legal enforcement approaches and abuses, that may contribute to a climate of impunity around client-perpetrated violence and police misconduct is critical.

Credit: 
Johns Hopkins Bloomberg School of Public Health

Study examines head impacts, changes in eye function in high school football players

Bottom Line: Head impacts in youth sports, even when they don't cause symptoms of concussion, are a public health concern because these so-called subconcussive head impacts may result in long-term neurological issues if they are sustained repeatedly. This study looked at changes in measurements of near point of convergence (NPC), which is the distance from your eyes to where both eyes can focus without double vision, in 12 high school football players at 14 different times during a season. The NPC measurement matters because it has been shown to detect damage to neurons before symptoms appear. The frequency and magnitude of head impacts from all practices and games also were measured. Study findings suggest NPC values worsened with subconcussive head impacts, and that impaired NPC didn't rapidly recover. However, NPC values began to return to normal in midseason while players continued to incur head impacts, suggesting the system controlling eye movements may develop tolerance to recurrent subconcussive head impacts. The findings of this study may not be generalized because of its small size.

Credit: 
JAMA Network

Hold the fries! How calorie content makes you rethink food choices

image: A cheeseburger, French fries and cherry cheesecake were among types of food images included in the study. (These are not the actual images the researchers used).

Image: 
Photo courtesy of Pixabay <a target="_blank"href="https://pixabay.com/en/beef-bread-bun-burger-cheese-1239198/">https://pixabay.com/en/beef-bread-bun-burger-cheese-1239198/</a>

Seeing pictures of food with calorie information not only makes food less appetizing but it also appears to change the way your brain responds to the food, according to a Dartmouth-led study published in PLOS ONE. When food images appeared with the calorie content, the brain showed decreased activation of the reward system and increased activation in the control system. In other words, foods that you might otherwise be inclined to eat became less desirable once the calorie content was displayed.

The study is the first of its kind to examine how your brain makes food choices when calorie information is presented. The results are timely given that earlier this year, certain food chain establishments had to comply with the U.S. Food & Drug Administration's menu labeling law requiring the disclosure of calorie information on menus and menu boards. In addition, according to the Centers for Disease Control and Prevention, obesity affected nearly 40 percent of U.S. adults in 2015-16.

"Our findings suggest that calorie-labeling may alter responses in the brain's reward system when considering food options. Moreover, we believe that nutritional interventions are likely to be more successful if they take into account the motivation of the consumer, including whether or not they diet," says first author Andrea Courtney, who was a graduate student in the department of psychological and brain sciences at Dartmouth at the time of the study and is currently a postdoctoral student at the Stanford Social Neuroscience Lab at Stanford University.

For the study conducted at Dartmouth, 42 undergraduate students (ages 18 to 22) viewed 180 food images without calorie information followed by images with calorie information and were asked to rate their desire to eat the food while in a functional magnetic resonance imaging scanner (fMRI). The images were obtained from either the food pics database or popular, fast food restaurant websites that post calorie information. The 22 dieters and 20 non-dieters viewed the same set of images, including foods such as a cheeseburger, a side of French fries or a slice of cherry cheesecake. On a scale from 1 to 4 (1 = not at all, 4 = very much), they indicated how likely they would be to eat the food in the dining hall.

While dieters and non-dieters alike rated calorie-labeled foods as less appetizing, this effect was strongest among dieters. Further, the researchers analyzed responses in two brain regions that motivate eating behavior: the nucleus accumbens (NAcc) and the orbitofrontal cortex (OFC). Although all participants showed a decrease in activation in these areas when calorie information was present, dieters showed more similar activation patterns in the left OFC for calorie-labeled and unlabeled foods. This finding suggests that dieters may consider calorie information even when it isn't explicitly present and builds on previous research suggesting that the presence of health cues can lead to healthier food decisions.

"In order to motivate people to make healthier food choices, policy changes are needed that incorporate not only nutritional information, including calorie content, but also a public education component, which reinforces the long-term benefits of a healthy diet," added senior author Kristina Rapuano, who was a graduate student in the department of psychological and brain sciences at Dartmouth at the time of the study and is currently a postdoctoral student at the Fundamentals of the Adolescent Brain Lab at Yale University.

Credit: 
Dartmouth College

Genetic study reveals how citrus became the Med's favorite squeeze

image: Buddha's Hands -- citron study highlights historic path of domestication.

Image: 
Mathulack Photography

Genetic detective work has illuminated the important role of Jewish culture in the widespread adoption of citrus fruit by early Mediterranean societies.

The fascinating find came to light in an investigation into a bizarre acidless mutation which makes citrus juice 1000 times less acidic.

John Innes Centre researchers used genetic analysis to trace the acidless mutations in citron, the first citrus species to be cultivated in the Mediterranean.

"Some people thought that this was a recent mutation that originated in Corsica, or somewhere in the Mediterranean, but we have found that this is not new. It's an ancient mutation that is present in Chinese fingered citrons known as Buddha's Hands and those used in the Sukkot Jewish ritual," explains Dr Eugenio Butelli of the John Innes Centre and first author of the paper.

The acidless mutations have captivated botanists and breeders for centuries and appear in many citrus varieties including citron, sweet lime, limetta, lemon and sweet orange.

Acidless citrus fruit have also lost the ability to produce anthocyanin pigments, that give a blush of dark red to leaves, flowers and, sometimes, flesh.

The researchers identified a gene, which they called Noemi, as the key factor behind the regulation of fruit acidity. Analysis also revealed that this gene works in partnership with another, named Ruby, to control anthocyanin production.

The study identified specific mutations affecting the Noemi gene in several acidless citrus species and hybrids. These acidless fruits are often referred to as sweet or insipid because of the reduction in fruit acidity and are highly prized citrons (Etrog in Hebrew) used in the Jewish harvest festival of Sukkot.

One of these mutations matched those found in fingered citron varieties first cultivated in China 3300 years ago. This confirmed that this mutation originated before the arrival of citron into the Mediterranean.

Further analysis revealed that the same ancient Noemi allele characteristic of the acidless trait was present in the Yemen citron, an ancient variety traditionally used in the Sukkot tradition since the time of the destruction of the first temple in 587 B.C.E. Another variety traditionally used in the Sukkot ritual, the Greek citron, also bore the same genetic hallmark.

The analysis suggests that the authentic Jewish Etrog used ritually was an acidless one, an idea supported by a reference to "sweet citron" in the Jewish legal text, the Talmud, dating from 200 C.E.

The study which appears in Current Biology illuminates the path of domestication of citron. It supports the view that the spread of citron in Mediterranean regions was facilitated by its adoption in Jewish culture as an important religious symbol. Some scholars speculate that Jews in exile in Babylonia brought the citron back to Palestine.

Why was this sweet, or insipid citrus, with plain white flowers and leaves drained of colour, the chosen fruit?

"Citron was first cultivated for its medicinal properties in China and its rind was used as a medicinal product, not as a food" explains Professor Cathie Martin of the John Innes Centre and a co-author on the study.

"By the time it reached the Mediterranean in Roman times, citron was a luxury item used for its fragrance to keep linen fresh. The presence of white flowers in the acidless mutation seems important because they are a symbol of purity and we speculate that there was a strong selection for the loss of anthocyanins, which normally add colour to leaves and flowers."

Citron is one of four primary species that make up the citrus genus, a complex group of flowering plants with notable nutritional, medicinal and aromatic value. Despite becoming one of the world's most economically important fruit crops, its history of evolution and domestication has remained obscure until recently.

The characterisation of Noemi provides researchers with an important genetic marker opening a fascinating landscape for genetic analysis of seeds found amid the burials of the ancient world and fossil remains from even further back in time.

The study also gives researchers the information they need to develop fruit of the future - to modulate their level of acidity and to increase their content of health-protecting anthocyanin compounds.

"If you could introduce these mutations stably in lemon, for example, you could make lemonade which does not need so much added sugar in it, making it healthier to drink and better for growing teeth." explains Professor Martin.

Credit: 
John Innes Centre

In a black hole, you might have to go beyond Einstein

image: This is an artist depiction of loop quantum gravity effects in a black hole. The bottom half of the image depicts the black hole which, according to general relativity, traps everything including light. Loop quantum gravity, a theory that extends Einstein's general relativity using quantum mechanics, overcomes this tremendous pull and liberates everything shown in the top half of image, thus solving the fundamental problem of black hole singularity.

Image: 
A. Corichi and J. P. Ruiz.

When stars collapse, they can create black holes, which are everywhere throughout the universe and therefore important to be studied. Black holes are mysterious objects with an outer edge called an event horizon, which traps everything including light. Einstein's theory of general relativity predicted that once an object falls inside an event horizon, it ends up at the center of the black hole called a singularity where it is completely crushed. At this point of singularity, gravitational attraction is infinite and all known laws of physics break down including Einstein's theory. Theoretical physicists have been questioning if singularities really exist through complex mathematical equations over the past several decades with little success until now. LSU Department of Physics & Astronomy Associate Professor Parampreet Singh and collaborators LSU Postdoctoral Researcher Javier Olmedo and Abhay Ashtekar, the Eberly Professor of Physics at Penn State developed new mathematical equations that go beyond Einstein's theory of general relativity overcoming its key limitation--the central singularity of black holes. This research was published recently in Physical Review Letters and Physical Review D and was highlighted by the editors of the American Physical Society.

Theoretical physicists developed a theory called loop quantum gravity in the 1990s that marries the laws of microscopic physics, or quantum mechanics, with gravity, which explains the dynamics of space and time. Ashtekar, Olmedos and Singh's new equations describe black holes in loop quantum gravity and showed that black hole singularity does not exist.

"In Einstein's theory, space-time is a fabric that can be divided as small as we want. This is essentially the cause of the singularity where the gravitational field becomes infinite. In loop quantum gravity, the fabric of space-time has a tile-like structure, which cannot be divided beyond the smallest tile. My colleagues and I have shown that this is the case inside black holes and therefore there is no singularity," Singh said.

Instead of singularity, loop quantum gravity predicts a funnel to another branch of the space-time.

"These tile-like units of geometry--called 'quantum excitations'-- which resolve the singularity problem are orders of magnitude smaller than we can detect with today's technology, but we have precise mathematical equations that predict their behavior," said Ashtekar, who is one of the founding fathers of loop quantum gravity.

"At LSU, we have been developing state-of-the-art computational techniques to extract physical consequences of these physical equations using supercomputers, bringing us closer to reliably test quantum gravity," Singh said.

Einstein's theory fails not only at the center of the black holes but also to explain how the universe was created from the Big Bang singularity. Therefore, a decade ago, Ashtekar, Singh and collaborators began to extend physics beyond the Big Bang and make new predictions using loop quantum gravity. Using the mathematical equations and computational techniques of loop quantum gravity, they showed that the Big Bang is replaced by the "Big Bounce." But, the problem of overcoming black hole singularity is exceptionally complex.

"The fate of black holes in a quantum theory of gravity is, in my view, the most important problem in theoretical physics," said Jorge Pullin, the Horace Hearne professor of theoretical physics at LSU, who was not part of this study.

Credit: 
Louisiana State University

Responsible innovation key to smart farming

Responsible innovation that considers the wider impacts on society is key to smart farming, according to academics at the University of East Anglia (UEA).

Agriculture is undergoing a technology revolution supported by policy-makers around the world. While smart technologies will play an important role in achieving improved productivity and greater eco-efficiency, critics have suggested that consideration of the social impacts is being side-lined.

In a new journal article Dr David Rose and Dr Jason Chilvers, from UEA's School of Environmental Sciences, argue that the concept of responsible innovation should underpin the so-called fourth agricultural revolution, ensuring that innovations also provide social benefits and address potentially negative side-effects.

Each of the previous revolutions was radical at the time - the first representing a transition from hunting and gathering to settled agriculture, the second relating to the British Agricultural Revolution in the 18th century, and the third to post-war productivity increases associated with mechanisation and the Green Revolution in the developing world.

The current 'agri-tech' developments come at a time when the UK government has provided £90 million of public money to transform food production in order to be at the forefront of global advanced sustainable agriculture. Many other countries are also prioritising smart agri-tech.

This, combined with private investment from organisations including IBM, Barclays, and Microsoft, means that 'Agriculture 4.0' is underway, with technologies such as Artificial Intelligence (AI) and robotics increasingly being used in farming.

Dr Rose, a lecturer in human geography, said: "All of these emergent technologies have uses in farming and may provide many benefits. For example, robotics could plug potential lost labour post-Brexit in industries such as fruit picking, while robotics and AI could enable better chemical application, saving farmers money and protecting the environment. They could also attract new, younger farmers to an ageing industry."

Writing in Frontiers in Sustainable Food Systems, Dr Rose and Dr Chilvers warn though that agri-tech could also have side-effects, bringing potential environmental, ethical, and social costs.

"In light of controversial agri-tech precedents, it is beyond doubt that smart farming is going to cause similar controversy. Robotics and AI could cause job losses or change the nature of farming in ways that are undesirable to some farmers. Others might be left behind by technological advancement, while wider society might not like how food is being produced," said Dr Rose.

"We therefore encourage policy-makers, funders, technology companies and researchers to consider the views of both farming communities and wider society. We advocate that this new agricultural tech revolution, particularly the areas funded by public money, should be responsible, considering the winners, but particularly the potential losers of change.

Dr Rose added: "This means better ways, both formal and informal, to include farmers and the public in decision-making, as well as advisors and other key stakeholders sharing their views. Wider society should be able to change the direction of travel, and ask whether we want to go there. They should be able to question and contest whether benefits to productivity should supersede social, ethical, or environmental concerns, and be able to convince innovators to change design processes.

"Responsible innovation frameworks should be tested in practice to see if they can make tech more responsible. More responsible tech saves controversy, such as that surrounding genetic modification, ensures farmers and the public are behind it, and can help to deliver on the policy objectives."

Credit: 
University of East Anglia

Women Go Lauren Bacall when competing for a man

 Katarzyna Pisanski et al., writing in Proceedings of the Royal Society B: Biological Sciences, have found that women lower their voice when competing for a man.

Hardware-software co-design approach could make neural networks less power hungry

image: A UC San Diego-led team has developed hardware and algorithms that could cut energy use and time when training a neural network.

Image: 
David Baillot/UC San Diego Jacobs School of Engineering

A team led by the University of California San Diego has developed a neuroinspired hardware-software co-design approach that could make neural network training more energy-efficient and faster. Their work could one day make it possible to train neural networks on low-power devices such as smartphones, laptops and embedded devices.

The advance is described in a paper published recently in Nature Communications.

Training neural networks to perform tasks like recognize objects, navigate self-driving cars or play games eats up a lot of computing power and time. Large computers with hundreds to thousands of processors are typically required to learn these tasks, and training times can take anywhere from weeks to months.

That's because doing these computations involves transferring data back and forth between two separate units--the memory and the processor--and this consumes most of the energy and time during neural network training, said senior author Duygu Kuzum, a professor of electrical and computer engineering at the Jacobs School of Engineering at UC San Diego.

To address this problem, Kuzum and her lab teamed up with Adesto Technologies to develop hardware and algorithms that allow these computations to be performed directly in the memory unit, eliminating the need to repeatedly shuffle data.

"We are tackling this problem from two ends--the device and the algorithms--to maximize energy efficiency during neural network training," said first author Yuhan Shi, an electrical engineering Ph.D. student in Kuzum's research group at UC San Diego.

The hardware component is a super energy-efficient type of non-volatile memory technology--a 512 kilobit subquantum Conductive Bridging RAM (CBRAM) array. It consumes 10 to 100 times less energy than today's leading memory technologies. The device is based on Adesto's CBRAM memory technology--it has primarily been used as a digital storage device that only has '0' and '1' states, but Kuzum and her lab demonstrated that it can be programmed to have multiple analog states to emulate biological synapses in the human brain. This so-called synaptic device can be used to do in-memory computing for neural network training.

"On-chip memory in conventional processors is very limited, so they don't have enough capacity to perform both computing and storage on the same chip. But in this approach, we have a high capacity memory array that can do computation related to neural network training in the memory without data transfer to an external processor. This will enable a lot of performance gains and reduce energy consumption during training," said Kuzum.

Kuzum, who is affiliated with the Center for Machine-Integrated Computing and Security at UC San Diego, led efforts to develop algorithms that could be easily mapped onto this synaptic device array. The algorithms provided even more energy and time savings during neural network training.

The approach uses a type of energy-efficient neural network, called a spiking neural network, for implementing unsupervised learning in the hardware. On top of that, Kuzum's team applies another energy-saving algorithm they developed called "soft-pruning," which makes neural network training much more energy efficient without sacrificing much in terms of accuracy.

Energy-saving algorithms

Neural networks are a series of connected layers of artificial neurons, where the output of one layer provides the input to the next. The strength of the connections between these layers is represented by what are called "weights." Training a neural network deals with updating these weights.

Conventional neural networks spend a lot of energy to continuously update every single one of these weights. But in spiking neural networks, only weights that are tied to spiking neurons get updated. This means fewer updates, which means less computation power and time.

The network also does what's called unsupervised learning, which means it can essentially train itself. For example, if the network is shown a series of handwritten numerical digits, it will figure out how to distinguish between zeros, ones, twos, etc. A benefit is that the network does not need to be trained on labeled examples--meaning it does not need to be told that it's seeing a zero, one or two--which is useful for autonomous applications like navigation.

To make training even faster and more energy-efficient, Kuzum's lab developed a new algorithm that they dubbed "soft-pruning" to implement with the unsupervised spiking neural network. Soft-pruning is a method that finds weights that have already matured during training and then sets them to a constant non-zero value. This stops them from getting updated for the remainder of the training, which minimizes computing power.

Soft-pruning differs from conventional pruning methods because it is implemented during training, rather than after. It can also lead to higher accuracy when a neural network puts its training to the test. Normally in pruning, redundant or unimportant weights are completely removed. The downside is the more weights you prune, the less accurate the network performs during testing. But soft-pruning just keeps these weights in a low energy setting, so they're still around to help the network perform with higher accuracy.

Hardware-software co-design to the test

The team implemented the neuroinspired unsupervised spiking neural network and the soft-pruning algorithm on the subquantum CBRAM synaptic device array. They then trained the network to classify handwritten digits from the MNIST database.

In tests, the network classified digits with 93 percent accuracy even when up to 75 percent of the weights were soft pruned. In comparison, the network performed with less than 90 percent accuracy when only 40 percent of the weights were pruned using conventional pruning methods.

In terms of energy savings, the team estimates that their neuroinspired hardware-software co-design approach can eventually cut energy use during neural network training by two to three orders of magnitude compared to the state of the art.

"If we benchmark the new hardware to other similar memory technologies, we estimate our device can cut energy consumption 10 to 100 times, then our algorithm co-design cuts that by another 10. Overall, we can expect a gain of a hundred to a thousand fold in terms of energy consumption following our approach," said Kuzum.

Moving forward, Kuzum and her team plan to work with memory technology companies to advance this work to the next stages. Their ultimate goal is to develop a complete system in which neural networks can be trained in memory to do more complex tasks with very low power and time budgets.

Credit: 
University of California - San Diego

Whale research helps answer long-sought scientific question

The humpback whale and a handful of similar whale species have a feeding mechanism utterly unique in the animal kingdom.

They open their mouths to scoop in ocean water, then strain it out through plates called baleen. The baleen looks like hairy bristles on a comb and functions as a sieve, leaving krill, small fish and other food for the whale to swallow.

Scientists previously established that fetal whales develop the very beginnings of teeth. These teeth will never take a single bite; they don't even erupt from the gums the way they do in human babies. The tiny "germs" are no mystery; they're remnants of real teeth in the whale's land-dwelling, evolutionary predecessors--creatures similar to hippos and cows that lived tens of millions of years ago.

San Diego State University graduate student Agnese Lanzetti's attempt to advance understanding of the remarkable biological transformation in humpbacks (Megaptera novaeangliae) was a two-and-a-half year quest involving her mentor, SDSU biologist Annalisa Berta, and resources in New York, Los Angeles, Austin, Texas and Copenhagen.

Using whale fetuses obtained from museums and a markedly advanced version of CT scanning, and imaging produced in Austin, Texas, Lanzetti developed new insights into germ and baleen development and other anatomical changes in the whale skull during gestation.

"I showed that the teeth are still present way past mid-gestation, which is kind of surprising," Lanzetti said. "Given the lack of research in general on this topic, it was not clearly understood. Many people thought they would lose them very early since then you have to get baleen."

New ground

Lanzetti earned undergraduate and masters' degrees in geology at the University of Pisa in her native Italy. She started a joint doctoral program at SDSU in 2014 with assistance from a Fulbright scholarship, drawn by her interest in paleontology and whales and Berta's expertise in the subject.

"It's not an area that there's a lot of research on," Lanzetti said. "It unites an important aspect of biology, which is development, and evolution."

With the slice-by-slice images offered through CT scanning, she established that the tooth germs are present in fetuses as small as seven inches. The subsequent baleen plate development--essential to the whales' survival in a marine environment--occurs over a relatively short span of time.

Lanzetti found as many as 40 germs per side in each jaw of the humpbacks. They're less than half a millimeter in size.

According to Berta, Lanzetti's results "provide solid evidence that helps answer a long time puzzle to scientists--when and how did tooth loss occur and baleen develop in the world's largest mammals. For the first time scientists now have a better understanding of the development of baleen, important in the evolutionary success of these whales, since it enabled them to filter enormous volumes of water and prey during bulk feeding."

Lanzetti's findings, which include other characteristics of the humpback skull, were published this fall in a journal of the American Association of Anatomists. The 3-D models resulting from her work already have been displayed in an exhibit on whale evolution in London.

The process

Hunting of humpbacks is illegal, so obtaining a new specimen for examination was impossible, as was dissection.

Lanzetti began with a fetal whale borrowed from the San Diego Natural History Museum and then obtained additional specimens from museums in Los Angeles, Berkeley and New York. Most of them date back 60 to 80 years. She traveled to Copenhagen for work on a fifth specimen, more than 100 years old and too large to be transported.

Similar research has used traditional CT scans of fetal tissue, the same computer-assisted technology used to look for cancers, by creating cross-sectional images, one slice at a time. But in a 2015 study, the tooth germs were not even visible, possibly due to their small size.

Lanzetti was the first to employ the more advanced protocol of diceCT, described by one user group as "the gold standard of magnetic resonance imaging," which produced a 3-D visualization as good as dissection.

The whale specimens first were soaked in solutions for up to two weeks, using an iodine stain that turns the sample red on the outside and enhances the contrast in the CT images. Lanzetti found a diceCT facility willing to assist at the University of Texas at Austin.

Evolutionary biology doesn't attract the same kind of research dollars that go to field work or medical breakthroughs, but Lanzetti received financial support from the American Natural History Museum and the University of California, Riverside, SDSU's partner in her joint doctoral program.

One question the research doesn't answer is why the useless germ teeth haven't vanished after tens of millions of years of evolution. Lanzetti said experts have hypothesized the teeth and baleen originate in the same genetic pathway. Under this view, proteins in the genes that are responsible for the teeth may also be needed for development of the baleen.

Credit: 
San Diego State University

How does your garden grow in space?

Astronauts in low-earth orbit could use a fresh salad to brighten up all those freeze-dried meals. But the microgravity space environment can affect plant growth in ways we're only beginning to understand. In research presented in a recent issue of Applications in Plant Sciences, Drs. Anna-Lisa Paul and Robert Ferl, and colleagues at the University of Florida Space Plants Lab, showed that two different transcriptomic approaches could feasibly be used to understand how subcomponents of plant organs respond to the space environment.

RNA-Seq and microarray chips are two methods of analyzing which genes are expressed (the "transcriptome") in a given tissue. Both techniques quantify mRNA transcripts, the intermediary molecule between genes and the proteins those genes encode, which provides a wealth of information about how an organism is responding to environmental cues. "When we look at tissue preserved from plants that have grown their entire lives on the space station, we can see into the underlying mechanisms they are using to physiologically adapt to that novel environment," said Dr. Paul, the corresponding author on the study. "In other words, what metabolic 'tools' they are engaging to adjust."

Both RNA-Seq and microarray chips can be adapted to use very small tissue samples, which is important considering the extreme expense of conducting spaceflight experiments. However, the two techniques differ in important ways. Microarray requires the design of a chip to probe expression of known genes, while RNA-Seq involves sequencing total RNA without regard for particular candidate RNA transcripts. In line with previous studies, the authors found that the two methods each had relative advantages. RNA-Seq picked up on expression of many more genes, including poorly known genes, while microarray detected some potentially important candidate genes relevant to spaceflight that were expressed at low levels, which RNA-Seq did not detect.

Transcriptomics can also be used to show how different tissues in the same organism respond to the same stimulus. The authors compared gene expression in two root zones, based on work by the first author, Dr. Aparna Krishnamurthy. "One of Dr. Krishnamurthy's post-doctoral projects was to optimize an approach that would allow us to look into the transcriptome of two of the most interesting regions of the root tip - the very tip, which houses the root cap and meristematic zone, and then slightly behind the tip, where elongation occurs," said Dr. Paul. "The root tip is the 'brain' of the plant root. The cells within the first half of a millimeter are responsible for sensing and processing most of the information that the plant root gathers from its environment - detecting the pull of gravity (or its lack), the direction of light, the path to water and minerals, and so on."

Understanding how plants respond to the space environment is critical to successfully providing fresh food to astronauts. "The information gained by the spaceflight and exploration research community today will guide the plant biology and habitat engineering required for the successful utilization of plants in future exploration initiatives," said Dr. Paul. In the near future, the need for a little greenery will be more pressing: plants grown in space can provide fresh, healthy food, oxygen, and a little slice of home to increasing numbers of people cooped up in space stations, on long voyages, or someday, on other planets.

Credit: 
Botanical Society of America