Culture

Microneedling improves appearance of acne scars

(Boston)--It turns out creating tiny injuries on your face with needles actually helps decrease the appearance of acne scars.

Researchers from Boston University School of Medicine (BUSM) have found that this process, called microneedling, helps rejuvenation and decreases the inflammation and scarring that often plagues those with acne.

The American Academy of Dermatology reports that acne is the most common skin condition in the U.S., affecting up to 50 million Americans annually. In response to the growing popularity of microneedling, the U.S. Food and Drug Administration in 2018 issued regulations on what is considered a safe, medical grade microneedling device. Even so, concerns about efficacy and safety have been raised over the years.

The researchers reviewed all the scientific studies done on microneedling for the treatment of acne scars from 2009 to 2018. They analyzed 33 studies from this 10-year period studying both efficacy of treating acne scarring with microneedling, microneedling in combination with other topical treatments, as well as overall patient satisfaction. Their research found all 33 articles analyzed showed an improvement of acne scar appearance as well as increased patient satisfaction when microneedling was used in combination with another therapy. Also, under the microscope the benefits of microneedling can be observed, including a decrease in inflammatory markers released by cells, and an overall increase in collagen and skin rejuvenating cell markers to help heal scarring.

"While there have been multiple smaller research studies and case reports which have shown the efficacy of microneedling with acne scarring, there has never been any consistent data and no one decided to take a step back, synthesize and look at what the evidence was telling us as a whole," explained corresponding author Neelam Vashi, MD, associate professor of dermatology at BUSM and director of the Boston University Cosmetic and Laser Center at Boston Medical Center. "With this systematic way of looking at all the data over the past decade, it is clear that microneedling works and helps reduce the appearance of acne scars for patients. Now the next step is to standardize this information and look at better ways to optimize this treatment for our patients."

According to the researchers now that these studies have been reviewed, the gap in the research for microneedling can be addressed including a need for a well-designed randomized controlled trials that compare microneedling to other popular minimally invasive treatments. "Microneedling works. Now it's time to evaluate how these treatments effect those with darker skin and how we can create strategies that are cost effective for not only the physician providing these services but most importantly for the patients who want solutions to these often debilitating scars."

Credit: 
Boston University School of Medicine

When working with animals can hurt your mental health

CHICAGO -- While it might sound like fun to work around pets every day, veterinarians and people who volunteer at animal shelters face particular stressors that can place them at risk for depression, anxiety and even suicide, according to research presented at the annual convention of the American Psychological Association.

"People who work or volunteer with animals are often drawn to it because they see it as a personal calling," said Angela K. Fournier, PhD, of Bemidji State University, who presented at the meeting. "However, they are faced with animal suffering and death on a routine basis, which can lead to burnout, compassion fatigue and mental health issues."

Veterinarians in particular are at high risk for death by suicide, according to a study from the Journal of the American Veterinary Medical Association, which found that from 1979 to 2015, veterinarians died by suicide between two to 3.5 times more often than the general U.S. population.

"Talking about veterinarian suicide certainly gets people to pay attention, but it does not tell the whole, nuanced story about what may be contributing to poor well-being in this population," said Katherine Goldberg, DVM, LMSW, community consultation and intervention specialist at Cornell Health and Founder of Whole Animal Veterinary Geriatrics and Palliative Care Services, who also presented at the meeting. "More research is under way to help better understand why veterinarians might be at an increased risk, but a combination of personality traits, professional demands and the veterinary learning environment all likely contribute."

Goldberg noted that vets also face economic challenges, as the average veterinary school graduate reported having more than $143,000 of school loan debt while earning a starting salary of just over an average of $73,000 annually in 2016.

"Personal finance concerns are stressful for many veterinarians, especially recent graduates, and at the same time, many clients regularly question the cost of care for their animals and may be suspicious that their vet is trying to 'push' services that their pet doesn't need," said Goldberg.

Goldberg described a multi-center study that looked at rates of adverse childhood experiences (a term used to describe all types of abuse, neglect and other traumatic experiences) in veterinary students, in an effort to understand what may be causing poor mental health among vets. However, veterinarians were not, on entry to the profession, more predisposed to poor mental health than the general population as a result of adverse childhood experiences, she said.

"This indicates that something is happening over the course of veterinary student training or once veterinarians are working to cause poor well-being outcomes," said Goldberg. "Well-being education should be integrated into the veterinary curriculum, emphasizing resiliency behaviors and cultivating professional partnerships between veterinary medicine and mental health care."

Substance use among veterinarians is also an understudied area. Veterinary medicine is the only medical profession in the U.S. that does not have a national monitoring program for substance use and mental health issues, she said.

While veterinarians who are dealing with mental health issues may exhibit symptoms common to all populations, such as sadness that interferes with daily activities or changes in appetite, there are a few specific warning signs to watch for in a clinical veterinary setting, according to Goldberg.

"Increased medical errors, absenteeism, client complaints and spending too little or too much time at work" are factors to watch for, she said. "For potential substance use issues, warning signs could include missing drugs or missing prescription pads."

Goldberg believes there needs to be a paradigm shift in veterinary training to better prepare veterinarians not only for the animal-related aspects of their jobs, but the human elements as well.

"We need core curricular material that focuses on coping with the emotional demands of the profession. Mindfulness, moral stress, ethics literacy, grief and bereavement, mental health first aid and suicide awareness all have a role in veterinary education" she said. "Colleges of veterinary medicine that have embedded mental health professionals are a step ahead of those that do not, and I would like to see this become a requirement for all schools accredited by the Association of American Veterinary Medical Colleges."

Fournier's presentation looked at employees and volunteers in animal shelters or rescues, and animal welfare and animal rights activists, who are at risk for compassion fatigue and psychological distress.

"Animal welfare agents, as these people are often called, are exposed to animal abuse, neglect and oppression on a regular basis, as well as routine euthanasia that is common in these settings," said Fournier.

Over 2.4 million healthy cats and dogs are euthanized each year in the U.S., most often homeless animals in shelters, according to the Humane Society of the United States.

"Shelter workers are then caught in a dilemma because they are charged with caring for an animal and they may ultimately end that animal's life," she said. "Research suggests that this causes significant guilt, which can lead to depression, anxiety and insomnia, as well as greater family-work conflict and low job satisfaction."

Animal welfare agents may also hear gruesome stories of animal abuse or witness the consequences firsthand when they are rehabilitating the animals, which can cause a lot of distress and lead to compassion fatigue, said Fournier.

"Experts suggest that animal welfare agents carry an even heavier burden than those in other helping professions who are susceptible to compassion fatigue because of the issues unique to working with animals, such as euthanasia and caring for living beings who have experienced pain and suffering but cannot articulate their needs and experiences," said Fournier.

Fournier suggested that psychotherapists who work with animal welfare agents offer patients strategies to reframe negative experiences, identify ways in which they get fulfillment and gratification from the work they do, and establish healthy boundaries between their work and personal lives.

"There are certainly positive and negative aspects of the job and over time or during times of acute stress, it can be difficult to see the positive," she said. "It may be necessary to help someone focus on the big picture that overall they are making a difference and animals have been saved, rather than ruminating on individual stories of crisis and loss. Self-care is also critical to ensuring the best mental health outcomes for those who work and volunteer with animals."

Credit: 
American Psychological Association

Disrupted genetic clocks in schizophrenia-affected brains reveal clues to the disease

Rhythms in gene expression in the brain are highly disrupted in people with schizophrenia, according to a new University of Pittsburgh-led study.

The findings, published today by researchers from the Pitt's School of Medicine in the journal Nature Communications, also suggest that researchers studying schizophrenia-linked genes in the brain could have missed important clues that would help understand the disease.

"Our study shows for the first time that there are significant disruptions in the daily timing of when some genes are turned on or off, which has implications for how we understand the disease at a molecular level," said senior author Colleen McClung, Ph.D., professor of psychiatry at Pitt's School of Medicine.

Many bodily functions run on a 24-hour cycle, called a circadian rhythm, which extends to how genes are expressed within cells. Some genes turn on or off at certain times of the day or night.

In this study, McClung and colleagues analyzed gene expression data from the dorsolateral prefrontal cortex -- a brain region responsible for cognition and memory -- from 46 people with schizophrenia and 46 sex- and age-matched healthy subjects. The data was obtained from the CommonMind Consortium, a public-private partnership that has curated a rich brain tissue and data bank for studying neuropsychiatric disorders.

By knowing the time of death, the researchers were able to use a statistical method to determine changes in the rhythmicity of different genes, which revealed some interesting patterns.

McClung explained the findings by drawing an analogy of gene expression to electrical appliances in a house.

"In a normal house -- like a healthy brain -- let's say the lights are turned on at night, but the refrigerator needs to be on all the time. What we saw was that in a schizophrenia-affected brain, the lights are on all day and the refrigerator shuts off at night."

This is problematic, explains McClung, because it can affect how cells function. In their samples, the genes that gained rhythmicity were involved in how mitochondria -- the cell's powerhouse -- functions, and those that lost rhythmicity were linked to inflammation.

The results also have implications for other researchers studying the genetics of schizophrenia, according to Marianne Seney, Ph.D., assistant professor of psychiatry at Pitt's School of Medicine and the study's first author. By not considering circadian rhythms, they could be missing out on important findings.

When Seney and McClung compared gene expression in brains from people who died during the day, the control and schizophrenia subjects were not different, but in those who died at night, there were major differences, since genes that had gained a rhythm had hit their low point during the night.

Seney alludes to the analogy of the house. "If we only looked to see if the refrigerator was on during the day we would see no difference, but at night, there would be one."

Credit: 
University of Pittsburgh

The brain inspires a new type of artificial intelligence

image: Processing an event with multiple objects. A synchronous input where all objects are presented simultaneously to a computer (left), versus an asynchronous input where objects are presented with temporal order to the brain (right).

Image: 
Prof. Ido Kanter

Machine learning, introduced 70 years ago, is based on evidence of the dynamics of learning in our brain. Using the speed of modern computers and large data sets, deep learning algorithms have recently produced results comparable to those of human experts in various applicable fields, but with different characteristics that are distant from current knowledge of learning in neuroscience.

Using advanced experiments on neuronal cultures and large scale simulations, a group of scientists at Bar-Ilan University in Israel has demonstrated a new type of ultrafast artifical intelligence algorithms -- based on the very slow brain dynamics -- which outperform learning rates achieved to date by state-of-the-art learning algorithms.

In an article published today in the journal Scientific Reports, the researchers rebuild the bridge between neuroscience and advanced artificial intelligence algorithms that has been left virtually useless for almost 70 years.

"The current scientific and technological viewpoint is that neurobiology and machine learning are two distinct disciplines that advanced independently," said the study's lead author, Prof. Ido Kanter, of Bar-Ilan University's Department of Physics and Gonda (Goldschmied) Multidisciplinary Brain Research Center. "The absence of expectedly reciprocal influence is puzzling."

"The number of neurons in a brain is less than the number of bits in a typical disc size of modern personal computers, and the computational speed of the brain is like the second hand on a clock, even slower than the first computer invented over 70 years ago," he continued. "In addition, the brain's learning rules are very complicated and remote from the principles of learning steps in current artificial intelligence algorithms," added Prof. Kanter, whose research team includes Herut Uzan, Shira Sardi, Amir Goldental and Roni Vardi.

Brain dynamics do not comply with a well-defined clock synchronized for all nerve cells, since the biological scheme has to cope with asynchronous inputs, as physical reality develops. "When looking ahead one immediately observes a frame with multiple objects. For instance, while driving one observes cars, pedestrian crossings, and road signs, and can easily identify their temporal ordering and relative positions," said Prof. Kanter. "Biological hardware (learning rules) is designed to deal with asynchronous inputs and refine their relative information." In contrast, traditional artifical intelligence algorithms are based on synchronous inputs, hence the relative timing of different inputs constituting the same frame is typically ignored.

The new study demonstrates that ultrafast learning rates are surprisingly identical for small and large networks. Hence, say the researchers, "the disadvantage of the complicated brain's learning scheme is actually an advantage". Another important finding is that learning can occur without learning steps through self-adaptation according to asynchronous inputs. This type of learning-without-learning occurs in the dendrites, several terminals of each neuron, as was recently experimentally observed. In addition, network dynamics under dendritic learning are governed by weak weights which were previously deemed insignificant.

The idea of efficient deep learning algorithms based on the very slow brain's dynamics offers an opportunity to implement a new class of advanced artificial intelligence based on fast computers. It calls for the reinitiation of the bridge from neurobiology to artifical intelligence and, as the research group concludes, "Insights of fundamental principles of our brain have to be once again at the center of future artificial intelligence".

Credit: 
Bar-Ilan University

Research on cholera adds to understanding of the social life of bacteria

image: Strains of Vibrio cholerae (red) make elongated cells that become entangled and help short-term survival.

Image: 
Ben Wucher and Carey Nadell

HANOVER, N.H. - August 9, 2019 - Certain strains of cholera can change their shape in response to environmental conditions to aid their short-term survival, according to new research from Dartmouth College.

In the research, some strains of the bacterium Vibrio cholerae transformed themselves from small, comma-shaped cells to long filaments in nutrient-poor environments.

This strategy of changing cell shape supports the growth of bacterial communities and allows the pathogen to compete in environments with a quick turnover of surfaces on which to grow.

According to the study, the formation of the elongated cell shapes allows the rapid formation of communities of bacteria that bind to surfaces - known as biofilms - that are essential in turbulent nutrient environments. These formations come at the expense of being able to compete in the long term with biofilms made from smaller cells that pack together more tightly.

The finding adds to the understanding of how bacteria adapt to their environment.

"Bacteria are normally thought of as solitary organisms, but they are actually highly-social organisms that like to live in groups," said Carey Nadell, an assistant professor of biology at Dartmouth. "This research shows that we can relate cell structure to group behavior in new ways when looking at realistic environments."

When not inside a human host, V. cholerae grows on nutritious pieces of debris in aquatic environments. This debris, known as chitin, comes from the shells of arthropods like plankton and shrimp. Cholera cell growth on the chitin typically takes place in the form of biofilms featuring clusters of organisms.

In the research, strains of cholera were grown in sea water and then observed using 3D microscopy with the aid of fluorescent markers to make the bacteria visible. The researchers found that the altered long-filaments become entangled, providing an advantage that allows the bacteria to quickly colonize nutrient-rich particles in sea water.

The research notes that the formation of the filament-like structure comes at the expense of longer-term competitive ability enjoyed by shorter cells that adhere more strongly to each other and to surfaces.

"This has important consequences for how cells survive in the environment. It shows how bacterial cell shape can be coupled to environmental success during surface occupation, competition within biofilms, and dispersal to new resource patches," said Nadell.

There are many strains of the cholera bacteria. Because the bacteria in the study was grown in sea water, the research does not directly lead to a greater understanding of how cholera acts within the human body.

The discovery of a new way that bacteria form groups on surfaces can, however, help researchers understand more about how bacteria act and associate.

"This kind of behavior is perhaps more widespread than we currently understand in the wild, and that variability in cell shape, like variability in animal body plans, could be a fundamental part of why some bacteria live in certain places but not others."

Predicting microbial community composition is a major frontier of modern microbiology and medicine, given the importance of microbiomes for health, industry, agriculture, and other applications.

Credit: 
Dartmouth College

These sharks glow in the dark thanks to a newly identified kind of marine biofluorescence

image: David Gruber with shark eye camera.

Image: 
David Gruber

In the depths of the sea, certain shark species transform the ocean's blue light into a bright green color that only other sharks can see--but how they biofluoresce has previously been unclear. In a study publishing August 8 in the journal iScience, researchers have identified what's responsible for the sharks' bright green hue: a previously unknown family of small-molecule metabolites. Not only is this mechanism of biofluorescence different from how most marine creatures glow, but it may also play other useful roles for the sharks, including helping them identify each other in the ocean and fight against microbial infections.

"Studying biofluorescence in the ocean is like a constantly evolving mystery novel, with new clues being provided as we move the research forward," says David Gruber, a professor at The City University of New York and co-corresponding author of the study. "After we first reported that swell sharks were biofluorescent, my collaborators and I decided to dive deeper into this topic. We wanted to learn more about what their biofluorescence might mean to them."

Gruber, working with Jason Crawford, co a professor at Yale University and the study's co-corresponding author, focused on two species of sharks--the swell shark and the chain catshark. They noticed that the sharks' skin had two tones--light and dark--and extracted chemicals from the two skin types. What they found was a type of fluorescent molecule that was only present in the light skin.

"The exciting part of this study is the description of an entirely new form of marine biofluorescence from sharks--one that is based on brominated tryptophan-kynurenine small-molecule metabolites," Gruber says.

These types of small-molecule metabolites are known to be fluorescent and activate pathways similar to those that, in other vertebrates, play a role in the central nervous system and immune system. But in the sharks, the novel small-molecule fluorescent variants account for the biophysical and spectral properties of their lighter skin. This mechanism is different from animals in the upper ocean, such as jellyfish, that commonly use green fluorescent proteins as mechanisms to transform blue light into other colors, Gruber says.

"It's a completely different system for them to see each other that other animals cannot necessarily tap into. They have a completely different view of the world that they're in because of these biofluorescent properties that their skin exhibits and that their eyes can detect," Crawford says. "Imagine if I were bright green, but only you could see me as being bright green, but others could not."

The molecules also serve multiple other purposes, including to help the sharks identify each other in the ocean and potentially provide protection against microbial infections, Crawford says.

"It is also interesting that these biofluorescent molecules display antimicrobial properties. These catsharks live on the ocean bottom, yet we don't see any biofouling or growth, so this could help explain yet another amazing feature of shark skin," Gruber says. "This study opens new questions related to potential function of biofluorescence in central nervous system signaling, resilience to microbial infections, and photoprotection."

While the study focused on two shark species, Gruber and Crawford hope to more broadly explore the luminescent properties of marine animals, which can ultimately lead to the development of new imaging techniques.

"If you can harness the abilities that marine animals have to make light, you can generate molecular systems for imaging in the lab or in medicine. Imaging is an incredibly important biomedical objective that these types of systems could help to propel into the future," Crawford says.

"Sharks are wonderful animals that have been around for over 400 million years. Sharks continually fascinate humans, and they hold so many mysteries and superpowers," Gruber says. "This study highlights yet another mystery of sharks, and it is my hope that this inspires us to learn more about their secrets and work to better protect them."

Credit: 
The City University of New York

Smuggling route for cells protects DNA from parasites

image: Microscopy image of an entire fruit fly (Drosophila melanogaster; body outline in green) with a protein central to the smuggling route (Nxf3) shown in red.

Image: 
Daniel Reumann, IMBA

Our cells have safety mechanisms that keep genetic parasites - such as viruses and transposons - in check while important genes of the host cell can remain active. Researchers have now shown that the molecular safety mechanisms of the host cell "smuggle" genetic information molecules around the cell, which are then used to recognize and shut down the parasites.

While information for the production of our cells' proteins constitutes less than two percent of our DNA, two-thirds of our DNA consists of selfish genetic elements such as retroviruses and tranposons and residues thereof. In fact, transposon sequences have benefited the adaptation of different species to new environments by imparting new regulatory elements to the genome. But unrestrained transposon proliferation makes the genome unstable and results in low fertility in both Drosophila, mice and humans.

Assistant Professor Peter Refsing Andersen has just started building his own group at the Department of Molecular Biology and Genetics at Aarhus University, after having worked as a postdoc in Vienna for five years. In Vienna, Peter studied the molecular mechanisms that keep transposons in check and thus ensure that intact DNA can be passed on to the next generation. Together with colleagues in Senior Scientist Julius Brennecke's group at the Vienna BioCenter, Peter has now found the answer to one of the big open questions in understanding how the defence mechanisms against transposons work.

The defence, which can shut down transposons, is guided by small RNA molecules, the so-called piRNAs. piRNAs are made in the cell from long RNA molecules that, after being produced within the cell nucleus, have to travel into the cytoplasm of specific piRNA production regions. However, the inherent problem is that the long RNA molecules - according to gene expression textbook dogmas in the field of RNA transport - should be locked inside the nucleus, as they lack all the molecular quality stamps that normally allow RNA to exit the nucleus.

Revealing new transport route for RNA

Through their work, Peter Refsing Andersen and his colleagues have found that the transport of the long RNA molecules for piRNA production takes place via an until now unknown RNA transport route. This molecular route breaks with several of the traditional dogmas of RNA transport and thereby "smuggles" RNA that cannot pass the normal quality control in the cell into the cytoplasm and even delivers the long RNA molecules directly to the piRNA production regions.

The study thus not only uncovers new insight about how animal genomes defend themselves against DNA parasites; it also reveals a glimpse of how cells sort and spatially distribute and organize genetic information. Peter and his colleagues have studied this problem in Drosophila, which is an ideal model system for this biology, which is expected to function in similar ways in humans.

Precisely this perspective, Peter Refsing Andersen finds very interesting: "Although this important biology cannot currently be investigated in humans due to technical obstacles, we can explore the biological principles in model systems such as Drosophila. The framework of understanding we build here can in the future be combined with the enormous wave of genetic information the scientific community is receiving from patients worldwide these years. Therefore, our work can help translate the billions of sequences into meaningful biological information that can benefit people in the longer term."

Credit: 
Aarhus University

This designer clothing lets users turn on electronics while turning away bacteria

image: Purdue waterproof, breathable and antibacterial self-powered clothing is based on omniphobic triboelectric nanogenerators.

Image: 
Ramses Martinez/Purdue University

WEST LAFAYETTE, Ind. - A new addition to your wardrobe may soon help you turn on the lights and music - while also keeping you fresh, dry, fashionable, clean and safe from the latest virus that's going around.

Purdue University researchers have developed a new fabric innovation that allows wearers to control electronic devices through clothing.

"It is the first time there is a technique capable to transform any existing cloth item or textile into a self-powered e-textile containing sensors, music players or simple illumination displays using simple embroidery without the need for expensive fabrication processes requiring complex steps or expensive equipment," said Ramses Martinez, an assistant professor in the School of Industrial Engineering and in the Weldon School of Biomedical Engineering in Purdue's College of Engineering.

The technology is featured in the July 25 edition of Advanced Functional Materials.

"For the first time, it is possible to fabricate textiles that can protect you from rain, stains, and bacteria while they harvest the energy of the user to power textile-based electronics," Martinez said. "These self-powered e-textiles also constitute an important advancement in the development of wearable machine-human interfaces, which now can be washed many times in a conventional washing machine without apparent degradation.

Martinez said the Purdue waterproof, breathable and antibacterial self-powered clothing is based on omniphobic triboelectric nanogeneragtors (RF-TENGs) - which use simple embroidery and fluorinated molecules to embed small electronic components and turn a piece of clothing into a mechanism for powering devices. The Purdue team says the RF-TENG technology is like having a wearable remote control that also keeps odors, rain, stains and bacteria away from the user.

"While fashion has evolved significantly during the last centuries and has easily adopted recently developed high-performance materials, there are very few examples of clothes on the market that interact with the user," Martinez said. "Having an interface with a machine that we are constantly wearing sounds like the most convenient approach for a seamless communication with machines and the Internet of Things."

The technology is being patented through the Purdue Research Foundation Office of Technology Commercialization. The researchers are looking for partners to test and commercialize their technology.

Their work aligns with Purdue's Giant Leaps celebration of the university's global advancements in artificial intelligence and health as part of Purdue's 150th anniversary. It is one of the four themes of the yearlong celebration's Ideas Festival, designed to showcase Purdue as an intellectual center solving real-world issues.

Credit: 
Purdue University

Fungi living in cattail roots could improve our picture of ancient ecoystems

image: This is lead author Az Klymiuk in waders collecting cattail roots.

Image: 
Courtesy of Az Klymiuk, Field Museum

Paleobotanist Az Klymiuk didn't set out to upend science's understanding of the fossil record of plant-fungal associations. She just wanted to figure out the environment that some fossil plants lived in. That question led her to look at modern cattail roots and the fungi that live inside of them. She found that fungi have a harder time growing in cattail roots that are underwater. And that discovery, which is being published in the journal Mycologia, could change how we interpret some parts of the fossil record, including an important community of early land plants.

"I was studying fossil plants that are 48 million years old. In these fossils, we can see the plant cells and everything in them, including fungi. In describing these fossil fungi, I realized we actually don't know much about fungi that are living in wetland plants today," says Klymiuk, the Field Museum's collections manager of paleobotany and lead author of the Mycologia study. "There's been very little work done, next to nothing in terms of their ecology and distribution."

Fungi, which include mushrooms, molds, and yeasts, aren't plants--they're their own separate kind of organism, more closely related to animals than anything green. But almost all land plants have tiny fungi living inside their roots. These microscopic fungi grow tendrils that snake out into the soil, where they can break down dirt and absorb the element phosphorus that's present in the soil and rocks. The plant host then uses that phosphorus to create energy (bonus points if you remember from high school biology that the energy comes in the form of a molecule called adenosinine triphosphate--ATP). Without the fungi, the plants have a much harder time pulling phosphorus from the soil. All this is to say, plants need the fungi living in their roots so that they can create the energy they need to survive. In addition to these "helpful" fungi, plants also host an enigmatic group of fungi called dark septate endophytes - the verdict is still out on whether these fungi are weak parasites, latent pathogens, or helpful partners. What we do know is that most plants host them, including the fossil plants Klymiuk worked with.

Klymiuk was curious about the possible ecological roles of these fungi, but was also intrigued by why some fossil plants had lots of root fungi and others had very few. Since these fossil plants were from areas that had been swampy wetlands millions of years ago when the plants were alive, she decided to look for clues in modern wetland plants, which have a lot in common with the ancient ones: cattails.

To do this, she collected cattail roots from several wetlands, sampling plants from the shoreline all the way to the deepest point at which they grew. "I was out there in hip waders with my undergraduate assistants, and we were digging up cattails. You get in there with a shovel, and you just yank up this giant plant that's as tall as you," says Klymiuk, who began work on the project at the University of Kansas with co-author Benjamin Sikes. "Memorably, in one reservoir, I had literally just told Abby [my undergraduate] to watch out because there was a sharp drop-off, and what'd I do? Ker-splash, right over. It was our last site that day, so thankfully we were close to the lab, which meant I could at least do a fast wardrobe change before we did sample prep."

Back in the lab, she examined the roots under a microscope and compared the fungi in cattail roots that had been growing at different depths underwater. She found that, at least in these cattails, the fungi don't do well with inundation at all. It didn't matter if they were growing in two inches of water, or four feet--cattails growing in water had very little fungi in their roots. Up on dry land, however, cattails had plenty of fungi living in their roots, comparable to grass roots Klymiuk collected alongside the cattails. "It turns out that any degree of flooding whatsoever massively suppresses the amount of fungi in plant roots," says Klymiuk.

Since the number of fungi living in plant roots appears to indicate whether the plants are growing fully in water or up on the shoreline, scientists looking at fungi in fossil plant roots can make a better guess about what environment those plants lived in. "This gives us a new way to understand what we're seeing in the fossils," says Klymiuk. "I have some roots where fungi are completely absent. I can look and look and look and there's nothing there. I have other roots that are just loaded, packed. To me, that indicates that we're dealing with different levels of inundation. I feel pretty confident saying that when we find a lot of fungus in a plant root, that root was probably not inundated during its life."

As well as giving scientists a new tool for figuring out what some prehistoric ecosystems were like, this study also suggests that some of our understanding of the fossil record of plant-fungal interactions might need recalibrating. "There's been this pervasive narrative, all across biology," Klymiuk says. "The basic idea is that plants needed fungi to get out of water, to get onto land. The oldest preserved community of land plants, the 407 million-year-old Rhynie Chert, is often cited as fossil evidence for this. This community of land plants has mostly been interpreted as a wetland assemblage associated with hot spring overflow or outwash, and many of these early land plants hosted mutualistic fungi, just like most living plants do today. What my research suggests, is that these plants were probably victims of intermittent flooding, as opposed to living right in the water."

So did plants need fungal helpers to get onto land? "I don't disagree that they found fungal partnerships extremely useful once they were on land, but I don't think proto-plants would have needed to bother with them if they were in water, because phosphorous is readily bioavailable in water," says Klymiuk. "My personal feeling is that the earliest land plants were probably never in water to begin with. There are lots of groups of green algae that are fully terrestrial. I think it's very possible that land plants evolved from fully terrestrial green algae, and that water was a secondarily-invaded habitat. Stay tuned, there's a lot of really cool research emerging on this question."

Klymiuk's work doesn't just speak to the past, but also sheds light on future life on Earth. "It's important for us to understand these relationships because so many of our crops and forests are under stress due to climate change," says Klymiuk. "Not only are we dealing with more flooding, and of longer duration (and more droughts, and more everything), but it's very probable that the way in which plant-fungal interactions work will change as we continue to move into a higher CO2 world -- we know that some groups of fungi are 'better roommates' under low CO2 than high; they pay their rent on time. Increase the CO2 and suddenly some of them are falling behind on phosphorus-as-rent, or ordering pizza on their host's credit card (using excess photosynthates at cost to the host plant). I can draw this analogy out in twelve different ways, and it will always end with 'we don't know enough about how these systems function in order to generalize or predict anything with confidence yet.' It's still seriously understudied."

Beyond the potential applications of this work, Klymiuk says she's excited about the project for the sake of discovery. "There's this mysterious, microscopic world that we don't usually think about. I'm genuinely fascinated by plants and their evolution, because so much of paleobotany is detective work. I just get excited by putting these puzzles back together and chipping away at some fundamental mysteries."

Credit: 
Field Museum

Too much coffee raises the odds of triggering a migraine headache

image: Association between servings of caffeinated beverages compared to none and occurrence of migraine on the same day among 98 participants with episodic migraines followed for 6 weeks.

Image: 
<em>The American Journal of Medicine</em>

Philadelphia, August 8, 2019 - Drinking three or more servings of caffeinated beverages a day is associated with the onset of a headache on that or the following day in patients with episodic migraine, according to a new study in The American Journal of Medicine, published by Elsevier. Results are consistent even after accounting for daily changes in alcohol intake, stress, sleep, physical activity, and menstruation, although there was some variation evident with oral contraception use.

"Based on our study, drinking one or two caffeinated beverages in a day does not appear to be linked to developing a migraine headache, however, three or more servings may be associated with a higher odds of developing a headache," noted lead investigator Elizabeth Mostofsky, ScD, Cardiovascular Epidemiology Research Unit, Beth Israel Deaconess Medical Center, and Department of Epidemiology, Harvard T.H. Chan School of Public Health, Boston, MA, USA.

Migraine is a disabling primary headache disorder affecting approximately 1.04 billion adults worldwide and representing the most common pain condition causing lost productivity and significant direct and indirect costs. Despite widespread anecdotal belief that caffeinated beverages may trigger migraine headaches and relieve headaches once they have begun, there is limited scientific evidence to assess the potential association between changes in daily intake and the onset of headaches after accounting for other changes in lifestyle such as physical activity and anxiety. Common anecdotal evidence also suggests that migraine can be immediately triggered by weather or lifestyle factors, such as sleep disturbance and skipping meals.

Approximately 87 percent of Americans consume caffeine daily, with an average intake of 193 mg per day. Whereas some behavioral and environmental factors may only have potential harmful effects on migraine risk, the role of caffeine is particularly complex because the impact depends on dose and frequency. It may trigger an attack, but also has an analgesic effect.

Investigators analyzed data from 98 adults who suffer from episodic migraines. Participants completed electronic diaries twice a day for six weeks reporting on their caffeinated beverage intake, other lifestyle factors, and the timing and characteristics of each migraine headache. The study compared each participant's incidence of migraines on days they consumed caffeinated beverage intake to the incidence of migraines on days they did not. Baseline data had indicated that participants typically experienced an average of five headaches per month; 66 percent of them usually consumed one to two servings of caffeinated beverages daily, and 12 percent consumed three or more cups. During the six-week study period in 2016-17, participants experienced an average of 8.4 headaches. All reported having caffeinated beverages on at least one day during the study, with an average of 7.9 servings per week.

"To date, there have been few prospective studies on the immediate risk of migraine headaches with daily changes in caffeinated beverage intake. Our study was unique in that we captured detailed daily information on caffeine, headache, and other factors of interest for six weeks," commented Suzanne M. Bertisch, MD, MPH, principal investigator of the study, of the Division of Sleep and Circadian Disorders, Brigham and Women's Hospital and Harvard Medical School, Boston, MA, USA.

These findings suggest that the impact of caffeinated beverages on headache risk was only apparent for three or more servings on that day, and that patients with episodic migraine did not experience a higher risk of migraine when consuming one to two caffeinated beverages per day. Additional research is needed to examine the potential effect of caffeine on symptom onset in the subsequent hours and the interplay of sleep, caffeine, anxiety, environmental factors, and migraine.

Credit: 
Elsevier

Fluoride may diminish kidney and liver function in adolescents, study suggests

(New York, NY - August 8, 2019) -- Fluoride exposure may lead to a reduction in kidney and liver function among adolescents, according to a study published by Mount Sinai researchers in Environment International in August.

The study examined the relationship between fluoride levels in drinking water and blood with kidney and liver health among adolescents participating in the National Health and Nutrition Examination Survey, a group of studies that assess health and nutritional well-being in the United States. The findings showed that exposure to fluoride may contribute to complex changes in kidney and liver function among youth in the United States, where 74 percent of public water systems add fluoride for dental health benefits. Fluoridated water is the main source of fluoride exposure in the U.S.. The findings also suggest that adolescents with poorer kidney or liver function may absorb more fluoride in their bodies.

While fluoride exposure in animals and adults has been associated with kidney and liver toxicity, this study examined potential effects of chronic low-level exposure among youth. This is important to study because a child's body excretes only 45 percent of fluoride in urine via the kidneys, while an adult's body clears it at a rate of 60 percent, and the kidneys accumulate more fluoride than any other organ in the body.

"While the dental benefits of fluoride are widely established, recent concerns have been raised regarding the appropriateness of its widespread addition to drinking water or salt in North America," said the study's first author Ashley J. Malin, PhD, postdoctoral fellow in the Department of Environmental Medicine and Public Health at the Icahn School of Medicine at Mount Sinai. "This study's findings suggest that there may be potential kidney and liver health concerns to consider when evaluating fluoride use and appropriate levels in public health interventions. Prospective studies are needed to examine the impact of chronic low-level fluoride exposure on kidney and liver function in the U.S. population."

The study analyzed fluoride measured in blood samples of 1,983 adolescents and the fluoride content of the tap water in the homes of 1,742 adolescents. Although the tap water fluoride concentrations were generally low, there are several mechanisms by which even low levels of fluoride exposure may contribute to kidney or liver dysfunction.

This study's findings, combined with previous studies of childhood exposure to higher fluoride levels, show there is a dose-dependent relationship between fluoride and indicators of kidney and liver function. The findings, if confirmed in other studies, suggest it may be important to consider children's kidney and liver function in drafting public health guidelines and recommendations.

Potential health side effects include renal system damage, liver damage, thyroid dysfunction, bone and tooth disease, and impaired protein metabolism.

Credit: 
The Mount Sinai Hospital / Mount Sinai School of Medicine

Blood clotting factors may help fight multi-drug resistant superbugs

Coagulation factors, which are involved in blood clotting after injury, may offer new strategies for fighting multidrug-resistant bacteria, according to a study published in Cell Research.

Infections caused by these bacteria pose an urgent public health risk, as effective drugs to combat them are lacking. A deficiency in blood coagulation factors - for example in patients with the blood clotting disorder haemophilia - has been associated with bacterial infection diseases such as sepsis and pneumonia, leading to the suggestion that these coagulation factors may have a role in anti-infection mechanisms as well as bloody clotting.

Now a group of researchers at Sichuan University, China, has shown that the factors VII, IX, and X - which are well known for their roles in blood coagulation - may act against Gram-negative bacteria, including extensively drug resistant pathogens such as Pseudomonas aeruginosa and Acetinobacter baumannii. Both bacteria were recently listed by the World Health Organisation among 12 bacteria that pose the greatest threat to human health because of their antibiotic resistance. Gram-negative bacteria are characterized by their cell envelopes, which are composed of an inner cell membrane, a thin cell wall and an outer membrane that make them harder to kill.

Xu Song, the corresponding author said: "In our study, we report a class of human antimicrobial proteins effective against some drug-resistant 'superbugs'. Unlike many antibacterial agents that target the cell metabolism or cytoplasmic membrane, these proteins act by breaking down the lipopolysaccharides of the bacterial outer membrane through hydrolysis. Lipopolysaccharides are crucial for the survival of Gram-negative bacteria."

The ability of bloody coagulation factors to hydrolyse essential lipopolysaccharides in the bacterial cell envelope, suggests they may potentially be used to combat Gram-negative bacteria. Examining the mechanism further, the authors showed that the coagulation factors act on the bacteria via light chains - one of two domains of the proteins. The other domains (heavy chains) have no effect. In cells in the laboratory, the authors showed that treatment of cells of the bacteria E. coli with light chains led to clearly observable damage to the bacterial cell envelope initially and almost complete destruction of the cell within four hours.

The authors found that the light chain of coagulation factor VII was effective against all Gram-negative bacterial cells tested. The light chains, as well as the coagulation factors as a whole, were also shown to be effective in combating Pseudomonas aruginosa and Acetinobacter baumanii infections in mice. Heavy chains had no effect.

Xu Song said: "None of the known antibacterial agents has been reported to function by hydrolyzing lipopolysaccharides. Identification of the lipopolysaccharide hydrolysis-based antibacterial mechanism, combined with antibacterial features of blood coagulation factors, and the ability to manufacture them on a large scale at relatively low-cost, may offer new and cost-effective strategies for combating the urgent public-health crisis posed by drug-resistant Gram-negative pathogens."

Credit: 
Springer

The Lancet Global Health: Automatically chlorinating water at public taps cuts child diarrhoea by almost a quarter in urban Bangladesh

image: Infographic showing how water treatment device works.

Image: 
<em>The Lancet Global Health</em>

A novel water treatment device that delivers chlorine automatically via public taps without the need for electricity, reduced child diarrhoea by 23% compared with controls (156 cases out of 2,073 child measurements [7.5%] vs 216/2,145 [10%]) over 14 months in two urban neighbourhoods of Bangladesh, according to a randomised trial following more than 1,000 children published in The Lancet Global Health journal.

Clean water is still a major problem in poor urban communities in low-income countries, where contamination by bacteria can lead to high rates of diarrhoeal diseases such as cholera and typhoid, harming children's health and growth. Worldwide, an estimated one billion people who have access to piped water are drinking water that does not meet international safety standards.

Most previous research has focused on household-level water treatment interventions that require people to calculate the correct dosage and add their own chlorine daily--but these have had low uptake and failed to reduce diarrhoea, partly because they deliver a chlorine dose that makes chlorinated water taste and smell unpleasant.

In this study, the device used a low chlorine dose which increased taste acceptability and achieved high uptake while still improving drinking water quality.

"Chlorination is one of the cheapest and most widely available methods to make drinking water safe, but poor taste and bad smell of chlorinated water are major barriers to adoption," explains co-author Dr Sonia Sultana from icddr,b (International Centre for Diarrhoeal Diseases Research, Bangladesh). "Our findings indicate that automated chlorine dosing below the taste detection threshold has the potential to be transformative by ensuring high adoption rates and will hopefully help progress towards the global target of universal access to safe and affordable drinking water." [1]

Although one of the first field trials of this new technology, the authors say that chlorinating water at the point-of-collection could be an effective, scalable strategy in low-income urban settings to reduce diarrhoeal diseases. More research will be needed to determine where this technology should be implemented to maximise health benefits, as the intervention was more effective in Bangladesh's capital city Dhaka city than in Tongi on the outskirts of the city.

"This novel, low-cost technology requires no behaviour change or effort by users--safe water comes straight out of the tap," says Dr Amy Pickering from Tufts University, USA who led the research. "This point-of-collection approach to water treatment could be a transformative strategy for reducing gastrointestinal disease burden in low-income urban communities. We are now expanding the project to roadside water stands in Kenya, and working on a business model that could work in other countries." [1]

In this study, researchers used a novel treatment device that automatically dispenses small amounts of chlorine to water from public taps and shared hand pumps. In the device, water flows past solid tablets of chlorine which dissolve into the water to treat it (see infographic).

Identical dispensers were installed at 100 shared water points in two low-income neighbourhoods in Bangladesh (Dhaka and Tongi) fed by piped water that is delivered intermittently, as is common in low-income settings. Water points were randomly assigned to have their drinking water automatically chlorinated (intervention) or to be treated with vitamin C (control group).

Between July 2015 and November 2015, 920 households with at least one child under the age of 5 years were assigned to the chlorine treatment (50 water points; 517 children) or control groups (50 water points; 519 children). Because of high migration, children could transfer into or out of the shared water points.

Every 2-3 months during the 14-month follow up period, caregiver-reported child diarrhoea (3 or more loose or watery stools in 24 hours) was measured alongside household and tap water quality (microbes, taste, smell), child weight, acute respiratory illness, and the presence of sufficient chlorine residual to prevent recontamination by dirty containers, utensils, or hands.

Before the trial began, the authors identified the concentration of chlorine that would be below the taste detection threshold for most residents, ensuring participants would not know which study group they were in, and would not be put off by the taste of chlorine [2]. Blinding was largely successful during the triial, with most participants unable to accurately guess which intervention they had received. Nevertheless, nine communal water points in the intervention group were uninstalled, primarily due to individual complaints about the smell and taste of chlorinated water.

Chlorine residual was detected at the point of collection from shared taps 83% of the time in the treatment group compared to 0% of the time in the control group (table 3). E. coli contamination was detected in 15% of tap samples in the treatment group compared with 64% in the control group.

Results showed that, over 14 months, children in the treatment group had substantially less diarrhoea than those in the control group (156 cases out of 2,073 child observations [7.5%] vs 216/2,145 [10%]).

Importantly, the intervention had the largest health benefits among children in Dhaka, reducing diarrhoea by 34% compared to 7% in Tongi. The authors speculate that this variation in effect probably resulted from the poorer water quality in Dhaka at the start of the study (e.g, 87% of tap samples in Dhaka were contaminated with E Coli compared with 50% in Tongi) and because Dhaka receives water that spends much longer travelling through unpressurised pipes, enabling contamination and sewage to seep into the system.

Compared to the control group, caregivers in the treatment group were significantly less likely to report seeking treatment for gastrointestinal illness for their child (260 cases of treatment sought out of 3062 child observations [12.2%] vs 382/3142 [8.5%]); spent less on illness-related treatment; and reported lower consumption of antibiotics by their children (table 2). Respiratory illness and differences in child weight and growth were similar between the groups.

Despite these achievements, the study has some limitations, including that diarrhoea episodes were based on caregiver-reported data, which might not accurately represent children's illness; and that participants may have drunk water from other sources, although less than 4% of respondents reported doing so.

Discussing the implications of the findings in a linked Comment, Dr Jean Humphrey from John Hopkins Bloomberg School of Public Health, USA, writes that the intervention will not work in all situations: "The device is compatible with only a specific type of water system--in the study area this was one third of the water taps...[and] although 1.3 billion people have gained access to piped water since 2000, 2.9 billion (38%) of the global population still do not have any kind of access to piped water...There is certainly no one-size-fits-all strategy for providing access to clean water throughout the world. However, the intervention reported in this paper is both specifically and conceptually an important step forward: this study shows that by removing the requirement of user behaviour change and slightly compromising effectiveness to achieve high uptake, a simple technology can have substantial public health benefit."]

Credit: 
The Lancet

Leaping larvae! How do they do that without legs?

image: A scanning electron microscope image shows the 1-micron projections on the adhesive patches of a leaping gall midge larva. Researchers aren't sure yet what makes them so sticky.

Image: 
Grace Farley, Duke University

DURHAM, NC -- Attaching its head to its tail to form a ring, a 3-millimeter larva of the goldenrod gall midge squeezes some internal fluids into its tail section, swelling it and raising the pressure like an inner tube.

When the adhesive bond between the head and tail can no longer hold, the tension is sprung, launching the worm into a high, tumbling flight that will carry it 20 to 30 body-lengths away in a tenth of a second at speeds comparable to a jumping insect with actual legs.
The direction of flight is somewhat random and the worm-like larva bounces a bit on landing, but it's apparently none the worse for wear. Still, as locomotion choices go, it seems a little reckless.

But this "hydrostatic legless jumping," as it's known by a team of Duke researchers who studied the launches with ultra-high-speed cameras, is about 28 times more energy efficient (and a heck of a lot faster) than crawling like a regular old caterpillar.

Their analysis of the remarkable leaping larvae appears Aug. 8 in the Journal of Experimental Biology.

What's new isn't the realization that legless larvae can leap. The behavior has been identified many times in the literature for more than 50 years, said Duke professor of biology Sheila Patek, whose lab led the analysis. The wonder here is in the details, which were captured with a 20,000-frames-per-second video camera and scanning electron microscopes.

"Sometimes they fall over and don't go very far," said Patek lab manager Grace Farley, who spent countless hours trying to keep the restless worms in focus and in the frame before they launched. Almost every chaotic flight travelled far enough to leave the camera's field of view.

What Farley learned from all those jumps is that there is a hinge in the worm's body about a third of the way from the tail that makes that lower portion what they call a "transient leg" to deliver the thrust to the surface.

While other ring-forming worms seem to use stiff appendages called pegs and mouthparts to create a firm latch between head and tail, the gall midge larva just has some sticky patches of skin that do the trick.

On close examination under an electron microscope, the sticky bits turn out to be rows of little finger-like scales, just 1 micron across, that are quite similar to the sticky pads found on a gecko's feet.

Farley said it's not clear yet whether these scales interlock with each other somehow or whether they adhere merely from the van der Waals effect, the weak electromagnetic attraction between atoms put into close proximity, which is how the gecko walks on window panes.

The adhesive patches appear to be similar to the "head-arresting system" that helps damselflies and dragonflies lock their heads in place, Patek said. But it's also possible the gall-midge larvae secrete some sort of fluid on the pads. They don't know the details yet.

In a way, it's a wonder these animal mechanics researchers even found this worm. It is one of several dozen species of gall midges that feed within the tissues of a hundred different species of goldenrods. This bright orange worm, a member of the Asphondylia genus that hasn't even been formally named and described by science yet, is partial to the silverrod, Solidago bicolor, a white-flowered species of goldenrod.

"They're really small and inconspicuous, so not a lot of people study them," said Michael Wise, a Roanoke College biologist who is one of the people who does indeed study goldenrods and the midges that love them.

It was Wise, a former graduate school classmate of Patek at Duke, who somewhat inadvertently started the project.

Having carefully collected several specimens of goldenrod galls in the Virginia mountains two Augusts ago, Wise was cutting open the swollen portions of the plants under a microscope to extract the little orange worm within each capsule.

"After dissecting about a dozen galls, I looked in the petri dish and there were only two larvae in the dish," Wise said. "They were jumping all over the office!"

He knew Patek had this high speed camera for her science on jumping, snapping and punching creatures and suggested they ought to take a look.

"So, we just decided to film them for fun," Patek said. "Then we realized, this might actually be an interesting new field."

The latching mechanism formed by "adhesive microhairs" between each segment of the worm is apparently new, and the calculations about how much more efficient jumping is than crawling may be of interest to the field of soft robots, Patek said. This work also fits with her larger inquiries about the spectacular accelerations achieved by fleas, ants, mantis shrimp and other creatures that tend to use a spring and latch mechanism rather than muscle power to achieve amazing feats.

Closely related species of this gall midge worm are known to jump from their home plants to find places to burrow into the ground and pupate. But this particular worm never leaves the gall - it pupates right there and emerges as a fully formed flying midge. Why would it even need to leap?

Perhaps it's a leftover skill from some earlier evolution of the worm, Wise suggests. Or perhaps it's to avoid predators and curious biologists.

Credit: 
Duke University

Fighting child diarrhea

image: A woman pumps water from a shared community tap in Dhaka, Bangladesh.

Image: 
GMB Akash

It kills a child under 5 every minute on average. Diarrheal disease, the second leading cause of death for children globally, could become even more difficult to control as poor urban areas with limited clean water access expand. An international team of researchers led by a Stanford epidemiologist finds reason for hope in a low-cost water treatment device that reduces rates of diarrhea in children, provides good-tasting water and avoids the need for in-home treatment - improvements over other purification strategies that could significantly increase uptake. Their results were published Aug. 8 in The Lancet Global Health.

In developing countries, few cities are able to maintain fully pressurized water systems that consistently pump water around the clock. Even if it is safe at the source, water in these systems is at risk of becoming contaminated while sitting in pipes. About 1 billion people who access water via piped systems receive water that does not meet international standards for safety.

"Group level water treatment among people who share a water supply removes the individual burden on households to treat their own water," said study senior author Stephen Luby, a professor of medicine in the Division of Infectious Diseases and Geographic Medicine at the Stanford School of Medicine. "So, it offers the prospect for extending safe drinking water to vulnerable slum residents globally."

Bad taste

Chlorination is one of the cheapest and most widely available ways of disinfecting water, but the chemical's taste and odor are significant barriers for many people. Also, most available water treatment devices have been intended for use in the home, generally after collection at a community tap, and therefore require a change in behavior. These barriers have prevented many people from accessing safe water.

"The study demonstrated that this simple, electricity-independent technology could be transformative in scaling up water treatment in slums and reducing child diarrhea, without requiring people to do anything differently when they collect their drinking water," said study lead author Amy Pickering, an assistant professor of civil and environmental engineering at Tufts University who received her PhD at Stanford where she also worked as a postdoctoral scholar.

Working in two poor communities of Dhaka, Bangladesh, the researchers tested a way of treating water, called Aquatabs Flo, that works at community pumps rather than in the home. It requires no electricity and automatically doses a precise amount of chlorine into water as it flows through the device. The chlorine lasts long enough to protect water stored in containers against recontamination.

To avoid bad-tasting water, the researchers polled Dhaka residents to find out how much chlorine could remain in the water without being objectionable. Then, they set the chlorine dosers to deliver low levels of chlorine the first few months so people would get used to the taste. Later, they upped that amount to a level that purified the water effectively, but remained acceptable taste-wise. The treated water was more than four times less likely to contain E coli., a bacterium that indicates sewage contamination.

The researchers tested the device by having it deliver chlorine in some communities and Vitamin C in others. Out of 1,000 children, The ones who received the chlorinated water had 23 percent lower rates of diarrhea. While the result may seem obvious, previous studies had been ambiguous either because people didn't consistently use the household chlorination systems being tested or because they weren't able to compare to communities without water treatment.

The device was particularly effective in children living in an urban setting, which the researchers suggest could be due to a few different causes. One is that water in urban settings often spends more time in unpressurized pipes. Also, before the chlorine treatment, nearly 90 percent of taps in that setting were contaminated with E coli, almost twice the rate of the more rural study area. Finally, the two locations contained different pathogens, some of which could be resistant to chlorine. Either way, the results suggest that a device that delivers a precise low-dose of chlorine can purify water while tasting good enough to drink.

Looking toward a safe water future

The study was an outgrowth of the Lotus Water project - a joint effort led by Stanford's Program on Water, Health and Development - which received early funding from the Stanford Woods Institute for the Environment. The project aims to provide water disinfection services through a business model that relies on monthly payments from landlords, who typically own shared water points in Dhaka. The team's earlier research indicates that slum residents are willing to pay higher rents in exchange for higher-quality water. Linking the device's lease to service payments would hold landlords accountable to their tenants (read related story).

Although Aquatabs Flo is currently only compatible with water points connected to storage tanks, Tufts and Stanford are collaborating with an industry partner to commercialize a chlorine doser compatible with any tap.

Credit: 
Stanford University