Culture

The brain inspires a new type of artificial intelligence

image: Processing an event with multiple objects. A synchronous input where all objects are presented simultaneously to a computer (left), versus an asynchronous input where objects are presented with temporal order to the brain (right).

Image: 
Prof. Ido Kanter

Machine learning, introduced 70 years ago, is based on evidence of the dynamics of learning in our brain. Using the speed of modern computers and large data sets, deep learning algorithms have recently produced results comparable to those of human experts in various applicable fields, but with different characteristics that are distant from current knowledge of learning in neuroscience.

Using advanced experiments on neuronal cultures and large scale simulations, a group of scientists at Bar-Ilan University in Israel has demonstrated a new type of ultrafast artifical intelligence algorithms -- based on the very slow brain dynamics -- which outperform learning rates achieved to date by state-of-the-art learning algorithms.

In an article published today in the journal Scientific Reports, the researchers rebuild the bridge between neuroscience and advanced artificial intelligence algorithms that has been left virtually useless for almost 70 years.

"The current scientific and technological viewpoint is that neurobiology and machine learning are two distinct disciplines that advanced independently," said the study's lead author, Prof. Ido Kanter, of Bar-Ilan University's Department of Physics and Gonda (Goldschmied) Multidisciplinary Brain Research Center. "The absence of expectedly reciprocal influence is puzzling."

"The number of neurons in a brain is less than the number of bits in a typical disc size of modern personal computers, and the computational speed of the brain is like the second hand on a clock, even slower than the first computer invented over 70 years ago," he continued. "In addition, the brain's learning rules are very complicated and remote from the principles of learning steps in current artificial intelligence algorithms," added Prof. Kanter, whose research team includes Herut Uzan, Shira Sardi, Amir Goldental and Roni Vardi.

Brain dynamics do not comply with a well-defined clock synchronized for all nerve cells, since the biological scheme has to cope with asynchronous inputs, as physical reality develops. "When looking ahead one immediately observes a frame with multiple objects. For instance, while driving one observes cars, pedestrian crossings, and road signs, and can easily identify their temporal ordering and relative positions," said Prof. Kanter. "Biological hardware (learning rules) is designed to deal with asynchronous inputs and refine their relative information." In contrast, traditional artifical intelligence algorithms are based on synchronous inputs, hence the relative timing of different inputs constituting the same frame is typically ignored.

The new study demonstrates that ultrafast learning rates are surprisingly identical for small and large networks. Hence, say the researchers, "the disadvantage of the complicated brain's learning scheme is actually an advantage". Another important finding is that learning can occur without learning steps through self-adaptation according to asynchronous inputs. This type of learning-without-learning occurs in the dendrites, several terminals of each neuron, as was recently experimentally observed. In addition, network dynamics under dendritic learning are governed by weak weights which were previously deemed insignificant.

The idea of efficient deep learning algorithms based on the very slow brain's dynamics offers an opportunity to implement a new class of advanced artificial intelligence based on fast computers. It calls for the reinitiation of the bridge from neurobiology to artifical intelligence and, as the research group concludes, "Insights of fundamental principles of our brain have to be once again at the center of future artificial intelligence".

Credit: 
Bar-Ilan University

Research on cholera adds to understanding of the social life of bacteria

image: Strains of Vibrio cholerae (red) make elongated cells that become entangled and help short-term survival.

Image: 
Ben Wucher and Carey Nadell

HANOVER, N.H. - August 9, 2019 - Certain strains of cholera can change their shape in response to environmental conditions to aid their short-term survival, according to new research from Dartmouth College.

In the research, some strains of the bacterium Vibrio cholerae transformed themselves from small, comma-shaped cells to long filaments in nutrient-poor environments.

This strategy of changing cell shape supports the growth of bacterial communities and allows the pathogen to compete in environments with a quick turnover of surfaces on which to grow.

According to the study, the formation of the elongated cell shapes allows the rapid formation of communities of bacteria that bind to surfaces - known as biofilms - that are essential in turbulent nutrient environments. These formations come at the expense of being able to compete in the long term with biofilms made from smaller cells that pack together more tightly.

The finding adds to the understanding of how bacteria adapt to their environment.

"Bacteria are normally thought of as solitary organisms, but they are actually highly-social organisms that like to live in groups," said Carey Nadell, an assistant professor of biology at Dartmouth. "This research shows that we can relate cell structure to group behavior in new ways when looking at realistic environments."

When not inside a human host, V. cholerae grows on nutritious pieces of debris in aquatic environments. This debris, known as chitin, comes from the shells of arthropods like plankton and shrimp. Cholera cell growth on the chitin typically takes place in the form of biofilms featuring clusters of organisms.

In the research, strains of cholera were grown in sea water and then observed using 3D microscopy with the aid of fluorescent markers to make the bacteria visible. The researchers found that the altered long-filaments become entangled, providing an advantage that allows the bacteria to quickly colonize nutrient-rich particles in sea water.

The research notes that the formation of the filament-like structure comes at the expense of longer-term competitive ability enjoyed by shorter cells that adhere more strongly to each other and to surfaces.

"This has important consequences for how cells survive in the environment. It shows how bacterial cell shape can be coupled to environmental success during surface occupation, competition within biofilms, and dispersal to new resource patches," said Nadell.

There are many strains of the cholera bacteria. Because the bacteria in the study was grown in sea water, the research does not directly lead to a greater understanding of how cholera acts within the human body.

The discovery of a new way that bacteria form groups on surfaces can, however, help researchers understand more about how bacteria act and associate.

"This kind of behavior is perhaps more widespread than we currently understand in the wild, and that variability in cell shape, like variability in animal body plans, could be a fundamental part of why some bacteria live in certain places but not others."

Predicting microbial community composition is a major frontier of modern microbiology and medicine, given the importance of microbiomes for health, industry, agriculture, and other applications.

Credit: 
Dartmouth College

These sharks glow in the dark thanks to a newly identified kind of marine biofluorescence

image: David Gruber with shark eye camera.

Image: 
David Gruber

In the depths of the sea, certain shark species transform the ocean's blue light into a bright green color that only other sharks can see--but how they biofluoresce has previously been unclear. In a study publishing August 8 in the journal iScience, researchers have identified what's responsible for the sharks' bright green hue: a previously unknown family of small-molecule metabolites. Not only is this mechanism of biofluorescence different from how most marine creatures glow, but it may also play other useful roles for the sharks, including helping them identify each other in the ocean and fight against microbial infections.

"Studying biofluorescence in the ocean is like a constantly evolving mystery novel, with new clues being provided as we move the research forward," says David Gruber, a professor at The City University of New York and co-corresponding author of the study. "After we first reported that swell sharks were biofluorescent, my collaborators and I decided to dive deeper into this topic. We wanted to learn more about what their biofluorescence might mean to them."

Gruber, working with Jason Crawford, co a professor at Yale University and the study's co-corresponding author, focused on two species of sharks--the swell shark and the chain catshark. They noticed that the sharks' skin had two tones--light and dark--and extracted chemicals from the two skin types. What they found was a type of fluorescent molecule that was only present in the light skin.

"The exciting part of this study is the description of an entirely new form of marine biofluorescence from sharks--one that is based on brominated tryptophan-kynurenine small-molecule metabolites," Gruber says.

These types of small-molecule metabolites are known to be fluorescent and activate pathways similar to those that, in other vertebrates, play a role in the central nervous system and immune system. But in the sharks, the novel small-molecule fluorescent variants account for the biophysical and spectral properties of their lighter skin. This mechanism is different from animals in the upper ocean, such as jellyfish, that commonly use green fluorescent proteins as mechanisms to transform blue light into other colors, Gruber says.

"It's a completely different system for them to see each other that other animals cannot necessarily tap into. They have a completely different view of the world that they're in because of these biofluorescent properties that their skin exhibits and that their eyes can detect," Crawford says. "Imagine if I were bright green, but only you could see me as being bright green, but others could not."

The molecules also serve multiple other purposes, including to help the sharks identify each other in the ocean and potentially provide protection against microbial infections, Crawford says.

"It is also interesting that these biofluorescent molecules display antimicrobial properties. These catsharks live on the ocean bottom, yet we don't see any biofouling or growth, so this could help explain yet another amazing feature of shark skin," Gruber says. "This study opens new questions related to potential function of biofluorescence in central nervous system signaling, resilience to microbial infections, and photoprotection."

While the study focused on two shark species, Gruber and Crawford hope to more broadly explore the luminescent properties of marine animals, which can ultimately lead to the development of new imaging techniques.

"If you can harness the abilities that marine animals have to make light, you can generate molecular systems for imaging in the lab or in medicine. Imaging is an incredibly important biomedical objective that these types of systems could help to propel into the future," Crawford says.

"Sharks are wonderful animals that have been around for over 400 million years. Sharks continually fascinate humans, and they hold so many mysteries and superpowers," Gruber says. "This study highlights yet another mystery of sharks, and it is my hope that this inspires us to learn more about their secrets and work to better protect them."

Credit: 
The City University of New York

Smuggling route for cells protects DNA from parasites

image: Microscopy image of an entire fruit fly (Drosophila melanogaster; body outline in green) with a protein central to the smuggling route (Nxf3) shown in red.

Image: 
Daniel Reumann, IMBA

Our cells have safety mechanisms that keep genetic parasites - such as viruses and transposons - in check while important genes of the host cell can remain active. Researchers have now shown that the molecular safety mechanisms of the host cell "smuggle" genetic information molecules around the cell, which are then used to recognize and shut down the parasites.

While information for the production of our cells' proteins constitutes less than two percent of our DNA, two-thirds of our DNA consists of selfish genetic elements such as retroviruses and tranposons and residues thereof. In fact, transposon sequences have benefited the adaptation of different species to new environments by imparting new regulatory elements to the genome. But unrestrained transposon proliferation makes the genome unstable and results in low fertility in both Drosophila, mice and humans.

Assistant Professor Peter Refsing Andersen has just started building his own group at the Department of Molecular Biology and Genetics at Aarhus University, after having worked as a postdoc in Vienna for five years. In Vienna, Peter studied the molecular mechanisms that keep transposons in check and thus ensure that intact DNA can be passed on to the next generation. Together with colleagues in Senior Scientist Julius Brennecke's group at the Vienna BioCenter, Peter has now found the answer to one of the big open questions in understanding how the defence mechanisms against transposons work.

The defence, which can shut down transposons, is guided by small RNA molecules, the so-called piRNAs. piRNAs are made in the cell from long RNA molecules that, after being produced within the cell nucleus, have to travel into the cytoplasm of specific piRNA production regions. However, the inherent problem is that the long RNA molecules - according to gene expression textbook dogmas in the field of RNA transport - should be locked inside the nucleus, as they lack all the molecular quality stamps that normally allow RNA to exit the nucleus.

Revealing new transport route for RNA

Through their work, Peter Refsing Andersen and his colleagues have found that the transport of the long RNA molecules for piRNA production takes place via an until now unknown RNA transport route. This molecular route breaks with several of the traditional dogmas of RNA transport and thereby "smuggles" RNA that cannot pass the normal quality control in the cell into the cytoplasm and even delivers the long RNA molecules directly to the piRNA production regions.

The study thus not only uncovers new insight about how animal genomes defend themselves against DNA parasites; it also reveals a glimpse of how cells sort and spatially distribute and organize genetic information. Peter and his colleagues have studied this problem in Drosophila, which is an ideal model system for this biology, which is expected to function in similar ways in humans.

Precisely this perspective, Peter Refsing Andersen finds very interesting: "Although this important biology cannot currently be investigated in humans due to technical obstacles, we can explore the biological principles in model systems such as Drosophila. The framework of understanding we build here can in the future be combined with the enormous wave of genetic information the scientific community is receiving from patients worldwide these years. Therefore, our work can help translate the billions of sequences into meaningful biological information that can benefit people in the longer term."

Credit: 
Aarhus University

This designer clothing lets users turn on electronics while turning away bacteria

image: Purdue waterproof, breathable and antibacterial self-powered clothing is based on omniphobic triboelectric nanogenerators.

Image: 
Ramses Martinez/Purdue University

WEST LAFAYETTE, Ind. - A new addition to your wardrobe may soon help you turn on the lights and music - while also keeping you fresh, dry, fashionable, clean and safe from the latest virus that's going around.

Purdue University researchers have developed a new fabric innovation that allows wearers to control electronic devices through clothing.

"It is the first time there is a technique capable to transform any existing cloth item or textile into a self-powered e-textile containing sensors, music players or simple illumination displays using simple embroidery without the need for expensive fabrication processes requiring complex steps or expensive equipment," said Ramses Martinez, an assistant professor in the School of Industrial Engineering and in the Weldon School of Biomedical Engineering in Purdue's College of Engineering.

The technology is featured in the July 25 edition of Advanced Functional Materials.

"For the first time, it is possible to fabricate textiles that can protect you from rain, stains, and bacteria while they harvest the energy of the user to power textile-based electronics," Martinez said. "These self-powered e-textiles also constitute an important advancement in the development of wearable machine-human interfaces, which now can be washed many times in a conventional washing machine without apparent degradation.

Martinez said the Purdue waterproof, breathable and antibacterial self-powered clothing is based on omniphobic triboelectric nanogeneragtors (RF-TENGs) - which use simple embroidery and fluorinated molecules to embed small electronic components and turn a piece of clothing into a mechanism for powering devices. The Purdue team says the RF-TENG technology is like having a wearable remote control that also keeps odors, rain, stains and bacteria away from the user.

"While fashion has evolved significantly during the last centuries and has easily adopted recently developed high-performance materials, there are very few examples of clothes on the market that interact with the user," Martinez said. "Having an interface with a machine that we are constantly wearing sounds like the most convenient approach for a seamless communication with machines and the Internet of Things."

The technology is being patented through the Purdue Research Foundation Office of Technology Commercialization. The researchers are looking for partners to test and commercialize their technology.

Their work aligns with Purdue's Giant Leaps celebration of the university's global advancements in artificial intelligence and health as part of Purdue's 150th anniversary. It is one of the four themes of the yearlong celebration's Ideas Festival, designed to showcase Purdue as an intellectual center solving real-world issues.

Credit: 
Purdue University

Fungi living in cattail roots could improve our picture of ancient ecoystems

image: This is lead author Az Klymiuk in waders collecting cattail roots.

Image: 
Courtesy of Az Klymiuk, Field Museum

Paleobotanist Az Klymiuk didn't set out to upend science's understanding of the fossil record of plant-fungal associations. She just wanted to figure out the environment that some fossil plants lived in. That question led her to look at modern cattail roots and the fungi that live inside of them. She found that fungi have a harder time growing in cattail roots that are underwater. And that discovery, which is being published in the journal Mycologia, could change how we interpret some parts of the fossil record, including an important community of early land plants.

"I was studying fossil plants that are 48 million years old. In these fossils, we can see the plant cells and everything in them, including fungi. In describing these fossil fungi, I realized we actually don't know much about fungi that are living in wetland plants today," says Klymiuk, the Field Museum's collections manager of paleobotany and lead author of the Mycologia study. "There's been very little work done, next to nothing in terms of their ecology and distribution."

Fungi, which include mushrooms, molds, and yeasts, aren't plants--they're their own separate kind of organism, more closely related to animals than anything green. But almost all land plants have tiny fungi living inside their roots. These microscopic fungi grow tendrils that snake out into the soil, where they can break down dirt and absorb the element phosphorus that's present in the soil and rocks. The plant host then uses that phosphorus to create energy (bonus points if you remember from high school biology that the energy comes in the form of a molecule called adenosinine triphosphate--ATP). Without the fungi, the plants have a much harder time pulling phosphorus from the soil. All this is to say, plants need the fungi living in their roots so that they can create the energy they need to survive. In addition to these "helpful" fungi, plants also host an enigmatic group of fungi called dark septate endophytes - the verdict is still out on whether these fungi are weak parasites, latent pathogens, or helpful partners. What we do know is that most plants host them, including the fossil plants Klymiuk worked with.

Klymiuk was curious about the possible ecological roles of these fungi, but was also intrigued by why some fossil plants had lots of root fungi and others had very few. Since these fossil plants were from areas that had been swampy wetlands millions of years ago when the plants were alive, she decided to look for clues in modern wetland plants, which have a lot in common with the ancient ones: cattails.

To do this, she collected cattail roots from several wetlands, sampling plants from the shoreline all the way to the deepest point at which they grew. "I was out there in hip waders with my undergraduate assistants, and we were digging up cattails. You get in there with a shovel, and you just yank up this giant plant that's as tall as you," says Klymiuk, who began work on the project at the University of Kansas with co-author Benjamin Sikes. "Memorably, in one reservoir, I had literally just told Abby [my undergraduate] to watch out because there was a sharp drop-off, and what'd I do? Ker-splash, right over. It was our last site that day, so thankfully we were close to the lab, which meant I could at least do a fast wardrobe change before we did sample prep."

Back in the lab, she examined the roots under a microscope and compared the fungi in cattail roots that had been growing at different depths underwater. She found that, at least in these cattails, the fungi don't do well with inundation at all. It didn't matter if they were growing in two inches of water, or four feet--cattails growing in water had very little fungi in their roots. Up on dry land, however, cattails had plenty of fungi living in their roots, comparable to grass roots Klymiuk collected alongside the cattails. "It turns out that any degree of flooding whatsoever massively suppresses the amount of fungi in plant roots," says Klymiuk.

Since the number of fungi living in plant roots appears to indicate whether the plants are growing fully in water or up on the shoreline, scientists looking at fungi in fossil plant roots can make a better guess about what environment those plants lived in. "This gives us a new way to understand what we're seeing in the fossils," says Klymiuk. "I have some roots where fungi are completely absent. I can look and look and look and there's nothing there. I have other roots that are just loaded, packed. To me, that indicates that we're dealing with different levels of inundation. I feel pretty confident saying that when we find a lot of fungus in a plant root, that root was probably not inundated during its life."

As well as giving scientists a new tool for figuring out what some prehistoric ecosystems were like, this study also suggests that some of our understanding of the fossil record of plant-fungal interactions might need recalibrating. "There's been this pervasive narrative, all across biology," Klymiuk says. "The basic idea is that plants needed fungi to get out of water, to get onto land. The oldest preserved community of land plants, the 407 million-year-old Rhynie Chert, is often cited as fossil evidence for this. This community of land plants has mostly been interpreted as a wetland assemblage associated with hot spring overflow or outwash, and many of these early land plants hosted mutualistic fungi, just like most living plants do today. What my research suggests, is that these plants were probably victims of intermittent flooding, as opposed to living right in the water."

So did plants need fungal helpers to get onto land? "I don't disagree that they found fungal partnerships extremely useful once they were on land, but I don't think proto-plants would have needed to bother with them if they were in water, because phosphorous is readily bioavailable in water," says Klymiuk. "My personal feeling is that the earliest land plants were probably never in water to begin with. There are lots of groups of green algae that are fully terrestrial. I think it's very possible that land plants evolved from fully terrestrial green algae, and that water was a secondarily-invaded habitat. Stay tuned, there's a lot of really cool research emerging on this question."

Klymiuk's work doesn't just speak to the past, but also sheds light on future life on Earth. "It's important for us to understand these relationships because so many of our crops and forests are under stress due to climate change," says Klymiuk. "Not only are we dealing with more flooding, and of longer duration (and more droughts, and more everything), but it's very probable that the way in which plant-fungal interactions work will change as we continue to move into a higher CO2 world -- we know that some groups of fungi are 'better roommates' under low CO2 than high; they pay their rent on time. Increase the CO2 and suddenly some of them are falling behind on phosphorus-as-rent, or ordering pizza on their host's credit card (using excess photosynthates at cost to the host plant). I can draw this analogy out in twelve different ways, and it will always end with 'we don't know enough about how these systems function in order to generalize or predict anything with confidence yet.' It's still seriously understudied."

Beyond the potential applications of this work, Klymiuk says she's excited about the project for the sake of discovery. "There's this mysterious, microscopic world that we don't usually think about. I'm genuinely fascinated by plants and their evolution, because so much of paleobotany is detective work. I just get excited by putting these puzzles back together and chipping away at some fundamental mysteries."

Credit: 
Field Museum

Too much coffee raises the odds of triggering a migraine headache

image: Association between servings of caffeinated beverages compared to none and occurrence of migraine on the same day among 98 participants with episodic migraines followed for 6 weeks.

Image: 
<em>The American Journal of Medicine</em>

Philadelphia, August 8, 2019 - Drinking three or more servings of caffeinated beverages a day is associated with the onset of a headache on that or the following day in patients with episodic migraine, according to a new study in The American Journal of Medicine, published by Elsevier. Results are consistent even after accounting for daily changes in alcohol intake, stress, sleep, physical activity, and menstruation, although there was some variation evident with oral contraception use.

"Based on our study, drinking one or two caffeinated beverages in a day does not appear to be linked to developing a migraine headache, however, three or more servings may be associated with a higher odds of developing a headache," noted lead investigator Elizabeth Mostofsky, ScD, Cardiovascular Epidemiology Research Unit, Beth Israel Deaconess Medical Center, and Department of Epidemiology, Harvard T.H. Chan School of Public Health, Boston, MA, USA.

Migraine is a disabling primary headache disorder affecting approximately 1.04 billion adults worldwide and representing the most common pain condition causing lost productivity and significant direct and indirect costs. Despite widespread anecdotal belief that caffeinated beverages may trigger migraine headaches and relieve headaches once they have begun, there is limited scientific evidence to assess the potential association between changes in daily intake and the onset of headaches after accounting for other changes in lifestyle such as physical activity and anxiety. Common anecdotal evidence also suggests that migraine can be immediately triggered by weather or lifestyle factors, such as sleep disturbance and skipping meals.

Approximately 87 percent of Americans consume caffeine daily, with an average intake of 193 mg per day. Whereas some behavioral and environmental factors may only have potential harmful effects on migraine risk, the role of caffeine is particularly complex because the impact depends on dose and frequency. It may trigger an attack, but also has an analgesic effect.

Investigators analyzed data from 98 adults who suffer from episodic migraines. Participants completed electronic diaries twice a day for six weeks reporting on their caffeinated beverage intake, other lifestyle factors, and the timing and characteristics of each migraine headache. The study compared each participant's incidence of migraines on days they consumed caffeinated beverage intake to the incidence of migraines on days they did not. Baseline data had indicated that participants typically experienced an average of five headaches per month; 66 percent of them usually consumed one to two servings of caffeinated beverages daily, and 12 percent consumed three or more cups. During the six-week study period in 2016-17, participants experienced an average of 8.4 headaches. All reported having caffeinated beverages on at least one day during the study, with an average of 7.9 servings per week.

"To date, there have been few prospective studies on the immediate risk of migraine headaches with daily changes in caffeinated beverage intake. Our study was unique in that we captured detailed daily information on caffeine, headache, and other factors of interest for six weeks," commented Suzanne M. Bertisch, MD, MPH, principal investigator of the study, of the Division of Sleep and Circadian Disorders, Brigham and Women's Hospital and Harvard Medical School, Boston, MA, USA.

These findings suggest that the impact of caffeinated beverages on headache risk was only apparent for three or more servings on that day, and that patients with episodic migraine did not experience a higher risk of migraine when consuming one to two caffeinated beverages per day. Additional research is needed to examine the potential effect of caffeine on symptom onset in the subsequent hours and the interplay of sleep, caffeine, anxiety, environmental factors, and migraine.

Credit: 
Elsevier

Fluoride may diminish kidney and liver function in adolescents, study suggests

(New York, NY - August 8, 2019) -- Fluoride exposure may lead to a reduction in kidney and liver function among adolescents, according to a study published by Mount Sinai researchers in Environment International in August.

The study examined the relationship between fluoride levels in drinking water and blood with kidney and liver health among adolescents participating in the National Health and Nutrition Examination Survey, a group of studies that assess health and nutritional well-being in the United States. The findings showed that exposure to fluoride may contribute to complex changes in kidney and liver function among youth in the United States, where 74 percent of public water systems add fluoride for dental health benefits. Fluoridated water is the main source of fluoride exposure in the U.S.. The findings also suggest that adolescents with poorer kidney or liver function may absorb more fluoride in their bodies.

While fluoride exposure in animals and adults has been associated with kidney and liver toxicity, this study examined potential effects of chronic low-level exposure among youth. This is important to study because a child's body excretes only 45 percent of fluoride in urine via the kidneys, while an adult's body clears it at a rate of 60 percent, and the kidneys accumulate more fluoride than any other organ in the body.

"While the dental benefits of fluoride are widely established, recent concerns have been raised regarding the appropriateness of its widespread addition to drinking water or salt in North America," said the study's first author Ashley J. Malin, PhD, postdoctoral fellow in the Department of Environmental Medicine and Public Health at the Icahn School of Medicine at Mount Sinai. "This study's findings suggest that there may be potential kidney and liver health concerns to consider when evaluating fluoride use and appropriate levels in public health interventions. Prospective studies are needed to examine the impact of chronic low-level fluoride exposure on kidney and liver function in the U.S. population."

The study analyzed fluoride measured in blood samples of 1,983 adolescents and the fluoride content of the tap water in the homes of 1,742 adolescents. Although the tap water fluoride concentrations were generally low, there are several mechanisms by which even low levels of fluoride exposure may contribute to kidney or liver dysfunction.

This study's findings, combined with previous studies of childhood exposure to higher fluoride levels, show there is a dose-dependent relationship between fluoride and indicators of kidney and liver function. The findings, if confirmed in other studies, suggest it may be important to consider children's kidney and liver function in drafting public health guidelines and recommendations.

Potential health side effects include renal system damage, liver damage, thyroid dysfunction, bone and tooth disease, and impaired protein metabolism.

Credit: 
The Mount Sinai Hospital / Mount Sinai School of Medicine

Blood clotting factors may help fight multi-drug resistant superbugs

Coagulation factors, which are involved in blood clotting after injury, may offer new strategies for fighting multidrug-resistant bacteria, according to a study published in Cell Research.

Infections caused by these bacteria pose an urgent public health risk, as effective drugs to combat them are lacking. A deficiency in blood coagulation factors - for example in patients with the blood clotting disorder haemophilia - has been associated with bacterial infection diseases such as sepsis and pneumonia, leading to the suggestion that these coagulation factors may have a role in anti-infection mechanisms as well as bloody clotting.

Now a group of researchers at Sichuan University, China, has shown that the factors VII, IX, and X - which are well known for their roles in blood coagulation - may act against Gram-negative bacteria, including extensively drug resistant pathogens such as Pseudomonas aeruginosa and Acetinobacter baumannii. Both bacteria were recently listed by the World Health Organisation among 12 bacteria that pose the greatest threat to human health because of their antibiotic resistance. Gram-negative bacteria are characterized by their cell envelopes, which are composed of an inner cell membrane, a thin cell wall and an outer membrane that make them harder to kill.

Xu Song, the corresponding author said: "In our study, we report a class of human antimicrobial proteins effective against some drug-resistant 'superbugs'. Unlike many antibacterial agents that target the cell metabolism or cytoplasmic membrane, these proteins act by breaking down the lipopolysaccharides of the bacterial outer membrane through hydrolysis. Lipopolysaccharides are crucial for the survival of Gram-negative bacteria."

The ability of bloody coagulation factors to hydrolyse essential lipopolysaccharides in the bacterial cell envelope, suggests they may potentially be used to combat Gram-negative bacteria. Examining the mechanism further, the authors showed that the coagulation factors act on the bacteria via light chains - one of two domains of the proteins. The other domains (heavy chains) have no effect. In cells in the laboratory, the authors showed that treatment of cells of the bacteria E. coli with light chains led to clearly observable damage to the bacterial cell envelope initially and almost complete destruction of the cell within four hours.

The authors found that the light chain of coagulation factor VII was effective against all Gram-negative bacterial cells tested. The light chains, as well as the coagulation factors as a whole, were also shown to be effective in combating Pseudomonas aruginosa and Acetinobacter baumanii infections in mice. Heavy chains had no effect.

Xu Song said: "None of the known antibacterial agents has been reported to function by hydrolyzing lipopolysaccharides. Identification of the lipopolysaccharide hydrolysis-based antibacterial mechanism, combined with antibacterial features of blood coagulation factors, and the ability to manufacture them on a large scale at relatively low-cost, may offer new and cost-effective strategies for combating the urgent public-health crisis posed by drug-resistant Gram-negative pathogens."

Credit: 
Springer

The Lancet Global Health: Automatically chlorinating water at public taps cuts child diarrhoea by almost a quarter in urban Bangladesh

image: Infographic showing how water treatment device works.

Image: 
<em>The Lancet Global Health</em>

A novel water treatment device that delivers chlorine automatically via public taps without the need for electricity, reduced child diarrhoea by 23% compared with controls (156 cases out of 2,073 child measurements [7.5%] vs 216/2,145 [10%]) over 14 months in two urban neighbourhoods of Bangladesh, according to a randomised trial following more than 1,000 children published in The Lancet Global Health journal.

Clean water is still a major problem in poor urban communities in low-income countries, where contamination by bacteria can lead to high rates of diarrhoeal diseases such as cholera and typhoid, harming children's health and growth. Worldwide, an estimated one billion people who have access to piped water are drinking water that does not meet international safety standards.

Most previous research has focused on household-level water treatment interventions that require people to calculate the correct dosage and add their own chlorine daily--but these have had low uptake and failed to reduce diarrhoea, partly because they deliver a chlorine dose that makes chlorinated water taste and smell unpleasant.

In this study, the device used a low chlorine dose which increased taste acceptability and achieved high uptake while still improving drinking water quality.

"Chlorination is one of the cheapest and most widely available methods to make drinking water safe, but poor taste and bad smell of chlorinated water are major barriers to adoption," explains co-author Dr Sonia Sultana from icddr,b (International Centre for Diarrhoeal Diseases Research, Bangladesh). "Our findings indicate that automated chlorine dosing below the taste detection threshold has the potential to be transformative by ensuring high adoption rates and will hopefully help progress towards the global target of universal access to safe and affordable drinking water." [1]

Although one of the first field trials of this new technology, the authors say that chlorinating water at the point-of-collection could be an effective, scalable strategy in low-income urban settings to reduce diarrhoeal diseases. More research will be needed to determine where this technology should be implemented to maximise health benefits, as the intervention was more effective in Bangladesh's capital city Dhaka city than in Tongi on the outskirts of the city.

"This novel, low-cost technology requires no behaviour change or effort by users--safe water comes straight out of the tap," says Dr Amy Pickering from Tufts University, USA who led the research. "This point-of-collection approach to water treatment could be a transformative strategy for reducing gastrointestinal disease burden in low-income urban communities. We are now expanding the project to roadside water stands in Kenya, and working on a business model that could work in other countries." [1]

In this study, researchers used a novel treatment device that automatically dispenses small amounts of chlorine to water from public taps and shared hand pumps. In the device, water flows past solid tablets of chlorine which dissolve into the water to treat it (see infographic).

Identical dispensers were installed at 100 shared water points in two low-income neighbourhoods in Bangladesh (Dhaka and Tongi) fed by piped water that is delivered intermittently, as is common in low-income settings. Water points were randomly assigned to have their drinking water automatically chlorinated (intervention) or to be treated with vitamin C (control group).

Between July 2015 and November 2015, 920 households with at least one child under the age of 5 years were assigned to the chlorine treatment (50 water points; 517 children) or control groups (50 water points; 519 children). Because of high migration, children could transfer into or out of the shared water points.

Every 2-3 months during the 14-month follow up period, caregiver-reported child diarrhoea (3 or more loose or watery stools in 24 hours) was measured alongside household and tap water quality (microbes, taste, smell), child weight, acute respiratory illness, and the presence of sufficient chlorine residual to prevent recontamination by dirty containers, utensils, or hands.

Before the trial began, the authors identified the concentration of chlorine that would be below the taste detection threshold for most residents, ensuring participants would not know which study group they were in, and would not be put off by the taste of chlorine [2]. Blinding was largely successful during the triial, with most participants unable to accurately guess which intervention they had received. Nevertheless, nine communal water points in the intervention group were uninstalled, primarily due to individual complaints about the smell and taste of chlorinated water.

Chlorine residual was detected at the point of collection from shared taps 83% of the time in the treatment group compared to 0% of the time in the control group (table 3). E. coli contamination was detected in 15% of tap samples in the treatment group compared with 64% in the control group.

Results showed that, over 14 months, children in the treatment group had substantially less diarrhoea than those in the control group (156 cases out of 2,073 child observations [7.5%] vs 216/2,145 [10%]).

Importantly, the intervention had the largest health benefits among children in Dhaka, reducing diarrhoea by 34% compared to 7% in Tongi. The authors speculate that this variation in effect probably resulted from the poorer water quality in Dhaka at the start of the study (e.g, 87% of tap samples in Dhaka were contaminated with E Coli compared with 50% in Tongi) and because Dhaka receives water that spends much longer travelling through unpressurised pipes, enabling contamination and sewage to seep into the system.

Compared to the control group, caregivers in the treatment group were significantly less likely to report seeking treatment for gastrointestinal illness for their child (260 cases of treatment sought out of 3062 child observations [12.2%] vs 382/3142 [8.5%]); spent less on illness-related treatment; and reported lower consumption of antibiotics by their children (table 2). Respiratory illness and differences in child weight and growth were similar between the groups.

Despite these achievements, the study has some limitations, including that diarrhoea episodes were based on caregiver-reported data, which might not accurately represent children's illness; and that participants may have drunk water from other sources, although less than 4% of respondents reported doing so.

Discussing the implications of the findings in a linked Comment, Dr Jean Humphrey from John Hopkins Bloomberg School of Public Health, USA, writes that the intervention will not work in all situations: "The device is compatible with only a specific type of water system--in the study area this was one third of the water taps...[and] although 1.3 billion people have gained access to piped water since 2000, 2.9 billion (38%) of the global population still do not have any kind of access to piped water...There is certainly no one-size-fits-all strategy for providing access to clean water throughout the world. However, the intervention reported in this paper is both specifically and conceptually an important step forward: this study shows that by removing the requirement of user behaviour change and slightly compromising effectiveness to achieve high uptake, a simple technology can have substantial public health benefit."]

Credit: 
The Lancet

Leaping larvae! How do they do that without legs?

image: A scanning electron microscope image shows the 1-micron projections on the adhesive patches of a leaping gall midge larva. Researchers aren't sure yet what makes them so sticky.

Image: 
Grace Farley, Duke University

DURHAM, NC -- Attaching its head to its tail to form a ring, a 3-millimeter larva of the goldenrod gall midge squeezes some internal fluids into its tail section, swelling it and raising the pressure like an inner tube.

When the adhesive bond between the head and tail can no longer hold, the tension is sprung, launching the worm into a high, tumbling flight that will carry it 20 to 30 body-lengths away in a tenth of a second at speeds comparable to a jumping insect with actual legs.
The direction of flight is somewhat random and the worm-like larva bounces a bit on landing, but it's apparently none the worse for wear. Still, as locomotion choices go, it seems a little reckless.

But this "hydrostatic legless jumping," as it's known by a team of Duke researchers who studied the launches with ultra-high-speed cameras, is about 28 times more energy efficient (and a heck of a lot faster) than crawling like a regular old caterpillar.

Their analysis of the remarkable leaping larvae appears Aug. 8 in the Journal of Experimental Biology.

What's new isn't the realization that legless larvae can leap. The behavior has been identified many times in the literature for more than 50 years, said Duke professor of biology Sheila Patek, whose lab led the analysis. The wonder here is in the details, which were captured with a 20,000-frames-per-second video camera and scanning electron microscopes.

"Sometimes they fall over and don't go very far," said Patek lab manager Grace Farley, who spent countless hours trying to keep the restless worms in focus and in the frame before they launched. Almost every chaotic flight travelled far enough to leave the camera's field of view.

What Farley learned from all those jumps is that there is a hinge in the worm's body about a third of the way from the tail that makes that lower portion what they call a "transient leg" to deliver the thrust to the surface.

While other ring-forming worms seem to use stiff appendages called pegs and mouthparts to create a firm latch between head and tail, the gall midge larva just has some sticky patches of skin that do the trick.

On close examination under an electron microscope, the sticky bits turn out to be rows of little finger-like scales, just 1 micron across, that are quite similar to the sticky pads found on a gecko's feet.

Farley said it's not clear yet whether these scales interlock with each other somehow or whether they adhere merely from the van der Waals effect, the weak electromagnetic attraction between atoms put into close proximity, which is how the gecko walks on window panes.

The adhesive patches appear to be similar to the "head-arresting system" that helps damselflies and dragonflies lock their heads in place, Patek said. But it's also possible the gall-midge larvae secrete some sort of fluid on the pads. They don't know the details yet.

In a way, it's a wonder these animal mechanics researchers even found this worm. It is one of several dozen species of gall midges that feed within the tissues of a hundred different species of goldenrods. This bright orange worm, a member of the Asphondylia genus that hasn't even been formally named and described by science yet, is partial to the silverrod, Solidago bicolor, a white-flowered species of goldenrod.

"They're really small and inconspicuous, so not a lot of people study them," said Michael Wise, a Roanoke College biologist who is one of the people who does indeed study goldenrods and the midges that love them.

It was Wise, a former graduate school classmate of Patek at Duke, who somewhat inadvertently started the project.

Having carefully collected several specimens of goldenrod galls in the Virginia mountains two Augusts ago, Wise was cutting open the swollen portions of the plants under a microscope to extract the little orange worm within each capsule.

"After dissecting about a dozen galls, I looked in the petri dish and there were only two larvae in the dish," Wise said. "They were jumping all over the office!"

He knew Patek had this high speed camera for her science on jumping, snapping and punching creatures and suggested they ought to take a look.

"So, we just decided to film them for fun," Patek said. "Then we realized, this might actually be an interesting new field."

The latching mechanism formed by "adhesive microhairs" between each segment of the worm is apparently new, and the calculations about how much more efficient jumping is than crawling may be of interest to the field of soft robots, Patek said. This work also fits with her larger inquiries about the spectacular accelerations achieved by fleas, ants, mantis shrimp and other creatures that tend to use a spring and latch mechanism rather than muscle power to achieve amazing feats.

Closely related species of this gall midge worm are known to jump from their home plants to find places to burrow into the ground and pupate. But this particular worm never leaves the gall - it pupates right there and emerges as a fully formed flying midge. Why would it even need to leap?

Perhaps it's a leftover skill from some earlier evolution of the worm, Wise suggests. Or perhaps it's to avoid predators and curious biologists.

Credit: 
Duke University

Fighting child diarrhea

image: A woman pumps water from a shared community tap in Dhaka, Bangladesh.

Image: 
GMB Akash

It kills a child under 5 every minute on average. Diarrheal disease, the second leading cause of death for children globally, could become even more difficult to control as poor urban areas with limited clean water access expand. An international team of researchers led by a Stanford epidemiologist finds reason for hope in a low-cost water treatment device that reduces rates of diarrhea in children, provides good-tasting water and avoids the need for in-home treatment - improvements over other purification strategies that could significantly increase uptake. Their results were published Aug. 8 in The Lancet Global Health.

In developing countries, few cities are able to maintain fully pressurized water systems that consistently pump water around the clock. Even if it is safe at the source, water in these systems is at risk of becoming contaminated while sitting in pipes. About 1 billion people who access water via piped systems receive water that does not meet international standards for safety.

"Group level water treatment among people who share a water supply removes the individual burden on households to treat their own water," said study senior author Stephen Luby, a professor of medicine in the Division of Infectious Diseases and Geographic Medicine at the Stanford School of Medicine. "So, it offers the prospect for extending safe drinking water to vulnerable slum residents globally."

Bad taste

Chlorination is one of the cheapest and most widely available ways of disinfecting water, but the chemical's taste and odor are significant barriers for many people. Also, most available water treatment devices have been intended for use in the home, generally after collection at a community tap, and therefore require a change in behavior. These barriers have prevented many people from accessing safe water.

"The study demonstrated that this simple, electricity-independent technology could be transformative in scaling up water treatment in slums and reducing child diarrhea, without requiring people to do anything differently when they collect their drinking water," said study lead author Amy Pickering, an assistant professor of civil and environmental engineering at Tufts University who received her PhD at Stanford where she also worked as a postdoctoral scholar.

Working in two poor communities of Dhaka, Bangladesh, the researchers tested a way of treating water, called Aquatabs Flo, that works at community pumps rather than in the home. It requires no electricity and automatically doses a precise amount of chlorine into water as it flows through the device. The chlorine lasts long enough to protect water stored in containers against recontamination.

To avoid bad-tasting water, the researchers polled Dhaka residents to find out how much chlorine could remain in the water without being objectionable. Then, they set the chlorine dosers to deliver low levels of chlorine the first few months so people would get used to the taste. Later, they upped that amount to a level that purified the water effectively, but remained acceptable taste-wise. The treated water was more than four times less likely to contain E coli., a bacterium that indicates sewage contamination.

The researchers tested the device by having it deliver chlorine in some communities and Vitamin C in others. Out of 1,000 children, The ones who received the chlorinated water had 23 percent lower rates of diarrhea. While the result may seem obvious, previous studies had been ambiguous either because people didn't consistently use the household chlorination systems being tested or because they weren't able to compare to communities without water treatment.

The device was particularly effective in children living in an urban setting, which the researchers suggest could be due to a few different causes. One is that water in urban settings often spends more time in unpressurized pipes. Also, before the chlorine treatment, nearly 90 percent of taps in that setting were contaminated with E coli, almost twice the rate of the more rural study area. Finally, the two locations contained different pathogens, some of which could be resistant to chlorine. Either way, the results suggest that a device that delivers a precise low-dose of chlorine can purify water while tasting good enough to drink.

Looking toward a safe water future

The study was an outgrowth of the Lotus Water project - a joint effort led by Stanford's Program on Water, Health and Development - which received early funding from the Stanford Woods Institute for the Environment. The project aims to provide water disinfection services through a business model that relies on monthly payments from landlords, who typically own shared water points in Dhaka. The team's earlier research indicates that slum residents are willing to pay higher rents in exchange for higher-quality water. Linking the device's lease to service payments would hold landlords accountable to their tenants (read related story).

Although Aquatabs Flo is currently only compatible with water points connected to storage tanks, Tufts and Stanford are collaborating with an industry partner to commercialize a chlorine doser compatible with any tap.

Credit: 
Stanford University

Forest fragments surprising havens for wildlife

image: Researchers found that forest fragments outside of Sumatra's Bukit Barisan National Park were surprisingly rich in wildlife -- including critically endangered species such as Sumatran tiger (Panthera tigris sumatrae).

Image: 
WCS

Destruction of tropical rainforests reduces many unprotected habitats to small fragments of remnant forests within agricultural lands, and to date, these remnant forest fragments have been largely disregarded as wildlife habitat.

Researchers conducted camera trap surveys within Sumatra's Bukit Barisan Selatan National Park and five surrounding remnant forest fragments, finding 28 mammal species in the protected forest and 21 in the fragments--including critically endangered species such as Sunda pangolin (Manis javanica) and Sumatran tiger (Panthera tigris sumatrae), along with species of conservation concern such as marbled cat (Pardofelis marmorata) and Asiatic golden cat (Pardofelis temminckii).

The biodiversity found within the fragments suggests that these small patches of remnant forest may have conservation value to certain mammal species and indicates the importance of further research into the role these habitats may play in landscape-level, multispecies conservation planning.

Credit: 
Wildlife Conservation Society

Great Scots! 'it's' a unique linguistic phenomenon

A new study reveals that in a number of varieties of English spoken in Scotland, the rules of contraction (it's for it is) seem to differ unexpectedly, and asserts that such differences may shed new light on our understanding of language. The study, 'Syntactic variation and auxiliary contraction: the surprising case of Scots', by Gary Thoms (New York University), David Adger (Queen Mary University of London), Caroline Heycock (University of Edinburgh) and Jennifer Smith (University of Glasgow) will be published in September 2019 issue of the scholarly journal Language. A pre-print version of the article may be found at https://www.linguisticsociety.org/sites/default/files/LSA95302.pdf.

Contractions are widespread in English. However, there are certain rules about what can be contracted where--rules that speakers follow without ever having been taught them, and without being consciously aware of them. For example, speakers happily say It's in the box but not I don't know where it's. Such rules seem to apply to every variety of English, whether it be spoken in Philadelphia, London or the Caribbean.

The starting point for the article is the rule that forbids contraction in examples like I don't know where it's, which is one of the most exceptionless rules of contraction in English varieties. Previous work showed that the problem is the presence of a 'gap' directly after the contraction (I don't know where it's__), the idea being that the sentence starts as I don't know it is where, but we move the where back before the it is when we actually utter the sentence. Many modern theories of syntax involve the existence of these two "layers" of structure--the word order we speak and hear may come from an "underlying" order that is quite different.

In the article, the authors investigate what looks like a curiously specific exemption from this restriction found in some dialects of Scots: speakers readily allow contraction in examples like Here it's! or There it's!, which are used in the context of discoveries or sudden realizations (Where's my book??? Ah, there it's!). The authors seek to explain why contraction is possible just in these types of sentences, which they call locative discovery expressions, and only in one specific subpart of the English dialect continuum.

To investigate this, the authors analyzed data from the Scots Syntax Atlas, a new online digital resource for the study of Scots. The atlas provides original data on hundreds of grammatical phenomena from more than 140 locations across Scotland, gathered in face-to-face interviews by community-insider fieldworkers. The authors found out that many varieties of Scots also allow a kind of locative discovery expression where speakers repeat the word there (or here), so they say things like There it's there!. And it turns out that all speakers who can say There it's! can also say There it's there!, but not vice versa.

But - if in There it's there! the word conveying the location is that second there, that gets the accent, then what's the purpose of that first there? In Scots, the initial there has become simply a kind of particle, serving to introduce this kind of discovery expression but not conveying any actual meaning itself - it's a butler of sorts.

And what about the speakers who say not only There it's there! but also There it's!? The authors argue that in this group of speakers' minds, there is an unpronounced there after the verb. So for them There it's! doesn't violate the rule that it is can't contract to it's next to a gap left by moving something, because nothing did move. There is a silent there after the it's -- we could write it as There it's there!

This article shows that the general rules on contraction in English really are general. But more importantly, it demonstrates that these rules make reference to very abstract differences in grammatical structure: there is different from a gap __, even though they are both silent. What looked like a peculiar feature of Scots dialects turns out to provide evidence for speakers' unconscious knowledge of differences in structure between sentences, differences which are not directly perceived.

Credit: 
Linguistic Society of America

Rethinking seizures associated with cardiac disease

image: As suggested by its name, mutations in the gene seizure (sei for short) cause flies to become highly sensitive to heat stress. When ambient temperature goes up rapidly, wild type flies are able to escape these unfavorable conditions. In contrast, mutant flies are hypersensitive to heat and start seizing almost immediately. Hill et al. now show that the protective effect of sei comes from its activity in specific populations of neurons and glia cells in the fly brain. Shown are the neurons in the brain (top panel) and the ventral ganglion (bottom panel) (a structure homologous to the spinal cord), which express the sei protein (green). All other neurons are shown in magenta. The nuclei of all cells in the nerve cord are in blue.

Image: 
Yehuda Ben-Shahar, Washington University in St. Louis

Most people with a medical condition called long QT syndrome have a mutation in a gene that causes bouts of fast, chaotic heartbeats. They also experience fainting spells and seizures. The clinical approach has largely assumed that when the heart beats erratically, the brain eventually does not get enough oxygen -- which in turn causes the seizures.

Research from Washington University in St. Louis finds that mutations of a gene implicated in long QT syndrome in humans may trigger seizures because of their direct effects on certain classes of neurons in the brain -- independent from what the genetic mutations do to heart function. The new work from Arts & Sciences was conducted with fruit flies and is published August 8 in PLOS Genetics.

"This gene seems to be a key factor in the physiological process that protects neurons from starting to fire uncontrollably in response to a rapid increase in temperature, which could lead to paralysis and death," said Yehuda Ben-Shahar, associate professor of biology in Arts & Sciences.

Alexis Hill, recently a postdoctoral fellow in the Ben-Shahar laboratory, discovered this unexpected relationship as she probed the nervous system response to acute environmental stress.

Heat in general causes neurons to start firing faster, so the brain is particularly sensitive to overheating. Mammals and other large animals have ways to maintain their internal temperature and protect their brains from heat. But not the fruit fly. With no extra bulk in his tiny body, the only thing a fly can do to regulate temperature is to move from an uncomfortable spot to a comfortable one.

Ben-Shahar had previously published work showing flies that lack a gene called sei could not act to save themselves at temperatures above 25 degrees Celsius (77 Fahrenheit). They had no ability to buffer heat stress, and started having seizures as temperatures increased.

This gene sei -- named by other researchers who had previously discovered its role in seizure activity -- shows up in lots of places in fruit flies: in the neurons responsible for primary communication of both excitatory and inhibitory signals, in the glia cells of the nervous system that support neurons in various ways, and in the heart.

In their new work, Hill and Ben-Shahar were able to show that sei protects against heat-induced hyperexcitability only when it is expressed in a few particular classes of neurons and glia. Knocking down the gene in the heart had no effect on seizure activity.

"The ability of flies to resist the heat is in neurons that release neurotransmitters that make other neurons fire faster, the ones that excite neurons," Ben-Shahar said.

Surprisingly, the study also uncovered a protective role for sei in glia, the other primary cell of the nervous system. Glia have traditionally been overshadowed by the importance of neurons, but in recent years they have been emerging as equally important in maintaining healthy brain functions. The fact that this work identifies a protective role of an ion channel in glia further supports the idea that glia have much broader physiological functions in the nervous system and how it might respond to environmental challenges, the researchers said.

A careful look through the scientific literature reveals many references to seizure associated with long QT syndrome, which afflicts human beings with a genetic mutation to a sei-comparable gene called hERG.

But most clinical practitioners assume that these seizures are a secondary outcome of cardiovascular disease. Ben-Shahar hopes this soon will change.

"If you look at population statistics, there is a much higher incidence of seizures in long QT patients than in the general population," he said. "Because cardiovascular dysfunction can cause all kinds of problems, in the literature right now it is assumed that the seizures are secondary -- that because the people have a sick heart they end up developing seizures and other things.

"It's possible, based on our data, that it's two independent effects. Because if the mutation is affecting the function of the gene in the heart, it will affect the function in the neurons.

"And in flies, it's not going to kill neurons," Ben-Shahar said. "We know that we can completely eliminate this gene from the fly genome -- and flies will develop normally, mostly. Yet they become extremely sensitive to environmental (conditions). It's possible that that's exactly what's happening in people -- that it's completely independent."

Credit: 
Washington University in St. Louis