Culture

Untapped resource, or greenhouse gas threat, found below rifting axis off Okinawa coast

image: Researchers at Kyushu University have located a large gas reservoir below an axis of rifting based on an automated method for deriving seismic pressure wave velocity from seismic reflection data. The reservoir can be seen in this two-dimensional seismic velocity mapping, which spans a depth of about 3.5 km below sea level and a distance of about 6.5 km, as a dark-blue area of low velocity within green areas of higher velocity. Identification of the reservoir was possible because of the greatly enhanced resolution provided by the automated technique compared to the manual analysis methods used to date. Depending on the nature of the gas, which is likely mainly carbon dioxide, methane, or a mixture of the two, this reservoir found in the Okinawa Trough could be a potential natural resource or an environmental concern.

Image: 
Takeshi Tsuji, Kyushu University

Analyzing reflections of seismic pressure waves by the subseafloor geology off southwestern Japan, researchers at Kyushu University have found the first evidence of a massive gas reservoir where the Earth's crust is being separated. Depending on its nature, the trapped gas could be a potential untapped natural resource or a source of greenhouse gases waiting to escape, raising the need for awareness of similar reservoirs around the world.

While the ocean can seem calm on the surface, the ocean depths can experience intense thermal activity as hot magma seeps from locations where the Earth's upper layers are being pulled apart--a process called rifting. In such areas, elevated levels of carbon dioxide and methane gas can be present in the water, possibly escaping from magma or being produced by microbial organisms or the interaction of organic-rich sediment with hot water.

In a new study published in Geophysical Research Letters, researchers from Kyushu University's International Institute for Carbon-Neutral Energy Research (I2CNER) now report that some of these gases may actually get trapped underground, leading to the existence of a massive gas reservoir beneath the axis along which rifting is occurring in the Okinawa Trough.

To find the reservoir, the researchers analyzed measurements of how geological structures reflect seismic pressure waves generated by an acoustic source carried by a boat to the study area. Applying an automated calculation technique to this seismic data, they were able to create a two-dimensional map of the velocities at which the pressure waves travel through the ground with a much higher resolution than previous manual techniques.

"Seismic pressure waves generally travel more slowly through gases than through solids," explains study co-author Andri Hendriyana. "Thus, by estimating the velocity of seismic pressure waves through the ground, we can identify underground gas reservoirs and even get information on how saturated they are. In this case, we found low-velocity pockets along the rifting axis near Iheya North Knoll in the middle of the Okinawa Trough, indicating areas filled with gas."

At this stage, the researchers are still not sure if the reservoirs are mainly filled with carbon dioxide or methane. If methane, the gas could be a potential natural resource. However, both carbon dioxide and methane contribute to the greenhouse effect, so the rapid, uncontrolled release of either gas from such a large reservoir could have significant environmental implications.

"While many people focus on greenhouse gases made by humans, a huge variety of natural sources also exist," says corresponding author Takeshi Tsuji. "Large-scale gas reservoirs along a rifting axis may represent another source of greenhouse gases that we need to keep our eyes on. Or, they could turn out to be a significant natural resource."

As for how the gas is trapped, one possibility is that layers of impermeable sediment such as clay could prevent the gas from escaping porous underlying layers of materials such as pumice. Based on the flow of heat around the study area, the researchers think another possibility is that a low-permeability cap of methane hydrate--a methane-containing ice--acts as the lid.

"Zones like the one we investigated are not uncommon along rifts, so I expect that similar reservoirs may exist elsewhere in the Okinawa Trough as well as other sediment-covered continental back-arc basins around the world," explains Tsuji.

Credit: 
Kyushu University

Why is the brain disturbed by harsh sounds?

image: Smooth and rough sounds activate different brain networks. While smooth sounds induce responses mainly in the 'classical' auditory system, rough sounds activate a wider brain network involved in processing aversion and salience.

Image: 
© UNIGE

Why do the harsh sounds emitted by alarms or human shrieks grab our attention? What is going on in the brain when it detects these frequencies? Neuroscientists from the University of Geneva (UNIGE) and Geneva University Hospitals (HUG), Switzerland, have been analysing how people react when they listen to a range of different sounds, the aim being to establish the extent to which repetitive sound frequencies are considered unpleasant. The scientists also studied the areas inside the brain that were stimulated when listening to these frequencies. Surprisingly, their results - which are published in Nature Communications - showed not only that the conventional sound-processing circuit is activated but also that the cortical and sub-cortical areas involved in the processing of salience and aversion are also solicited. This is a first, and it explains why the brain goes into a state of alert on hearing this type of sound.

Alarm sounds, whether artificial (such as a car horn) or natural (human screams), are characterised by repetitive sound fluctuations, which are usually situated in frequencies of between 40 and 80 Hz. But why were these frequencies selected to signal danger? And what happens in the brain to hold our attention to such an extent? Researchers from UNIGE and HUG played repetitive sounds of between 0 and 250 Hz to 16 participants closer and closer together in order to define the frequencies that the brain finds unbearable. "We then asked participants when they perceived the sounds as being rough (distinct from each other) and when they perceived them as smooth (forming one continuous and single sound)," explains Luc Arnal, a researcher in the Department of Basic Neurosciences in UNIGE's Faculty of Medicine.

Based on the responses of participants, the scientists were able to establish that the upper limit of sound roughness is around 130 Hz. "Above this limit," continues Arnal, "the frequencies are heard as forming only one continuous sound." But why does the brain judge rough sounds to be unpleasant? In an attempt to answer this question, the neuroscientists asked participants to listen to different frequencies, which they had to classify on a scale of 1 to 5, 1 being bearable and 5 unbearable. "The sounds considered intolerable were mainly between 40 and 80 Hz, i.e. in the range of frequencies used by alarms and human screams, including those of a baby," says Arnal. Since these sounds are perceptible from a distance, unlike a visual stimulus, it is crucial that attention can be captured from a survival perspective. "That's why alarms use these rapid repetitive frequencies to maximise the chances that they are detected and gain our attention," says the researcher. In fact, when the repetitions are spaced less than about 25 milliseconds apart, the brain cannot anticipate them and therefore suppress them. It is constantly on alert and attentive to the stimulus.

Harsh sounds fall outside the conventional auditory system

The researchers then attempted to find out what actually happens in the brain: why are these harsh sounds so unbearable? "We used an intracranial EEG, which records brain activity inside the brain itself in response to sounds," explains Pierre Mégevand, a neurologist and researcher in the Department of Basic Neurosciences in the UNIGE Faculty of Medicine and at HUG.

When the sound is perceived as continuous (above 130 Hz), the auditory cortex in the upper temporal lobe is activated. "This is the conventional circuit for hearing," says Mégevand. But when sounds are perceived as harsh (especially between 40 and 80 Hz), they induce a persistent response that additionally recruits a large number of cortical and sub-cortical regions that are not part of the conventional auditory system. "These sounds solicit the amygdala, hippocampus and insula in particular, all areas related to salience, aversion and pain. This explains why participants experienced them as being unbearable," says Arnal, who was surprised to learn that these regions were involved in processing sounds.

This is the first time that sounds between 40 and 80 Hz have been shown to mobilise these neural networks, although the frequencies have been used for a long time in alarm systems. "We now understand at last why the brain can't ignore these sounds," says Arnal. "Something particular happens at these frequencies, and there are also many illnesses that show atypical brain responses to sounds at 40 Hz. These include Alzheimer's, autism and schizophrenia." The neuroscientists will now investigate the networks stimulated by these frequencies to see whether it could be possible to detect these illnesses early by soliciting the circuit activated by the sounds.

Credit: 
Université de Genève

Sponge-like action of circular RNA aids heart attack recovery, Temple-led team discovers

(Philadelphia, PA) - The human genetic blueprint is like a string of code. To follow it, the code, or DNA, is transcribed into shorter strings of RNA. While some of these shorter strings carry instructions for making proteins - the functional units of cells - most RNA is not involved in protein production. Among these noncoding RNAs are the recently discovered circular RNAs, so-named because of their unusual ring shape (most other RNAs are linear).

Circular RNAs, like other noncoding RNAs, were thought to be nonfunctional, but recent evidence suggests otherwise. Circular RNAs may in fact act like sponges to "soak up," or bind, other molecules, including microRNAs and proteins, and now, new work by researchers at the Lewis Katz School of Medicine at Temple University (LKSOM) and colleagues supports this idea. They describe, for the first time, a circular RNA that fills a critical role in tissue repair after heart attack, thanks to its ability to soak up harmful molecules.

The study was published online September 20 in the journal Nature Communications.

"We discovered that a circular RNA known as circFndc3b, when added therapeutically to the injured heart after surgically induced heart attack in mice, enhances cardiac repair and helps restore heart function," explained Raj Kishore, PhD, Professor of Pharmacology and Medicine and Director of the Stem Cell Therapy Program in the Center for Translational Medicine at LKSOM and senior investigator on the new report. "We attributed these effects of circFndc3b to its ability to function like a 'sponge,' binding a protein called FUS that mediates cell death and reduces vascular growth, which hinders heart tissue repair."

Dr. Kishore and colleagues focused their investigation on circFndc3b after finding that this particular circular RNA was significantly decreased in the heart in mice that had experienced a heart attack. "This observation led us to wonder whether the change in circFndc3b expression meant that it was important functionally in the heart," Dr. Kishore said.

To investigate this possibility, a gene product to induce circFndc3b overexpression was injected into the heart in mice after heart attack. Subsequent examination showed that within eight weeks of injection, treated mice experienced gains in heart function and in survival compared to their untreated counterparts. There was also evidence within heart tissue that new blood vessels had started to form, greatly aiding the tissue repair process.

The findings offer exciting insight into circular RNAs and the significance of their potential role as molecular sponges that limit the activity of damaging molecules. "CircFndc3b specifically soaked up an RNA binding protein that suppresses blood vessel formation," Dr. Kishore explained. "In doing so, it made way for new vessels to grow."

Dr. Kishore and colleagues are now in the process of developing a large animal model to further investigate the therapeutic potential of circFndc3b. The team also wants to begin analyzing plasma samples from patients just after heart attack to investigate whether specific circulating RNAs could serve as biomarkers for heart disease or injury and to get a better sense of their clinical significance.

Credit: 
Temple University Health System

Why the lettuce mitochondrial genome is like a chopped salad

image: Mitochondrial genomes are often depicted as rings. But new research from UC Davis confirms that at least for plant mitochondria, one ring does not rule them all. The lettuce mitochondrial genome is more like a chopped salad, with pieces that can be shuffled around, and often forms branching structures.

Image: 
Alex Kozik and Richard Michelmore, UC Davis

The mitochondrion, "the powerhouse of the cell." Somewhere back in the very distant past, something like a bacterium moved into another cell and never left, retaining some of its own DNA. For billions of years, mitochondria have passed from mother to offspring of most eukaryotic organisms, generating energy for the cell and playing roles in metabolism and programmed cell death.

All eukaryotes have them, but that does not mean that all mitochondria are the same. The mitochondria of plants are in fact quite different from those of animals. A new paper published Aug. 30 in PLOS Genetics by Alex Kozik, Beth Rowan and Richard Michelmore at the UC Davis Genome Center and Alan Christensen (who was on sabbatical at UC Davis from the University of Nebraska-Lincoln) shows just how different.

Plant mitochondrial genomes are larger and more complex than those in animal cells. They can vary considerably in size, sequence and arrangement, although the actual protein-coding sequences are conserved.

Plant mitochondrial genomes are almost universally believed to be single circular chromosomes (master circles) and scientific papers and textbooks typically show them as such, write Kozik and colleagues. But in recent years it has become evident that they are much more complex.

Chopped salad DNA

The researchers used long-read DNA sequencing and microscopy to work out the structure of mitochondrial genomes in domestic lettuce (Lactua sativa) and two wild relatives, L. saligna and L. serriola.

They convincingly show that plant mitochondrial DNA is a dynamic mixture of branching structures, ribbons, and rings. In fact, mitochondrial DNA was more often found as branched or linear structures than as circles and was never found as a large master circle.

While the genes themselves were conserved, blocks of genes were shuffled around. It's rather like a chopped salad: the basic ingredients can be tossed around in different combinations.

"Our data suggest that plant mitochondrial genomes should be presented as multiple sequence units showing their variable and dynamic connections, rather than as circles," the authors wrote. This correction of the widely-held notion of mitochondrial genome structure provides the foundation for future work on mitochondrial genome dynamics and evolution in diverse plants.

Credit: 
University of California - Davis

New Penn-developed vaccine prevents herpes in mice, guinea pigs

PHILADELPHIA - Researchers from the Perelman School of Medicine at the University of Pennsylvania have developed a vaccine to protect against genital herpes. Tested on both mice and guinea pigs, the immunization led to "mostly sterilizing immunity" from the virus--the strongest type of immunity. The results of the study are published today in Science Immunology.

In the study, researchers delivered the Penn-developed vaccine to 64 mice and then exposed them to genital herpes. After 28 days, 63 of the mice were found to have sterilizing immunity, meaning there was no trace of herpes infection or disease after the exposure. The one remaining mouse developed dormant infection without any prior genital disease. Similarly, 10 guinea pigs, which have responses to herpes infections that more closely resemble that of humans, were also given the vaccine and exposed to the virus. No animal developed genital lesions and only two showed any evidence that they became infected, but the infection was not in a form that animals could transmit the virus.

"We're extremely encouraged by the substantial immunizing effect our vaccine had in these animal models," said the study's principal investigator Harvey Friedman, MD, a professor of Infectious Diseases. "Based on these results, it is our hope that this vaccine could be translated into human studies to test both the safety and efficacy of our approach."

Building on the approaches of many cutting-edge cancer and immunotherapy researchers, the Penn team filled their vaccine with specific messenger RNA (mRNA), which can create proteins necessary for a strong immune response. This vaccine stimulates three types of antibodies: one that blocks the herpes virus from entering cells, and two others that ensure the virus doesn't "turn off" innate immune system protective functions. This approach differs from other herpes vaccines, which often only rely on blocking virus's entry as the mode to attack the virus.

This vaccine was developed with funding from NIH and as part of a collaboration between Penn Medicine and BioNTech, entered into in October 2018, to research and develop mRNA vaccines for various infectious diseases.

Genital herpes, also called Herpes simplex virus type 2 or HSV-2, is the most common sexually-transmitted disease. Approximately 14 percent of Americans ages 14 to 59, and 11 percent of people in the same age range across the world, are infected. HSV-2 may lead to painful sores, which can spread to other areas of the body. The virus increases one's risk of contracting HIV, and infected pregnant women may pass herpes onto their fetus, or more commonly, to their baby during delivery.

"Along with physical symptoms, HSV-2 takes an emotional toll," said Friedman. "People worry over transmission of the disease, and it can certainly have a negative effect on intimate relationships."

Since herpes is so widespread but also often undetected, as it is only visible during an outbreak, researchers say a successful vaccine would be invaluable to many adults across the world.

Credit: 
University of Pennsylvania School of Medicine

Oral health effects of tobacco products: Science and regulatory policy proceedings

Alexandria, VA, USA - On September 14, 2018 AADR held the "Oral Health Effects of Tobacco Products: Science and Regulatory Policy" meeting in Bethesda, Maryland, USA. The papers resulting from this conference are published in the latest issue of Advances in Dental Research, an e-Supplement to the Journal of Dental Research (JDR).

As the primary route of delivery, the oral cavity is particularly sensitive to harmful exposure from tobacco products. Tobacco product use has been linked to oral cancer, periodontal disease and tooth loss. During the conference, researchers also presented data on the effect of tobacco use on immunity and the oral microbiome. This conference was especially timely given the rapidly evolving landscape of tobacco use in the United States, which is simultaneously seeing the lowest level of adult cigarette use since 1965 and the emergence of novel nicotine delivery systems, such as e-cigarettes, for which little is currently known about the long-term health effects.

The goal of the conference was to bring the oral health effects of tobacco products to the attention of regulators, public health professionals, healthcare providers, researchers and ultimately, the public with the hope that the information presented would promote cessation or deter initiation among current or potential tobacco users, respectively.

The Oral Health Effects of Tobacco Products: Science and Regulatory Policy conference reviewed the effects tobacco products have on oral health, providing a robust scientific base that included the importance of oral health in overall health. The conference, summarized in these proceedings, was organized into five sessions focused on tobacco products regulated by the FDA -- Perspectives on Tobacco Regulatory Policy, Combusted Tobacco (Inhaled and non-inhaled) Products, Non-combusted Tobacco (Smokeless Tobacco), Novel Nicotine Delivery Systems and In Vitro Models, Standards and Experimental Methods -- and concluded with a discussion of the role of dentistry in tobacco use cessation.

"Although the adverse effects of conventional tobacco products on various oral health outcomes are well established, much remains unknown about the oral health implication of novel tobacco products such as electronic nicotine delivery systems," said guest editor Scott Tomar, University of Florida, Gainesville, USA. "There is a great need for research on the clinical and public health effects of these products and their underlying mechanisms, and an urgent need for behavioral and regulatory science research around conventional and novel tobacco products."

Credit: 
International Association for Dental, Oral, and Craniofacial Research

Anthropologist contributes to major study of large animal extinction

image: This is Amelia Villasenor, University of Arkansas.

Image: 
University of Arkansas / University Relations

As part of an international research group based at the Smithsonian Museum of Natural History, anthropology assistant professor Amelia Villaseñor contributed to a large, multi-institutional study explaining how the human-influenced mass extinction of giant carnivores and herbivores of North America fundamentally changed the biodiversity and landscape of the continent.

In their study published today in Science, researchers from Australia, the United States, Canada and Finland showed that humans shaped the processes underlying how species co-existed for the last several thousand years. Smaller, surviving animals such as deer changed their ecological interactions, the researchers found, causing ecological upheaval across the continent.

The researchers' work has implications for conservation of today's remaining large animals, now threatened by another human-led mass extinction.

The study's primary author is Anikó Tóth at Macquarie University in Sydney, Australia. Tóth collaborated with Villaseñor and several other researchers at the Smithsonian's Evolution of Terrestrial Ecosystems Program, as well as researchers at other institutions.

Tóth and the co-authors focused on how large mammals were distributed across the continent in the Pleistocene and Holocene geological epochs. (The Pleistocene Epoch occurred from about 2.5 million to 11,700 years ago. Starting at the end of the Pleistocene, the Holocene is the current geological epoch.) To do this, the researchers analyzed how often pairs of species were found living in the same community or in different communities.

To rule out community changes that were the result of reduced diversity or lost associations involving extinct species, the researchers analyzed only those pairs in which both species survived. Prior to the extinction, co-occurrence was more common. After extinction, segregations were more common.

Villaseñor's research focuses on human fossil remains as a way to understand how human ancestors interacted with mammal communities for the last 3.5 million years. Her more recent research explores how modern humans have shaped today's ecosystems.

"Rather than thinking of humans as separate from 'natural' environments, our research has illuminated the major impacts that humans have had on the ecosystem for many thousands of years," Villaseñor said. "The results of this paper and others from our group illuminate the outsized impacts that human-mediated extinction has had in North America."

By the end of the Late Pleistocene in North America, roughly 11,000 years ago, humans contributed to the extinction of large mammals, including mammoths and sabre-toothed cats. Recent work, driven by today's crisis in biodiversity, has looked at understanding the ecological and evolutionary legacies of this event. There was ecological transformation across the continent - the mammoth steppe disappeared, vegetation and fire regimes changed and large carnivores were lost.

Credit: 
University of Arkansas

World's first gene therapy for glycogen storage disease produces remarkable results

image: Jerrod Watts, a patient enrolled in the first clinical trial for Glycogen Storage Disease, chats with lead investigator Dr. David Weinsten. Photos taken in the dedicated Glycogen Storage Disease Unit at UConn Health.

Image: 
Tina Encarnacion/UConn Health

At the Association for Glycogen Storage Disease's 41st Annual Conference, Dr. David Weinstein of UConn School of Medicine and Connecticut Children's presented his groundbreaking, one-year clinical trial results for the novel gene therapy treatment for glycogen storage disease (GSD).

The rare and deadly genetic liver disorder, GSD type Ia, affects children from infancy through adulthood, causing dangerously low blood sugar levels and constant dependence on glucose consumption in the form of cornstarch every few hours for survival. If a cornstarch dose is missed, the disease can lead to seizures and even death.

Weinstein, whose team first administered the investigational gene therapy at UConn John Dempsey Hospital in Farmington, Connecticut, on July 24, 2018, calls the results "remarkable."

One year after patient Jerrod Watts first received the GSD vaccine during a 30-minute infusion, he is completely off of cornstarch. In addition to totally stopping daily cornstarch consumption, Watts has experienced normal regulation of his blood glucose levels, weight loss, increased muscle strength, and marked improvement in his energy.

"The clinical trial is going better than expected. The therapy is transforming patient lives," says Weinstein. "We have seen all of the patients wean their therapy with some already discontinuing treatment. Missed cornstarch doses no longer are resulting in hypoglycemia, which previously could have been life threatening."

Weinstein, the clinical trial's lead investigator, is pediatric endocrinologist-scientist who cares for more than 700 GSD patients from 51 countries as director of the Glycogen Storage Disease Program at Connecticut Children's and UConn Health -- the largest center in the world for the care and treatment of this condition.

The clinical trial, conducted in conjunction with the biopharmaceutical company Ultragenyx, originally set out to simply test the safety and dosage of the gene therapy for three patients with GSD Type Ia.

The gene therapy works by delivering a new copy of a gene to the liver via a naturally occurring virus. Administered through the patient's bloodstream, the new copy replaces deficient sugar enzymes caused by the disease and jump starts the body's glucose control.

Both Weinstein and Watts were surprised by Watts' response to the gene therapy.

"We were just making sure it was safe for humans to take, that was our initial goal. The results are way beyond my imagination," says Weinstein. "The patients who are finishing the first year got what we all thought was a test dose -- one-third strength -- yet the response has been dramatic. They can now go through the night without any treatment and they wake up clinically well."

Prior to the treatment, Watts was consuming more than 400 grams of cornstarch per day. The GSD Program's multidisciplinary team at Connecticut Children's provides comprehensive clinical care to patients, while the program's research laboratories and clinical trial are located at UConn School of Medicine at UConn Health.

"The main thing I want to do is inspire hope. One of the biggest reliefs from this gene therapy is I can now sleep through the night without worrying about dying in the middle of the night. I wake up 6 to 7 hours later with normal blood sugar."

"I'm living, breathing proof that there is a light at the end of the tunnel with GSD. I'm a completely different person now that I was a year ago. I feel like I can live a normal life and I can do anything I want to do now," says Watts.

His message to other patients: "Please don't give up hope."

In addition to Watts, two other clinical trial cohort patients are seeing promising results on the lower cornstarch daily regimens. All three will participate in the next phase -- a 4-year follow-up clinical trial study. In addition, three patients are enrolled in clinical trial testing a higher gene therapy dose.

"What's exciting is if it works this well with the low dose, what does the future hold?," says Weinstein." If the patients are already coming off cornstarch and the labs are getting better, we just hope it will be even faster and more dramatic with the higher dose."

"This gene therapy treatment is something we've worked on for 21 years. This means so much to the glycogen storage disease community," says Weinstein. "To see patients off cornstarch and doing so well really is a culmination of an incredible journey ... We feel like we're living history."

Credit: 
University of Connecticut

Evolution of learning is key to better artificial intelligence

image: This is Anselmo Pontes, lead author and computer science researcher.

Image: 
Michigan State University

Since "2001: A Space Odyssey," people have wondered: could machines like HAL 9000 eventually exist that can process information with human-like intelligence?

Researchers at Michigan State University say that true, human-level intelligence remains a long way off, but their new paper published in The American Naturalist explores how computers could begin to evolve learning in the same way as natural organisms did - with implications for many fields, including artificial intelligence.

"We know that all organisms are capable of some form of learning, we just weren't sure how those abilities first evolved. Now we can watch these major evolutionary events unfold before us in a virtual world," said Anselmo Pontes, MSU computer science researcher and lead author. "Understanding how learning behavior evolved helps us figure out how it works and provides insights to other fields such as neuroscience, education, psychology, animal behavior, and even AI. It also supplies clues to how our brains work and could even lead to robots that learn from experiences as effectively as humans do."

According to Fred Dyer, MSU integrative biology professor and co-author, these findings have the potential for huge implications.

"We're untangling the story of how our own cognition came to be and how that can shape the future," Dyer said. "Understanding our own origins can lead us to developing robots that can watch and learn rather than being programmed for each individual task."

The results are the first demonstration that shows the evolution of associative learning in an artificial organism without a brain. Here is a video showing the process.

"Our inspiration was the way animals learn landmarks and use them to navigate their environments," Pontes said. "For example, in laboratory experiments, honeybees learn to associate certain colors or shapes with directions and navigate complex mazes."

Since the evolution of learning cannot be observed through fossils - and would take more than a lifetime to watch in nature - the MSU interdisciplinary team composed of biologists and computer scientists used a digital evolution program that allowed them to observe tens of thousands of generations of evolution in just a few hours, a feat unachievable with living systems.

In this case, organisms evolved to learn and use environmental signals to help them navigate the environment and find food.

"Learning is crucial to most behaviors, but we couldn't directly observe how learning got started in the first place from our purely instinctual ancestors," Dyer said. "We built in various selection pressures that we thought might play a role and watched what happened in the computer."

While the environment was simulated, the evolution was real. The programs that controlled the digital organism were subject to genetic variation from mutation, inheritance and competitive selection. Organisms were tasked to follow a trail alongside signals that - if interpreted correctly - pointed where the path went next.

In the beginning of the simulation, organisms were "blank slates," incapable of sensing, moving or learning. Every time an organism reproduced, its descendants could suffer mutations that changed their behavior. Most mutations were lethal. Some did nothing. But the rare traits that allowed an organism to better follow the trail resulted in the organism collecting more resources, reproducing more often and, thus, gaining share in the population.

Over the generations, organisms evolved more and more complex behaviors. First came simple movements allowing them to stumble into food. Next was the ability to sense and distinguish different types of signals, followed by the reflexive ability to correct errors, such as trying an incorrect path, backing up and trying another.

A few organisms evolved the ability to learn by association. If one of these organisms made a wrong turn it would correct the error, but it would also learn from that mistake and associate the specific signal it saw with the direction it now knew it should have gone. From then on, it would navigate the entire trail without any further mistakes. Some organisms could even relearn when tricked by switching signals mid-trail.

"Evolution in nature might take too long to study," Pontes said, "but evolution is just an algorithm, so it can be replicated in a computer. We were not just able to see how certain environments fostered the evolution of learning, but we saw populations evolve through the same behavioral phases that previous scientists speculated should happen but didn't have the technology to see."

Credit: 
Michigan State University

NASA finds a tiny tropical storm Kiko

image: On Sept. 19, the MODIS instrument that flies aboard NASA's Terra satellite provided this image of Tropical Storm Kiko moving toward the Central Pacific Ocean.

Image: 
NASA Worldview

NASA's Terra satellite is one in a fleet of NASA satellites that provide data for research. Terra captured an image of Tropical Storm Kiko in the Eastern Pacific Ocean which showed the extent of the small storm.

On Sept. 19, the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard Terra provided a visible image on Kiko. The image showed that the storm is compact. Tropical-storm-force winds only extend outward up to 45 miles (75 km) from the center making the storm about 90 miles (150km) in diameter.

Since the MODIS image, a pair of microwave satellite images between (5 a.m. and 7 a.m. EDT) 0900 and 1100 UTC on Sept. 20 revealed that Kiko has redeveloped a well-defined low-level inner circulation. NOAA's National Hurricane Center (NHC) said, "However, most of the deep convection (strongest thunderstorms) associated with the tropical storm is located northeast of the center, a result of moderate southwesterly [wind] shear."

At 11 a.m. EDT (1500 UTC), the center of Tropical Storm Kiko was located near latitude 17.7 degrees north and longitude 130.2 degrees west. That puts the center about 1,360 miles (2,190 km) west-southwest of the southern tip of Baja California, Mexico. Kiko is moving toward the north-northwest at near 6 mph (9 km/h). A turn toward the west is expected tonight, followed by a turn toward the west-southwest over the weekend. Maximum sustained winds have increased to near 60 mph (95 kph) with higher gusts. Slight additional strengthening is possible today, but only small changes in intensity are expected during the next several days. The estimated minimum central pressure is 999 millibars.

Hurricanes are the most powerful weather event on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

Credit: 
NASA/Goddard Space Flight Center

Wearable brain-machine interface could control a wheelchair, vehicle or computer

image: Test subject who has flexible wireless electronics conformed to the back of the neck, with dry hair electrodes under a fabric headband and a membrane electrode on the mastoid, connected with thin-film cables.

Image: 
Courtesy Woon-Hong Yeo

Combining new classes of nanomembrane electrodes with flexible electronics and a deep learning algorithm could help disabled people wirelessly control an electric wheelchair, interact with a computer or operate a small robotic vehicle without donning a bulky hair-electrode cap or contending with wires.

By providing a fully portable, wireless brain-machine interface (BMI), the wearable system could offer an improvement over conventional electroencephalography (EEG) for measuring signals from visually evoked potentials in the human brain. The system's ability to measure EEG signals for BMI has been evaluated with six human subjects, but has not been studied with disabled individuals.

The project, conducted by researchers from the Georgia Institute of Technology, University of Kent and Wichita State University, was reported on September 11 in the journal Nature Machine Intelligence.

"This work reports fundamental strategies to design an ergonomic, portable EEG system for a broad range of assistive devices, smart home systems and neuro-gaming interfaces," said Woon-Hong Yeo, an assistant professor in Georgia Tech's George W. Woodruff School of Mechanical Engineering and Wallace H. Coulter Department of Biomedical Engineering. "The primary innovation is in the development of a fully integrated package of high-resolution EEG monitoring systems and circuits within a miniaturized skin-conformal system."

BMI is an essential part of rehabilitation technology that allows those with amyotrophic lateral sclerosis (ALS), chronic stroke or other severe motor disabilities to control prosthetic systems. Gathering brain signals known as steady-state virtually evoked potentials (SSVEP) now requires use of an electrode-studded hair cap that uses wet electrodes, adhesives and wires to connect with computer equipment that interprets the signals.

Yeo and his collaborators are taking advantage of a new class of flexible, wireless sensors and electronics that can be easily applied to the skin. The system includes three primary components: highly flexible, hair-mounted electrodes that make direct contact with the scalp through hair; an ultrathin nanomembrane electrode; and soft, flexible circuity with a Bluetooth telemetry unit. The recorded EEG data from the brain is processed in the flexible circuitry, then wirelessly delivered to a tablet computer via Bluetooth from up to 15 meters away.

Beyond the sensing requirements, detecting and analyzing SSVEP signals have been challenging because of the low signal amplitude, which is in the range of tens of micro-volts, similar to electrical noise in the body. Researchers also must deal with variation in human brains. Yet accurately measuring the signals is essential to determining what the user wants the system to do.

To address those challenges, the research team turned to deep learning neural network algorithms running on the flexible circuitry.

"Deep learning methods, commonly used to classify pictures of everyday things such as cats and dogs, are used to analyze the EEG signals," said Chee Siang (Jim) Ang, senior lecturer in Multimedia/Digital Systems at the University of Kent. "Like pictures of a dog which can have a lot of variations, EEG signals have the same challenge of high variability. Deep learning methods have proven to work well with pictures, and we show that they work very well with EEG signals as well."

In addition, the researchers used deep learning models to identify which electrodes are the most useful for gathering information to classify EEG signals. "We found that the model is able to identify the relevant locations in the brain for BMI, which is in agreement with human experts," Ang added. "This reduces the number of sensors we need, cutting cost and improving portability."

The system uses three elastomeric scalp electrodes held onto the head with a fabric band, ultrathin wireless electronics conformed to the neck, and a skin-like printed electrode placed on the skin below an ear. The dry soft electrodes adhere to the skin and do not use adhesive or gel. Along with ease of use, the system could reduce noise and interference and provide higher data transmission rates compared to existing systems.

The system was evaluated with six human subjects. The deep learning algorithm with real-time data classification could control an electric wheelchair and a small robotic vehicle. The signals could also be used to control a display system without using a keyboard, joystick or other controller, Yeo said.

"Typical EEG systems must cover the majority of the scalp to get signals, but potential users may be sensitive about wearing them," Yeo added. "This miniaturized, wearable soft device is fully integrated and designed to be comfortable for long-term use."

Next steps will include improving the electrodes and making the system more useful for motor-impaired individuals.

"Future study would focus on investigation of fully elastomeric, wireless self-adhesive electrodes that can be mounted on the hairy scalp without any support from headgear, along with further miniaturization of the electronics to incorporate more electrodes for use with other studies," Yeo said. "The EEG system can also be reconfigured to monitor motor-evoked potentials or motor imagination for motor-impaired subjects, which will be further studied as a future work on therapeutic applications."

Long-term, the system may have potential for other applications where simpler EEG monitoring would be helpful, such as in sleep studies done by Audrey Duarte, an associate professor in Georgia Tech's School of Psychology.

"This EEG monitoring system has the potential to finally allow scientists to monitor human neural activity in a relatively unobtrusive way as subjects go about their lives," she said. "For example, Dr. Yeo and I are currently using a similar system to monitor neural activity while people sleep in the comfort of their own homes, rather than the lab with bulky, rigid, uncomfortable equipment, as is customarily done. Measuring sleep-related neural activity with an imperceptible system may allow us to identify new, non-invasive biomarkers of Alzheimer's-related neural pathology predictive of dementia."

In addition to those already mentioned, the research team included Musa Mahmood, Yun-Soung Kim, Saswat Mishra, and Robert Herbert from Georgia Tech; Deogratias Mzurikwao from the University of Kent; and Yongkuk Lee from Wichita State University.

Credit: 
Georgia Institute of Technology

Temple researchers identify new target regulating mitochondria during stress

(Philadelphia, PA) - Like an emergency response team that is called into action to save lives, stress response proteins in the heart are activated during a heart attack to help prevent cell death. As part of this process, Lewis Katz School of Medicine at Temple University researchers show for the first time that one of these specialized emergency responder proteins, known as MCUB, temporarily decreases harmful levels of calcium transport into mitochondria, the energy-generating batteries of cells.

The new research, published online September 19 in the journal Circulation, identifies MCUB as a promising new target for the investigation and treatment of conditions that feature calcium overload and cell death - conditions that include heart failure, heart attack, stroke, and neurodegeneration.

"MCUB fine-tunes calcium uptake by mitochondria in injured heart tissue, in an attempt to limit calcium overload, which is a major contributor to cell death, particularly following a heart attack," explained John W. Elrod, PhD, Associate Professor in the Center for Translational Medicine at the Temple University Lewis Katz School of Medicine and senior investigator on the new study.

Calcium homeostasis is vital to a number of day-to-day cellular activities and is regulated primarily by mitochondria. For calcium to enter mitochondria, it passes through a channel known as the mitochondrial calcium uniporter (MCU), which resides in the inner mitochondrial membrane where it stimulates the production of ATP, the energy currency of the cell. The amount of calcium that mitochondria take up is regulated by various components of this channel. While MCUB closely resembles the pore forming subunit, MCU, its precise role in calcium regulation is largely unknown, particularly in the context of disease.

Dr. Elrod's team found that deletion of the MCUB gene in cells results in a change in the proteins that make up the calcium channel and that are essential for controlling whether the channel is on or off. Since these alterations are induced by stress, such as heart cell injury, the researchers next investigated the role of MCUB after heart attack in mice. In mice that suffered heart attack, the research team observed significant elevations in MCUB gene expression and decreases in MCU and the gatekeeper of the channel, MICU1. When genetically expressed prior to inducing a heart attack in mice, MCUB altered the channel to reduce calcium overload in the injured heart, ultimately curtailing tissue injury.

Dr. Elrod's team also found that, while it can improve cell survival after heart injury, increased MCUB activity comes at the expense of mitochondrial energy production. "MCUB induction is a compensatory change," explained Dr. Elrod. Just like an emergency responder, MCUB moves in and tries to reduce cell death and aid cell survival - however, the reduction in mitochondrial calcium uptake is also maladaptive and limits the cell's ability to increase energy during stress.

"MCUB presents us with a new molecular target for investigation," Dr. Elrod said. "It's unique in that it alters the stoichiometry of the channel and thereby presents a new mechanism which may be amendable to therapeutic manipulation. We think that modulating MCUB may allow us to tune down mitochondrial calcium uptake without completely inhibiting all energetic function."

It is hoped that follow-up studies defining the exact sites of molecular interaction will provide additional insight into how to target mitochondrial calcium overload in heart disease.

Credit: 
Temple University Health System

Did a common childhood illness take down the neanderthals?

image: This illustration shows the structure of the Eustachian Tube in Neanderthal Man and it's similarity to the human infant.

Image: 
SUNY Downstate Health Sciences University

BROOKLYN, NY - It is one of the great unsolved mysteries of anthropology. What killed off the Neanderthals, and why did Homo sapiens thrive even as Neanderthals withered to extinction? Was it some sort of plague specific only to Neanderthals? Was there some sort of cataclysmic event in their homelands of Eurasia that lead to their disappearance?

A new study from a team of physical anthropologists and head & neck anatomists suggests a less dramatic but equally deadly cause.

Published online by the journal, The Anatomical Record, the study, "Reconstructing the Neanderthal Eustachian Tube: New Insights on Disease Susceptibility, Fitness Cost, and Extinction"1 suggests that the real culprit in the demise of the Neanderthals was not some exotic pathogen.

Instead, the authors believe the path to extinction may well have been the most common and innocuous of childhood illnesses - and the bane of every parent of young children - chronic ear infections.

"It may sound far-fetched, but when we, for the first time, reconstructed the Eustachian tubes of Neanderthals, we discovered that they are remarkably similar to those of human infants," said coinvestigator and Downstate Health Sciences University Associate Professor Samuel Márquez, PhD, "Middle ear infections are nearly ubiquitous among infants because the flat angle of an infant's Eustachian tubes is prone to retain the otitis media bacteria that cause these infections - the same flat angle we found in Neanderthals."

In this age of antibiotics, these infections are easy to treat and relatively benign for human babies. Additionally, around age 5, the Eustachian tubes in human children lengthen and the angle becomes more acute, allowing the ear to drain, all but eliminating these recurring infections beyond early childhood.

But unlike modern humans, the structure of the Eustachian tubes in Neanderthals do not change with age - which means these ear infections and their complications, including respiratory infections, hearing loss, pneumonia, and worse, would not only become chronic, but a lifelong threat to overall health and survival.

"It's not just the threat of dying of an infection," said Dr. Márquez. "If you are constantly ill, you would not be as fit and effective in competing with your Homo sapien cousins for food and other resources. "In a world of survival of the fittest, it is no wonder that modern man, not Neanderthal, prevailed."

"The strength of the study lies in reconstructing the cartilaginous Eustachian tube," said Richard Rosenfeld, MD, MPH, MBA, Distinguished Professor and Chairman of Otolaryngology at SUNY Downstate and a world-renowned authority on children's health. "This new and previously unknown understanding of middle ear function in Neanderthal is what allows us to make new inferences regarding the impact on their health and fitness."

"Here is yet another intriguing twist on the ever-evolving Neanderthal story, this time involving a part of the body that researchers had almost entirely neglected," said Ian Tattersall, Ph.D., paleoanthropologist and Curator Emeritus of the American Museum of National History. "It adds to our gradually emerging picture of the Neanderthals as very close relatives who nonetheless differed in crucial respects from modern man."

Credit: 
SUNY Downstate Health Science University

For people with pre-existing liver disease, toxic algae may be more dangerous

image: Dr. David Kennedy (left) and Dr. Steven Haller recently published research that found microcystin, a toxin produced by blue-green algae, may be more dangerous than previously known for those with preexisting liver disease. Kennedy and Haller are assistant professors of medicine at The University of Toledo.

Image: 
Daniel Miller, The University of Toledo

Toxins produced during harmful algal blooms may be more harmful to people than previously known.

Researchers at The University of Toledo College of Medicine and Life Sciences sought out to examine how microcystin might affect individuals with non-alcoholic fatty liver disease, a widespread condition that is frequently asymptomatic. They found the toxin can significantly amplify the disease at levels below what would harm a healthy liver.

The study, published last month in the journal Toxins, follows earlier research from UToledo that found clear evidence that microcystin exposure worsens the severity of pre-existing colitis. Microcystin is a by-product of the cyanobacteria found in what is commonly known as blue-green algae.

"The take home message from our research is there are certain groups of people who need to pay extra attention and may be more susceptible to microcystin toxins. We may need to explore special preventative guidelines for those people in terms of how much microcystin they are exposed to through drinking water or other means," said Dr. David Kennedy, an assistant professor of medicine at UToledo and one of the study's lead authors.

Aided by nutrient runoff and warming waters, seasonal blooms of blue-green algae are flourishing across much of the United States. Not all algal blooms produce toxins, but many do.

Potentially dangerous concentrations of microcystin have been found this year in ponds in New York City's Central Park, along the Mississippi Gulf Coast, reservoirs in California, and a portion of Lake Erie's coastline near Toledo.

While no human deaths have been linked to microcystin in the United States, deaths have been reported elsewhere -- most notably among a group of kidney dialysis patients in Brazil. There also have been reports this year of pet dogs dying after exposure to blue-green algae in Texas, North Carolina and Georgia.

With annual blooms becoming more frequent and intense, researchers in the UToledo College of Medicine and Life Sciences wanted to better understand how the toxins might affect people already suffering from conditions that affect organ systems microcystin is known to attack, such as the liver.

"It's a gray area in terms of what microcystin is really doing to you if you have a pre-existing disease state. Are you more susceptible? Are we going to have to go back and revaluate what we consider safe in a person with a pre-existing disease state? It's important we start providing answers to these questions," said Dr. Steven Haller, UToledo assistant professor of medicine.

In the liver study, researchers examined how chronic, low-level exposure of microcystin affected mice with non-alcoholic fatty liver disease compared to mice with healthy livers.

At microcystin ingestion levels below the No Observed Adverse Effect Level for healthy mice, analysis showed significant exacerbation of liver damage in mice with fatty liver disease. Researchers observed no liver damage in mice who started the experiment with healthy livers.

"Current exposure limits from the World Health Organization and the U.S. Environmental Protection Agency for humans are based off studies done in healthy animals," Haller said. "The results of this study suggest there may be a need to review those guidelines for people with pre-existing conditions."

They also noted major differences in how microcystin was processed handled by the kidneys in the two test groups.

In mice with non-alcoholic fatty liver disease, elevated levels of microcystin were found in the blood plasma, but were not detectable in the plasma of healthy mice. Mice with non-alcoholic fatty liver disease also excreted far less microcystin in their urine.

The differences seen in how microcystin was processed between the two test groups suggests that kidney function may play an important role in the increased susceptibility of the mice with pre-existing liver disease.

"This may be highly relevant to help us understand the deaths that occurred in kidney dialysis patients, and point to the need to pay particular attention to at-risk patient populations as we design preventative, diagnostic and therapeutic strategies," Kennedy said.

The results from the liver study build on prior work from Kennedy and Haller looking at how microcystin exposure might affect individuals with inflammatory bowel disease, another common condition that impacts an estimated 1 million Americans.

In that study, published in June, the researchers demonstrated that exposure to MC-LR prolongs and worsens the severity of pre-existing colitis, contributing to significant weight loss, bleeding, and higher numbers of signaling molecules that cause inflammation.

"Based on this data we're coming up with insights into how we can potentially treat exposures if they do occur," Kennedy said. "This is giving us a number of insights into how we might help patients, especially patients who are vulnerable or susceptible if there was an exposure."

Credit: 
University of Toledo

New study reveals a strong link between vitamin D deficiency and increased mortality, especially diabetes-related deaths

New research presented at this year's Annual Meeting of the European Association for the Study of Diabetes (EASD) in Barcelona, Spain (16-20 Sept) reveals that vitamin D deficiency is strongly linked to increased mortality, especially in younger and middle-aged people, and is particularly associated with diabetes-related deaths.

The research was conducted by Dr Rodrig Marculescu and colleagues at the Medical University of Vienna, Austria. It analysed the effects of low 25-hydroxyvitamin D (25D) (referred to as vitamin D) levels in the blood on overall and cause-specific mortality in a large study cohort covering all age groups, and taken from a population with minimal vitamin D supplementation in old age.

Vitamin D deficiency is a widely prevalent and easily correctable risk factor for early death, and evidence for its link to mortality comes from numerous studies and clinical trials. The majority of this research to date has however come from looking at older populations, and the authors believe that many of the largest scale studies may have been affected by increased rates of vitamin D supplementation in old age. They also note: "Cause-specific mortalities and the impact of age on the association of vitamin D with the risk of death have not yet been reported in detail."

The researchers took their data from the records of all 78,581 patients (mean age 51.0 years, 31.5% male) who had a vitamin D (25D) measurement taken at the Department of Laboratory Medicine, General Hospital of Vienna between 1991 and 2011, which was then matched with the Austrian national register of deaths. The first 3 years of mortality following the vitamin D measurement were excluded from the analysis, and patients were followed for up to 20 years where possible, with a median follow-up of 10.5 years.

The authors used a blood level of vitamin D 50 nmol/L, a commonly used cut-off value for vitamin D deficiency, as their reference value to which other levels would be compared, and set their low and high levels for which risk would be calculated at 10 nmol/L and 90 nmol/L respectively.

The study found that vitamin D levels of 10 nmol/L or less were associated with a 2-3 fold increase in risk of death, with the largest effect being observed in patients aged 45 to 60 years (2.9 times increased risk). Levels of 90 nmol/L or greater were associated with a reduction in all-cause mortality of 30-40%, again with the largest effect being found in the 45 to 60-years-old age group (a 40% reduction in risk). No statistically significant associations between vitamin D levels and mortality were observed in patients over the age of 75

With regard to cause-specific mortality, the authors were surprised to find the strongest associations of vitamin D were with causes of death other than cardiovascular disease and cancer. Differences between the age groups were even more pronounced for these causes of death and, again, the largest effect was found in patients aged 45 to 60 years. Further subdivision of these non-cardiovascular, non-cancer causes of death revealed the largest effect of vitamin for diabetes with a 4.4 times higher risk of death from the disease in the vitamin D deficient group (less than or equal to 50 nmol/L) than for study participants whose serum vitamin D was above 50 nmol/L.

Plotting the risk of death according to vitamin D level in the various subgroups did not support a risk resurgence at higher vitamin D levels above 100 nmol/L. The authors say this further diminishes concerns about a possible negative effect of vitamin D in the higher concentration range, as have been shown in some previous studies reporting "inverse J-shaped" risk association (meaning risk decreased to a certain level of vitamin D and then started increasing again at higher levels).

The team conclude: "Our survival data from a large cohort, covering all age groups, from a population with minimal vitamin D supplementation at old age, confirm a strong association of vitamin D deficiency (under 50 nmol/L) with increased mortality. This association is most pronounced in the younger and middle-aged groups and for causes of deaths other than cancer and cardiovascular disease, especially diabetes."

The researchers go on to suggest that: "Our findings strengthen the rationale for widespread vitamin D supplementation to prevent premature mortality, emphasize the need for it early in life and mitigate concerns about a possible negative effect at higher levels."

Credit: 
Diabetologia