Culture

Why the lettuce mitochondrial genome is like a chopped salad

image: Mitochondrial genomes are often depicted as rings. But new research from UC Davis confirms that at least for plant mitochondria, one ring does not rule them all. The lettuce mitochondrial genome is more like a chopped salad, with pieces that can be shuffled around, and often forms branching structures.

Image: 
Alex Kozik and Richard Michelmore, UC Davis

The mitochondrion, "the powerhouse of the cell." Somewhere back in the very distant past, something like a bacterium moved into another cell and never left, retaining some of its own DNA. For billions of years, mitochondria have passed from mother to offspring of most eukaryotic organisms, generating energy for the cell and playing roles in metabolism and programmed cell death.

All eukaryotes have them, but that does not mean that all mitochondria are the same. The mitochondria of plants are in fact quite different from those of animals. A new paper published Aug. 30 in PLOS Genetics by Alex Kozik, Beth Rowan and Richard Michelmore at the UC Davis Genome Center and Alan Christensen (who was on sabbatical at UC Davis from the University of Nebraska-Lincoln) shows just how different.

Plant mitochondrial genomes are larger and more complex than those in animal cells. They can vary considerably in size, sequence and arrangement, although the actual protein-coding sequences are conserved.

Plant mitochondrial genomes are almost universally believed to be single circular chromosomes (master circles) and scientific papers and textbooks typically show them as such, write Kozik and colleagues. But in recent years it has become evident that they are much more complex.

Chopped salad DNA

The researchers used long-read DNA sequencing and microscopy to work out the structure of mitochondrial genomes in domestic lettuce (Lactua sativa) and two wild relatives, L. saligna and L. serriola.

They convincingly show that plant mitochondrial DNA is a dynamic mixture of branching structures, ribbons, and rings. In fact, mitochondrial DNA was more often found as branched or linear structures than as circles and was never found as a large master circle.

While the genes themselves were conserved, blocks of genes were shuffled around. It's rather like a chopped salad: the basic ingredients can be tossed around in different combinations.

"Our data suggest that plant mitochondrial genomes should be presented as multiple sequence units showing their variable and dynamic connections, rather than as circles," the authors wrote. This correction of the widely-held notion of mitochondrial genome structure provides the foundation for future work on mitochondrial genome dynamics and evolution in diverse plants.

Credit: 
University of California - Davis

New Penn-developed vaccine prevents herpes in mice, guinea pigs

PHILADELPHIA - Researchers from the Perelman School of Medicine at the University of Pennsylvania have developed a vaccine to protect against genital herpes. Tested on both mice and guinea pigs, the immunization led to "mostly sterilizing immunity" from the virus--the strongest type of immunity. The results of the study are published today in Science Immunology.

In the study, researchers delivered the Penn-developed vaccine to 64 mice and then exposed them to genital herpes. After 28 days, 63 of the mice were found to have sterilizing immunity, meaning there was no trace of herpes infection or disease after the exposure. The one remaining mouse developed dormant infection without any prior genital disease. Similarly, 10 guinea pigs, which have responses to herpes infections that more closely resemble that of humans, were also given the vaccine and exposed to the virus. No animal developed genital lesions and only two showed any evidence that they became infected, but the infection was not in a form that animals could transmit the virus.

"We're extremely encouraged by the substantial immunizing effect our vaccine had in these animal models," said the study's principal investigator Harvey Friedman, MD, a professor of Infectious Diseases. "Based on these results, it is our hope that this vaccine could be translated into human studies to test both the safety and efficacy of our approach."

Building on the approaches of many cutting-edge cancer and immunotherapy researchers, the Penn team filled their vaccine with specific messenger RNA (mRNA), which can create proteins necessary for a strong immune response. This vaccine stimulates three types of antibodies: one that blocks the herpes virus from entering cells, and two others that ensure the virus doesn't "turn off" innate immune system protective functions. This approach differs from other herpes vaccines, which often only rely on blocking virus's entry as the mode to attack the virus.

This vaccine was developed with funding from NIH and as part of a collaboration between Penn Medicine and BioNTech, entered into in October 2018, to research and develop mRNA vaccines for various infectious diseases.

Genital herpes, also called Herpes simplex virus type 2 or HSV-2, is the most common sexually-transmitted disease. Approximately 14 percent of Americans ages 14 to 59, and 11 percent of people in the same age range across the world, are infected. HSV-2 may lead to painful sores, which can spread to other areas of the body. The virus increases one's risk of contracting HIV, and infected pregnant women may pass herpes onto their fetus, or more commonly, to their baby during delivery.

"Along with physical symptoms, HSV-2 takes an emotional toll," said Friedman. "People worry over transmission of the disease, and it can certainly have a negative effect on intimate relationships."

Since herpes is so widespread but also often undetected, as it is only visible during an outbreak, researchers say a successful vaccine would be invaluable to many adults across the world.

Credit: 
University of Pennsylvania School of Medicine

Oral health effects of tobacco products: Science and regulatory policy proceedings

Alexandria, VA, USA - On September 14, 2018 AADR held the "Oral Health Effects of Tobacco Products: Science and Regulatory Policy" meeting in Bethesda, Maryland, USA. The papers resulting from this conference are published in the latest issue of Advances in Dental Research, an e-Supplement to the Journal of Dental Research (JDR).

As the primary route of delivery, the oral cavity is particularly sensitive to harmful exposure from tobacco products. Tobacco product use has been linked to oral cancer, periodontal disease and tooth loss. During the conference, researchers also presented data on the effect of tobacco use on immunity and the oral microbiome. This conference was especially timely given the rapidly evolving landscape of tobacco use in the United States, which is simultaneously seeing the lowest level of adult cigarette use since 1965 and the emergence of novel nicotine delivery systems, such as e-cigarettes, for which little is currently known about the long-term health effects.

The goal of the conference was to bring the oral health effects of tobacco products to the attention of regulators, public health professionals, healthcare providers, researchers and ultimately, the public with the hope that the information presented would promote cessation or deter initiation among current or potential tobacco users, respectively.

The Oral Health Effects of Tobacco Products: Science and Regulatory Policy conference reviewed the effects tobacco products have on oral health, providing a robust scientific base that included the importance of oral health in overall health. The conference, summarized in these proceedings, was organized into five sessions focused on tobacco products regulated by the FDA -- Perspectives on Tobacco Regulatory Policy, Combusted Tobacco (Inhaled and non-inhaled) Products, Non-combusted Tobacco (Smokeless Tobacco), Novel Nicotine Delivery Systems and In Vitro Models, Standards and Experimental Methods -- and concluded with a discussion of the role of dentistry in tobacco use cessation.

"Although the adverse effects of conventional tobacco products on various oral health outcomes are well established, much remains unknown about the oral health implication of novel tobacco products such as electronic nicotine delivery systems," said guest editor Scott Tomar, University of Florida, Gainesville, USA. "There is a great need for research on the clinical and public health effects of these products and their underlying mechanisms, and an urgent need for behavioral and regulatory science research around conventional and novel tobacco products."

Credit: 
International Association for Dental, Oral, and Craniofacial Research

Anthropologist contributes to major study of large animal extinction

image: This is Amelia Villasenor, University of Arkansas.

Image: 
University of Arkansas / University Relations

As part of an international research group based at the Smithsonian Museum of Natural History, anthropology assistant professor Amelia Villaseñor contributed to a large, multi-institutional study explaining how the human-influenced mass extinction of giant carnivores and herbivores of North America fundamentally changed the biodiversity and landscape of the continent.

In their study published today in Science, researchers from Australia, the United States, Canada and Finland showed that humans shaped the processes underlying how species co-existed for the last several thousand years. Smaller, surviving animals such as deer changed their ecological interactions, the researchers found, causing ecological upheaval across the continent.

The researchers' work has implications for conservation of today's remaining large animals, now threatened by another human-led mass extinction.

The study's primary author is Anikó Tóth at Macquarie University in Sydney, Australia. Tóth collaborated with Villaseñor and several other researchers at the Smithsonian's Evolution of Terrestrial Ecosystems Program, as well as researchers at other institutions.

Tóth and the co-authors focused on how large mammals were distributed across the continent in the Pleistocene and Holocene geological epochs. (The Pleistocene Epoch occurred from about 2.5 million to 11,700 years ago. Starting at the end of the Pleistocene, the Holocene is the current geological epoch.) To do this, the researchers analyzed how often pairs of species were found living in the same community or in different communities.

To rule out community changes that were the result of reduced diversity or lost associations involving extinct species, the researchers analyzed only those pairs in which both species survived. Prior to the extinction, co-occurrence was more common. After extinction, segregations were more common.

Villaseñor's research focuses on human fossil remains as a way to understand how human ancestors interacted with mammal communities for the last 3.5 million years. Her more recent research explores how modern humans have shaped today's ecosystems.

"Rather than thinking of humans as separate from 'natural' environments, our research has illuminated the major impacts that humans have had on the ecosystem for many thousands of years," Villaseñor said. "The results of this paper and others from our group illuminate the outsized impacts that human-mediated extinction has had in North America."

By the end of the Late Pleistocene in North America, roughly 11,000 years ago, humans contributed to the extinction of large mammals, including mammoths and sabre-toothed cats. Recent work, driven by today's crisis in biodiversity, has looked at understanding the ecological and evolutionary legacies of this event. There was ecological transformation across the continent - the mammoth steppe disappeared, vegetation and fire regimes changed and large carnivores were lost.

Credit: 
University of Arkansas

World's first gene therapy for glycogen storage disease produces remarkable results

image: Jerrod Watts, a patient enrolled in the first clinical trial for Glycogen Storage Disease, chats with lead investigator Dr. David Weinsten. Photos taken in the dedicated Glycogen Storage Disease Unit at UConn Health.

Image: 
Tina Encarnacion/UConn Health

At the Association for Glycogen Storage Disease's 41st Annual Conference, Dr. David Weinstein of UConn School of Medicine and Connecticut Children's presented his groundbreaking, one-year clinical trial results for the novel gene therapy treatment for glycogen storage disease (GSD).

The rare and deadly genetic liver disorder, GSD type Ia, affects children from infancy through adulthood, causing dangerously low blood sugar levels and constant dependence on glucose consumption in the form of cornstarch every few hours for survival. If a cornstarch dose is missed, the disease can lead to seizures and even death.

Weinstein, whose team first administered the investigational gene therapy at UConn John Dempsey Hospital in Farmington, Connecticut, on July 24, 2018, calls the results "remarkable."

One year after patient Jerrod Watts first received the GSD vaccine during a 30-minute infusion, he is completely off of cornstarch. In addition to totally stopping daily cornstarch consumption, Watts has experienced normal regulation of his blood glucose levels, weight loss, increased muscle strength, and marked improvement in his energy.

"The clinical trial is going better than expected. The therapy is transforming patient lives," says Weinstein. "We have seen all of the patients wean their therapy with some already discontinuing treatment. Missed cornstarch doses no longer are resulting in hypoglycemia, which previously could have been life threatening."

Weinstein, the clinical trial's lead investigator, is pediatric endocrinologist-scientist who cares for more than 700 GSD patients from 51 countries as director of the Glycogen Storage Disease Program at Connecticut Children's and UConn Health -- the largest center in the world for the care and treatment of this condition.

The clinical trial, conducted in conjunction with the biopharmaceutical company Ultragenyx, originally set out to simply test the safety and dosage of the gene therapy for three patients with GSD Type Ia.

The gene therapy works by delivering a new copy of a gene to the liver via a naturally occurring virus. Administered through the patient's bloodstream, the new copy replaces deficient sugar enzymes caused by the disease and jump starts the body's glucose control.

Both Weinstein and Watts were surprised by Watts' response to the gene therapy.

"We were just making sure it was safe for humans to take, that was our initial goal. The results are way beyond my imagination," says Weinstein. "The patients who are finishing the first year got what we all thought was a test dose -- one-third strength -- yet the response has been dramatic. They can now go through the night without any treatment and they wake up clinically well."

Prior to the treatment, Watts was consuming more than 400 grams of cornstarch per day. The GSD Program's multidisciplinary team at Connecticut Children's provides comprehensive clinical care to patients, while the program's research laboratories and clinical trial are located at UConn School of Medicine at UConn Health.

"The main thing I want to do is inspire hope. One of the biggest reliefs from this gene therapy is I can now sleep through the night without worrying about dying in the middle of the night. I wake up 6 to 7 hours later with normal blood sugar."

"I'm living, breathing proof that there is a light at the end of the tunnel with GSD. I'm a completely different person now that I was a year ago. I feel like I can live a normal life and I can do anything I want to do now," says Watts.

His message to other patients: "Please don't give up hope."

In addition to Watts, two other clinical trial cohort patients are seeing promising results on the lower cornstarch daily regimens. All three will participate in the next phase -- a 4-year follow-up clinical trial study. In addition, three patients are enrolled in clinical trial testing a higher gene therapy dose.

"What's exciting is if it works this well with the low dose, what does the future hold?," says Weinstein." If the patients are already coming off cornstarch and the labs are getting better, we just hope it will be even faster and more dramatic with the higher dose."

"This gene therapy treatment is something we've worked on for 21 years. This means so much to the glycogen storage disease community," says Weinstein. "To see patients off cornstarch and doing so well really is a culmination of an incredible journey ... We feel like we're living history."

Credit: 
University of Connecticut

Evolution of learning is key to better artificial intelligence

image: This is Anselmo Pontes, lead author and computer science researcher.

Image: 
Michigan State University

Since "2001: A Space Odyssey," people have wondered: could machines like HAL 9000 eventually exist that can process information with human-like intelligence?

Researchers at Michigan State University say that true, human-level intelligence remains a long way off, but their new paper published in The American Naturalist explores how computers could begin to evolve learning in the same way as natural organisms did - with implications for many fields, including artificial intelligence.

"We know that all organisms are capable of some form of learning, we just weren't sure how those abilities first evolved. Now we can watch these major evolutionary events unfold before us in a virtual world," said Anselmo Pontes, MSU computer science researcher and lead author. "Understanding how learning behavior evolved helps us figure out how it works and provides insights to other fields such as neuroscience, education, psychology, animal behavior, and even AI. It also supplies clues to how our brains work and could even lead to robots that learn from experiences as effectively as humans do."

According to Fred Dyer, MSU integrative biology professor and co-author, these findings have the potential for huge implications.

"We're untangling the story of how our own cognition came to be and how that can shape the future," Dyer said. "Understanding our own origins can lead us to developing robots that can watch and learn rather than being programmed for each individual task."

The results are the first demonstration that shows the evolution of associative learning in an artificial organism without a brain. Here is a video showing the process.

"Our inspiration was the way animals learn landmarks and use them to navigate their environments," Pontes said. "For example, in laboratory experiments, honeybees learn to associate certain colors or shapes with directions and navigate complex mazes."

Since the evolution of learning cannot be observed through fossils - and would take more than a lifetime to watch in nature - the MSU interdisciplinary team composed of biologists and computer scientists used a digital evolution program that allowed them to observe tens of thousands of generations of evolution in just a few hours, a feat unachievable with living systems.

In this case, organisms evolved to learn and use environmental signals to help them navigate the environment and find food.

"Learning is crucial to most behaviors, but we couldn't directly observe how learning got started in the first place from our purely instinctual ancestors," Dyer said. "We built in various selection pressures that we thought might play a role and watched what happened in the computer."

While the environment was simulated, the evolution was real. The programs that controlled the digital organism were subject to genetic variation from mutation, inheritance and competitive selection. Organisms were tasked to follow a trail alongside signals that - if interpreted correctly - pointed where the path went next.

In the beginning of the simulation, organisms were "blank slates," incapable of sensing, moving or learning. Every time an organism reproduced, its descendants could suffer mutations that changed their behavior. Most mutations were lethal. Some did nothing. But the rare traits that allowed an organism to better follow the trail resulted in the organism collecting more resources, reproducing more often and, thus, gaining share in the population.

Over the generations, organisms evolved more and more complex behaviors. First came simple movements allowing them to stumble into food. Next was the ability to sense and distinguish different types of signals, followed by the reflexive ability to correct errors, such as trying an incorrect path, backing up and trying another.

A few organisms evolved the ability to learn by association. If one of these organisms made a wrong turn it would correct the error, but it would also learn from that mistake and associate the specific signal it saw with the direction it now knew it should have gone. From then on, it would navigate the entire trail without any further mistakes. Some organisms could even relearn when tricked by switching signals mid-trail.

"Evolution in nature might take too long to study," Pontes said, "but evolution is just an algorithm, so it can be replicated in a computer. We were not just able to see how certain environments fostered the evolution of learning, but we saw populations evolve through the same behavioral phases that previous scientists speculated should happen but didn't have the technology to see."

Credit: 
Michigan State University

NASA finds a tiny tropical storm Kiko

image: On Sept. 19, the MODIS instrument that flies aboard NASA's Terra satellite provided this image of Tropical Storm Kiko moving toward the Central Pacific Ocean.

Image: 
NASA Worldview

NASA's Terra satellite is one in a fleet of NASA satellites that provide data for research. Terra captured an image of Tropical Storm Kiko in the Eastern Pacific Ocean which showed the extent of the small storm.

On Sept. 19, the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard Terra provided a visible image on Kiko. The image showed that the storm is compact. Tropical-storm-force winds only extend outward up to 45 miles (75 km) from the center making the storm about 90 miles (150km) in diameter.

Since the MODIS image, a pair of microwave satellite images between (5 a.m. and 7 a.m. EDT) 0900 and 1100 UTC on Sept. 20 revealed that Kiko has redeveloped a well-defined low-level inner circulation. NOAA's National Hurricane Center (NHC) said, "However, most of the deep convection (strongest thunderstorms) associated with the tropical storm is located northeast of the center, a result of moderate southwesterly [wind] shear."

At 11 a.m. EDT (1500 UTC), the center of Tropical Storm Kiko was located near latitude 17.7 degrees north and longitude 130.2 degrees west. That puts the center about 1,360 miles (2,190 km) west-southwest of the southern tip of Baja California, Mexico. Kiko is moving toward the north-northwest at near 6 mph (9 km/h). A turn toward the west is expected tonight, followed by a turn toward the west-southwest over the weekend. Maximum sustained winds have increased to near 60 mph (95 kph) with higher gusts. Slight additional strengthening is possible today, but only small changes in intensity are expected during the next several days. The estimated minimum central pressure is 999 millibars.

Hurricanes are the most powerful weather event on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

Credit: 
NASA/Goddard Space Flight Center

Wearable brain-machine interface could control a wheelchair, vehicle or computer

image: Test subject who has flexible wireless electronics conformed to the back of the neck, with dry hair electrodes under a fabric headband and a membrane electrode on the mastoid, connected with thin-film cables.

Image: 
Courtesy Woon-Hong Yeo

Combining new classes of nanomembrane electrodes with flexible electronics and a deep learning algorithm could help disabled people wirelessly control an electric wheelchair, interact with a computer or operate a small robotic vehicle without donning a bulky hair-electrode cap or contending with wires.

By providing a fully portable, wireless brain-machine interface (BMI), the wearable system could offer an improvement over conventional electroencephalography (EEG) for measuring signals from visually evoked potentials in the human brain. The system's ability to measure EEG signals for BMI has been evaluated with six human subjects, but has not been studied with disabled individuals.

The project, conducted by researchers from the Georgia Institute of Technology, University of Kent and Wichita State University, was reported on September 11 in the journal Nature Machine Intelligence.

"This work reports fundamental strategies to design an ergonomic, portable EEG system for a broad range of assistive devices, smart home systems and neuro-gaming interfaces," said Woon-Hong Yeo, an assistant professor in Georgia Tech's George W. Woodruff School of Mechanical Engineering and Wallace H. Coulter Department of Biomedical Engineering. "The primary innovation is in the development of a fully integrated package of high-resolution EEG monitoring systems and circuits within a miniaturized skin-conformal system."

BMI is an essential part of rehabilitation technology that allows those with amyotrophic lateral sclerosis (ALS), chronic stroke or other severe motor disabilities to control prosthetic systems. Gathering brain signals known as steady-state virtually evoked potentials (SSVEP) now requires use of an electrode-studded hair cap that uses wet electrodes, adhesives and wires to connect with computer equipment that interprets the signals.

Yeo and his collaborators are taking advantage of a new class of flexible, wireless sensors and electronics that can be easily applied to the skin. The system includes three primary components: highly flexible, hair-mounted electrodes that make direct contact with the scalp through hair; an ultrathin nanomembrane electrode; and soft, flexible circuity with a Bluetooth telemetry unit. The recorded EEG data from the brain is processed in the flexible circuitry, then wirelessly delivered to a tablet computer via Bluetooth from up to 15 meters away.

Beyond the sensing requirements, detecting and analyzing SSVEP signals have been challenging because of the low signal amplitude, which is in the range of tens of micro-volts, similar to electrical noise in the body. Researchers also must deal with variation in human brains. Yet accurately measuring the signals is essential to determining what the user wants the system to do.

To address those challenges, the research team turned to deep learning neural network algorithms running on the flexible circuitry.

"Deep learning methods, commonly used to classify pictures of everyday things such as cats and dogs, are used to analyze the EEG signals," said Chee Siang (Jim) Ang, senior lecturer in Multimedia/Digital Systems at the University of Kent. "Like pictures of a dog which can have a lot of variations, EEG signals have the same challenge of high variability. Deep learning methods have proven to work well with pictures, and we show that they work very well with EEG signals as well."

In addition, the researchers used deep learning models to identify which electrodes are the most useful for gathering information to classify EEG signals. "We found that the model is able to identify the relevant locations in the brain for BMI, which is in agreement with human experts," Ang added. "This reduces the number of sensors we need, cutting cost and improving portability."

The system uses three elastomeric scalp electrodes held onto the head with a fabric band, ultrathin wireless electronics conformed to the neck, and a skin-like printed electrode placed on the skin below an ear. The dry soft electrodes adhere to the skin and do not use adhesive or gel. Along with ease of use, the system could reduce noise and interference and provide higher data transmission rates compared to existing systems.

The system was evaluated with six human subjects. The deep learning algorithm with real-time data classification could control an electric wheelchair and a small robotic vehicle. The signals could also be used to control a display system without using a keyboard, joystick or other controller, Yeo said.

"Typical EEG systems must cover the majority of the scalp to get signals, but potential users may be sensitive about wearing them," Yeo added. "This miniaturized, wearable soft device is fully integrated and designed to be comfortable for long-term use."

Next steps will include improving the electrodes and making the system more useful for motor-impaired individuals.

"Future study would focus on investigation of fully elastomeric, wireless self-adhesive electrodes that can be mounted on the hairy scalp without any support from headgear, along with further miniaturization of the electronics to incorporate more electrodes for use with other studies," Yeo said. "The EEG system can also be reconfigured to monitor motor-evoked potentials or motor imagination for motor-impaired subjects, which will be further studied as a future work on therapeutic applications."

Long-term, the system may have potential for other applications where simpler EEG monitoring would be helpful, such as in sleep studies done by Audrey Duarte, an associate professor in Georgia Tech's School of Psychology.

"This EEG monitoring system has the potential to finally allow scientists to monitor human neural activity in a relatively unobtrusive way as subjects go about their lives," she said. "For example, Dr. Yeo and I are currently using a similar system to monitor neural activity while people sleep in the comfort of their own homes, rather than the lab with bulky, rigid, uncomfortable equipment, as is customarily done. Measuring sleep-related neural activity with an imperceptible system may allow us to identify new, non-invasive biomarkers of Alzheimer's-related neural pathology predictive of dementia."

In addition to those already mentioned, the research team included Musa Mahmood, Yun-Soung Kim, Saswat Mishra, and Robert Herbert from Georgia Tech; Deogratias Mzurikwao from the University of Kent; and Yongkuk Lee from Wichita State University.

Credit: 
Georgia Institute of Technology

Temple researchers identify new target regulating mitochondria during stress

(Philadelphia, PA) - Like an emergency response team that is called into action to save lives, stress response proteins in the heart are activated during a heart attack to help prevent cell death. As part of this process, Lewis Katz School of Medicine at Temple University researchers show for the first time that one of these specialized emergency responder proteins, known as MCUB, temporarily decreases harmful levels of calcium transport into mitochondria, the energy-generating batteries of cells.

The new research, published online September 19 in the journal Circulation, identifies MCUB as a promising new target for the investigation and treatment of conditions that feature calcium overload and cell death - conditions that include heart failure, heart attack, stroke, and neurodegeneration.

"MCUB fine-tunes calcium uptake by mitochondria in injured heart tissue, in an attempt to limit calcium overload, which is a major contributor to cell death, particularly following a heart attack," explained John W. Elrod, PhD, Associate Professor in the Center for Translational Medicine at the Temple University Lewis Katz School of Medicine and senior investigator on the new study.

Calcium homeostasis is vital to a number of day-to-day cellular activities and is regulated primarily by mitochondria. For calcium to enter mitochondria, it passes through a channel known as the mitochondrial calcium uniporter (MCU), which resides in the inner mitochondrial membrane where it stimulates the production of ATP, the energy currency of the cell. The amount of calcium that mitochondria take up is regulated by various components of this channel. While MCUB closely resembles the pore forming subunit, MCU, its precise role in calcium regulation is largely unknown, particularly in the context of disease.

Dr. Elrod's team found that deletion of the MCUB gene in cells results in a change in the proteins that make up the calcium channel and that are essential for controlling whether the channel is on or off. Since these alterations are induced by stress, such as heart cell injury, the researchers next investigated the role of MCUB after heart attack in mice. In mice that suffered heart attack, the research team observed significant elevations in MCUB gene expression and decreases in MCU and the gatekeeper of the channel, MICU1. When genetically expressed prior to inducing a heart attack in mice, MCUB altered the channel to reduce calcium overload in the injured heart, ultimately curtailing tissue injury.

Dr. Elrod's team also found that, while it can improve cell survival after heart injury, increased MCUB activity comes at the expense of mitochondrial energy production. "MCUB induction is a compensatory change," explained Dr. Elrod. Just like an emergency responder, MCUB moves in and tries to reduce cell death and aid cell survival - however, the reduction in mitochondrial calcium uptake is also maladaptive and limits the cell's ability to increase energy during stress.

"MCUB presents us with a new molecular target for investigation," Dr. Elrod said. "It's unique in that it alters the stoichiometry of the channel and thereby presents a new mechanism which may be amendable to therapeutic manipulation. We think that modulating MCUB may allow us to tune down mitochondrial calcium uptake without completely inhibiting all energetic function."

It is hoped that follow-up studies defining the exact sites of molecular interaction will provide additional insight into how to target mitochondrial calcium overload in heart disease.

Credit: 
Temple University Health System

Did a common childhood illness take down the neanderthals?

image: This illustration shows the structure of the Eustachian Tube in Neanderthal Man and it's similarity to the human infant.

Image: 
SUNY Downstate Health Sciences University

BROOKLYN, NY - It is one of the great unsolved mysteries of anthropology. What killed off the Neanderthals, and why did Homo sapiens thrive even as Neanderthals withered to extinction? Was it some sort of plague specific only to Neanderthals? Was there some sort of cataclysmic event in their homelands of Eurasia that lead to their disappearance?

A new study from a team of physical anthropologists and head & neck anatomists suggests a less dramatic but equally deadly cause.

Published online by the journal, The Anatomical Record, the study, "Reconstructing the Neanderthal Eustachian Tube: New Insights on Disease Susceptibility, Fitness Cost, and Extinction"1 suggests that the real culprit in the demise of the Neanderthals was not some exotic pathogen.

Instead, the authors believe the path to extinction may well have been the most common and innocuous of childhood illnesses - and the bane of every parent of young children - chronic ear infections.

"It may sound far-fetched, but when we, for the first time, reconstructed the Eustachian tubes of Neanderthals, we discovered that they are remarkably similar to those of human infants," said coinvestigator and Downstate Health Sciences University Associate Professor Samuel Márquez, PhD, "Middle ear infections are nearly ubiquitous among infants because the flat angle of an infant's Eustachian tubes is prone to retain the otitis media bacteria that cause these infections - the same flat angle we found in Neanderthals."

In this age of antibiotics, these infections are easy to treat and relatively benign for human babies. Additionally, around age 5, the Eustachian tubes in human children lengthen and the angle becomes more acute, allowing the ear to drain, all but eliminating these recurring infections beyond early childhood.

But unlike modern humans, the structure of the Eustachian tubes in Neanderthals do not change with age - which means these ear infections and their complications, including respiratory infections, hearing loss, pneumonia, and worse, would not only become chronic, but a lifelong threat to overall health and survival.

"It's not just the threat of dying of an infection," said Dr. Márquez. "If you are constantly ill, you would not be as fit and effective in competing with your Homo sapien cousins for food and other resources. "In a world of survival of the fittest, it is no wonder that modern man, not Neanderthal, prevailed."

"The strength of the study lies in reconstructing the cartilaginous Eustachian tube," said Richard Rosenfeld, MD, MPH, MBA, Distinguished Professor and Chairman of Otolaryngology at SUNY Downstate and a world-renowned authority on children's health. "This new and previously unknown understanding of middle ear function in Neanderthal is what allows us to make new inferences regarding the impact on their health and fitness."

"Here is yet another intriguing twist on the ever-evolving Neanderthal story, this time involving a part of the body that researchers had almost entirely neglected," said Ian Tattersall, Ph.D., paleoanthropologist and Curator Emeritus of the American Museum of National History. "It adds to our gradually emerging picture of the Neanderthals as very close relatives who nonetheless differed in crucial respects from modern man."

Credit: 
SUNY Downstate Health Science University

For people with pre-existing liver disease, toxic algae may be more dangerous

image: Dr. David Kennedy (left) and Dr. Steven Haller recently published research that found microcystin, a toxin produced by blue-green algae, may be more dangerous than previously known for those with preexisting liver disease. Kennedy and Haller are assistant professors of medicine at The University of Toledo.

Image: 
Daniel Miller, The University of Toledo

Toxins produced during harmful algal blooms may be more harmful to people than previously known.

Researchers at The University of Toledo College of Medicine and Life Sciences sought out to examine how microcystin might affect individuals with non-alcoholic fatty liver disease, a widespread condition that is frequently asymptomatic. They found the toxin can significantly amplify the disease at levels below what would harm a healthy liver.

The study, published last month in the journal Toxins, follows earlier research from UToledo that found clear evidence that microcystin exposure worsens the severity of pre-existing colitis. Microcystin is a by-product of the cyanobacteria found in what is commonly known as blue-green algae.

"The take home message from our research is there are certain groups of people who need to pay extra attention and may be more susceptible to microcystin toxins. We may need to explore special preventative guidelines for those people in terms of how much microcystin they are exposed to through drinking water or other means," said Dr. David Kennedy, an assistant professor of medicine at UToledo and one of the study's lead authors.

Aided by nutrient runoff and warming waters, seasonal blooms of blue-green algae are flourishing across much of the United States. Not all algal blooms produce toxins, but many do.

Potentially dangerous concentrations of microcystin have been found this year in ponds in New York City's Central Park, along the Mississippi Gulf Coast, reservoirs in California, and a portion of Lake Erie's coastline near Toledo.

While no human deaths have been linked to microcystin in the United States, deaths have been reported elsewhere -- most notably among a group of kidney dialysis patients in Brazil. There also have been reports this year of pet dogs dying after exposure to blue-green algae in Texas, North Carolina and Georgia.

With annual blooms becoming more frequent and intense, researchers in the UToledo College of Medicine and Life Sciences wanted to better understand how the toxins might affect people already suffering from conditions that affect organ systems microcystin is known to attack, such as the liver.

"It's a gray area in terms of what microcystin is really doing to you if you have a pre-existing disease state. Are you more susceptible? Are we going to have to go back and revaluate what we consider safe in a person with a pre-existing disease state? It's important we start providing answers to these questions," said Dr. Steven Haller, UToledo assistant professor of medicine.

In the liver study, researchers examined how chronic, low-level exposure of microcystin affected mice with non-alcoholic fatty liver disease compared to mice with healthy livers.

At microcystin ingestion levels below the No Observed Adverse Effect Level for healthy mice, analysis showed significant exacerbation of liver damage in mice with fatty liver disease. Researchers observed no liver damage in mice who started the experiment with healthy livers.

"Current exposure limits from the World Health Organization and the U.S. Environmental Protection Agency for humans are based off studies done in healthy animals," Haller said. "The results of this study suggest there may be a need to review those guidelines for people with pre-existing conditions."

They also noted major differences in how microcystin was processed handled by the kidneys in the two test groups.

In mice with non-alcoholic fatty liver disease, elevated levels of microcystin were found in the blood plasma, but were not detectable in the plasma of healthy mice. Mice with non-alcoholic fatty liver disease also excreted far less microcystin in their urine.

The differences seen in how microcystin was processed between the two test groups suggests that kidney function may play an important role in the increased susceptibility of the mice with pre-existing liver disease.

"This may be highly relevant to help us understand the deaths that occurred in kidney dialysis patients, and point to the need to pay particular attention to at-risk patient populations as we design preventative, diagnostic and therapeutic strategies," Kennedy said.

The results from the liver study build on prior work from Kennedy and Haller looking at how microcystin exposure might affect individuals with inflammatory bowel disease, another common condition that impacts an estimated 1 million Americans.

In that study, published in June, the researchers demonstrated that exposure to MC-LR prolongs and worsens the severity of pre-existing colitis, contributing to significant weight loss, bleeding, and higher numbers of signaling molecules that cause inflammation.

"Based on this data we're coming up with insights into how we can potentially treat exposures if they do occur," Kennedy said. "This is giving us a number of insights into how we might help patients, especially patients who are vulnerable or susceptible if there was an exposure."

Credit: 
University of Toledo

New study reveals a strong link between vitamin D deficiency and increased mortality, especially diabetes-related deaths

New research presented at this year's Annual Meeting of the European Association for the Study of Diabetes (EASD) in Barcelona, Spain (16-20 Sept) reveals that vitamin D deficiency is strongly linked to increased mortality, especially in younger and middle-aged people, and is particularly associated with diabetes-related deaths.

The research was conducted by Dr Rodrig Marculescu and colleagues at the Medical University of Vienna, Austria. It analysed the effects of low 25-hydroxyvitamin D (25D) (referred to as vitamin D) levels in the blood on overall and cause-specific mortality in a large study cohort covering all age groups, and taken from a population with minimal vitamin D supplementation in old age.

Vitamin D deficiency is a widely prevalent and easily correctable risk factor for early death, and evidence for its link to mortality comes from numerous studies and clinical trials. The majority of this research to date has however come from looking at older populations, and the authors believe that many of the largest scale studies may have been affected by increased rates of vitamin D supplementation in old age. They also note: "Cause-specific mortalities and the impact of age on the association of vitamin D with the risk of death have not yet been reported in detail."

The researchers took their data from the records of all 78,581 patients (mean age 51.0 years, 31.5% male) who had a vitamin D (25D) measurement taken at the Department of Laboratory Medicine, General Hospital of Vienna between 1991 and 2011, which was then matched with the Austrian national register of deaths. The first 3 years of mortality following the vitamin D measurement were excluded from the analysis, and patients were followed for up to 20 years where possible, with a median follow-up of 10.5 years.

The authors used a blood level of vitamin D 50 nmol/L, a commonly used cut-off value for vitamin D deficiency, as their reference value to which other levels would be compared, and set their low and high levels for which risk would be calculated at 10 nmol/L and 90 nmol/L respectively.

The study found that vitamin D levels of 10 nmol/L or less were associated with a 2-3 fold increase in risk of death, with the largest effect being observed in patients aged 45 to 60 years (2.9 times increased risk). Levels of 90 nmol/L or greater were associated with a reduction in all-cause mortality of 30-40%, again with the largest effect being found in the 45 to 60-years-old age group (a 40% reduction in risk). No statistically significant associations between vitamin D levels and mortality were observed in patients over the age of 75

With regard to cause-specific mortality, the authors were surprised to find the strongest associations of vitamin D were with causes of death other than cardiovascular disease and cancer. Differences between the age groups were even more pronounced for these causes of death and, again, the largest effect was found in patients aged 45 to 60 years. Further subdivision of these non-cardiovascular, non-cancer causes of death revealed the largest effect of vitamin for diabetes with a 4.4 times higher risk of death from the disease in the vitamin D deficient group (less than or equal to 50 nmol/L) than for study participants whose serum vitamin D was above 50 nmol/L.

Plotting the risk of death according to vitamin D level in the various subgroups did not support a risk resurgence at higher vitamin D levels above 100 nmol/L. The authors say this further diminishes concerns about a possible negative effect of vitamin D in the higher concentration range, as have been shown in some previous studies reporting "inverse J-shaped" risk association (meaning risk decreased to a certain level of vitamin D and then started increasing again at higher levels).

The team conclude: "Our survival data from a large cohort, covering all age groups, from a population with minimal vitamin D supplementation at old age, confirm a strong association of vitamin D deficiency (under 50 nmol/L) with increased mortality. This association is most pronounced in the younger and middle-aged groups and for causes of deaths other than cancer and cardiovascular disease, especially diabetes."

The researchers go on to suggest that: "Our findings strengthen the rationale for widespread vitamin D supplementation to prevent premature mortality, emphasize the need for it early in life and mitigate concerns about a possible negative effect at higher levels."

Credit: 
Diabetologia

A single dose of yellow fever vaccine does not offer lasting protection to all children

Yellow fever is a viral infection spread by various species of mosquito and is rife in 34 countries in Africa and 13 in Latin America. Infection may be asymptomatic and go unnoticed or, on the contrary, it may progress rapidly to severe illness with fever, headache, muscle pain, nausea, vomiting and fatigue. The virus attacks the liver cells, often causing jaundice from which the disease gets its name. Severe bleeding occurs in 25 to 50 % of cases, with high levels of mortality observed 7 to 10 days after the onset of symptoms.

Since 2013, WHO recommends a single dose of the vaccine for life-long protection. This recommendation is based on proof of long-term efficacy, in vitro and in vivo, established in adults and children over 2 years of age. But data on the long-term efficacy of primary vaccination in infants are absent, despite 9-12-month-olds being the main targets of routine vaccination in countries in which yellow fever is endemic. In this respect, WHO recommended research into the long-term persistence of the immunity conferred by vaccination in this age group. This research was performed by José Enrique Mejía from Unit 1043 Center for Pathophysiology of Toulouse Purpan in partnership with Cristina Domingo from Robert Koch Institute in Berlin, and researchers from the USA, Ghana and Mali, with support from the Wellcome Trust.

Their study verified whether children to whom the vaccine was administered at around 9 months of age were still protected several years later. The team studied two cohorts, one from Mali (587 children) and the other from Ghana (436 children), in whom the levels of specific antibodies to the yellow fever virus had been measured 4 weeks after vaccination. They then repeated the measurement several years later, with findings from previous studies enabling them to estimate that levels above 0.5 IU/ml should protect children from infection.

In the Malian cohort, 4.5 years after vaccination, only half of the children continued to present levels of antibodies above 0.5 IU/ml. And 19.3 % presented detectable antibodies but at levels below this recommended threshold (In the Ghanaian cohort, 2.5 years after vaccination, only around 30 % of children continued to be protected against infection and 11.7 % continued to present specific antibodies but in low concentrations (

Irrespective of the differences in vaccine efficacy between these two groups, which could be explained by ethnic and environmental factors (urban/rural population, seasonality of vaccination, diet, exposure to other infectious agents, etc.), the results in both cases show a substantial fall - practically by half - in the levels of protective antibodies in the years following vaccination, and which predict the absence of protection against infection for large numbers of children.

"Our data suggest that a booster may be necessary when the 1st vaccination is performed in 9-12-month-olds, but we will need more precise knowledge of the decrease in antibodies over time. Maintaining immunity to the virus during childhood and in adulthood is fundamental for obtaining vaccine coverage beyond the threshold of 80 % of the population in order to prevent the risk of epidemic", concludes Mejía.

Credit: 
INSERM (Institut national de la santé et de la recherche médicale)

Smoking abstinence has little impact on the motivation for food

BUFFALO, N.Y. - It's sometimes thought that smokers who can't light up are likely to reach for food in lieu of cigarettes. But new research from the University at Buffalo suggests that smoking abstinence doesn't greatly affect the motivation for food.

The study, published in the journal Drug and Alcohol Dependence, used cues and actual money to learn how much smokers might spend for cigarettes, food and water during abstinence. The results provide new insights for how different systems control motivation and reward.

Do smokers who can't smoke, for whatever reason, reallocate their resources toward food and water when cigarettes are not an option?

"We found with this sample in this study that the motivations for cigarettes, food and water do not interact very much," says Stephen Tiffany, Empire Innovation Professor in UB's Department of Psychology in the university's College of Arts and Sciences. "The results suggest that smoking abstinence does not affect the motivation for food and water."

The participants in this study were not trying to quit smoking and the findings don't speak to how trying to quit would influence these motivations. However, food does not become more appealing during those times when a smoker is in a smoke-free environment, or otherwise can't smoke.

"If you're on an airplane and can't smoke you're not likely to be spending more money than usual on snacks," says Tiffany, an expert in the assessment, diagnosis and treatment of addiction who is also affiliated with UB's Clinical and Research Institute on Addictions

Most of Tiffany's addiction research has dealt with cigarette smokers and over the years he's tried to bring elements of smoking motivation into the laboratory. Craving, he says, is easy to introduce in a research setting by presenting people with cues or reminders of cigarettes. These studies look at behaviors in response to cues, but they don't allow participants an opportunity to use the cue. It's look, but don't touch.

"In those cases we're studying verbal behavior, but not overt behavior," he says. "We don't look at the choice people may actually make."

For the current study, 50 participants, all smokers who had abstained for 12 hours, had money to spend on their choices.

Tiffany and Jennifer Betts, the study's co-author and a graduate student in UB's psychology department, sat those participants in front of a box with a sliding door. Inside the box was one of three items: their favorite brand of cigarette, a candy bar they previously acknowledged as liking, or a cup of water.

Giving participants "house money" to spend on their decisions, including the option of keeping the cash, has research advantages, according to Betts.

"Unlike many previous studies, people in this study were spending real money, getting real food and cigarettes, and they had real, immediate chances to sample these items," says Betts.

Each participant would see the box's contents following a tone. They rated their craving for that item from 1 to 7 and then determined how much of the $9 they were given they'd be willing to spend to sample one of the cues. The amount they spent, from a penny to 25 cents, determined whether the door was unlocked or not. The more they spent, the greater the chance of the door being unlocked, up to a probability of 95 percent.

"There's a cost," notes Tiffany. "Which is true in life."

In previous research, Tiffany has consistently found that people will spend more for a cigarette than water, but for this study he was interested in food.

"There are interesting relationships between food and smoking, and smoking and weight," says Tiffany. "Smokers overall weigh less than non-smokers and smokers tend to gain weight when they quit."

There are also contrasting notions about food's appeal during smoking abstinence. Some theories suggest food becomes more appealing, while other theories say that value of food decreases as the desire for cigarettes increases.

In this study, non-abstinent smokers spent more money for cigarettes than for food. And more money for food than for water. Abstinent smokers spent even more for cigarettes, but they didn't spend for food or for water.

"When people are abstinent from cigarettes their craving tends to go up, but they don't become hypersensitive to the cue," says Tiffany.

There's still work to be done, but Tiffany says the findings speak generally to how critical cues are for smokers and how strongly events that remind them of smoking drive craving and determine choices.

"People don't relapse randomly," says Tiffany. "They relapse in the presence of opportunities to use which can be triggered by cues."

Credit: 
University at Buffalo

Introducing 'mesh,' memory-saving plug-in to boost phone and computer performance

image: A research group co-led by Emery Berger, a professor of computer science at UMass Amherst, has developed a system they call Mesh that can automatically reduce such memory demands.

Image: 
UMass Amherst

AMHERST, Mass. - Applications like web browsers or smartphone apps often use a lot of memory. To address this, a research group co-led by Emery Berger, a professor of computer science at the University of Massachusetts Amherst, has developed a system they call Mesh that can automatically reduce such memory demands. Berger is presenting this work today at Cppcon, the C++ conference in Aurora, Colorado.

Berger and colleagues in the College of Information and Computer Science (CICS) expect Mesh to have a substantial impact on the computing world, from mobile applications to desktops to data centers, because no one has previously been able to compact memory in applications written in or running on top of widely-used languages like C, C++, or Objective C, the language used for iOS apps. 

As the authors explain, programs written in C-like languages can suffer from serious memory fragmentation, where memory is broken up, much like a bad Tetris board, Berger says, so there are many empty gaps in between. "This is how memory gets wasted," he points out. "Imagine a Tetris board where you could stop and reorganize it at any time - this would make the game a lot easier, because you could always squeeze out the empty space. But you can't do this in C, just as you can't do it in Tetris." 

Mesh effectively squeezes out these gaps by taking advantage of a hardware feature called "virtual memory" that is supported by almost all modern computers. "The trick is to find chunks of memory that can be interleaved, sort of like when interlocking gears mesh," Berger explains. When Mesh finds these chunks, it can reclaim the memory from one of the chunks by combining the two chunks into just one. "This meshing process works because we only change things in 'physical' memory. From the perspective of the program, which can only see 'virtual' memory, nothing has changed. This is powerful because we can do this for any application automatically."

The team reports that the results to date have been extremely promising; for example, using Mesh automatically reduces the memory demands of the Firefox web browser by 16%. For Redis, a popular open source data structure server, Mesh reduces memory demands by almost 40%.

The CICS Mesh team includes professor Emery Berger, an expert in memory management who designed the algorithm that the Mac OS X memory manager is based on, professor Andrew McGregor, a specialist in algorithm design and analysis, and doctoral candidates Bobby Powers and David Tench. Powers is a fourth-year doctoral candidate who also is an infrastructure engineer at Stripe, and Tench is a fifth-year doctoral candidate specializing in randomized algorithms.

In a field where "catastrophic fragmentation" was long accepted as inevitable, their software is a major step forward, the authors point out. "This is something that everyone thought to be impossible," notes McGregor. "After Emery had his key insight, we were able to analyze it theoretically and design an efficient algorithm to implement the idea. Against almost 50 years of conventional wisdom, it's great that we now have a solution to this important problem that not only works in theory, but is practical."

Earlier this year, Berger presented technical details at the ACM SIGPLAN Programming Language Design and Implementation conference (PLDI '19) in Phoenix. In response to the paper, Microsoft programmer and distinguished engineer Miguel de Icaza tweeted that Mesh is a "truly inspiring work, with deep impact. A beautiful idea fully developed. What an amazing contribution to the industry."

Credit: 
University of Massachusetts Amherst