Culture

Broken shuttle may interfere with learning in major brain disorders

Unable to carry signals based on sights and sounds to the genes that record memories, a broken shuttle protein may hinder learning in patients with intellectual disability, schizophrenia, and autism.

This is the implication of a study led by researchers at NYU School of Medicine and published online June 22 in Nature Communications.

Specifically, the research team found that mice genetically engineered to lack the gene for the gamma-CaMKII shuttle protein took twice as long as normal mice to form a memory needed to complete a simple task.

"Our study shows for the first time that gamma-CaMKII plays a critical role in learning and memory in live animals," says Richard Tsien, PhD, chair of the Department of Neuroscience and Physiology and director of the Neuroscience Institute at NYU Langone Health.

"Adding more weight to our results, we showed that making the same change in the shuttle's structure seen in a human child with severe intellectual disability also took away the ability of mice to learn," says Dr. Tsien, also the Druckenmiller Professor of Neuroscience. He says this result suggests that the shuttle works similarly in the two species.

The research team then restored the learning ability by re-inserting the human version of the shuttle protein into mice.

The current study revolves around the nerve cells that coordinate thought and memory. Each cell in a nerve pathway sends an electric pulse down its branches until it reaches a synapse, a gap between itself and the next cell in line. Signals that form memories start at synapses where sights and sounds trigger responses, and end when genes are turned on in the nuclei of nerve cells to make permanent, physical changes in their connections.

When sensory information triggers known mechanisms near synapses, calcium is released into nerve cells, building up until it triggers chain reactions fine-tuned by partnering proteins like calmodulin or CaM, say the study authors. When calcium and CaM link up and arrive in a nerve cell's nucleus, the compartment where genes operate, they set off reactions known to activate the protein CREB, which dials up the action of genes previously linked to memory formation.

Missing Link

Going into the study, a "missing link" in the field was an understanding of how synapses "talk to" nerve cell nuclei as memories form. In the current study, researchers determined for the first time that this communication occurs when gamma-CaMKII shuttles the calcium/calmodulin complexes that form just inside of nerve cells to their nuclei.

Comparing spatial memory in mice without gamma-CaMKII to normal mice, the study authors found that gamma-CaMKII "knockout" mice were much less able to locate a platform hidden beneath the surface of murky water in a maze. During this exercise, normal mice quickly identify the platform's location.

The team also found that, an hour after maze training, normal mice displayed a significant increase in expression of three genes--BDNF, c-Fos, and Arc--known from past studies to help form long-term, spatial memories based on experiences. In contrast, training-induced increases in the expression of these genes did not occur in mice engineered to lack gamma-CaMKII.

Along with removing the entire gene encoding gamma-CaMKII protein from some mice, a separate group of mice were engineered to have a version of the protein with a small change found by a 2012 study in a boy with severe intellectual disability. In the nerve cells of the boy, the protein building block at position 292 in the amino acid backbone of gamma-CaMKII, typically arginine, was occupied instead by a proline residue (R292P). The change rendered this protein a thousand times less able to trap the calcium-calmodulin complex, so it often arrived in nerve cell nuclei without its cargo.

Next steps for the team include determining how gamma-CaMKII fits into a larger "feedback machine" of nerve cell circuitry published by Dr. Tsien and colleagues in the journal Neuron in 2016.

"This learning machine, controlled by a key set of genes, senses nerve signaling levels and shapes sensory input into memories," says Tsien. Experiments are planned to reveal more details about how the machine "copes with small flaws, including in those the gamma-CaMKII shuttle, but fails when too many problems build up in one or more of its components."

Credit: 
NYU Langone Health / NYU Grossman School of Medicine

Normalisation of 'plus-size' risks hidden danger of obesity, study finds

New research warns that the normalisation of 'plus-size' body shapes may be leading to an increasing number of people underestimating their weight - undermining efforts to tackle England's ever-growing obesity problem.

While attempts to reduce stigmatisation of larger body sizes - for example with the launch of plus-size clothing ranges - help promote body positivity, the study highlights an unintentional negative consequence that may prevent recognition of the health risks of being overweight.

The study by Dr Raya Muttarak, from the University of East Anglia (UEA) and the International Institute for Applied Systems Analysis (IIASA), in Austria, examined the demographic and socioeconomic characteristics associated with underestimation of weight status to reveal social inequalities in patterns of weight misperception.

Analysis of data from almost 23,460 people who are overweight or obese revealed that weight misperception has increased in England. Men and individuals with lower levels of education and income are more likely to underestimate their weight status and consequently less likely to try to lose weight.

Members of minority ethnic groups are also more likely to underestimate their weight than the white population, however they are more likely to try to lose weight. Overall, those underestimating their weight are 85% less likely to try to lose weight compared with people who accurately identified their weight status.

The results, published today in the journal Obesity, show that the number of overweight individuals who are misperceiving their weight has increased over time, from 48.4% to 57.9% in men and 24.5% to 30.6% in women between 1997 and 2015. Similarly, among individuals classified as obese, the proportion of men misperceiving their weight in 2015 was almost double that of 1997 (12% vs 6.6%).

The study comes amid growing global concern about rising obesity rates and follows a 2017 report by the Organisation for Economic Co-operation and Development (OECD) that showed 63% of adults in the UK are overweight or obese.

Dr Muttarak, a senior lecturer in UEA's School of International Development, says her findings have important implications for public health policies.

"Seeing the huge potential of the fuller-sized fashion market, retailers may have contributed to the normalisation of being overweight and obese," said Dr Muttarak. "While this type of body positive movement helps reduce stigmatisation of larger-sized bodies, it can potentially undermine the recognition of being overweight and its health consequences. The increase in weight misperception in England is alarming and possibly a result of this normalisation.

"Likewise, the higher prevalence of being overweight and obesity among individuals with lower levels of education and income may contribute to visual normalisation, that is, more regular visual exposure to people with excess weight than their counterparts with higher socioeconomic status have.

"To achieve effective public health intervention programmes, it is therefore vital to prioritise inequalities in overweight- and obesity-related risks. Identifying those prone to misperceiving their weight can help in designing obesity-prevention strategies targeting the specific needs of different groups."

Dr Muttarak added: "The causes of socioeconomic inequalities in obesity are complex. Not only does access to health care services matter, but socioeconomic determinants related to living and working conditions and health literacy also substantially influence health and health behaviours.

"Given the price of healthier foods such as fresh fruits and vegetables are higher than processed and energy-dense foods in this country, as a sociologist, I feel these inequalities should be addressed. The continuing problem of people underestimating their weight reflects unsuccessful interventions of health professionals in tackling the overweight and obesity issue."

The study used data from the annual Health Survey for England, which contains a question on weight perception.

Focusing on respondents with a BMI of 25 or over, about two-thirds were classified as being overweight and one-third as obese. In order to assess trends in self-perception of weight status, the analysis was based on pooled data from five years - 1997, 1998, 2002, 2014, 2015 - of the survey.

The proportion underestimating their weight status was higher among overweight individuals compared with those with obesity (40.8% vs 8.4%). Correspondingly, only about half of overweight individuals were trying to lose weight compared with more than two-thirds of people with obesity.

Credit: 
University of East Anglia

USGS estimates 8.5 billion barrels of oil in Texas' Eagle Ford Group

video: USGS researchers drill a research well located on the south side of U.S. 90, 7.1 miles east of Brackettville, Texas. This core was drilled by USGS during field work for an oil and gas assessment for the Eagle Ford of the Gulf Coast Basins. Cores like these provide information on the various rock layers, such as their make-up, their age, etc. The borehole, core, and logging suite are the third in a series of wells in the Austin -- Eagle Ford for use in our USGS Energy Program regional oil and gas assessments.

Image: 
Stan Paxton, USGS

The Eagle Ford Group of Texas contains estimated means of 8.5 billion barrels of oil, 66 trillion cubic feet of natural gas, and 1.9 billion barrels of natural gas liquids, according to a new assessment by the U.S. Geological Survey. This estimate consists of undiscovered, technically recoverable resources in continuous accumulations.

The Eagle Ford Group stretches from the Texas-Mexico border to the west, across portions of southern and eastern Texas to the Texas-Louisiana border to the east. It is one of the most prolific continuous accumulations in the United States, and is comprised of mudstone with varying amounts of carbonate.

"Texas is so well-known for its oil and gas endowment that it's almost synonymous with petroleum," said USGS Director Dr. Jim Reilly. "In fact, the shale boom that has reinvigorated the Nation's energy industry began in Texas. That development is why it's so important that we regularly reassess potential resources around the country, because what's technically recoverable changes as new techniques are pioneered."

The Eagle Ford Group has long been known to contain oil and gas, but it was not until 2008 that production of the continuous resources really got underway in East Texas.

"This assessment is a bit different than previous ones, because it ranks in the top 5 of assessments we've done of continuous resources for both oil and gas," said USGS scientist Kate Whidden, lead author for the assessment. "Usually, formations produce primarily oil or gas, but the Eagle Ford is rich in both."

Continuous oil and gas is dispersed throughout a geologic formation rather than existing as discrete, localized occurrences, such as those in conventional accumulations. Because of that, continuous resources commonly require special technical drilling and recovery methods, such as hydraulic fracturing.

Undiscovered resources are those that are estimated to exist based on geologic knowledge and statistical analysis of known resources, while technically recoverable resources are those that can be produced using currently available technology and industry practices. Whether or not it is profitable to produce these resources has not been evaluated.

The USGS is the only provider of publicly available estimates of undiscovered technically recoverable oil and gas resources of onshore lands and offshore state waters. The USGS assessment of the Eagle Ford Group was undertaken as part of a nationwide project assessing domestic petroleum basins using standardized methodology and protocol.

Credit: 
U.S. Geological Survey

When low batteries are a good thing

image: In circulating white blood cells (left side) the "batteries" or mitochondria (in green) have a stronger signal -- than in intestinal white blood cells, that are in a 'low energy mode' (right side). Blue indicates the cell's nucleus.

Image: 
Špela Konjar, iMM.

Every day the human gut works on a fine-tuned balance that ensures the retention of essential nutrients while it prevents the entrance of potential armful microbes. Contributing to this surveillance system is a specialised group of immune cells that are held back due to unknown reasons although they have many characteristics of activated cells. Now, a new study led by Marc Veldhoen, group leader at Instituto de Medicina Molecular João Lobo Antunes (iMM; Portugal) shows how these cells are kept under control. The work published now in Science Immunology, reveals that the "batteries" of these cells have a different composition that reduces their capacity of producing energy, keeping them in a controlled activated mode. This knowledge can give rise to new diagnostics and treatments for conditions affecting the digestive track such as gut inflammations or infections.

The outer layer of our bodies, the skin and intestine, contains a special population of white blood cells, called intraepithelial lymphocytes. It is largely unknown how the activity of these cells is controlled, not fully activated nor at rest. Using imaging and biochemical experiments, the research group led by Marc Veldhoen has now shown this is, at least in part, due to differences in the cells' "batteries" - the mitochondria. These energy-producing structures are present inside our cells regulate the cell's power. "We hypothesised that these gut-resident white blood cells may use energy in a different way. It was surprising to see that the detection of mitochondria gave a very different picture than seen in other white blood cells, forming the basis of a new hypothesis that the mitochondria themselves are different in these cells", explains Marc Veldhoen.

Using high magnification electron microscopy, the researchers observed that the mitochondria were present in abundance but seem to be different upon staining for light microscopy. Next, they studied the functionality of these batteries. "When we analysed in detail these structures, we found changes in the lipids that form a layer separating the mitochondria from the rest of the cell", says S?pela Konjar, joint first author of the study, adding that "these changes make the "batteries" work differently, as if they are in a "low energy mode".

When the lipid landscape was purposely altered, the researchers confirmed a change in the activation potential of the cells. "Our results showed that lipids in the mitochondria of these cells could alter their metabolic state and change their activity. When the mitochondrial lipids could not be arranged similar to those found in other white blood cells, the cells could not be properly activated when needed", explains Marc Veldhoen. The researcher further explains: "This knowledge allows us to investigate how we can inhibit these cells when they are too active and cause damage, such as in gut inflammations, or how we can activate them more in cases of gut infections. Furthermore, the detection of mitochondria could be a diagnostic marker for the activation state of intestinal white blood cells".

Credit: 
Instituto de Medicina Molecular

Low-cost plastic sensors could monitor a range of health conditions

image: A low-cost sensor made from semiconducting plastic could be used to diagnose or monitor a wide range of health conditions, such as surgical complications or neurodegenerative diseases.

Image: 
KAUST

An international team of researchers have developed a low-cost sensor made from semiconducting plastic that can be used to diagnose or monitor a wide range of health conditions, such as surgical complications or neurodegenerative diseases.

The sensor can measure the amount of critical metabolites, such as lactate or glucose, that are present in sweat, tears, saliva or blood, and, when incorporated into a diagnostic device, could allow health conditions to be monitored quickly, cheaply and accurately. The new device has a far simpler design than existing sensors, and opens up a wide range of new possibilities for health monitoring down to the cellular level. The results are reported in the journal Science Advances.

The device was developed by a team led by the University of Cambridge and King Abdullah University of Science and Technology (KAUST) in Saudi Arabia. Semiconducting plastics such as those used in the current work are being developed for use in solar cells and flexible electronics, but have not yet seen widespread use in biological applications.

"In our work, we've overcome many of the limitations of conventional electrochemical biosensors that incorporate enzymes as the sensing material," said lead author Dr Anna-Maria Pappa, a postdoctoral researcher in Cambridge's Department of Chemical Engineering and Biotechnology. "In conventional biosensors, the communication between the sensor's electrode and the sensing material is not very efficient, so it's been necessary to add molecular wires to facilitate and 'boost' the signal."

To build their sensor, Pappa and her colleagues used a newly-synthesised polymer developed at Imperial College that acts as a molecular wire, directly accepting the electrons produced during electrochemical reactions. When the material comes into contact with a liquid such as sweat, tears or blood, it absorbs ions and swells, becoming merged with the liquid. This leads to significantly higher sensitivity compared to traditional sensors made of metal electrodes.

Additionally, when the sensors are incorporated into more complex circuits, such as transistors, the signal can be amplified and respond to tiny fluctuations in metabolite concentration, despite the tiny size of the devices.

Initial tests of the sensors were used to measure levels of lactate, which is useful in fitness applications or to monitor patients following surgery. However, according to the researchers, the sensor can be easily modified to detect other metabolites, such as glucose or cholesterol by incorporating the appropriate enzyme, and the concentration range that the sensor can detect can be adjusted by changing the device's geometry.

"This is the first time that it's been possible to use an electron accepting polymer that can be tailored to improve communication with the enzymes, which allows for the direct detection of a metabolite: this hasn't been straightforward until now," said Pappa. "It opens up new directions in biosensing, where materials can be designed to interact with a specific metabolite, resulting in far more sensitive and selective sensors."

Since the sensor does not consist of metals such as gold or platinum, it can be manufactured at a lower cost and can be easily incorporated in flexible and stretchable substrates, enabling their implementation in wearable or implantable sensing applications.

"An implantable device could allow us to monitor the metabolic activity of the brain in real time under stress conditions, such as during or immediately before a seizure and could be used to predict seizures or to assess treatment," said Pappa.

The researchers now plan to develop the sensor to monitor metabolic activity of human cells in real time outside the body. The Bioelectronic Systems and Technologies group where Pappa is based is focused on developing models that can closely mimic our organs, along with technologies that can accurately assess them in real-time. The developed sensor technology can be used with these models to test the potency or toxicity of drugs.

Credit: 
University of Cambridge

What causes the sound of a dripping tap -- and how do you stop it?

video: This is a water droplet hitting a liquid surface. The oscillation of the trapped air bubble, visible in this video, causes the water surface to oscillate and drives the airborne sound.

Image: 
University of Cambridge

Scientists have solved the riddle behind one of the most recognisable, and annoying, household sounds: the dripping tap. And crucially, they have also identified a simple solution to stop it, which most of us already have in our kitchens.

Using ultra-high-speed cameras and modern audio capture techniques, the researchers, from the University of Cambridge, found that the 'plink, plink' sound produced by a water droplet hitting a liquid surface is caused not by the droplet itself, but by the oscillation of a small bubble of air trapped beneath the water's surface. The bubble forces the water surface itself to vibrate, acting like a piston to drive the airborne sound.

In addition, the researchers found that changing the surface tension of the surface, for example by adding dish soap, can stop the sound. The results are published in the journal Scientific Reports.

Despite the fact that humans have been kept awake by the sound of dripping water from a leaky tap or roof for generations, the exact source of the sound has not been known until now.

"A lot of work has been done on the physical mechanics of a dripping tap, but not very much has been done on the sound," said Dr Anurag Agarwal of Cambridge's Department of Engineering, who led the research. "But thanks to modern video and audio technology, we can finally find out exactly where the sound is coming from, which may help us to stop it."

Agarwal, who leads the Acoustics Lab and is a Fellow of Emmanuel College, first decided to investigate this problem while visiting a friend who had a small leak in the roof of his house. Agarwal's research investigates acoustics and aerodynamics of aerospace, domestic appliances and biomedical applications. "While I was being kept awake by the sound of water falling into a bucket placed underneath the leak, I started thinking about this problem," he said. "The next day I discussed it with my friend and another visiting academic, and we were all surprised that no one had actually answered the question of what causes the sound."

Working with Dr Peter Jordan from the University of Poitiers, who spent a term in Cambridge through a Fellowship from Emmanuel College, and final-year undergraduate Sam Phillips, Agarwal set up an experiment to investigate the problem. Their setup used an ultra-high-speed camera, a microphone and a hydrophone to record droplets falling into a tank of water.

Water droplets have been a source of scientific curiosity for more than a century: the earliest photographs of drop impacts were published in 1908, and scientists have been trying to figure out the source of the sound ever since.

The fluid mechanics of a water droplet hitting a liquid surface are well-known: when the droplet hits the surface, it causes the formation of a cavity, which quickly recoils due to the surface tension of the liquid, resulting in a rising column of liquid. Since the cavity recoils so fast after the droplet's impact, it causes a small air bubble to get trapped underwater.

Previous studies have posited that the 'plink' sound is caused by the impact itself, the resonance of the cavity, or the underwater sound field propagating through the water surface, but have not been able to confirm this experimentally.

In their experiment, the Cambridge researchers found that somewhat counter-intuitively, the initial splash, the formation of the cavity, and the jet of liquid are all effectively silent. The source of the sound is the trapped air bubble.

"Using high-speed cameras and high-sensitivity microphones, we were able to directly observe the oscillation of the air bubble for the first time, showing that the air bubble is the key driver for both the underwater sound, and the distinctive airborne 'plink' sound," said Phillips, who is now a PhD student in the Department of Engineering. "However, the airborne sound is not simply the underwater sound field spreading to the surface, as had been previously thought."

In order for the 'plink' to be significant, the trapped air bubble needs to be close to the bottom of the cavity caused by the drop impact. The bubble then drives oscillations of the water surface at the bottom of the cavity, acting like a piston driving sound waves into the air. This is a more efficient mechanism by which the underwater bubble drives the airborne sound field than had previously been suggested.

According to the researchers, while the study was purely curiosity-driven, the results could be used to develop more efficient ways to measure rainfall or to develop a convincing synthesised sound for water droplets in gaming or movies, which has not yet been achieved.

Credit: 
University of Cambridge

Tiny jumping roundworm undergoes unusual sexual development

image: A newly hatched Steinernema carpocapsae juvenile (shown here, scale bar 50 μm) is only about 0.25 mm (less than 1/100 of an inch) long with a gonad only 0.013 mm in length. During development, the worm will increase over 10 times in length (in comparison, the average human only increases about 4 fold in height from birth to adulthood) and its gonad increases almost 500 times in length.

Image: 
Nathan Schroeder, University of Illinois.

URBANA, Ill. - Nematodes may be among the simplest animals, but scientists can't get enough of the microscopic roundworms. They have mapped the entire genome of C. elegans, the "lab rat" of nematodes, and have characterized nearly every aspect of its biology, with a particular focus on neurons. For years, it was assumed other nematodes' neurons were similar to those of C. elegans, until researchers at the University of Illinois demonstrated the vast diversity in neuronal anatomy present across species.

Now Nathan Schroeder, assistant professor in the Department of Crop Sciences at U of I and leader of the previous study, has shown that gonad development also varies in other nematodes relative to C. elegans. Specifically, he and graduate student Hung Xuan Bui focused on Steinernema carpocapsae, a nematode used in insect biocontrol applications in lawns and gardens.

The gonads in all nematodes develop within a structure called the gonad arm, a tube through which multiple reproductive organs migrate into place throughout the animal's postembryonic development. This happens in a highly predictable manner in C. elegans, with very low variability among individuals. Not so with Steinernema.

Schroeder says finding and understanding examples of variability within and among species can help scientists understand how diversity arises, an open question with relevance to evolution and genetic processes.

But it also has practical applications, especially in this species.

"One of the issues in terms of commercialization of Steinernema biocontrol products is being able to produce a lot of them," he says. "Can we somehow increase the overall reproductive output of these animals? Understanding more about the gonad development, where babies are actually being made, might move us in that direction."

Aside from showing that Steinernema development differs from C. elegans, the study also represents an advancement in terms of studying organisms whose development occurs almost entirely inside another organism.

These tiny roundworms, less than a millimeter long, stand upright on their tails and jump up to 10 times their body length with the goal of landing on and infecting an insect. Once they find a bug, Steinernema expels symbiotic bacteria from its gut, which is what kills the insect.

That's when the nematode starts feeding on the insect and the bacteria that, by this point, has spread throughout the insect's body. Being exposed to this external bacterial stew is what triggers the nematode to begin its postembryonic sexual development and then to reproduce with other nematodes nestled inside the same insect. As one can imagine, it could be rather difficult to replicate that environment in the lab.

"Bui was able to trick them. He put them in a high density of this bacteria, and essentially tricked them into coming out of this juvenile stage to undergo normal reproductive development without being inside the insect," Schroeder says.

The technique should allow further study of the anatomy and behavior of this and other so-called entomopathogenic, or bug-eating, nematodes.

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Using fragment-based approaches to discover new antibiotics

image: Using fragment-based approaches to discover new antibiotics.

Image: 
Bas Lamoree and Roderick E. Hubbard

In the July 2018 issue of SLAS Discovery, a review article summarizes new methods of fragment-based lead discovery (FBLD) to identify new compounds as potential antibiotics.

Authors Bas Lamoree and Roderick E. Hubbard of the University of York (UK) explain how FBLD works and illustrate its advantages over conventional high-throughput screening (HTS). Specifically, how FBLD increases the chances of finding hit compounds; how its methods can deliver hits without the massive investment required for HTS; and how by starting small, FBLD gives medicinal chemists more opportunities to build more drug-like compounds. These principles are illustrated in the review and supported with recent examples of discovery projects against a range of potential antibiotic targets.

The rise in antibiotic resistance is now recognized as a real threat to human health. However, no new antibiotics have been developed in many decades. FBLD begins by identifying low molecular weight compounds (fragments), which bind to protein targets. Information on how the fragments bind to their protein targets is then used to grow the compounds into potent drug candidates. Because the fragments are small, they are more likely to fit into a binding site and each fragment represents a huge number of potential compounds.

Credit: 
SLAS (Society for Laboratory Automation and Screening)

Not junk: 'Jumping gene' is critical for early embryo

image: These are single two-cell mouse embryos with nuclear LINE1 RNA labeled magenta.

Image: 
Ramalho-Santos Lab

A so-called "jumping gene" that researchers long considered either genetic junk or a pernicious parasite is actually a critical regulator of the first stages of embryonic development, according to a new study in mice led by UC San Francisco scientists and published June 21, 2018 in Cell.

Only about 1 percent of the human genome encodes proteins, and researchers have long debated what the other 99 percent is good for. Many of these non-protein coding regions are known to contain important regulatory elements that orchestrate gene activity, but others are thought to be evolutionary garbage that is just too much trouble for the genome to clean up.

For example, fully half of our DNA is made up of "transposable elements," or "transposons," virus-like genetic material that has the special ability of duplicating and reinserting itself in different locations in the genome, which has led researchers to dub them genetic parasites. Over the course of evolution, some transposons have left hundreds or thousands of copies of themselves scattered across the genome. While most of these stowaways are thought to be inert and inactive, others create havoc by altering or disrupting cells' normal genetic programming and have been associated with diseases such as certain forms of cancer.

Now UCSF scientists have revealed that, far from being a freeloader or parasite, the most common transposon, called LINE1, which accounts for 20 percent or more of the human genome, is actually necessary for embryos to develop past the two-cell stage.

'Playing with Fire'

Study senior author Miguel Ramalho-Santos, PhD, an associate professor of obstetrics/gynecology and reproductive sciences and a member of the Eli and Edythe Broad Center for Regeneration Medicine and Stem Cell Research at UCSF, has been interested in transposons since he first set up his lab as an independent UCSF Fellow in 2003.

He and others had observed that embryonic stem cells and early embryos expressed high levels of LINE1, which seemed paradoxical for a gene that was considered to be a dangerous, disease-causing parasite. "Given the standard view of transposons, these early embryos were really playing with fire," he remembers thinking. "It just didn't make any sense, and I wondered if something else was going on."

The project took off when Michelle Percharde, PhD, joined the lab as a post-doctoral researcher in 2013 and was quickly caught up in Ramalho-Santos's enthusiasm for the LINE1 paradox. "When I saw all the LINE1 RNA that's present in the nucleus of developing cells, I agreed there must be some role it's playing," Percharde said. "Why let your cells make so much of this RNA if it's either dangerous or doing nothing?"

To determine whether the high levels of LINE1 RNA expression in mouse embryos was actually important to the animals' development, Percharde performed experiments to eliminate LINE1 RNA from mouse embryonic stem cells. To her surprise, she found that the pattern of gene expression in these cells changed, reverting to a pattern seen in two-cell embryos after the fertilized egg's first division. The team tried eliminating LINE1 from fertilized eggs and found that the embryos completely lost their ability to progress past the two-cell phase.

"When we saw that cells were changing identity when we removed LINE1 RNA, that was our real 'Aha!' moment that told us we were on to something," Percharde said.

Further experiments revealed that although the LINE1 gene is expressed in the early embryo and stem cells, its role is not to insert itself elsewhere in the genome. Instead, its RNA is trapped within the cell nucleus, where it forms a complex with gene regulatory proteins Nucleolin and Kap1. This complex is necessary to turn off the dominant genetic program that orchestrates embryos' two-cell state -- controlled by a gene called Dux -- and to turn on genes that are needed for the embryo to move on with further cell divisions and development.

Duplication May Add Robustness

The research took five years to come to fruition, and required Percharde to invent several new techniques for studying transposons along the way, but has resulted in a finding that the researchers hope will convince other scientists to finally pay attention to functional roles of jumping genes.

"These genes have been with us for billions of years and have been the majority of our genomes for hundreds of millions of years," Ramalho-Santos said. "I think it's fair to ask if the 1.5 percent of protein-coding genes are the free-riders, and not the other way around."

Ramalho-Santos speculates that transposons like LINE1 may make the delicate early stages of development more robust precisely because they are so ubiquitous. Because LINE1 is repeated thousands of times in the genome, it's virtually impossible for a mutation to disrupt its function: if one copy is bad, there are thousands more to take its place.

"We now think these early embryos are playing with fire but in a very calculated way," Ramalho-Santos said. "This could be a very robust mechanism for regulating development."

"Scientists have done so much work on protein-coding genes, and they're less than 2 percent of the genome, whereas transposons make up nearly 50 percent," Percharde added. "I'm personally excited to continue exploring novel functions of these elements in development and disease."

Credit: 
University of California - San Francisco

Watch: Insects also migrate using the Earth's magnetic field

image: The Australian Bogong moth Agrotis infusa (Lepidoptera: Noctuidae).

Image: 
Ajay Narendra

A major international study led by researchers from Lund University in Sweden has proven for the first time that certain nocturnally migrating insects can explore and navigate using the Earth's magnetic field. Until now, the ability to steer flight using an internal magnetic compass was only known in nocturnally migrating birds.

WATCH: The incredible journey of the bogong moth https://www.youtube.com/watch?v=Z7w44Ka0xXI

"Our findings are the first reliable proof that nocturnally active insects can use the Earth's magnetic field to guide their flight when migrating over one thousand kilometres. We show that insects probably use the Earth's magnetic field in a similar way to birds", says Eric Warrant, professor at Lund University.

Eric Warrant and David Dreyer at Lund University, together with colleagues from Australia, Canada, Germany, and the USA , studied the moth species Agrotis infusa, also known as the Bogong moth, in Australia.

The findings indicate that the insects use both visual landmarks in their flight path and the Earth's magnetic field, probably making their navigation more reliable.

The researchers believe that moths in northern Europe may use the Earth's magnetic field in an equivalent manner when flying over the Alps to the Mediterranean.

The moths migrate over a great distance every year, from a large area in southeastern Australia to a specific area of small, cool caves high up in the mountains more than one thousand kilometres away. After a few months in a dormant state, they make the same journey back when summer is over. Besides the Bogong moth, only the North American Monarch butterfly migrates with equivalent precision.

The researchers focused on investigating how the Bogong moth knows in which direction to fly. They found answers by capturing the moths in flight and placing them in a flight simulator where the insects were free to fly in any direction they chose. The flight simulator - invented by team members Barrie Frost and Henrik Mouritsen for studying navigation in Monarch butterflies - was in turn placed in a system of magnetic coils which allowed the researchers to turn the magnetic field in any direction. In addition, they were able to show visual landmarks to the moths.

"By turning the magnetic field and the landmarks either together or in conflict with each other, we were able to investigate how the Bogong moths use magnetic and visual information to direct their flight", says David Dreyer, adding:

"When the magnetic field and the landmarks were turned together, the moths changed their flight path in an equivalent manner. However, if the magnetic field and the landmarks were turned in conflict with each other, the moths lost their sense of direction and became confused."

Eric Warrant has many years of experience of researching animal night vision and how animals navigate in the dark. Nevertheless, the findings surprised him.

"I believed the studies would show that Bogong moths only use visual cues such as stars, the moon and landmarks to navigate. But that is not the case. They perceive the Earth's magnetic field in exactly the same way as birds do - and probably for the same reason."

The next step will be to find out how the moths, despite never having been to the caves before, know that they have arrived at their destination. The researchers also want to locate and characterise the insects' elusive magnetic sensor.

Credit: 
Lund University

Cells stop dividing when this gene kicks into high gear, study finds

image: These are senescent cells under a microscope. The cells -- human lung fibroblasts -- became senescent after they or nearby cells were genetically engineered to increase activity of the CD36 gene. Areas stained in blue are regions where an enzyme associated with senescence is active.

Image: 
Darleny Lizardo/Alan Siegel/University at Buffalo North Campus Confocal Imaging Facility

BUFFALO, N.Y. -- Scientists seeking to unlock the secrets of cellular aging have identified a gene that triggers senescence, a phenomenon in which cells stop dividing.

Senescence is a natural occurrence in the life of a cell, and researchers have sought to learn about it for a couple of reasons. First, it's connected to old age: Senescent cells are thought to contribute to heart disease, arthritis, cataracts and a bevy of other age-linked conditions. Second, a lack of senescence is a hallmark of cancer cells, which bypass this process to replicate in an uncontrolled manner.

The new study -- published online on June 20 in Molecular Omics, a journal of the Royal Society of Chemistry -- illuminates genes involved in cellular senescence, and highlights one in particular that seems tightly associated with this crucial biological process.

In experiments, University at Buffalo researchers discovered that a gene called CD36 is unusually active in older, senescent cells.

What's more, scientists were able to cause young, healthy cells to stop dividing by heightening CD36 activity within those cells. The effect spread to nearby cells, with almost all of the cells in a petri dish showing signs of senescence when only a small fraction of those cells -- about 10 to 15 percent -- were overexpressing CD36. New cells placed in the growth medium (a soupy substance) that previously housed the senescent cells also stopped replicating.

"What we found was very surprising," says Ekin Atilla-Gokcumen, PhD, an assistant professor of chemistry in the UB College of Arts and Sciences. "Senescence is a very complex process, and we didn't expect that altering expression of one gene could spark it, or cause the same effect in surrounding cells."

The results point to CD36 as an exciting topic of future research. The gene's exact role in senescence remains a mystery: Scientists know that the gene guides the body in building a protein of the same name that sits on the surface of cells, but this protein's functions are still being studied. Proposed activities include helping cells import lipids, and influencing how these lipids are used within cells.

"Our research identifies CD36 as a candidate for further study. Senescence is a fundamental aspect of being a cell, but there is still a lot that we don't know about it," says Omer Gokcumen, PhD, an assistant professor of biological sciences in the UB College of Arts and Sciences. "Senescence seems to have implications for old age and cancer, so understanding it is very important."

Atilla-Gokcumen and Gokcumen led the study. The first authors were UB chemistry PhD student Darleny Lizardo and UB biological sciences postdoctoral researcher Marie Saitou, who won an award for a talk on this work during the UB Postdoctoral Research Symposium in June.

Zeroing in on an important gene

The scientists did not set out to investigate CD36.

Instead, they began with a pair of broad goals: They wanted to catalogue all genes related to senescence, and they wanted to gain a better understanding, in particular, of lipid-related genes involved in this process. (Past studies have shown that lipids play a role in cellular aging.)

CD36 emerged as a gene of interest in experiments designed to address these questions.

First, through a technique called transcriptomics, scientists identified CD36 as one of the two lipid-related genes that ramp up their activity the most in senescent cells. (This part of the study was done on two kinds of cells -- human skin and lung fibroblasts -- and the findings held true for both cell types.)

CD36 popped up again in a second test, this one a genetic analysis of all lipid-related genes that kicked into high gear during senescence. Within this group of genes, CD36 stood out as one of the most variable in humans, meaning that the gene's DNA sequence is highly likely to vary from person to person. Such diversity may be an indicator of functional variation, in which different environmental and evolutionary pressures give rise to a range of useful mutations in a highly expressed gene that serves an important purpose, Gokcumen says.

"We did not set out to look for CD36," Gokcumen says. "We took a broad approach to our study, using transcriptomics and an evolutionary framework to identify genes and proteins that are fundamental to the senescence process. In the end, CD36 stood out as an outlier in both cases. That's kind of beautiful -- a compelling way to do biological research."

Credit: 
University at Buffalo

Army study quantifies changes in stress after meditation

image: A Soldier practices meditation to relieve stress.

Image: 
US Army photo by Dr. Valerie Rice

ADELPHI, MD. - For a thousand years, people have reported feeling better by meditating but there has never been a systematic study that quantified stress and how much stress changes as a direct result of meditation until now.

U.S. Army Research Laboratory researchers spent a year collaborating with a team of scientists from the University of North Texas to develop a new data processing technique that uses heart rate variability as a sensor to monitor the state of the brain. Their findings are reported in a paper published in the June edition of Frontiers in Physiology.

Healthy heartbeats have irregularities built into them with a slight random variation occurring in the time interval between successive beats. The sinus node, or the heart's natural pacemaker, receives signals from the autonomic or involuntary portion of the nervous system, which has two major branches: the parasympathetic, whose stimulation decreases the firing rate of the sinus node, and the sympathetic, whose stimulation increases the firing rate. These two branches produce a continual tug-of-war that generates fluctuations in the heart rate of healthy individuals.

Heart rate variability provides a window through which we can observe the heart's ability to respond to external disturbances, such as stress, said Dr. Bruce West, the Army's senior research scientist for mathematics.

He said it turns out that the HRV time series is very sensitive to changes in the physiological state of the brain and the new data processing system, called dynamic subordination technique, can quantify the changes in HRV and relate these directly to brain activity, such as produced by meditation. Thus, the DST has quantified the level of stress reduction produced by meditation and offers the potential to quantify such things as the inability to concentrate and sustain focus, impatience, impulsiveness and other dysfunctional properties that severely limit a soldier's ability to do his job.

Stress modulates the autonomic nervous system signals, which in turn disrupts normal HRV and therefore the stress level can be detected by processing HRV time series.

Through a new method of processing HRV time series data, the researchers developed a way to measure the change in the level of stress provided by meditation. This measure assigns a number to the level of variability of heartbeat interval time series before and during meditation. This number indicates precisely how much stress is alleviated by control of the heart-brain coupling through meditation.

In the article, Rohisha Tuladhar, Gyanendra Bohara, and Paolo Grigolini, all from the University of North Texas and Bruce J. West, Army Research Office, propose and successfully test a new model for the coupling between the heart and brain, along with a measure of the influence of meditation on this network. Traditional models of biofeedback focus on the coherent behavior, assuming a kind of resonance, however the new approach includes both periodicity and complexity.

The research team compared two schools of meditation and determined that Yoga, over Chi meditation, is more effective in reducing stress and can show by how much. They also found that the long-term practice of meditation has the effect of making permanent the meditation-induced physiologic changes. Moreover that meditators show a stronger executive control, that is, the ability to carry out goal-oriented behavior, using complex mental processes and cognitive abilities.

Many military historians believe battles, even wars, have been won or lost in the warrior's mind, long before any physical conflict is initiated. Learning how to circumvent the debilitating psychological influence of stress requires that we have in hand a way to quantify its influence, in order to gauge the effectiveness of any given procedure to counteract its effects, explained West, who's a Fellow with the American Physical Society, American Association for the Advancement of Science and ARL.

Historically, one purpose of meditation has been to reduce stress, however, the Army's long-term goal is to use it to mitigate the effects of post-traumatic stress disorder, or PTSD. West said the potential for this to succeed has been dramatically increased with the new ability to quantify the degree of effectiveness in stress reduction using different meditation techniques.

From a physiological perspective, meditation constitutes a coupling of the functionalities of the heart and brain. We are only now beginning to understand how to take advantage of the coupling of the two to measure stress reduction by applying the methods of science and data analysis to HRV time series.

Our research focus is on changes in physiologic processes, such as stress level. It is the most direct measure of the effectiveness of meditation in reducing stress to date and compliments an existing ARL program on determining the efficacy of mindfulness meditation stress reduction, which quantifies the influence on different task performance measures, such as changes in PTSD symptoms, West said.

This early research could guide the design and testing of new interventions for improving warrior readiness and resilience, as well as reducing symptoms of PTSD.

Credit: 
U.S. Army Research Laboratory

New technology helps to improve treatment for NHS patients with depression

A new web-based "feedback" technology which allows therapists to accurately monitor how patients with depression are coping has been found to reduce the probability of deterioration during psychological treatment by 74%, a new study has found.

The study, which is the largest controlled trial of its kind, involved data from more than 2,000 mental health patients treated across multiple NHS Trusts in England.

Psychological therapy is offered to many people with depression and anxiety who seek treatment in the NHS, but while roughly half of these patients respond well to the treatment, up to 10% actually get worse.

Known as "Outcome Feedback", the new technology was developed by an international team of researchers from UK, German and Dutch Universities, in partnership with PCMIS, a patient case management information system used widely by mental health services.

"Outcome Feedback" uses patient feedback to rapidly identify patients at risk of deterioration by tracking their symptoms and monitoring their response to treatment.

The team behind the technology have now received a grant from the Wellcome Trust to implement the software across the NHS with the aim of improving the quality of psychological care. The grant will also enable the researchers to develop an e-learning programme to train NHS therapists to use the technology effectively.

Lead author of the study, Dr Jaime Delgadillo, who conducted the research while at the Department of Health Sciences at the University of York, and is now based at the University of Sheffield, said: "There are many complex reasons why some patients get worse during treatment, including difficult life circumstances and sometimes unresolved difficulties in their relationship with their therapist.

"Patients who don't respond well to therapy usually drop out of treatment after only a few sessions. The outcome feedback technology we developed accurately identifies problems early on and allows therapists to be more in tune with their patients' difficulties and obstacles to improvement."

The technology uses data from weekly patient questionnaires which measure the frequency and intensity of typical depression and anxiety symptoms; such as lethargy, low mood, disrupted sleep cycle, loss of appetite, restlessness, constant worry and difficulty relaxing.

The therapist enters this information into the patient case management system, which plots a graph showing changes in the patient's level of symptoms from week to week.

The system searches for other patients with similar symptoms to assess if the current patient is responding to treatment in a typical way, flagging up "atypical" cases that are not on track. Therapists can then intervene and adapt their plans for treatment, while continuing to track the patient's progress using the feedback technology.

Credit: 
University of York

NIH-funded study finds new evidence that viruses may play a role in Alzheimer’s disease

Analysis of large data sets from post-mortem brain samples of people with and without Alzheimer’s disease has revealed new evidence that viral species, particularly herpesviruses, may have a role in Alzheimer’s disease biology. Researchers funded by the National Institute on Aging (NIA), part of the National Institutes of Health, made the discovery by harnessing data from brain banks and cohort studies participating in the Accelerating Medicines Partnership - Alzheimer's Disease (AMP-AD) consortium.

Reporting in the June 21 issue of the journal Neuron, the authors emphasize that their findings do not prove that the viruses cause the onset or progression of Alzheimer’s. Rather, the findings show viral DNA sequences and activation of biological networks—the interrelated systems of DNA, RNA, proteins and metabolites—may interact with molecular, genetic and clinical aspects of Alzheimer’s.

“The hypothesis that viruses play a part in brain disease is not new, but this is the first study to provide strong evidence based on unbiased approaches and large data sets that lends support to this line of inquiry,” said NIA Director Richard J. Hodes, M.D. “This research reinforces the complexity of Alzheimer’s disease, creates opportunities to explore Alzheimer’s more thoroughly, and highlights the importance of sharing data freely and widely with the research community.”

Alzheimer’s disease is an irreversible, progressive brain disorder that slowly destroys memory and thinking skills and, eventually, the ability to carry out simple tasks. More evidence is accumulating to indicate that this loss of cognitive functioning is a mix of many different disease processes in the brain, rather than just one, such as buildup of amyloid or tau proteins. Identifying links to viruses may help researchers learn more about the complicated biological interactions involved in Alzheimer’s, and potentially lead to new treatment strategies.

The research group, which included experts from Icahn School of Medicine at Mount Sinai, New York City, and Arizona State University, Phoenix, originally set out to find whether drugs used to treat other diseases can be repurposed for treating Alzheimer’s. They designed their study to map and compare biological networks underlying Alzheimer’s disease. What they found is that Alzheimer’s biology is likely impacted by a complex constellation of viral and host genetic factors, adding that they identified specific testable pathways and biological networks.

“The robust findings by the Mount Sinai team would not have been possible without the open science data resources created by the AMP-AD program–particularly the availability of raw genomic data,” said NIA Program Officer Suzana Petanceska, Ph.D., who leads the AMP-AD Target Discovery and Preclinical Validation Project. “This is a great example of the power of open science to accelerate discovery and replication research.”

The researchers used multiple layers of genomic and proteomic data from several NIA-supported brain banks and cohort studies. They began their direct investigation of viral sequences using data from the Mount Sinai Brain Bank and were able to verify their initial observations using datasets from the Religious Orders Study, the Memory and Aging Project and the Mayo Clinic Brain Bank. They were then able to incorporate additional data from the Emory Alzheimer's Disease Research Center to understand viral impacts on protein abundance. Through the application of sophisticated computational modeling the researchers made several key findings, including:

Human herpesvirus 6A and 7 were more abundant in Alzheimer’s disease samples than non-Alzheimer’s.

There are multiple points of overlap between virus-host interactions and genes associated with Alzheimer’s risk.

Multiple viruses impact the biology of Alzheimer’s disease across domains such as DNA, RNA and proteins.

Important roles for microbes and viruses in Alzheimer’s disease have been suggested and studied for decades, the authors noted. Since the 1980s, hundreds of reports have associated Alzheimer’s with bacteria and viruses. These studies combined suggest a viral contribution but have not explained how the connection works.

While the current findings are more specific, they do not provide evidence to change how risk and susceptibility are assessed, nor the diagnosis and treatment of Alzheimer’s, the authors said. Rather, the research gives scientists reason to revisit the old pathogen hypothesis and will be the basis for further work that will test whether herpes virus activity is one of the causes of Alzheimer’s.

More on this research is available in announcements from the Icahn School of Medicine at Mount Sinai, Arizona State University and Cell Press, the publisher of Neuron.

Credit: 
NIH/National Institute on Aging

Enhanced detection of nuclear events, thanks to deep learning

video: Scientists at Pacific Northwest National Laboratory are exploring deep learning to interpret highly technical data related to national security, the environment, the cosmos, and breast cancer. In one project a deep neural network running on an ordinary desktop computer is interpreting data about events known as radioactive decays as well as -- and sometimes better than -- today's best automated methods or even human experts.

Image: 
Video courtesy of PNNL

A deep neural network running on an ordinary desktop computer is interpreting highly technical data related to national security as well as - and sometimes better than - today's best automated methods or even human experts.

The progress tackling some of the most complex problems of the environment, the cosmos and national security comes from scientists at the Department of Energy's Pacific Northwest National Laboratory who presented their work at the 11th MARC conference - Methods and Applications of Radioanalytical Chemistry - in April in Hawaii. Their work employs deep learning, in which machines are enabled to learn and make decisions without being explicitly programmed for all conditions.

The research probes incredibly complex data sets from the laboratory's shallow underground lab, where scientists detect the faintest of signals from a planet abuzz in activity. In the laboratory buried 81 feet beneath concrete, rock and earth, thick shielding dampens signals from cosmic rays, electronics and other sources. That allows PNNL scientists to isolate and decipher signals of interest collected from anywhere on the planet.

Those signals signify events called radioactive decays, when a particle such as an electron is emitted from an atom. The process is happening constantly, through both natural and human activity. Scientists can monitor changes in levels of argon-37, which could indicate prior nuclear test activity, and argon-39, whose levels help scientists determine the age of groundwater and learn more about the planet.

The lab has accumulated data on millions of radioactive decay events since it opened in 2010. But it's a noisy world out there, especially for scientists listening for very rare signals that are easily confused with signals of a different and frequently routine origin - for instance, a person flipping on a light switch or receiving a call on a cell phone.

PNNL scientist Emily Mace, who presented at MARC, is an expert in interpreting the features of such signals - when an event might indicate underground nuclear testing, for example, or a rapidly depleting aquifer. Much like physicians peruse X-rays for hints of disease, Mace and her colleagues pore over radioactive decay event data regularly to interpret the signals - their energy, timing, peaks, slopes, duration, and other features.

"Some pulse shapes are difficult to interpret," said Mace. "It can be challenging to differentiate between good and bad data."

Recently Mace and colleagues turned for input to their colleagues who are experts in deep learning, an exciting and active subfield of artificial intelligence. Jesse Ward is one of dozens of deep learning experts at the lab who are exploring several applications through PNNL's Deep Learning for Scientific Discovery Agile Investment. Mace sent Ward information on nearly 2 million energy pulses detected in the Shallow Underground Laboratory since 2010.

Ward used a clean sample set of 32,000 pulses to train the network, inputting many features of each pulse and showing the network how the data was interpreted. Then he fed the network thousands more signals as it taught itself to differentiate between "good" signals that showed something of interest and "bad" signals that amounted to unwanted noise. Finally, he tested the network, feeding it increasingly complex sets of data that are difficult even for experts to interpret.

The network he created interprets pulse shape events with an accuracy that equals and sometimes surpasses the know-how of experts like Mace. With straightforward data, the program sorted more than 99.9 percent of the pulses correctly.

Results are even more impressive when the data is noisy and includes an avalanche of spurious signals:

In an analysis involving 50,000 pulses, the neural network agreed 100 percent of the time with the human expert, besting the best conventional computerized techniques which agreed with the expert 99.8 percent of the time.

In another analysis of 10,000 pulses, the neural net correctly identified 99.9 percent of pulses compared to 96.1 percent with the conventional technique. Included in this analysis were the toughest pulses to interpret; with that subset, the neural network did more than 25 times better, correctly classifying 386 out of 400 pulses compared to 14 of 400 for the conventional technique.

"This is a relatively simple neural network but the results are impressive," said Ward. "You can do productive work on important scientific problems with a fairly primitive machine. It's exciting to consider what else is possible."

The project posed an unexpected challenge, however: The shallow underground lab is so pristine, with most spurious noise signals mitigated before they enter the data stream, that Ward found himself asking Mace for more bad data.

"Signals can be well behaved or they can be poorly behaved," said Ward. "For the network to learn about the good signals, it needs a decent amount of bad signals for comparison."

The problem of culling through vast amounts of data looking for meaningful signals has a raft of implications and extends to many areas of science. At PNNL, one area is the search for signals that would result from dark matter, the vast portion of matter in our universe whose origin and whereabouts is unknown. Another is the automatic detection of breast cancers and other tissue anomalies.

"Deep learning is making it easier for us to filter out a small number of good events that are indicative of the activity of interest," said Craig Aalseth, nuclear physicist and PNNL laboratory fellow. "It's great to see deep-learning techniques actually doing a better job than our previous best detection techniques."

Credit: 
DOE/Pacific Northwest National Laboratory