Heavens

Visualizing strong magnetic fields with neutrons

image: The device for determining the orientation of the magnetic field functions like a compass. If you hold it against a magnet (here, silver-grey), the blue-red pin rotates so that its red end points in the direction of the north pole.

Image: 
Paul Scherrer Institute/Mahir Dzambegovic

Researchers at the Paul Scherrer Institute PSI have developed a new method with which strong magnetic fields can be precisely measured. They use neutrons obtained from the SINQ spallation source. In the future, it will therefore be possible to measure the fields of magnets that are already installed in devices and thus are inaccessible by other probing techniques. The researchers have now published their results in the journal Nature Communications.

Neutrons are, as their name suggests, electrically neutral and are the building blocks of almost all atomic nuclei. Neutrons interact with magnetic fields due to their so-called spin. Researchers at the Paul Scherrer Institute PSI have now shown that this property can be used to visualise magnetic fields. They used polarised neutrons, which means that all of the neutrons have the same spin orientation.

If beams of polarised neutrons pass through a magnetic field, a refraction of the neutron beam can be detected behind this field. From the refraction pattern, the magnetic field and in particular the differences in field strengths can be reconstructed. For the first time this method, also known as polarised neutron grating interferometry (pnGI), has been used to measure magnetic fields.

One million times stronger than Earth's magnetic field

pnGI can be used to measure very strong magnetic fields with a so-called gradient strength in the order of 1 Tesla per centimeter. "This allows us to move in orders of magnitude about one million times stronger than Earth's magnetic field", says Christian Grünzweig, a neutron researcher at the Paul Scherrer Institute PSI. Until now, neutrons could only be used to measure significantly weaker magnetic fields.

From alternators to MRI systems

Numerous applications are conceivable for the new method, above all because neutrons penetrate most materials non-destructively. "We can also probe magnetic fields that are difficult to access because they are already built into an apparatus", explains Jacopo Valsecchi, first author of the study and a doctoral candidate working at PSI. "Applications range from alternators in car engines to many components of the energy supply system to magnetic fields from magnetic resonance tomography systems used in medicine."

The researchers proved that their method works by using computer models to simulate the expected results of the measurement. They then checked whether comparable results could actually be achieved with a real measurement. "The results from the simulations and the actual measurement results agree very well", says Grünzweig.

With the new method, fluctuations in the magnetic field can also be detected. For example, even permanent magnets, such as those familiar from magnetic stickers for refrigerator doors, do not have a homogeneous magnetic field. "We can now detect possible gradients, even if the magnetic field is very strong", says physicist Valsecchi.

Credit: 
Paul Scherrer Institute

Origin of massive methane reservoir identified

image: The manipulator arm of the remotely operated vehicle Jason samples a stream of fluid from a hydrothermal vent. The fluid contains gases that are in liquid form because of the high pressure of the deep ocean.

Image: 
Photo by Chris German/WHOI/NSF, NASA/ROV Jason 2012, © Woods Hole Oceanographic Institution

New research from Woods Hole Oceanographic Institution (WHOI) published Aug. 19, 2019, in the Proceedings of the National Academy of Science provides evidence of the formation and abundance of abiotic methane--methane formed by chemical reactions that don't involve organic matter--on Earth and shows how the gases could have a similar origin on other planets and moons, even those no longer home to liquid water. Researchers had long noticed methane released from deep-sea vents. But while the gas is plentiful in the atmosphere where it's produced by living things, the source of methane at the seafloor was a mystery.

"Identifying an abiotic source of deep-sea methane has been a problem that we've been wrestling with for many years," says Jeffrey Seewald a senior scientist at WHOI who studies geochemistry in hydrothermal systems and is one of the study's authors.

Of 160 rock samples analyzed from across the world's oceans, almost all contained pockets of methane. These oceanic deposits make up a reservoir exceeding the amount of methane in Earth's atmosphere before industrialization, estimates Frieder Klein, a marine geologist at WHOI and lead author of the study.

"We were totally surprised to find this massive pool of abiotic methane in the oceanic crust and mantle," Klein says.

The scientists analyzed rocks using Raman spectroscopy, a laser-based microscope that allows them to identify fluids and minerals in a thin slice of rock. Nearly every sample contained an assemblage of minerals and gases that form when seawater, moving through the deep oceanic crust, is trapped in magma-hot olivine. As the mineral cools, the water trapped inside undergoes a chemical reaction, a process called serpentinization that forms hydrogen and methane. The authors demonstrate that in otherwise inhospitable environments, just two ingredients?--water and olivine?--can form methane.

"Here's a source of chemical energy that's being created by geology," says Seewald.

On Earth, deep-sea methane might have played a critical role for the evolution of primitive organisms living at hydrothermal vents on the seafloor, Seewald explains. And elsewhere in the solar system, on places like Jupiter's moon Europa and Saturn's Enceladus, methane produced through the same process could provide an energy source for basic life forms.

Credit: 
Woods Hole Oceanographic Institution

James Webb Space Telescope could begin learning about TRAPPIST-1 atmospheres in a year

New research from astronomers at the University of Washington uses the intriguing TRAPPIST-1 planetary system as a kind of laboratory to model not the planets themselves, but how the coming James Webb Space Telescope might detect and study their atmospheres, on the path toward looking for life beyond Earth.

The study, led by Jacob Lustig-Yaeger, a UW doctoral student in astronomy, finds that the James Webb telescope, set to launch in 2021, might be able to learn key information about the atmospheres of the TRAPPIST-1 worlds even in its first year of operation, unless -- as an old song goes -- clouds get in the way.

"The Webb telescope has been built, and we have an idea how it will operate," said Lustig-Yaeger. "We used computer modeling to determine the most efficient way to use the telescope to answer the most basic question we'll want to ask, which is: Are there even atmospheres on these planets, or not?"

His paper, "The Detectability and Characterization of the TRAPPIST-1 Exoplanet Atmospheres with JWST," was published online in June in the Astronomical Journal.

The TRAPPIST-1 system, 39 light-years -- or about 235 trillion miles -- away in the constellation of Aquarius, interests astronomers because of its seven orbiting rocky, or Earth-like, planets. Three of these worlds are in the star's habitable zone -- that swath of space around a star that is just right to allow liquid water on the surface of a rocky planet, thus giving life a chance.

The star, TRAPPIST-1, was much hotter when it formed than it is now, which would have subjected all seven planets to ocean, ice and atmospheric loss in the past.

"There is a big question in the field right now whether these planets even have atmospheres, especially the innermost planets," Lustig-Yaeger said. "Once we have confirmed that there are atmospheres, then what can we learn about each planet's atmosphere -- the molecules that make it up?"

Given the way he suggests the James Webb Space Telescope might search, it could learn a lot in fairly short time, this paper finds.

Astronomers detect exoplanets when they pass in front of or "transit" their host star, resulting in a measurable dimming of starlight. Planets closer to their star transit more frequently and so are somewhat easier to study. When a planet transits its star, a bit of the star's light passes through the planet's atmosphere, with which astronomers can learn about the molecular composition of the atmosphere.

Lustig-Yaeger said astronomers can see tiny differences in the planet's size when they look in different colors, or wavelengths, of light.

"This happens because the gases in the planet's atmosphere absorb light only at very specific colors. Since each gas has a unique 'spectral fingerprint,' we can identify them and begin to piece together the composition of the exoplanet's atmosphere."

Lustig-Yaeger said the team's modeling indicates that the James Webb telescope, using a versatile onboard tool called the Near-Infrared Spectrograph, could detect the atmospheres of all seven TRAPPIST-1 planets in 10 or fewer transits -- if they have cloud-free atmospheres. And of course we don't know whether or not they have clouds.

If the TRAPPIST-1 planets have thick, globally enshrouding clouds like Venus does, detecting atmospheres might take up to 30 transits.

"But that is still an achievable goal," he said. "It means that even in the case of realistic high-altitude clouds, the James Webb telescope will still be capable of detecting the presence of atmospheres -- which before our paper was not known."

Many rocky exoplanets have been discovered in recent years, but astronomers have not yet detected their atmospheres. The modeling in this study, Lustig-Yaeger said, "demonstrates that, for this TRAPPIST-1 system, detecting terrestrial exoplanet atmospheres is on the horizon with the James Webb Space Telescope -- perhaps well within its primary five-year mission."

The team found that the Webb telescope may be able to detect signs that the TRAPPIST-1 planets lost large amounts of water in the past, when the star was much hotter. This could leave instances where abiotically produced oxygen -- not representative of life -- fills an exoplanet atmosphere, which could give a sort of "false positive" for life. If this is the case with TRAPPIST-1 planets, the Webb telescope may be able to detect those as well.

Lustig-Yaeger's co-authors, both with the UW, are astronomy professor Victoria Meadows, who is also principal investigator for the UW-based Virtual Planetary Laboratory; and astronomy doctoral student Andrew Lincowski. The work follows, in part, on previous work by Lincowski modeling possible climates for the seven TRAPPIST-1 worlds.

"By doing this study, we have looked at: What are the best-case scenarios for the James Webb Space Telescope? What is it going to be capable of doing? Because there are definitely going to be more Earth-sized planets found before it launches in 2021."

The research was funded by a grant from the NASA Astrobiology Program's Virtual Planetary Laboratory team, as part of the Nexus for Exoplanet System Science (NExSS) research coordination network.

Lustig-Yaeger added: "It's hard to conceive in theory of a planetary system better suited for James Webb than TRAPPIST-1."

Credit: 
University of Washington

Methane not released by wind on Mars, experts find

Wind erosion has been ruled out as the primary cause of methane gas release on Mars, Newcastle University academics have shown.

Methane can be produced over time through both geological and biological routes and since its first detection in the Martian atmosphere in 2003, there has been intense speculation about the source of the gas and the possibility that it could signal life on the planet.

Previous studies have suggested the methane may not be evenly distributed in the atmosphere around Mars, but instead appear in localised and very temporary pockets on the planet's surface. And the previous discovery of methane 'spikes' in the Martian atmosphere has further fuelled the debate.

Now research led by Newcastle University, UK, and published in Scientific Reports, has ruled out the possibility that the levels of methane detected could be produced by the wind erosion of rocks, releasing trapped methane from fluid inclusions and fractures on the planets' surface.

Principal Investigator Dr Jon Telling, a geochemist based in the School of Natural and Environmental Sciences at Newcastle University, said:

"The questions are - where is this methane coming from, and is the source biological? That's a massive question and to get to the answer we need to rule out lots of other factors first.

"We realised one potential source of the methane that people hadn't really looked at in any detail before was wind erosion, releasing gases trapped within rocks. High resolution imagery from orbit over the last decade have shown that winds on Mars can drive much higher local rates of sand movement, and hence potential rates of sand erosion, than previously recognised.

"In fact, in a few cases, the rate of erosion is estimated to be comparable to those of cold and arid sand dune fields on Earth.

"Using the data available, we estimated rates of erosion on the surface of Mars and how important it could be in releasing methane.

"And taking all that into account we found it was very unlikely to be the source.

"What's important about this is that it strengthens the argument that the methane must be coming from a different source. Whether or not that's biological, we still don't know."

Pinpointing the source

Funded by the UK Space Agency, the research uses new alongside previously published data to consider the likely methane contents of different rock types and whether they have the capacity to produce measurable levels of methane when worn away.

The team found that for wind erosion to be a viable mechanism to produce detectable methane in the Martian atmosphere, the methane content of any gases trapped within rocks would have to rival those of some of the richest hydrocarbon containing shales on Earth; a highly unlikely scenario.

Lead author Dr Emmal Safi, a postdoctoral researcher in the School of Natural and Environmental Sciences at Newcastle University, concludes that the cause of methane spikes on Mars is still unknown.

"It's still an open question. Our paper is just a little part of a much bigger story," she says.

"Ultimately, what we're trying to discover is if there's the possibility of life existing on planets other than our own, either living now or maybe life in the past that is now preserved as fossils or chemical signatures."

Credit: 
Newcastle University

Researchers discover oldest fossil forest in Asia

image: Reconstructions of lycopsid trees (Guangdedendron micrum). Left: juvenile plant. Right: adult plant.

Image: 
Zhenzhen Deng

The Devonian period, which was 419 million to 359 million years ago, is best known for Tiktaalik, the lobe-finned fish that is often portrayed pulling itself onto land. However, the "age of the fishes," as the period is called, also saw evolutionary progress in plants. Researchers reporting August 8 in the journal Current Biology describe the largest example of a Devonian forest, made up of 250,000 square meters of fossilized lycopsid trees, which was recently discovered near Xinhang in China's Anhui province. The fossil forest, which is larger than Grand Central Station, is the earliest example of a forest in Asia.

Lycopsids found in the Xinhang forest resembled palm trees, with branchless trunks and leafy crowns, and grew in a coastal environment prone to flooding. These lycopsid trees were normally less than 3.2 meters tall, but the tallest was estimated at 7.7 meters, taller than the average giraffe. Giant lycopsids would later define the Carboniferous period, which followed the Devonian, and become much of the coal that is mined today. The Xinhang forest depicts the early root systems that made their height possible. Two other Devonian fossil forests have been found: one in the United States, and one in Norway.

"The large density as well as the small size of the trees could make Xinhang forest very similar to a sugarcane field, although the plants in Xinhang forest are distributed in patches," says Deming Wang, a professor in the School of Earth and Space Sciences at Peking University, co-first author on the paper along with Min Qin of Linyi University. "It might also be that the Xinhang lycopsid forest was much like the mangroves along the coast, since they occur in a similar environment and play comparable ecologic roles."

The fossilized trees are visible in the walls of the Jianchuan and Yongchuan clay quarries, below and above a four-meter thick sandstone bed. Some fossils included pinecone-like structures with megaspores, and the diameters of fossilized trunks were used to estimate the trees' heights. The authors remarked that it was difficult to mark and count all the trees without missing anything.

"Jianchuan quarry has been mined for several years and there were always some excavators working at the section. The excavations in quarries benefit our finding and research. When the excavators stop or left, we come close to the highwalls and look for exposed erect lycopsid trunks," says Wang, who, with Qin, found the first collection of fossil trunks in the mine in 2016. "The continuous finding of new in-situ tree fossils is fantastic. As an old saying goes: the best one is always the next one."

Credit: 
Cell Press

Gold glue really does bond nanocages 'contradicting' logic

image: The 'impossible' sphere, i.e. a molecular nanocage of 24 protein rings, each of which has an 11-sided structure. The rings are connected by bonds with the participation of gold atoms, here marked in yellow. Depending on their position in the structure, not all gold atoms have to be used to attach adjacent proteins (an unused gold atom is marked in red). (Source: UJ, IFJ PAN)

Image: 
Source: UJ, IFJ PAN

It has long been known that gold can be used to do things that philosophers have never even dreamed of. The Institute of Nuclear Physics of the Polish Academy of Sciences in Cracow has confirmed the existence of 'gold glue': bonds involving gold atoms, capable of permanently bonding protein rings. Skilfully used by an international team of scientists, the bonds have made it possible to construct molecular nanocages with a structure so far unparalleled in nature or even in mathematics.

The world of science has been interested in molecular cages for years. Not without reason. Chemical molecules, including those that would under normal conditions enter into chemical reactions, can be enclosed within their empty interiors. The particles of the enclosed compound, separated by the walls of the cage from the environment, have nothing to bond with. These cages can be therefore be used, for example, to transport drugs safely into a cancer cell, only releasing the drug when they are inside it.

Molecular cages are polyhedra made up of smaller 'bricks', usually protein molecules. The bricks can't be of any shape. For example, if we wanted to build a molecular polyhedron using only objects with the outline of an equilateral triangle, geometry would limit us to only three solid figures: a tetrahedron, an octahedron or an icosahedron. So far, there have been no other structural possibilities.

"Fortunately, Platonic idealism is not a dogma of the physical world. If you accept certain inaccuracies in the solid figure being constructed, you can create structures with shapes that are not found in nature, what's more, with very interesting properties," says Dr. Tomasz Wrobel from the Cracow Institute of Nuclear Physics of the Polish Academy of Sciences (IFJ PAN).

Dr. Wrobel is one of the members of an international team of researchers who have recently carried out the 'impossible': they built a cage similar in shape to a sphere out of eleven-walled proteins. The main authors of this spectacular success are scientists from the group of Prof. Jonathan Heddle from the Malopolska Biotechnology Centre of the Jagiellonian University in Cracow and the Japanese RIKEN Institute in Wako. The work described in Nature magazine took place with the participation of researchers from universities in Osaka and Tsukuba (Japan), Durham (Great Britain), Waterloo (Canada) and other research centres.

Each of the walls of the new nanocages was formed by a protein ring from which eleven cysteine molecules stuck out at regular intervals. It was to the sulphur atom found in each cysteine molecule that the 'glue', i.e. the gold atom, was planned to be attached. In the appropriate conditions, it could bind with one more sulphur atom, in the cysteine of a next ring. In this way a permanent chemical bond would be formed between the two rings. But would the gold atom under these conditions really be able to form a bond between the rings?

"In the Spectroscopic Imaging Laboratory of IFJ PAS we used Raman spectroscopy and X-ray photoelectron spectroscopy to show that in the samples provided to us with the test nanocages, the gold really did form bonds with sulphur atoms in cysteines. In other words, in a difficult, direct measurement, we proved that gold 'glue' for bonding protein rings in cages really does exist," explains Dr. Wrobel.

Each gold atom can be treated as a stand-alone clip that makes it possible to attach another ring. The road to the 'impossible' begins when we realize that we don't always have to use all of the clips! So, although all the rings of the new nanocages are physically the same, depending on their place in the structure they connect with their neighbours with a different number of gold atoms, and thus function as polygons with different numbers of vertices. 24 nanocage walls presented by the researchers were held together by 120 gold atoms. The outer diameter of the cages was 22 nanometres and the inner diameter was 16 nm.

Using gold atoms as a binder for nanocages is also important due to its possible applications. In earlier molecular structures, proteins were glued together using many weak chemical bonds. The complexity of the bonds and their similarity to the bonds responsible for the existence of the protein rings themselves did not allow for precise control over the decomposition of the cages. This is not the case in the new structures. On the one hand, gold-bonded nanocages are chemically and thermally stable (for example, they withstand hours of boiling in water). On the other hand, however, gold bonds are sensitive to an increase in acidity. By its increase, the nanocage can be decomposed in a controlled way and the contents can be released into the environment. Since the acidity within cells is greater than outside them, gold-bonded nanocages are ideal for biomedical applications.

The 'impossible' nanocage is the presentation of a qualitatively new approach to the construction of molecular cages, with gold atoms in the role of loose clips. The demonstrated flexibility of the gold bonds will make it possible in the future to create nanocages with sizes and features precisely tailored to specific needs.

Credit: 
The Henryk Niewodniczanski Institute of Nuclear Physics Polish Academy of Sciences

A rocky relationship: A history of Earth's continents breaking up and getting back together

A new study of rocks that formed billions of years ago lends fresh insight into how Earth's plate tectonics, or the movement of large pieces of Earth's outer shell, evolved over the planet's 4.56-billion-year history.

A report of the findings, published August 7 in Nature, reveals that, contrary to previous studies that say plate tectonics has operated throughout Earth's history or that it emerged only 0.7 billion years ago, plate tectonics actually evolved over the last 2.5 billion years. This new timeline impacts researchers' models for understanding how Earth has changed.

"One of the key ways to understand how Earth has evolved to become the planet that we know is plate tectonics," says Robert Holder, a Postdoctoral Fellow in Earth and Planetary Sciences at Johns Hopkins University and the paper's first author.

Plate tectonics dictates how continents drift apart and come back together, helps explain where volcanoes and earthquakes occur, predicts cycles of erosion and ocean circulation, and how life on Earth has evolved.

In a bid to resolve the mystery of how and when plate tectonics emerged on Earth, Holder and the research team examined a global compilation of metamorphic rocks that formed over the past 3 billion years at 564 sites. Metamorphic rocks are rocks that, through the process of being buried and heated deep in the Earth's crust, have transformed into a new type of rock. Scientists can measure the depth and temperatures at which metamorphic rocks form, and thereby constrain heat flow at different places in Earth's crust. Because plate tectonics strongly influences heat flow, ancient metamorphic rocks can be used to study plate tectonics in Earth's past.

The research team compiled data on the temperatures and depths at which the metamorphic rocks formed and then evaluated how these conditions have changed systematically through geological time. From this, the team found that plate tectonics, as we see it today, developed gradually over the last 2.5 billion years.

"The framework for much of our understanding of the world and its geological processes relies on plate tectonics," says Holder. "Knowing when plate tectonics began and how it changed impacts that framework."

Clarity on when plate tectonics began and whether it was different in Earth's past can help scientists better understand why we find certain rocks and minerals where we do and how they formed, says Holder.

Credit: 
Johns Hopkins University

Whole body vibration shakes up microbiome, reduces inflammation in diabetes

image: Drs. Babak Baban and Jack Yu

Image: 
Phil Jones, Senior Photographer, Augusta University

AUGUSTA, Ga. (Aug. 5, 2019) - In the face of diabetes, a common condition in which glucose and levels of destructive inflammation soar, whole body vibration appears to improve how well our body uses glucose as an energy source and adjust our microbiome and immune cells to deter inflammation, investigators report.

For the first time they have described how regular use of whole body vibration can create this healthier mix by yielding a greater percentage of macrophages -- cells that can both promote or prevent inflammation -- that suppress rather than promote.

In their mouse model, investigators at the Medical College of Georgia and Dental College of Georgia at Augusta University have also shown that whole body vibration alters the microbiome, a collection of microorganisms in and on our body, which help protect us from invaders and, in the gut, help us digest food.

Changes they saw included increasing levels of a bacterium that makes short chain fatty acids, which can help the body better utilize glucose, they report in the International Journal of Molecular Sciences. Glucose is used by the body for fuel but at high levels promotes inflammation, insulin insensitivity and ultimately can cause diabetes.

While there were other changes, the most dramatic they documented was the 17-fold increase in this bacterium called Alistipes, a gut bacterium not typically in high supply there but known to be proficient at making short chain fatty acids which, in turn, are "very good" at decreasing inflammation in the gut, says Dr. Jack Yu, chief of pediatric plastic surgery at MCG. Alistipes, which helps ferment our food without producing alcohol, generally improves the metabolic status of our gut and makes us more proficient at using the glucose we consume for energy.

When they saw this, co-corresponding authors Yu and Dr. Babak Baban, immunologist and interim associate dean for research at DCG, immediately thought that giving a dose of the bacterium, like you would a medication, with a smaller dose of whole body vibration -- in this case 10 minutes versus 20 minutes five times weekly -- might work just as well, and it did, they report.

It what appears to be this good chain reaction, when Alistipes went up, glucose use and the macrophage mix also improved, Baban says. "The sequencing is not yet completely clear," Yu says, "But it appears to be a closed loop, feed forward, self-magnifying cycle."

Our microbiome, like a casserole, is in layers and one way whole body vibration may work is by rearranging those layers, Baban says, but they reiterate that no one is certain just how whole body vibration works in this or other scenarios, like as an exercise mimic without all the proactive movement.

But it appears to help address a key concern in diabetes and many common diseases: inflammation. While acute inflammation helps us fight disease, chronic inflammation helps start and sustain a variety of diseases from cardiovascular problems to cancer as well as diabetes.

With rates of inflammation-producing obesity and related type 2 diabetes increasing -- even in children -- new therapies that can directly help avoid the health consequences are needed, they write. They add that while more work is needed, whole body vibration might be one widely applicable and generally safe approach to use.

Macrophages, which promote inflammation, called M1, and suppress inflammation, called M2, play an important role in regulation of the inflammatory response. The inflammatory status of macrophages also influences the gut microbiome and vice versa.

In Baban and Yu's mouse model of type 2 diabetes, where circulating glucose levels are high, they wanted to know how whole-body vibration affected the inflammatory status of macrophages and the diversity of the microbiome. They theorized their diabetes model would have more M1s, and that whole body vibration would result in more M2s and yield changes in the microbiome as well.

They found both a significant increase in the number of M2s as well as increases in levels of other anti-inflammatory molecules like the cytokine IL-10, in both normal mice and their diabetes mouse model after vibration. In fact, whole body vibration restored M2 levels to that of normal controls.

In the microbiome, they saw numerous shifts but by far the most significant was the increase in Alistipes and a general decrease in the diversity of the microbiome.

They note that while more diversity is generally considered a good thing, in this case the shift likely resulted from an increase in species like Alistipes, which can produce short chain fatty acids like butyrate, which result from the fermentation of dietary fiber in our gut and which feed inhabitants of the microbiome, are highly anti-inflammatory and can help reverse ill effects of high-fat diets, they write.

Theirs is the first study to document crosstalk between the microbiome and innate immunity by altering the macrophage mix with whole body vibration. Innate immunity is a sort of basic defense that immediately responds to invaders in the body and includes physical barriers like the skin as well as immune cells like macrophages, which are key to this response. In this scenario, macrophages, for example, release other cells called cytokines that help trigger inflammation. Adaptive immunity is when the body makes a specific cell, like an antibody, to target a specific invader.

They note that it's still impossible to know whether the microbiome or macrophage shift comes first but theorize that making more glucose available to macrophages fosters inflammation and insulin resistance.

Another experiment they want to do to better define the order, is to delete the macrophages and see if they still see the other effects, Baban says.

But either way, the investigators say the clear interaction provides more evidence that whole body vibration can turn down inflammation.

The microbiome lives in the mouth, gut, vagina and skin -- mostly in the gut-- at points where our body comes in contact with foreign items to help protect us from invaders. In the gut it helps us digest and use our food.

Scientists have found more than 8 million genes represented in the bacteria, fungi and viruses that comprise a healthy human microbiome while the human himself has more like 20,000 to 25,000 genes. Obesity has been associated with a less-diverse microbiome, which is actually more efficient at digesting food.

In diabetes, whole body vibration is known to reduce ill effects like excessive urine production and excessive thirst, Yu reported in 2012 to the Third World Congress of Plastic Surgeons of Chinese Descent. That work was in a mouse model, which mimicked overeating adolescents. Vibration also reduced inflammation levels, including shifts in some immune cell levels. Vibration also was better than drugs at reducing A1C levels, which provide a better idea of your average blood sugar levels than a fasting glucose by showing what percentage of your oxygen-carrying hemoglobin is routinely coated with sugar. High glucose, or blood sugar levels, may result in sugar binding to cells and other places inside the body where it can alter function.

"Hyperglycemia is not good," says Yu. "When it happens you perturb the normal."

While Alistipes, which does not survive well outside the body, is not currently a part of probiotics or even yogurt cultures, for these studies the investigators used levels of other bacterium, like lactobacillus, found in yogurt to determine how much to give when they tried the Alistipes as a medicine adjunct to whole body vibration.

Alistipes is found in plants, and levels have been shown to be decreased in individuals with inflammatory bowel disease and Crohn's. Higher levels have been associated with depression, and high levels can be found in the gut of hibernating bears.

A 2017 study published in Endocrinology by Drs. Alexis Stranahan and Meghan E. McGee-Lawrence at MCG, see eurekalert.org/multimedia/pub/135777.php, provided evidence that in their animal model of obesity and diabetes, whole body vibration was essentially the same as walking on a treadmill at reducing body fat and improving muscle and bone tone, including reducing seriously unhealthy fat around the liver, where it produces damage similar to excessive drinking.

Credit: 
Medical College of Georgia at Augusta University

When plant roots learned to follow gravity

image: Left: slow gravitropism of the fern C. richardii after 0, 6, 12, 24, and 36h; right: fast gravitropisms of the gymnosperm P. taeda after 0, 1, 3, 6, and 12h

Image: 
© IST Austria -- Yuzhou Zhang/Friml Group

One of the most important events in evolutionary history occurred around 500 million years ago with the spread of plant life from water to land. For plants to thrive in this new environment, root systems had to evolve to grow downwards, following gravity with two primary purposes: anchoring in the soil and providing a source of water and nutrients for growth of the parts of the plant above the ground. This mechanism -- called gravitropism--has been extensively studied in flowering plants such as Arabidopsis thaliana. However, it has never been systematically compared throughout the plant kingdom, and its evolutionary origin remains a mystery.

Down, down, down--but at different speed

Now, Yuzhou Zhang, postdoc in the group of Professor Ji?í Friml, and his team have gained a broader view of how and when root gravitropism evolved. The researchers selected multiple plant species representing the lineages of mosses, lycophytes (clubmosses and firmosses), ferns, gymnosperms (conifers), and flowering plants and let their roots grow horizontally to observe if and when they started to bend downwards to follow gravity. The result: Gravity-driven root growth turned out to be very rudimentary and slow in the most primitive land plants (mosses) as well as in the basal vascular plants (lycophytes and ferns). Only seed plants (gymnosperms and flowering plants), which first appeared around 350 million years ago, showed a faster and thus more efficient form of gravitropism.

The power of starch

But which evolutionary step enabled this fast and efficient root gravitropism in seed plants? Through analyzing the distinct phases of gravitropism--gravity perception, the transmission of the gravitropic signal, and ultimately the growth response itself--the researchers found two crucial components, which evolved hand in hand. The first turned out to be an anatomical feature: Plant organelles called amyloplasts--densely filled with starch granules--sediment in response to gravity and this way function as gravity sensors. However, this sedimentation process was only observed in gymnosperms and flowering plants with the amyloplasts ending up highly concentrated in the very bottom of the root tip. In earlier plants, by contrast, the amyloplasts remained randomly distributed within and above the root tip, thus not functioning as gravity sensors as was the case in the seed plants.

A special PIN code for auxin

After perception through the amyloplasts, the gravity signal is further transmitted from cell to cell by the growth hormone auxin. In genetic experiments, the researchers identified a specific transporter molecule in the model plant Arabidopsis thaliana, PIN2, which directs auxin flow and thus root growth. While almost all green plants carry PIN proteins, only the specific PIN2 molecule in seed plants gathers at the shoot-ward side of the root epidermal cells. This specific localization--unique to seed plants--leads to the polarization of the transporter cells which, in turn, enables the root to transport auxin towards the shoot and thus for the auxin-based signaling to wander from the place of gravity perception to the zone of growth regulation.

Plants as teachers for mankind

With these two anatomical and functional components identified, the authors have gained valuable insights into the evolution of root gravitropism, which is one of the crucial adaptations of seed plants to land. But even practical implications of these findings are conceivable: "Now that we have started to understand what plants need to grow stable anchorage in order to reach nutrients and water in deep layers of the soil, we may eventually be able to figure out ways to improve the growth of crop and other plants in very arid areas," says Zhang, who joined the IST Austria in 2016. He adds: "Nature is much smarter than we are; there is so much we can learn from plants that can eventually be of benefit to us."

Credit: 
Institute of Science and Technology Austria

At the edge of chaos: New method for exoplanet stability analysis

image: Gaining a full understanding of systems of exoplanets and distant stars is difficult, because the initial positions and velocities of the exoplanets are unknown. Determining whether the system dynamics are quasi-periodic or chaotic is cumbersome, expensive and computationally demanding. In this week's Chaos, Tamás Kovács delivers an alternative method for stability analysis of exoplanetary bodies using only the observed time series data to deduce dynamical measurements and quantify the unpredictability of exoplanet systems. This image shows a stability map of Saturn obtained by chaos indicator MEGNO (a) and recurrence network measures average path length (b) and transitivity (c). The latter two panels are based on transit timing variation of Jupiter and the radial velocity of the sun, respectively.

Image: 
Tamás Kovács

WASHINGTON, D.C., July 30, 2019 -- Exoplanets revolving around distant stars are coming quickly into focus with advanced technology like the Kepler space telescope. Gaining a full understanding of those systems is difficult, because the initial positions and velocities of the exoplanets are unknown. Determining whether the system dynamics are quasi-periodic or chaotic is cumbersome, expensive and computationally demanding.

In this week's Chaos, from AIP Publishing, Tamás Kovács delivers an alternative method for stability analysis of exoplanetary bodies using only the observed time series data to deduce dynamical measurements and quantify the unpredictability of exoplanet systems.

"If we don't know the governing equations of the motion of a system, and we only have the time series -- what we measure with the telescope -- then we want to transform that time series into a complex network. In this case, it is called a recurrence network," Kovács said. "This network holds all of the dynamical features of the underlying system we want to analyze."

The paper draws on the work of physicist Floris Takens, who proposed in 1981 that the dynamics of a system could be reconstructed using a series of observations about the state of the system. With Takens' embedding theorem as a starting point, Kovács uses time delay embedding to reconstruct a high-dimensional trajectory and then identify recurrence points, where bodies in the phase space are close to each other.

"Those special points will be the vertices and the edges of the complex network," Kovács said. "Once you have the network, you can reprogram this network to be able to apply measures like transitivity, average path length or others unique to that network."

Kovács tests the reliability of the method using a known system as a model, the three-body system of Saturn, Jupiter and the sun, and then applies it to the Kepler 36b and 36c system. His Kepler system results agree with what is known.

"Earlier studies pointed out that Kepler 36b and 36c is a very special system, because from the direct simulation and the numerical integrations, we see the system is at the edge of the chaos," Kovács said. "Sometimes, it shows regular dynamics, and at other times, it seems to be chaotic."

The author plans to next apply his methods to systems with more than three bodies, testing its scalability and exploring its ability to handle longer time series and sharper datasets.

Credit: 
American Institute of Physics

Recovering color images from scattered light

image: Researchers have created a method that takes light from colored numerals (top left) that has been scattered by a mostly opaque surface (top center) and uses its 'speckle' patterns and a coded aperture to reconstruct the image in five different frequencies (bottom row) before combining them into a final image (top right).

Image: 
Michael Gehm, Duke University

DURHAM, NC -- Engineers at Duke University have developed a method for extracting a color image from a single exposure of light scattered through a mostly opaque material. The technique has applications in a wide range of fields from healthcare to astronomy.

The study appeared online on July 9 in the journal Optica.

"Others have been able to reconstruct color images from scattered light, but those methods had to sacrifice spatial resolution or required prior characterization of the scatterer in advance, which frequently isn't possible," said Michael Gehm, associate professor of electrical and computer engineering at Duke. "But our approach avoids all those issues."

When light is scattered as it passes through a translucent material, the emerging pattern of "speckle" looks as random as static on a television screen with no signal. But it isn't random. Because the light coming from one point of an object travels a path very similar to that of the light coming from an adjacent point, the speckle pattern from each looks very much the same, just shifted slightly.

With enough images, astronomers used to use this "memory effect" phenomenon to create clearer images of the heavens through a turbulent atmosphere, as long as the objects being imaged were sufficiently compact.

While the technique fell out of favor with the development of adaptive optics, which do the same job by using adjustable mirrors to compensate for the scattering, it has recently became popular once again. Because modern cameras can record hundreds of millions of pixels at a time, only a single exposure is needed to make the statistics work.

While this approach can reconstruct a scattered image, it has limitations in the realm of color. The speckle patterns created by different wavelengths are typically impossible to disentangle from one another.

The new memory effect imaging approach developed by the authors Xiaohan Li, a PhD student in Gehm's lab, Joel Greenberg, associate research professor of electrical and computer engineering, and Gehm breaks through this limitation.

The trick is to use a coded aperture followed by a prism. A coded aperture is basically a filter that allows light to pass through some areas but not others in a specific pattern. After the speckle is "stamped" by the coded aperture, it passes through a prism that causes different frequencies of light to spread out from each other.

This causes the pattern from the coded aperture to shift slightly in relation to the image being captured by the detector. And the amount it shifts is directly related to the color of light passing through.

"This shift is small compared to the overall size of what's being imaged, and because our detector is not sensitive to color, it creates a messy combination," said Li. "But the shift is enough to give our algorithm a toehold to tease the individual speckle patterns apart from each color, and from that we can figure out what the object looks like for each color."

The researchers show that, by focusing on five spectral channels corresponding to violet, green and three shades of red, the technique can reconstruct a letter "H" full of nuanced pinks, yellows and blues. Outside of this difficult proof-of-principle, the researchers believe their approach could find applications in fields such as astronomy and healthcare.

In astronomy, the color content of the light coming from astronomical phenomena contains valuable information about its chemical composition, and speckle is often created as light is distorted by the atmosphere. Similarly in healthcare, color can tell researchers something about the molecular composition of what's being imaged, or it can be used to identify biomolecules that have been tagged with fluorescent markers.

"There are a lot of applications where people really want to know how much energy there is in specific spectral bands emitted from objects located behind opaque occlusions," said Greenberg. "We've shown that this approach can accomplish this goal across the visible spectrum. Knowing the aperture pattern and how much it shifts as a function of wavelength provides the key we need to disentangle the messy sum into separate channels."

Credit: 
Duke University

TESS discovers three new planets nearby, including temperate 'sub-Neptune'

image: NASA's Transiting Exoplanet Survey Satellite, or TESS, has discovered three new worlds that are among the smallest, nearest exoplanets known to date. The planets orbit a star just 73 light years away and include a small, rocky super-Earth and two sub-Neptunes -- planets about half the size of our own icy giant.

Image: 
NASA's Goddard Space Flight Center/Scott Wiessinger

NASA's Transiting Exoplanet Survey Satellite, or TESS, has discovered three new worlds that are among the smallest, nearest exoplanets known to date. The planets orbit a star just 73 light years away and include a small, rocky super-Earth and two sub-Neptunes -- planets about half the size of our own icy giant.

The sub-Neptune furthest out from the star appears to be within a "temperate" zone, meaning that the very top of the planet's atmosphere is within a temperature range that could support some forms of life. However, scientists say the planet's atmosphere is likely a thick, ultradense heat trap that renders the planet's surface too hot to host water or life.

Nevertheless, this new planetary system, which astronomers have dubbed TOI-270, is proving to have other curious qualities. For instance, all three planets appear to be relatively close in size. In contrast, our own solar system is populated with planetary extremes, from the small, rocky worlds of Mercury, Venus, Earth, and Mars, to the much more massive Jupiter and Saturn, and the more remote ice giants of Neptune and Uranus.

There's nothing in our solar system that resembles an intermediate planet, with a size and composition somewhere in the middle of Earth and Neptune. But TOI-270 appears to host two such planets: both sub-Neptunes are smaller than our own Neptune and not much larger than the rocky planet in the system.

Astronomers believe TOI-270's sub-Neptunes may be a "missing link" in planetary formation, as they are of an intermediate size and could help researchers determine whether small, rocky planets like Earth and more massive, icy worlds like Neptune follow the same formation path or evolve separately.

TOI-270 is an ideal system for answering such questions, because the star itself is nearby and therefore bright, and also unusually quiet. The star is an M-dwarf, a type of star that is normally extremely active, with frequent flares and solar storms. TOI-270 appears to be an older M-dwarf that has since quieted down, giving off a steady brightness, against which scientists can measure many properties of the orbiting planets, such as their mass and atmospheric composition.

"There are a lot of little pieces of the puzzle that we can solve with this system," says Maximilian Günther, a postdoc in MIT's Kavli Institute for Astrophysics and Space Research and lead author of a study published in Nature Astronomy that details the discovery. "You can really do all the things you want to do in exoplanet science, with this system."

A planetary pattern

Günther and his colleagues detected the three new planets after looking through measurements of stellar brightness taken by TESS. The MIT-developed satellite stares at patches of the sky for 27 days at a time, monitoring thousands of stars for possible transits -- characteristic dips in brightness that could signal a planet temporarily blocking the star's light as it passes in front of it.

The team isolated several such signals from a nearby star, located 73 light years away in the southern sky. They named the star TOI-270, for the 270th "TESS Object of Interest" identified to date. The researchers used ground-based instruments to follow up on the star's activity, and confirmed that the signals are the result of three orbiting exoplanets: planet b, a rocky super-Earth with a roughly three-day orbit; planet c, a sub-Neptune with a five-day orbit; and planet d, another sub-Neptune slightly further out, with an 11-day orbit.

Günther notes that the planets seem to line up in what astronomers refer to as a "resonant chain," meaning that the ratio of their orbits are close to whole integers -- in this case, 3:5 for the inner pair, and 2:1 for the outer pair -- and that the planets are therefore in "resonance" with each other. Astronomers have discovered other small stars with similarly resonant planetary formations. And in our own solar system, the moons of Jupiter also happen to line up in resonance with each other.

"For TOI-270, these planets line up like pearls on a string," Günther says. "That's a very interesting thing, because it lets us study their dynamical behavior. And you can almost expect, if there are more planets, the next one would be somewhere further out, at another integer ratio."

"An exceptional laboratory"

TOI-270's discovery initially caused a stir of excitement within the TESS science team, as it seemed, in the first analysis, that planet d might lie in the star's habitable zone, a region that would be cool enough for the planet's surface to support water, and possibly life. But the researchers soon realized that the planet's atmosphere was probably extremely thick, and would therefore generate an intense greenhouse effect, causing the planet's surface to be too hot to be habitable.

But Günther says there is a good possibility that the system hosts other planets, further out from planet d, that might well lie within the habitable zone. Planet d, with an 11-day orbit, is about 10 million kilometers out from the star. Günther says that, given that the star is small and relatively cool -- about half as hot as the sun -- its habitable zone could potentially begin at around 15 million kilometers. But whether a planet exists within this zone, and whether it is habitable, depends on a host of other parameters, such as its size, mass, and atmospheric conditions.

Fortunately, the team writes in their paper that "the host star, TOI-270, is remarkably well-suited for future habitability searches, as it is particularly quiet." The researchers plan to focus other instruments, including the upcoming James Webb Space Telescope, on TOI-270, to pin down various properties of the three planets, as well as search for additional planets in the star's habitable zone.

"TOI-270 is a true Disneyland for exoplanet science, and one of the prime systems TESS was set out to discover," Günther says. "It is an exceptional laboratory for not one, but many reasons -- it really ticks all the boxes."

Credit: 
Massachusetts Institute of Technology

Researchers recreate the sun's solar wind and plasma 'burps' on Earth

image: The Big Red Plasma Ball is pictured in Sterling Hall at the University of Wisconsin-Madison on Oct. 2, 2017. The Big Red Plasma Ball, part of the new Wisconsin Plasma Physics Laboratory (WiPPL) being led by Physics Professor Cary Forest, is one of several pieces of scientific equipment being used to study the fundamental properties of plasma in order to better understand the universe, where the hot gas is abundant.

Image: 
Jeff Miller/UW-Madison

MADISON - The sun's solar wind affects nearly everything in the solar system. It can disrupt the function of Earth's satellites and creates the lights of the auroras.

A new study by University of Wisconsin-Madison physicists mimicked solar winds in the lab, confirming how they develop and providing an Earth-bound model for the future study of solar physics.

Our sun is essentially a big ball of hot plasma -- an energetic state of matter made up of ionized gas. As the sun spins, the plasma spins along, too. This plasma movement in the core of the sun produces a magnetic field that fills the solar atmosphere. At some distance from the sun's surface, known as the Alfvén surface, this magnetic field weakens and plasma breaks away from the sun, creating the solar wind.

"The solar wind is highly variable, but there are essentially two types: fast and slow," explains Ethan Peterson, a graduate student in the department of physics at UW-Madison and lead author of the study published online July 29 in Nature Physics. "Satellite missions have documented pretty well where the fast wind comes from, so we were trying to study specifically how the slow solar wind is generated and how it evolves as it travels toward Earth."

Peterson and his colleagues, including physics professor Cary Forest, may not have direct access to the big plasma ball of the sun, but they do have access to the next best thing: the Big Red Ball.

The Big Red Ball is a three-meter-wide hollow sphere, with a strong magnet at its center and various probes inside. The researchers pump helium gas in, ionize it to create a plasma, and then apply an electric current that, along with the magnetic field, stirs the plasma, creating a near-perfect mimic of the spinning plasma and electromagnetic fields of the sun.

With their mini-sun in place, the researchers can take measurements at many points inside the ball, allowing them to study solar phenomena in three dimensions.

First, they were able to recreate the Parker Spiral, a magnetic field that fills the entire solar system named for the scientist who first described the solar wind. Below the Alfvén surface, the magnetic field radiates straight out from the Sun. But at that surface, solar wind dynamics take over, dragging the magnetic field into a spiral.

"Satellite measurements are pretty consistent with the Parker Spiral model, but only at one point at a time, so you'd never be able to make a simultaneous, large-scale map of it like we can in the lab." Peterson says. "Our experimental measurements confirm Parker's theory of how it is created by these plasma flows."

The researchers were also able to identify the source of the Sun's plasma "burps," small, periodic ejections of plasma that fuel the slow solar wind. With the plasma spinning, they probed the magnetic field and the speed of the plasma. Their data mapped a region where the plasma was moving fast enough and the magnetic field was weak enough that the plasma could break off and eject radially.

"These ejections are observed by satellites, but no one knows what drives them," Peterson says. "We ended up seeing very similar burps in our experiment, and identified how they develop."

The researchers stress that their Earth-bound experiments complement, but don't replaceme, satellite missions. For example, the Parker Solar Probe, launched in August 2018, is expected to reach and even dip below the Alfvén surface. It will provide direct measurements of solar wind never obtained before.

"Our work shows that laboratory experiments can also get at the fundamental physics of these processes," Peterson says. "And because the Big Red Ball is now funded as a National User Facility, it says to the science community: If you want to study the physics of solar wind, you can do that here."

Credit: 
University of Wisconsin-Madison

Army project may advance quantum materials, efficient communication networks

image: A US Army project at Princeton University results in an electronic array on a microchip that simulates particle interactions in a hyperbolic place, a geometric surface in which space curves away from itself at every point.

Image: 
Princeton University

RESEARCH TRIANGLE PARK, N.C. (July 24, 2019) -- A U.S. Army project exploring novel applications of superconducting resonators has discovered these systems may be used to simulate quantum materials impossible to otherwise fabricate. Additionally, they may provide insights to open and fundamental questions in quantum mechanics and gravity.

Scientists at Princeton University, led by electrical engineering Professor Andrew Houck, built an electronic array on a microchip that simulates particle interactions in a hyperbolic plane, a geometric surface in which space curves away from itself at every point.

"This research may advance quantum simulation in a way that enables us to not only develop a better understanding of materials relevant to Army goals, but also help us explore questions at the forefront of other fields of Army relevance," said Dr. Sara Gamble, a program manager with the Army Research Office, an element of the U.S. Army Combat Capabilities Development Command's Army Research Laboratory. "In addition to the potential materials applications, the fantastic results obtained by the research team can provide insight into communication networks and ultimately enable the DOD to develop more efficient networking capabilities."

The research, published in Nature, used superconducting circuits to create a lattice that functions as a hyperbolic space. When the researchers introduce photons into the lattice, they can answer a wide range of difficult questions by observing the photons' interactions in simulated hyperbolic space.

"The problem is that if you want to study a very complicated quantum mechanical material, then computer modeling is very difficult," said Dr. Alicia Kollár, a postdoctoral research associate at the Princeton Center for Complex Materials. "We're trying to implement a model at the hardware level so that nature does the hard part of the computation for you."

The centimeter-sized chip is etched with a circuit of superconducting resonators that provide paths for microwave photons to move and interact. The resonators on the chip are arranged in a lattice pattern of heptagons, or seven-sided polygons. The structure exists on a flat plane, but simulates the unusual geometry of a hyperbolic plane.

"In normal 3-D space, a hyperbolic surface doesn't exist," said Princeton electrical engineering Prof. Andrew Houck. "This material allows us to start to think about mixing quantum mechanics and curved space in a lab setting."

Trying to force a three-dimensional sphere onto a two-dimensional plane reveals that space on a spherical plane is smaller than on a flat plane. This is why the shapes of countries appear stretched out when drawn on a flat map of the spherical Earth. In contrast, a hyperbolic plane would need to be compressed in order to fit onto a flat plane.

To simulate the effect of compressing hyperbolic space onto a flat surface, the researchers used a special type of resonator called a coplanar waveguide resonator. When microwave photons pass through this resonator, they behave in the same way whether their path is straight or meandering. The meandering structure of the resonators offers flexibility to "squish and scrunch" the sides of the heptagons to create a flat tiling pattern, said Kollár, who is starting a faculty position at the University of Maryland and Joint Quantum Institute.

Looking at the chip's central heptagon is akin to looking through a fisheye camera lens, in which objects at the edge of the field of view appear smaller than in the center -- the heptagons look smaller the farther they are from the center. This arrangement allows microwave photons that move through the resonator circuit to behave like particles in a hyperbolic space.

The chip's ability to simulate curved space could enable new investigations in quantum mechanics, including properties of energy and matter in the warped space-time around black holes. The material could also be useful for understanding complex webs of relationships in mathematical graph theory and communication networks. Kollár noted that this research could eventually aid the design of new materials.

But first, she and her colleagues will need to further develop the photonic material, both by continuing to examine its mathematical basis and by introducing elements that enable photons in the circuit to interact.

"By themselves, microwave photons don't interact with each other -- they pass right through," Kollár said. Most applications of the material would require "doing something to make it so that they can tell there's another photon there."

"The research team is forming connections with researchers in other disciplines because of these results, and the addition of photon interactions into the systems will increase the application space for advancing Army capabilities even further," Gamble said.

Credit: 
U.S. Army Research Laboratory

To understand a childhood brain tumor, researchers turn to single-cell analysis

image: From left to right: Paul Northcott, Ph.D., Kyle Smith, Ph.D. and Laure Bihannic, Ph.D., all of the Northcott lab in the St. Jude Department of Developmental Neurobiology, are researching the cells of origin for key medulloblastoma subtypes.

Image: 
St. Jude Children's Research Hospital

Investigators at St. Jude Children's Research Hospital and Massachusetts General Hospital, alongside others, have revealed the cells of origin for specific subtypes of medulloblastoma, the most common malignant pediatric brain tumor. The work also has implications for how medulloblastoma is classified, which may eventually shape clinical care. The work appears as an advance online publication today in Nature.

This study is the most in-depth analysis to date of medulloblastoma using single-cell RNAseq technology. The findings shed light on the relationship between the four known subtypes of medulloblastoma. Additionally, the researchers suggest a previously unknown cell of origin for the understudied Group 4 subgroup.

"The ability to look at individual cells has propelled us 10 steps forward in our understanding of how the subtypes of medulloblastoma arise, what drives them and how we can make treatments more effective for patients," said co-senior author Paul Northcott, Ph.D., of the St. Jude Department of Developmental Neurobiology.

Previous research by the Northcott laboratory and others has classified medulloblastoma into four distinct subtypes: WNT and SHH (which are driven by their namesake genetic mutations), Group 3 and Group 4. Group 3 and Group 4 constitute 60% of all medulloblastoma but are the least understood. This is in part because there are currently no genetically accurate laboratory models that mimic Group 4.

Using tumor samples from patients with each of the four subtypes, the researchers conducted single-cell RNA sequencing. This technique evaluates each cell, providing the investigators with a more detailed portrait of how the subtypes arise and relate to each other. Going beyond traditional genetic sequencing, RNAseq drills down into how genes are controlled, highlighting important mechanisms that can dictate a cell's behavior.

These results provide for the first time an indication that Group 4 medulloblastoma tumors may arise from cells in the cerebellum called glutamatergic cerebellar nuclei and unipolar brush cells. This is a critical finding for future research into Group 4 medulloblastoma, because understanding the cell of origin may make it possible to develop accurate laboratory models.

The study did not confirm a cell of origin for Group 3 medulloblastoma. However, the results did suggest several possible explanations. For example, Group 3 may arise from tissues outside the cerebellum which were not studied. It is also possible that the oncogene MYC, which characterizes Group 3, alters the cells to make them unrecognizable when compared to their origins. Future studies will focus on understanding more about the origins of Group 3.

"Group 3 and Group 4 medulloblastomas have confounded the field, with some very clearly belonging to one subtype or the other but another group exhibiting the hallmarks of both subtypes," said co-senior author Mario Suva, M.D., Ph.D., of Massachusetts General Hospital and the Broad Institute of MIT and Harvard. "Now, we can more clearly see that there is an intermediate group, straddling Group 3 and Group 4 classification."

By digging into the biology of intermediate Group3/4 medulloblastoma, the investigators showed these tumors consist of a mixture of immature and mature tumor cells, explaining current challenges associated with their classification. This unique biology may prompt investigators to think differently about how these tumors are stratified and may have implications for the design of future clinical trials.

The study also confirmed that SHH tumors likely arise from cerebellar granule neuron progenitors. The findings advanced understanding of the ways that SHH tumors can differ based on the patient's age. The investigators found that in infants, SHH tumors are largely differentiated, meaning the cells have already matured into their final cell type; whereas in adults, SHH tumors are more primitive and less differentiated. This difference may be tied to the increased number of genetic mutations present in adult tumors. The research also provided insights into WNT tumors, showing that WNT tumors can be subdivided into four distinct tumor cell populations based on gene expression profiles.

Credit: 
St. Jude Children's Research Hospital