Heavens

Three elder sisters of the Sun with planets

image: Prof. Niedzielski's team have been working on this subject for years. Thanks to precise observations of the sky, they have managed to discover 26 stars around which planets revolve

Image: 
Andrzej Romanski/NCU

An international team led by Prof. dr habil. Andrzej Niedzielski, an astronomer from the Nicolaus Copernicus University in Torun (Poland), has discovered yet another three extrasolar planets. These planets revolve around the stars that can be called elder sisters of our Sun.

You can read about the astronomers' success in Astronomy and Astrophysics. The prestigious European journal will publish the paper: Tracking Advanced Planetary Systems (TAPAS) with HARPS-N. VII. Elder suns with low-mass companions. Apart from Prof. Andrzej Niedzielski from the NCU Institute of Astronomy, the team which worked on the discovery includes Prof. dr habil. Gracjan Maciejewski, also from the NCU Faculty of Physics, Astronomy and Informatics, Prof. Aleksander Wolszczan (Pennsylvania State University), dr Eva Villaver (University of Madrid) as well as dr Monika Adamów and dr Kacper Kowalik (both from the University of Illinois).

Discoverers of planets

Prof. Niedzielski's team have been working on this subject for years. Thanks to precise observations of the sky, they have managed to discover 26 stars around which planets revolve. These are usually planetary systems much older than ours. Their suns are mostly red giants. An exception is the Solaris system and the Pirx, a star similar to the Sun (although slightly less massive and cooler) and its planet, discovered in 2009.

- The red giant is a star that has burnt out hydrogen in its interior as a result of nuclear reactions and is rebuilding its internal structure to ignite helium burning nuclear reactions - explains Prof. Niedzielski, - Such a star shrinks in its central part, where the temperature starts to rise. Its outer areas expand significantly and cool down. Initially a yellow star, like the Sun, becomes red and huge. Hence the name of this type of stars. These stars can reach a size comparable to that of Earth's orbit.

Sisters of the Sun

The astronomers looked at 122 stars. They carried out their observations using the Hobby-Eberly Telescope (HET) at the McDonald Observatory, near Fort Davis, Texas, and the Italian National Galileo Telescope, which is located on the island of La Palma (Canary Islands) in Spain. They succeeded in discovering other extrasolar planets orbiting the stars which could be called the big sisters of our Sun.

- These stars are red giants. They have masses exactly the same as our star, but they are a few billion years older, much bigger and cooler - explains Prof. Niedzielski, - The planets that we have discovered are gas giants - without surfaces, similar to our Jupiter. They orbit far too close to their stars for conditions favourable for the origin of life to occur on them or in their vicinity

Eldest sister: HD 4760

The star HD 4760 is an eighth magnitude object in Pisces constellation. It is 40 times larger and emits 850 times more light than the Sun, but because of its distance (about 1781 light years away from us) it is invisible to the naked eye, but it is already within reach of even small and amateur telescopes.

-A planet about 14 times more massive than Jupiter revolves around it. It is in an orbit similar in size to that of Earth around the Sun, at a distance of about 1.1 astronomical units. A year on this planet lasts 434 days - says Prof. Niedzielski.

The observations of the star that led to the discovery of the planet took 9 years. They were conducted first with the Hobby-Eberly telescope and the HRS spectrograph, then with the Galileo telescope and the Harps-N.

- The observations were so long because in the case of the search for planets near red giants it is necessary to study several periods of rotation of the star, which can reach hundreds of days - explains the astronomer from Toru?, - The researchers must make sure that a planet is actually observed, and not a spot on the star's surface that pretends to be a planet.

Younger sisters: TYC 0434-04538-1 and HD 96992

The astronomers have recently discovered a planet orbiting the TYC 0434-04538-1, a star about 2032light-years away from us, in the Serpens constellation. Although it shines almost 50 times more strongly than the Sun, it is also invisible to the naked eye. The reason is again the great distance - to see this object of tenth apparent magnitude, you already need a small telescope. This star is ten times bigger than the Sun, and it is surrounded by a planet six times more massive than Jupiter.
- Interestingly, this planet orbits quite close to its star, at a distance of 0.66 astronomical units. In our Solar System it would be located between the orbits of Venus and Earth - explains Prof. Niedzielski, - A year on this gas planet lasts only 193 days.
Observations of this star with both telescopes lasted 10 years.
The third of the Sun's elder sisters, the HD 96992, is closest to us - "only" 1305 light years away. It is a star of the ninth magnitude in the Great Bear.

- This star, seven times bigger and almost 30 times more energetic than the Sun, has a planet with a mass only slightly bigger than that of Jupiter, in an orbit of 1.24 astronomical units. A year on this planet lasts 514 days, says Prof. Niedzielski.

This star has been observed with the use of two telescopes by astronomers for the longest time - 14 years.

Credit: 
Nicolaus Copernicus University in Torun

How a plant regulates its growth

image: If the growth processes of plants are disturbed, the roots no longer grow to the center of the earth and flower and seed formation is massively disrupted.

Image: 
U. Hammes / TUM

Plants grow towards the light. This phenomenon, which already fascinated Charles Darwin, has been observed by everyone who owns houseplants. Thus, the plant ensures that it can make the best use of light to photosynthesize and synthesize sugars. Similarly, the roots grow into the soil to ensure that the plant is supplied with water and nutrients.

These growth processes are controlled by a hormone called "auxin", which plays a key role in the formation of polarity in plants. To do this, auxin is transported in the plant body polar, from the shoot through the plant body into the roots. In this process, a family of polar transport proteins distributes the auxin throughout the plant. To better understand this process, the research team investigated it in more detail with the help of a chemical.

How the herbicide naptalam works

Scientists around the world are studying transporter proteins in more detail due to their central role in plant development processes. Naptalam (NPA) is an important tool to elucidate the structure of the transporters.

Naptalam is the registered name of Napthylphphthalic acid. It inhibits the directional flow of auxin, thus severely inhibiting plant growth. It was used in in the European Union until 2002, and the sodium salt of naptalam is still used in the USA as a pre-emergence herbicide to control broadleaf weed in cucurbits and nursery stock.

"We wanted to know how naptalam exerts its effects," says PD Dr. Ulrich Hammes, the study's principal investigator. "Our studies show that the activity of the auxin transporters is really completely shut down by the inhibitor." When NPA binds to the transporter proteins, auxin can no longer get out of the cell, and thus the plant is no longer able to grow polarly. The roots no longer grow to the center of the earth, and flowers and seed formations are massively disrupted.

An effect of the inhibitor NPA on the activators of the transporters, known as kinases, could be ruled out through collaboration with Claus Schwechheimer, Professor of Plant Systems Biology of at the TUM, where the work was carried out. He explains, "This makes it clear that the inhibitor NPA acts directly on the transport proteins."

How transport proteins contribute to plant development

"We can now clearly explain the molecular mechanism by which polar plant growth can be disrupted pharmacologically," says Ulrich Hammes.

The research groups in Vienna were able to show that naptalam not only binds the transporters, but also prevents the transporters from binding to each other. "This mechanism of binding to each other seems to apply universally in the family of auxin transporters, as we observed the effect in all transporters studied," says Martina Kolb, first author of the study.

Better understanding of molecular relationships

Overall, the study provides a significant step forward in understanding the mechanism of the molecular machinery of plant polarity. The new findings make it possible to study polar growth more precisely and to understand the molecular mechanism of auxin transport.

Credit: 
Technical University of Munich (TUM)

Meteorites remember conditions of stellar explosions

image: Artist illustration of the formation of the solar system, capturing the moment where radioactive nuclei got incorporated into solids that would become meteorites.

Image: 
Bill Saxton / NSF / AUI / NRAO

A team of international researchers went back to the formation of the solar system 4.6 billion years ago to gain new insights into the cosmic origin of the heaviest elements on the periodic table.

Led by scientists who collaborate as part of the International Research Network for Nuclear Astrophysics (IReNA) (irenaweb.org) and the Joint Institute for Nuclear Astrophysics - Center for the Evolution of the Elements (JINA-CEE) (jinaweb.org), the study is published in the latest issue of the journal Science.

Heavy elements we encounter in our everyday life, like iron and silver, did not exist at the beginning of the universe, 13.7 billion years ago. They were created in time through nuclear reactions called nucleosynthesis that combined atoms together. In particular, iodine, gold, platinum, uranium, plutonium, and curium, some of the heaviest elements, were created by a specific type of nucleosynthesis called the rapid neutron capture process, or r process.

The question of which astronomical events can produce the heaviest elements has been a mystery for decades. Today, it is thought that the r process can occur during violent collisions between two neutron stars, between a neutron star and a black hole, or during rare explosions following the death of massive stars. Such highly energetic events occur very rarely in the universe. When they do, neutrons are incorporated in the nucleus of atoms, then converted into protons. Since elements in the periodic table are defined by the number of protons in their nucleus, the r process builds up heavier nuclei as more neutrons are captured.

Some of the nuclei produced by the r process are radioactive and take millions of years to decay into stable nuclei. Iodine-129 and curium-247 are two of such nuclei that were pro-duced before the formation of the sun. They were incorporated into solids that eventually fell on the earth's surface as meteorites. Inside these meteorites, the radioactive decay generat-ed an excess of stable nuclei. Today, this excess can be measured in laboratories in order to figure out the amount of iodine-129 and curium-247 that were present in the solar system just before its formation.

Why are these two r-process nuclei are so special? They have a peculiar property in common: they decay at almost exactly the same rate. In other words, the ratio between iodine-129 and curium-247 has not changed since their creation, billions of years ago.

"This is an amazing coincidence, particularly given that these nuclei are two of only five radioactive r-process nuclei that can be measured in meteorites," says Benoit Côté from the Konkoly Observatory, the leader of the study. "With the iodine-129 to curium-247 ratio being frozen in time, like a prehistoric fossil, we can have a direct look into the last wave of heavy element production that built up the composition of the solar system, and everything within it."

Iodine, with its 53 protons, is more easily created than curium with its 96 protons. This is because it takes more neutron capture reactions to reach curium's higher number of protons. As a consequence, the iodine-129 to curium-247 ratio highly depends on the amount of neutrons that were available during their creation.

The team calculated the iodine-129 to curium-247 ratios synthesized by collisions between neutron stars and black holes to find the right set of conditions that reproduce the composition of meteorites. They concluded that the amount of neutrons available during the last r-process event before the birth of the solar system could not be too high. Otherwise, too much curium would have been created relative to iodine. This implies that very neutron-rich sources, such as the matter ripped off the surface of a neutron star during a collision, likely did not play an important role.

So what created these r-process nuclei? While the researchers could provide new and insightful information regarding how they were made, they could not pin down the nature of the astronomical object that created them. This is because nucleosynthesis models are based on uncertain nuclear properties, and it is still unclear how to link neutron availability to specific astronomical objects such as massive star explosions and colliding neutron stars.

"But the ability of the iodine-129 to curium-247 ratio to peer more directly into the fundamental nature of heavy element nucleosynthesis is an exciting prospect for the future," said Nicole Vassh from the University of Notre Dame, coauthor of the study.

With this new diagnostic tool, advances in the fidelity of astrophysical simulations and in the understanding of nuclear properties could reveal which astronomical objects created the heaviest elements of the solar system.

"Studies like this are only possible when you bring together a multidisciplinary team, where each collaborator contributes to a distinct piece of the puzzle. The JINA-CEE 2019 Frontiers meeting provided the ideal environment to formalize the collaboration that led to the current result," Côté said.

Credit: 
Michigan State University Facility for Rare Isotope Beams

Twin atoms: A source for entangled particles

image: The atom chip used for the quantum experiments

Image: 
TU Wien

Heads or tails? If we toss two coins into the air, the result of one coin toss has nothing to do with the result of the other. Coins are independent objects. In the world of quantum physics, things are different: quantum particles can be entangled, in which case they can no longer be regarded as independent individual objects, they can only be described as one joint system.

For years, it has been possible to produce entangled photons - pairs of light particles that move in completely different directions but still belong together. Spectacular results have been achieved, for example in the field of quantum teleportation or quantum cryptography. Now, a new method has been developed at TU Wien (Vienna) to produce entangled atom pairs - and not just atoms which are emitted in all directions, but well-defined beams. This was achieved with the help of ultracold atom clouds in electromagnetic traps.

Entangled particles

"Quantum entanglement is one of the essential elements of quantum physics," says Prof. Jörg Schmiedmayer from the Institute of Atomic and Subatomic Physics at TU Wien. "If particles are entangled with each other, then even if you know everything there is to know about the total system, you still cannot say anything at all about one specific particle. Asking about the state of one particular particle makes no sense, only the overall state of the total system is defined."

There are different methods of creating quantum entanglement. For example, special crystals can be used to create pairs of entangled photons: a photon with high energy is converted by the crystal into two photons of lower energy - this is called "down conversion". This allows large numbers of entangled photon pairs to be produced quickly and easily.

Entangling atoms, however, is much more difficult. Individual atoms can be entangled using complicated laser operations - but then you only get a single pair of atoms. Random processes can also be used to create quantum entanglement: if two particles interact with each other in a suitable way, they can turn out to be entangled afterwards. Molecules can be broken up, creating entangled fragments. But these methods cannot be controlled. "In this case, the particles move in random directions. But when you do experiments, you want to be able to determine exactly where the atoms are moving," says Jörg Schmiedmayer.

The twin pair

Controlled twin pairs could now be produced at TU Wien with a novel trick: a cloud of ultracold atoms is created and held in place by electromagnetic forces on a tiny chip. "We manipulate these atoms so that they do not end up in the state with the lowest possible energy, but in a state of higher energy," says Schmiedmayer. From this excited state, the atoms then spontaneously return to the ground state with the lowest energy.

However, the electromagnetic trap is constructed in such a way that this return to the ground state is physically impossible for a single atom - this would violate the conservation of momentum. The atoms can therefore only get trasferred to the ground state as pairs and fly away in opposite directions, so that their total momentum remains zero. This creates twin atoms that move exactly in the direction specified by the geometry of the electromagnetic trap on the chip.

The double-slit experiment

The trap consists of two elongated, parallel waveguides. The pair of twin atoms may have been created in the left or in the right waveguide - or, as quantum physics allows, in both simultaneously. "It's like the well-known double-slit experiment, where you shoot a particle at a wall with two slits," says Jörg Schmiedmayer. "The particle can pass through both the left and the right slit at the same time, behind which it interferes with itself, and this creates wave patterns that can be measured."

The same principle can be used to prove that the twin atoms are indeed entangled particles: only if you measure the entire system - i.e. both atoms at the same time - can you detect the wave-like superpositions typical of quantum phenomena. If, on the other hand, you restrict yourself to a single particle, the wave superposition disappears completely.

"This shows us that in this case it makes no sense to look at the particles individually," explains Jörg Schmiedmayer. "In the double-slit experiment, the superpositions disappear as soon as you measure whether the particle goes through the left or the right slit. As soon as this information is available, the quantum superposition is destroyed. It is very similar here: if the atoms are entangled and you only measure one of them, you could theoretically still use the other atom to measure whether they both originated in the left or the right part of the trap. Therefore, the quantum superpositions are destroyed."

Now that it has been proven that ultracold atom clouds can indeed be used to reliably produce entangled twin atoms in this way, further quantum experiments are to be carried out with these atom pairs - similar to those that have already been possible with photon pairs.

Credit: 
Vienna University of Technology

Scientists propose a new heavy particle similar to the Higgs boson

image: Simulation of a collision in the Large Hadron Collider, producing the Higgs boson. © 1997-2021 CERN (License: CC-BY-SA-4.0)

Image: 
university of granada

Unlike the Higgs boson, discovered at CERN's Large Hadron Collider in 2012 after a 40-year quest, the new particle proposed by these researchers is so heavy that it could not be produced directly even in this collider

The University of Granada is among the participants in this major scientific advancement in Theoretical Physics, which could help unravel the mysteries of dark matter

Scientists from the University of Granada (UGR) and the Johannes Gutenberg University Mainz (Germany) have recently published a study in which they endeavour to extend the Standard Model of particle physics (the equivalent of 'the periodic table' for particle physics) and answer some of the questions that this model is unable to answer. Such puzzles include: What is dark matter made of? Why do the various constituents of fermionic dark matter have such different masses? Or, why is the force of gravity much weaker than electromagnetic interaction?

This work, published in the European Physical Journal C, is based on the existence of a dimension in spacetime that is "so small that we can only detect evidence of it through its indirect effects," explains one of the authors of the article, Adrián Carmona, Athenea3i Fellow at the UGR and a member of the Department of Theoretical Physics and the Cosmos.

As early as the 1920s, in an attempt to unify the forces of gravity and electromagnetism, Theodor Kaluza and Oskar Klein speculated on the existence of an extra dimension beyond the familiar three space dimensions and time (which, in physics, are combined into a 4-dimensional spacetime).

Such models became popular in the 1990s, when theoretical physicists realized that theories with curved extra dimensions could explain some of the major mysteries in this field. However, despite their many strengths, such models generally lacked a viable dark-matter candidate.

Now, more than 20 years later, Adrián Carmona and collaborators from the University of Mainz, Professor Matthias Neubert and doctoral student Javier Castellano, have predicted the existence of a new heavy particle in these models with properties similar to those of the famous Higgs boson.

A new dimension

"This particle could play a fundamental role in the generation of masses of all the particles sensitive to this extra dimension, and at the same time be the only relevant window to a possible dark sector responsible for the existence of dark matter, which would simultaneously solve two of the biggest problems of these theories that, a priori, appear disconnected," explains the UGR researcher.

However, unlike the Higgs boson, which was discovered at CERN's Large Hadron Collider in 2012 after a 40-year quest, the new particle proposed by these researchers is so heavy that it could not be produced directly even in this, the highest-energy particle collider in the world.

In the article, the researchers explore other possible ways of discovering this particle by looking for clues about the physics surrounding a very early stage in the history of our universe, when dark matter was produced.

Credit: 
University of Granada

The CLASP2 space experiment achieves an unprecedented map of the Sun's magnetic field

image: Artistic visualization of the Sun's magnetic field in the active region observed by CLASP2

Image: 
Gabriel Pérez Díaz, SMM (IAC).

Every day space telescopes provide spectacular images of the solar activity. However, their instruments are blind to its main driver: the magnetic field in the outer layers of the solar atmosphere, where the explosive events that occasionally affect the Earth occur. The extraordinary observations of the polarization of the Sun's ultraviolet light achieved by the CLASP2 mission have made it possible to map the magnetic field throughout the entire solar atmosphere, from the photosphere until the base of the extremely hot corona. This investigation, published today in the journal Science Advances, has been carried out by the international team responsible for this suborbital experiment, which includes several scientists of the POLMAG group of the Instituto de Astrofísica de Canarias (IAC).

The chromosphere is a very important region of the solar atmosphere spanning a few thousand kilometers between the relatively thin and cool photosphere (with temperatures of a few thousand degrees) and the hot and extended corona (with temperatures above a million degrees). Although the temperature of the chromosphere is about one hundred times lower than that of the corona, the chromosphere has a far higher density, and thus much more energy is required to sustain it. Moreover, the mechanical energy necessary to heat the corona needs to traverse the chromosphere, making it a crucial interface region for the solution of many of the key problems in solar and stellar physics. One of the current scientific challenges is to understand the origin of the violent activity of the solar atmosphere, which on some occasions perturb the Earth's magnetosphere with serious consequences for our present technological world.

"It is impossible to understand the solar atmosphere if we cannot determine the magnetic fields of the chromosphere, especially in its outer layers where the plasma temperature is of the order of ten thousand degrees and the magnetic forces dominate the structure and dynamics of the plasma", says Javier Trujillo Bueno, CSIC Professor at the IAC and lead scientist of the POLMAG group of the IAC (see http://research.iac.es/proyecto/polmag/). The theoretical investigations carried out by this group, funded by an Advanced Grant of the European Research Council, indicated that this goal could be reached by observing the polarization that various physical mechanisms produce in the radiation emitted by neutral hydrogen and ionized magnesium atoms in the solar chromosphere.

Because the Earth's atmosphere strongly absorbs the solar ultraviolet radiation, one must go to observe it at altitudes above 100 kilometers. An international consortium was established with this goal, led by the NASA Marshall Space Flight Center (NASA/MSFC), the National Astronomical Observatory of Japan (NAOJ), the French Institute of Space Astrophysics (IAS) and the Spanish Instituto de Astrofísica de Canarias (IAC). This international team designed a series of space experiments that were selected through competitive calls within the Sounding Rocket Program of NASA. These space experiments are known as CLASP, the "Chromospheric Lyman-Alpha Spectro-Polarimeter" (CLASP1, launched on the 3rd of September 2015) and the "Chromospheric LAyer Spectro-Polarimeter" (CLASP2, launched on the 11th of April 2019). Both experiments were a great success (see the POLMAG project webpage http://research.iac.es/proyecto/polmag/pages/news-and-press-releases/clasp.php), which NASA has acknowledged by granting the "Group Achievement Honor Award" to the international team.

The research paper recently published in the prestigious journal "Science Advances" is based on a small part of the unprecedented data acquired by CLASP2. The team analyzed the intensity and the circular polarization of the ultraviolet radiation emitted by an active region of the solar atmosphere in the spectral range containing the h & k lines of Mg II (ionized magnesium) around 2800 Å (see figure 1). Within this spectral region there are also two spectral lines produced by Mn I (neutral manganese) atoms.

The circular polarization observed by CLASP2 arises from a physical mechanism known as the Zeeman effect, through which the radiation emitted by atoms in the presence of a magnetic field is polarized. "The circular polarization signals of the magnesium (Mg II) lines are sensitive to the magnetic fields in the middle and outer regions of the solar chromosphere, whereas the circular polarization of the manganese (Mn I) lines responds to the magnetic fields in the deepest regions of the chromosphere", explains Tanausú del Pino Alemán, one of the scientists of the POLMAG group and of the international team.

While CLASP2 was carrying out its observations, the Hinode space telescope was simultaneously pointing at the same active region on the solar disk. "This made it possible to obtain information about the magnetic field in the photosphere through the polarization observed in neutral iron (Fe I) spectral lines of the visible range of the spectrum", notes Andrés Asensio Ramos, another IAC researcher who participated in the project. The team also made simultaneous observations with the IRIS space telescope, measuring the intensity of the ultraviolet radiation with higher spatial resolution (IRIS was not designed to measure polarization).

The team's investigation, led by Dr. Ryohko Ishikawa (NAOJ) and Dr. Javier Trujillo Bueno (IAC), allowed to map for the first time the magnetic field in the active region observed by CLASP2 throughout its entire atmosphere, from the photosphere to the base of the corona (see figure 2). "This mapping of the magnetic field at various heights in the solar atmosphere is of great scientific interest, as it will help us decipher the magnetic coupling between the different regions of the solar atmosphere", comments Ernest Alsina Ballester, a researcher of the international team who just joined the IAC after his first postdoc in Switzerland.

The achieved results confirm and prove that, in these regions of the solar atmosphere, the lines of force of the magnetic field expand and fill the whole chromosphere before reaching the base of the corona. Another important result of this investigation is that the magnetic field strength in the outer layers of the chromosphere is strongly correlated with the radiation intensity at the center of the magnesium lines and with the electron pressure in the same layers, revealing the magnetic origin of the heating in the outer regions of the solar atmosphere.

The CLASP1 and CLASP2 space experiments represent a milestone in astrophysics, providing the first observations of the relatively weak polarization signals produced by various physical mechanisms in spectral lines of the solar ultraviolet spectrum. Such observations have spectacularly confirmed previous theoretical predictions, thus validating the quantum theory of the generation and transfer of polarized radiation that these scientists apply in their investigations about the magnetic field in the solar chromosphere.

The international team has just received the good news that NASA has selected their recent proposal to carry out a new space experiment next year, which will allow them to map the magnetic field over larger regions of the solar disk. "Of course, systematic observations of the intensity and polarization of the solar ultraviolet radiation will require a space telescope equipped with instruments like the ones on CLASP, because the few minutes of observation time allowed by a suborbital flight experiment are not sufficient", clarifies Javier Trujillo Bueno. The team is convinced that, thanks to what CLASP1 and CLASP2 have achieved, such space telescopes will soon become a reality and the physical interpretation of their spectropolarimetric observations will allow for a better understanding of the magnetic activity in the outer layers of the Sun and other stars.

Credit: 
Instituto de Astrofísica de Canarias (IAC)

Ultrafast electron dynamics in space and time

video: PTCDA molecules on a copper oxide surface are used as a sample. A molecular electron is excited by a laser pulse into another orbital, changing its spatial distribution. The electron in the excited orbital has a finite lifetime, which can be measured by ultrafast orbital tomography.

Image: 
Philipps-Universität Marburg / Till Schürmann

"For decades, chemistry has been governed by two ambitions goals," says Professor Stefan Tautz, head of the Quantum Nanoscience subinstitute at Forschungszentrum Jülich. "One of these is understanding chemical reactions directly from the spatial distribution of electrons in molecules, while the other is tracing electron dynamics over time during a chemical reaction." Both of these goals have been achieved in separate ground-breaking discoveries in chemistry: frontier molecular orbital theory explained the role of the electron distribution in molecules during chemical reactions, while femtosecond spectroscopy made it possible to observe transition states in reactions. "It has long been a dream of physical chemistry to combine these two developments and to then trace electrons in a chemical reaction in time and space."

The scientists have now come a huge step closer to achieving this goal: they observed electron transfer processes at a metal--molecule interface in space and time. Such interfaces are the focus of research in the German Research Foundation's Collaborative Research Centre 1083 at Philipps-Universität Marburg, and it was experiments conducted here that lead to today's publication. "Interfaces initially appear to be no more than two layers side by side, whereas they are in fact the place where the functions of materials come into being. They therefore play a decisive role in technological applications," says Ulrich Höfer, professor of experimental physics at Philipps-Universität Marburg and collaborative research centre spokesman. In organic solar cells, for example, combining different materials at an interface improves the splitting of the states excited by incident light, thus allowing electricity to flow. Interfaces also play a key role in organic light-emitting diode (OLED) displays used in smartphones, for example.

The experimental approach used by the scientists is based on a breakthrough made a few years ago in molecular spectroscopy: photoemission orbital tomography, which itself is based on the well-known photoelectric effect. "Here, a layer of molecules on a metal surface is bombarded with photons, or particles of light, which excites the electrons and causes them to be released," says Professor Peter Puschnig from the University of Graz. "These released electrons do not simply fly around in space, but - and this is the decisive point - based on their angular distribution and energy distribution, they provide a good indication of the spatial distribution of electrons in molecular orbitals."

"The key result of our work is that we can image the orbital tomograms with ultrahigh resolution over time," says Dr. Robert Wallauer, group leader and research assistant at Philipps-Universität Marburg. To do so, the scientists not only used special lasers with ultrashort pulses in the femtosecond range to excite the electrons in the molecules; they also used a novel impulse microscope which simultaneously measured the direction and energy of the electrons released with very high sensitivity. One femtosecond is 10-15 seconds - a millionth of a billionth of a second. In relation to a second, this is as little as a second in relation to 32 million years. Such short pulses are like a kind of strobe light and can be used to break down fast processes into individual images. This enabled the researchers to trace the electron transfer as if in slow motion. "This allowed us to spatially trace the electron excitation pathways almost in real time," says Tautz. "In our experiment, an electron was first excited from its initial state into an unoccupied molecular orbital by a first laser pulse before a second laser pulse enabled it to finally reach the detector. Not only could we observe this process in detail over time, but the tomograms also allowed us to clearly trace where the electrons came from."

"We believe that our findings represent a crucial breakthrough towards the goal of tracing electrons through chemical reactions in space and time," says Ulrich Höfer. "In addition to the fundamental insights into chemical reactions and electron transfer processes, these findings will also have very practical implications. They open up countless possibilities for the optimization of interfaces and nanostructures and the resulting processors, sensors, displays, organic solar cells, catalysts, and potentially even applications and technologies we haven't even thought of yet."

Credit: 
Forschungszentrum Juelich

Investigating the wave properties of matter with vibrating molecules

image: HD+ molecular ions (yellow and red dot pairs) in an ion trap (grey) are irradiated by a laser wave (red). This causes quantum jumps, whereby the vibrational state of the molecular ions changes. (HHU / Soroosh Alighanbari)

Image: 
HHU / Soroosh Alighanbari

Almost 100 years ago, a revolutionary discovery was made in the field of physics: microscopic matter exhibits wave properties. Over the decades, more and more precise experiments have been used to measure the wave properties of electrons in particular. These experiments were mostly based on spectroscopic analysis of the hydrogen atom and they enabled verifying the accuracy of the quantum theory of the electron.

For heavy elementary particles - for example protons - and nuclides (atomic nuclei), it is difficult to measure their wave properties accurately. In principle, however, these properties can be seen everywhere. In molecules, the wave properties of atomic nuclei are obvious and can be observed in the internal vibrations of the atomic nuclei against each other. Such vibrations are enabled by the electrons in molecules, that create a bond between the nuclei that is 'soft' rather than rigid. For example, nuclear vibrations occur in every molecular gas under normal conditions, such as in air.

The wave properties of the nuclei are demonstrated by the fact that the vibration cannot have an arbitrary strength - i.e. energy - as would be the case with a pendulum for example. Instead, only precise, discrete values known as 'quantized' values are possible for the energy.

A quantum jump from the lowest vibrational energy state to a higher energy state can be achieved by radiating light onto the molecule, whose wavelength is precisely set so that it corresponds exactly to the energy difference between the two states.

To investigate the wave properties of nuclides very accurately, one needs both a very precise measuring method and a very precise knowledge of the binding forces in the specific molecule, because these determine the details of the wave motion of the nuclides. This then makes it possible to test fundamental laws of nature by comparing their specific statements for the nuclide investigated with the measurement results.

Unfortunately, it is not yet possible to make precise theoretical predictions regarding the binding forces of molecules in general - the quantum theory to be applied is mathematically too complex to handle. Consequently, it is not possible to investigate the wave properties in any given molecule accurately. This can only be achieved with particularly simple molecules.

Together with its long-standing cooperation partner V. I. Korobov from the Bogoliubov Laboratory of Theoretical Physics at the Joint Institute for Nuclear Research in Dubna, Russia, Prof. Schiller's research team is dedicated to precisely one such molecule, namely the hydrogen molecular ion HD+. HD+ consists of a proton (p) and the nuclide deuteron (d). The two are linked together by a single electron. The relative simplicity of this molecule means that extremely accurate theoretical calculations can now be performed. It was V.I. Korobov who achieved this, after refining his calculations continuously for over twenty years.

For charged molecules such as the hydrogen molecule, an accessible yet highly precise measuring technique did not exist until recently. Last year, however, the team led by Prof. Schiller developed a novel spectroscopy technique for investigating the rotation of molecular ions. The radiation used then is referred to as 'terahertz radiation', with a wavelength of about 0.2 mm.

The team has now been able to show that the same approach also works for excitation of molecular vibrations using radiation with a wavelength that is 50 times shorter. To do this, they had to develop a particularly frequency-sharp laser that is one of a kind worldwide.

They demonstrated that this extended spectroscopy technique has a resolution capacity for the radiation wavelength for vibrational excitation that is 10,000 times higher than in previous techniques used for molecular ions. Systematic disturbances of the vibrational states of the molecular ions, for example through interfering electrical and magnetic fields, could also be suppressed by a factor of 400.

Ultimately, it emerged that the prediction of quantum theory regarding the behaviour of the atomic nuclei proton and deuteron was consistent with the experiment with a relative inaccuracy of less than 3 parts in 100 billion parts.

If it is assumed that V.I. Korobov's prediction based on quantum theory is complete, the result of the experiment can also be interpreted differently - namely as the determination of the ratio of electron mass to proton mass. The value derived corresponds very well with the values determined by experiments by other working groups using completely different measuring techniques.

Prof. Schiller emphasises: "We were surprised at how well the experiment worked. And we believe that the technology we developed is applicable not only to our 'special' molecule but also in a much wider context. It will be exciting to see how quickly the technology is adopted by other working groups."

Credit: 
Heinrich-Heine University Duesseldorf

Lakes isolated beneath Antarctic ice could be more amenable to life than thought

image: Ellsworth Mountains, on transit to Subglacial Lake Ellsworth, December 2012

Image: 
Peter Bucktrout, British Antarctic Survey

Lakes underneath the Antarctic ice sheet could be more hospitable than previously thought, allowing them to host more microbial life.

This is the finding of a new study that could help researchers determine the best spots to search for microbes that could be unique to the region, having been isolated and evolving alone for millions of years. The work could even provide insights into similar lakes beneath the surfaces of icy moons orbiting Jupiter and Saturn, and the southern ice cap on Mars.

Lakes can form beneath the thick ice sheet of Antarctica where the weight of ice causes immense pressure at the base, lowering the melting point of ice. This, coupled with gentle heating from rocks below and the insulation provided by the ice from the cold air above, allows pools of liquid water to accumulate.

More than 400 of these 'subglacial' lakes have been discovered beneath the Antarctic ice sheet, many of which have been isolated from each other and the atmosphere for millions of years.

This means that any life in these lakes could be just as ancient, providing insights into how life might adapt and evolve under persistent extreme cold conditions, which have occurred previously in Earth's history.

Expeditions have successfully drilled into two small subglacial lakes at the edge of the ice sheet, where water can rapidly flow in or out. These investigations revealed microbial life beneath the ice, but whether larger lakes isolated beneath the central ice sheet contain and sustain life remains an open question.

Now, in a study published today in Science Advances, researchers from Imperial College London, the University of Lyon and the British Antarctic Survey have shown subglacial lakes may be more hospitable than they first appear.

As they have no access to sunlight, microbes in these environments do not gain energy through photosynthesis, but by processing chemicals. These are concentrated in sediments on the lake beds, where life is thought to be most likely.

However, for life to be more widespread, and therefore easier to sample and detect, the water in the lake must be mixed - must move around - so that sediments, nutrients and oxygen can be more evenly distributed.

In lakes on the surface of the Earth, this mixing is caused by the wind and by heating from the sun, causing convection currents. As neither of these can act on subglacial lakes, it could be assumed there is no mixing.

However, the team behind the new study found that another source of heat is sufficient to cause convection currents in most subglacial lakes. The heat is geothermal: rising from the interior of the Earth and generated by the combination of heat left over from the formation of the planet and the decay of radioactive elements.

The researchers calculated that this heat can stimulate convection currents in subglacial lakes that suspend small particles of sediment and move oxygen around, allowing more of the water body to be hospitable to life.

Lead researcher Dr Louis Couston, from the University of Lyon and the British Antarctic Survey said: "The water in lakes isolated under the Antarctic ice sheet for millions of years is not still and motionless; the flow of water is actually quite dynamic, enough to cause fine sediment to be suspended in the water. With dynamic flow of water, the entire body of water may be habitable, even if more life remains focused on the floors.
"This changes our appreciation of how these habitats work, and how in future we might plan to sample them when their exploration takes place."

The researchers' predictions may soon be tested, as a team from the UK and Chile plan to sample a lake called Lake CECs in the next few years. Samples taken throughout the depth of the lake water will show just where microbial life is found.

The predictions could also be used to generate theories about life elsewhere in the Solar System, as co-author Professor Martin Siegert, Co-Director of the Grantham Institute - Climate Change and Environment at Imperial, explains: "Our eyes now turn to predicting the physical conditions in liquid water reservoirs on icy moons and planets. The physics of subglacial water pockets is similar on Earth and icy moons, but the geophysical setting is quite different, which means that we're working on new models and theories.

"With new missions targeting icy moons and increasing computing capabilities, it's a great time for astrobiology and the search for life beyond the Earth."

Credit: 
Imperial College London

Comet or asteroid: What killed the dinosaurs and where did it come from?

image: Artist's rendering of a comet headed towards Earth.

Image: 
Credit not required. This is a public domain image.

It forever changed history when it crashed into Earth about 66 million years ago.

The Chicxulub impactor, as it's known, left behind a crater off the coast of Mexico that spans 93 miles and runs 12 miles deep. Its devastating impact brought the reign of the dinosaurs to an abrupt and calamitous end by triggering their sudden mass extinction, along with the end of almost three-quarters of the plant and animal species living on Earth.

The enduring puzzle: Where did the asteroid or comet originate, and how did it come to strike Earth? Now, a pair of researchers at the Center for Astrophysics | Harvard & Smithsonian believe they have the answer.

In a study published today in Nature's Scientific Reports, Harvard University astrophysics undergraduate student Amir Siraj and astronomer Avi Loeb put forth a new theory that could explain the origin and journey of this catastrophic object.

Using statistical analysis and gravitational simulations, Siraj and Loeb calculate that a significant fraction of long-period comets originating from the Oort cloud, an icy sphere of debris at the edge of the solar system, can be bumped off-course by Jupiter's gravitational field during orbit.

"The solar system acts as a kind of pinball machine," explains Siraj, who is pursuing bachelor's and master's degrees in astrophysics, in addition to a master's degree in piano performance at the New England Conservatory of Music. "Jupiter, the most massive planet, kicks incoming long-period comets into orbits that bring them very close to the sun."

During close passage to the sun, the comets -- nicknamed "sungrazers" --can experience powerful tidal forces that break apart pieces of the rock and ultimately, produce cometary shrapnel.

"In a sungrazing event, the portion of the comet closer to the sun feels a stronger gravitational pull than the part that is further, resulting in a tidal force across the object," Siraj says. "You can get what's called a tidal disruption event, in which a large comet breaks up into many smaller pieces. And crucially, on the journey back to the Oort cloud, there's an enhanced probability that one of these fragments hit the Earth."

The new calculations from Siraj and Loeb's theory increase the chances of long-period comets impacting Earth by a factor of about 10, and show that about 20 percent of long-period comets become sungrazers.

The pair say that their new rate of impact is consistent with the age of Chicxulub, providing a satisfactory explanation for its origin and other impactors like it.

"Our paper provides a basis for explaining the occurrence of this event," Loeb says. "We are suggesting that, in fact, if you break up an object as it comes close to the sun, it could give rise to the appropriate event rate and also the kind of impact that killed the dinosaurs."

Evidence found at the Chicxulub crater suggests the rock was composed of carbonaceous chondrite. Siraj and Loeb's hypothesis might also explain this unusual composition.

A popular theory on the origin of Chicxulub claims that the impactor originated from the main belt, which is an asteroid population between the orbit of Jupiter and Mars. However, carbonaceous chondrites are rare amongst main-belt asteroids, but possibly widespread amongst long-period comets, providing additional support to the cometary impact hypothesis.

Other similar craters display the same composition. This includes an object that hit about 2 billion years ago and left the Vredefort crater in South Africa, which is the largest confirmed crater in Earth's history, and the impactor that left the Zhamanshin crater in Kazakhstan, which is the largest confirmed crater within the last million years. The researchers say that the timing of these impacts support their calculations on the expected rate of Chicxulub-sized tidally disrupted comets.

Siraj and Loeb say their hypothesis can be tested by further studying these craters, others like them, and even ones on the surface of the moon to determine the composition of the impactors. Space missions sampling comets can also help.

Aside from composition of comets, the new Vera Rubin Observatory in Chile may be able to observe tidal disruption of long-period comets after it becomes operational next year.

"We should see smaller fragments coming to Earth more frequently from the Oort cloud," Loeb says. "I hope that we can test the theory by having more data on long-period comets, get better statistics, and perhaps see evidence for some fragments."

Loeb says understanding this is not just crucial to solving a mystery of Earth's history but could prove pivotal if such an event were to threaten the planet.

"It must have been an amazing sight, but we don't want to see that again," he said.

Credit: 
Center for Astrophysics | Harvard & Smithsonian

A new way to look for life-sustaining planets

image: The Very Large Telescope, or VLT, at the Paranal Observatory in Chile's Atacama Desert. VLT's instrumentation was adapted to conduct a search for planets in the Alpha Centauri system as part of the Breakthrough initiatives. This stunning image of the VLT is painted with the colors of sunset and reflected in water on the platform.

Image: 
A. Ghizzi Panizza/ESO

It is now possible to capture images of planets that could potentially sustain life around nearby stars, thanks to advances reported by an international team of astronomers in the journal Nature Communications.

Using a newly developed system for mid-infrared exoplanet imaging, in combination with a very long observation time, the study's authors say they can now use ground-based telescopes to directly capture images of planets about three times the size of Earth within the habitable zones of nearby stars.

Efforts to directly image exoplanets - planets outside our solar system - have been hamstrung by technological limitations, resulting in a bias toward the detection of easier-to-see planets that are much larger than Jupiter and are located around very young stars and far outside the habitable zone - the "sweet spot" in which a planet can sustain liquid water. If astronomers want to find alien life, they need to look elsewhere.

"If we want to find planets with conditions suitable for life as we know it, we have to look for rocky planets roughly the size of Earth, inside the habitable zones around older, sun-like stars," said the paper's first author, Kevin Wagner, a Sagan Fellow in NASA's Hubble Fellowship Program at the University of Arizona's Steward Observatory.

The method described in the paper provides more than a tenfold improvement over existing capabilities to directly observe exoplanets, Wagner said. Most studies on exoplanet imaging have looked in infrared wavelengths of less than 10 microns, stopping just short of the range of wavelengths where such planets shine the brightest, Wagner said.

"There is a good reason for that because the Earth itself is shining at you at those wavelengths," Wagner said. "Infrared emissions from the sky, the camera and the telescope itself are essentially drowning out your signal. But the good reason to focus on these wavelengths is that's where an Earthlike planet in the habitable zone around a sun-like star is going to shine brightest."

The team used the Very Large Telescope, or VLT, of the European Southern Observatory in Chile to observe our closest neighbor star system: Alpha Centauri, just 4.4 light-years away. Alpha Centauri is a triple star system; it consists of two stars - Alpha Centauri A and B - that are similar to the sun in size and age and orbit each other as a binary system. The third star, Alpha Centauri C, better known as Proxima Centauri, is a much smaller red dwarf orbiting its two siblings at a great distance.

A planet not quite twice the size of Earth and orbiting in the habitable zone around Proxima Centauri has already been indirectly detected through observations of the star's radial velocity variation, or the tiny wobble a star exhibits under the tug of the unseen planet. According to the study's authors, Alpha Centauri A and B could host similar planets, but indirect detection methods are not yet sensitive enough to find rocky planets in their more widely separated habitable zones, Wagner explained.

"With direct imaging, we can now push beneath those detection limits for the first time," he said.

To boost the sensitivity of the imaging setup, the team used a so-called adaptive secondary telescope mirror that can correct for the distortion of the light by the Earth's atmosphere. In addition, the researchers used a starlight-blocking mask that they optimized for the mid-infrared light spectrum to block the light from one of the stars at a time. To enable observing both stars' habitable zones simultaneously, they also pioneered a new technique to switch back and forth between observing Alpha Centauri A and Alpha Centauri B very rapidly.

"We're moving one star on and one star off the coronagraph every tenth of a second," Wagner said. "That allows us to observe each star for half of the time, and, importantly, it also allows us to subtract one frame from the subsequent frame, which removes everything that is essentially just noise from the camera and the telescope."

Using this approach, the undesired starlight and "noise" - unwanted signal from within the telescope and camera - become essentially random background noise, possible to further reduce by stacking images and subtracting the noise using specialized software.

Similar to the effect to noise-canceling headphones, which allow soft music to be heard over a steady stream of unwanted jet engine noise, the technique allowed the team to remove as much of the unwanted noise as possible and detect the much fainter signals created by potential planet candidates inside the habitable zone.

The team observed the Alpha Centauri system for nearly 100 hours over the course of a month in 2019, collecting more than 5 million images. They collected about 7 terabytes of data, which they made publicly available at http://archive.eso.org .

"This is one of the first dedicated multi-night exoplanet imaging campaigns, in which we stacked all of the data we accumulated over nearly a month and used that to achieve our final sensitivity," Wagner said.

After removing so-called artifacts - false signals created by the instrumentation and residual light from the coronagraph - the final image revealed a light source designated as "C1" that could potentially hint at the presence of an exoplanet candidate inside the habitable zone.

"There is one point source that looks like what we would expect a planet to look like, that we can't explain with any of the systematic error corrections," Wagner said. "We are not at the level of confidence to say we discovered a planet around Alpha Centauri, but there is a signal there that could be that with some subsequent verification."

Simulations of what planets within the data are likely to look like suggest that "C1" could be a Neptune- to Saturn-sized planet at a distance from Alpha Centauri A that is similar to the distance between the Earth and the sun, Wagner said. However, the authors clearly state that without subsequent verification, the possibility that C1 might be due to some unknown artifact caused by the instrument itself cannot be ruled out just yet.

Finding a potentially habitable planet within Alpha Centauri has been the goal of the initiative Breakthrough Watch/NEAR, which stands for New Earths in the Alpha Centauri Region. Breakthrough Watch is a global astronomical program looking for Earthlike planets around nearby stars.

"We are very grateful to the Breakthrough Initiatives and ESO for their support in achieving another steppingstone towards the imaging of Earthlike planets around our neighbor stars," said Markus Kasper, lead scientist of the NEAR project and a co-author on the paper.

The team intends to embark on another imaging campaign in a few years, in an attempt to catch this potential exoplanet in the Alpha Centauri system in a different location, and to see whether it would be consistent with what would be expected based on modeling its expected orbit. Further clues may come from follow-up observations using different methods.

The next-generation of extremely large telescopes, such as the Extremely Large Telescope of the European Southern Observatory, and the Giant Magellan Telescope, for which the University of Arizona produces the primary mirrors, are expected to be able to increase direct observations of nearby stars that might harbor planets in their habitable zones by a factor of 10, Wagner explained. Candidates to look at include Sirius, the brightest star in the night sky, and Tau Ceti, which hosts an indirectly observed planetary system that Wagner and his colleagues will try to directly image.

"Making the capability demonstrated here a routine observing mode - to be able to pick up heat signatures of planets orbiting within the habitable zones of nearby stars - will be a game changer for the exploration of new worlds and for the search for life in the universe," said study co-author Daniel Apai, a UArizona associate professor of astronomy and planetary science who leads the NASA-funded Earths in Other Solar Systems program that partly supported the study.

Credit: 
University of Arizona

Embry-Riddle alumna helps unravel key mysteries of rare stars

image: Dr. Noel Richardson, assistant professor of Physics and Astronomy at Embry-Riddle, mentored now-alumna Laura M. Lee, who helped determine the visual orbit and dynamical mass of the Wolf-Rayet 133 binary system as part of her capstone thesis project.

Image: 
Embry-Riddle/Jason Kadah

Within the constellation Cygnus, an elderly star and its massive companion are having one last hurrah, flinging off mass at an incredible rate before they explode as supernovae and collapse into a black hole.

Now, researchers including recent Embry-Riddle Aeronautical University graduate Laura M. Lee have mapped the elderly star's orbit around its oversized and equally ancient partner. In a scientific first, they have also determined the dynamical mass of both stars that make up a binary system called Wolf-Rayet 133.

The team's findings, published Feb. 9, 2021 by Astrophysical Journal Letters, mark the first-ever visually observed orbit of a rare type of star called a Nitrogen-rich Wolf-Rayet (WN) star. The WN star in question is half of the starry dance duo in the WR 133 binary.

The WN star pirouettes around its partner star, an O9 supergiant, every 112.8 days - a relatively brief orbit, indicating that the two stars are close together, researchers reported. The WN star has 9.3 times more mass than our Sun, while the O9 supergiant is a whopping 22.6 times more massive, the team found.

Imagining the Early Universe

The research opens a new window to the distant past when stars and planets were first beginning to form.

Wolf-Rayet type stars, so named for the astronomers who discovered them in 1867, are massive stars near the end of their lives, said Lee's faculty mentor Dr. Noel Richardson, assistant professor of Physics and Astronomy at Embry-Riddle. They're very hot, a million times more luminous than the Sun, and stellar winds have stripped off their hydrogen envelopes. That has made it difficult to measure their mass - a vital step toward modeling the evolution of stars - until now.

Because the pair of stars in the WR 133 binary are tightly coupled, they've likely exchanged mass, Richardson noted. "In the early universe, we think most stars were very, very massive and they probably exploded early on," he said.

"When these types of binary stars are close enough, they can transfer mass to each other, possibly kicking up space dust, which is necessary for the formation of stars and planets. If they're not close enough to transfer mass, they're still whipping up a huge wind that shoots material into the cosmos, and that can also allow stars and planets to form. This is why we want to know more about this rare type of star."

Lee was still an undergraduate at Embry-Riddle when Richardson invited her to help solve an intriguing astronomy riddle, as part of her senior capstone project. Richardson had been analyzing data from the CHARA Array, a collection of six telescopes positioned across California's Mount Wilson. The array, operated by Georgia State University's Center for High Angular Resolution Astronomy, could pluck out celestial details smaller than the angular size of a dime in New York City from the telescopes near Los Angeles, California.

Lee's specific task was to make sense of about 100 spectra - barcode-like graphs that reveal how much light a star is giving off. To better understand WR 133's spectra, provided by Grant M. Hill of the Keck Observatory in Hawaii, Lee used computer code that allowed the team to measure how the two stars were moving. "These measurements are a necessary step because they tell us how the stars move back and forth from us, while the CHARA measurements told us how they move across the sky," Richardson explained. "The combination gives us the ability to see a three-dimensional orbit, which then tells us the masses."

At the time, Lee was laser-focused on earning her Embry-Riddle degree. "I didn't really realize how big of an impact we were making in this field," said Lee, a member of the Sigma Pi Sigma physics honors society who now holds an Astronomy degree with a Mathematics minor. "It was pretty exciting to be a part of the project, especially as an undergraduate student."

`A Blue Marble in Space'

At the Armagh Observatory & Planetarium in Northern Ireland, one of the many institutions involved in the project, Andreas A.C. Sander said the team's findings were somewhat surprising and will prompt researchers to rethink key assumptions. "The results are very interesting as they yield a lower mass than expected for such a star," Sander noted.

"While this might sound like a detail, it will change our perception of the Black Holes resulting from collapsing Wolf-Rayet stars, a crucial ingredient in the astrophysical context of gravitational wave events."

Gail Schaefer of the CHARA Array noted that Richardson's observations using the Georgia State University (GSU) telescopes on Mount Wilson - made possible through an open-access program at the facility - "will help improve our understanding of how binary interactions impact the evolution of these massive stars."

Astronomer Jason Aufdenberg of Embry-Riddle, who has also used the CHARA Array, said that "the kind of work Noel is doing, establishing orbits, is very important because they can get the masses of these things. Knowing about these very hot stars, how many there were and their luminosities is all part of understanding what happened in our universe after the Big Bang."

Now at the beginning of her career, Lee said she hopes to keep learning and being amazed by our universe. "We are on a blue marble floating in space," she said. "It's important to learn more about the complexities of the universe around us. Humans are born to learn. Any knowledge we can gain is a gift."

Credit: 
Embry-Riddle Aeronautical University

Can super-Earth interior dynamics set the table for habitability?

image: An illustration showing how a combination of static high-pressure synthesis techniques and dynamic methods enabled the researchers to probe the magnesium silicate bridgmanite, believed to be predominate in the mantles of rocky planets, under extreme conditions mimicking the interior of a super-Earth.

Image: 
Image is courtesy of Yingwei Fei. Sandia Z Machine photograph by Randy Montoya, courtesy of Sandia National Laboratories.

Washington, DC-- New research led by Carnegie's Yingwei Fei provides a framework for understanding the interiors of super-Earths--rocky exoplanets between 1.5 and 2 times the size of our home planet--which is a prerequisite to assess their potential for habitability. Planets of this size are among the most abundant in exoplanetary systems. The paper is published in Nature Communications.

"Although observations of an exoplanet's atmospheric composition will be the first way to search for signatures of life beyond Earth, many aspects of a planet's surface habitability are influenced by what's happening beneath the planet's surface, and that's where Carnegie researcher's longstanding expertise in the properties of rocky materials under extreme temperatures and pressures comes in," explained Earth and Planets Laboratory Director Richard Carlson.

On Earth, the interior dynamics and structure of the silicate mantle and metallic core drive plate tectonics, and generate the geodynamo that powers our magnetic field and shields us from dangerous ionizing particles and cosmic rays. Life as we know it would be impossible without this protection. Similarly, the interior dynamics and structure of super-Earths will shape the surface conditions of the planet.

With exciting discoveries of a diversity of rocky exoplanets in recent decades, are much-more-massive super-Earths capable of creating conditions that are hospitable for life to arise and thrive?

Knowledge of what's occurring beneath a super-Earth's surface is crucial for determining whether or not a distant world is capable of hosting life. But the extreme conditions of super-Earth-mass planetary interiors challenge researchers' ability to probe the material properties of the minerals likely to exist there.

That's where lab-based mimicry comes in.

For decades, Carnegie researchers have been leaders at recreating the conditions of planetary interiors by putting small samples of material under immense pressures and high temperatures. But sometimes even these techniques reach their limitations.

"In order to build models that allow us to understand the interior dynamics and structure of super-Earths, we need to be able to take data from samples that approximate the conditions that would be found there, which could exceed 14 million times atmospheric pressure," Fei explained. "However, we kept running up against limitations when it came to creating these conditions in the lab. "

A breakthrough occurred when the team--including Carnegie's Asmaa Boujibar and Peter Driscoll, along with Christopher Seagle, Joshua Townsend, Chad McCoy, Luke Shulenburger, and Michael Furnish of Sandia National Laboratories--was granted access to the world's most powerful, magnetically-driven pulsed power machine (Sandia's Z Pulsed Power Facility) to directly shock a high-density sample of bridgmanite--a high-pressure magnesium silicate that is believed to be predominant in the mantles of rocky planets--in order to expose it to the extreme conditions relevant to the interior of super-Earths.

A series of hypervelocity shockwave experiments on representative super-Earth mantle material provided density and melting temperature measurements that will be fundamental for interpreting the observed masses and radii of super-Earths.

The researchers found that under pressures representative of super-Earth interiors, bridgmanite has a very high melting point, which would have important implications for interior dynamics. Under certain thermal evolutionary scenarios, they say, massive rocky planets might have a thermally driven geodynamo early in their evolution, then lose it for billions of years when cooling slows down. A sustained geodynamo could eventually be re-started by the movement of lighter elements through inner core crystallization.

"The ability to make these measurements is crucial to developing reliable models of the internal structure of super-Earths up to eight times our planet's mass," Fei added. "These results will make a profound impact on our ability to interpret observational data."

Credit: 
Carnegie Institution for Science

Drop the stress

image: High-resolution confocal microscopy images of mCherry tagged NELF-A protein (red) in human HeLa cells. The wild-type NELF-A protein forms stress-induced condensates upon heat-shock (left) whereas protein lacking IDR region fails to form these condensates (right). Scale bar: 5μm.

Image: 
© MPI of Immunbiology and Epigenetic, P. Rawat

All life on earth evolved multiple layers and networks of ensuring survival upon catastrophic events. Even cells have their emergency plan: the heat shock response. Triggered by multiple stress stimuli such as heat, toxins, or radiation, this cellular safety program tries to prevent permanent damage to the organism. The response resembles an overall adopted "lockdown" strategy witnessed during the global corona virus pandemic. During a lockdown, only essential activities are permitted and resources were diverted towards measures ensuring minimizing the impact of a pandemic.

Under normal conditions, RNA polymerase II rushes down the DNA. At the correct places, the DNA is transcribed into mRNA, which is then translated into proteins. In a crisis, however, this transcription activity must come to a standstill, for the most part, to shut down or minimize the production of proteins not essential during stress conditions. "This move releases necessary capacities to ramp-up the production of RNA and proteins called molecular chaperones, which help to cope with the threat and effects of stress. The question remains, how to place an entire cell under lockdown?" says Ritwick Sawarkar, group leader at the MPI of Immunobiology and Epigenetics and the University of Cambridge.

NELF condensation upon stress - ensures attenuation of gene transcription

Earlier studies by the Sawarkar lab gave first insights what happens in the cells, when they switch from normal to emergency. Stress causes the accumulation of the negative elongation factor (NELF) in the nucleus and stops the transcription at a vast number of genes. But how exactly the transcriptional regulator NELF executes the so-called Stress-Induced Transcriptional Attenuation (SITA) remained unknown.

"At the start of this project, we tried to visualize the NELF protein with live-cell imaging to understand its role and regulation better. Surprisingly, we discovered that NELF forms puncta or droplets upon stress whereas the same protein remains diffused under no stress conditions. We called these droplets NELF condensates", says Prashant Rawat, first-author of the study. Together with the Lab of Patrick Cramer at the MPI for Biophysical Chemistry who could recapitulate the same NELF droplets in vitro with recombinant purified proteins, the teams propose that the stress induced biomolecular condensation facilitates an enhanced recruitment of NELF to the promoter regions of genes. Here, the NELF droplets presumably block the activity of the polymerase and drive the downregulation of gene expression.

NELF tentacle driven condensation

NELF subunits contain so-called Intrinsically disordered regions (IDRs). IDRs are the parts of proteins with no fixed structure and act as tentacles. The Max Planck scientists were able to show that interactions between the NELF tentacles are essential for condensation. "Many individual NELF molecules come together and their tentacles engage strongly together to form the droplet just like holding each other's hands. But what puzzled us the most was that NELF always contains IDRs as part of their structure but only undergoes condensation upon stress," says Prashant Rawat.

Using genome and proteome-wide molecular and biochemical approaches, the team identified specific Post-translational Modifications (PTMs) that are essential for NELF condensation. PTMs are changes of proteins after their synthesis and are often used by cells to answer environmental stimuli. The results show that two different modifications make NELF condensates possible. "We found that stress -contingent changes in NELF phosphorylation and further SUMOylation governs NELF condensation," says Ritwick Sawarkar.

NELF condensation relevant for cellular fitness

Cells failing to form the NELF droplets because of impaired IDR or SUMOylation deficiency also fail to downregulate the genes and transcription upon stress. "If cells fail to go under lockdown by NELF condensation and transcriptional downregulation they risk their fitness. Our data shows significantly higher death rates of cells lacking a proper NELF condensation during stress," says Prashant Rawat.

For Ritwick Sawarkar, these results also highlight the collaborative aspects of life at Max Planck Institutes. "This research only became possible because of close cooperations. Andrea Pichler's lab at MPI-IE was key to understanding the SUMO machinery's role, while another collaboration with Patrick Cramer's lab at MPI-BPC Göttingen could recapitulate the same NELF droplets in vitro with recombinant purified proteins," says Ritwick Sawarkar, lead author of the study.

Stress-induced transcriptional downregulation is already speculated to be associated with neurological disorders like Huntington. "We have already generated mouse models at the institute to extend our findings in vivo and to relevant disease models," says Prashant Rawat. The possibility of exploring the role of NELF condensates in different diseases seems to be an exciting avenue for future research in the lab.

Credit: 
Max Planck Institute of Immunobiology and Epigenetics

What did the Swiss eat during the Bronze Age?

The Bronze Age (2200 to 800 BC) marked a decisive step in the technological and economic development of ancient societies. People living at the time faced a series of challenges: changes in the climate, the opening up of trade and a degree of population growth. How did they respond to changes in their diet, especially in Western Switzerland? A team from the University of Geneva (UNIGE), Switzerland, and Pompeu Fabra University (UPF) in Spain has for the first time carried out isotopic analyses on human and animal skeletons together with plant remains. The scientists discovered that manure use had become widespread over time to improve crop harvests in response to demographic growth. The researchers also found that there had been a radical change in dietary habits following the introduction of new cereals, such as millet. In fact, the spread of millet reflected the need to embrace new crops following the drought that ravaged Europe during this period. Finally, the team showed that the resources consumed were mainly terrestrial. The research results are published in the journal PLOS ONE.

Today, archaeological resources for studying the Bronze Age are limited. "This is partly down to changes in funeral rituals," begins Mireille David-Elbiali, an archaeologist in the Laboratory of Prehistoric Archaeology and Anthropology in the F.-A. Forel Department in UNIGE's Faculty of Sciences. "People gradually abandoned the inhumation practice in favour of cremation, thereby drastically reducing the bone material needed for research. And yet the Bronze Age signals the beginning of today's societies with the emergence of metallurgy." As the name suggests, societies began working with bronze, an alloy consisting of copper and tin. "And this development in metallurgy called for more intensive trade so they could obtain the essential raw materials. This increased the circulation of traditional crafts, prestigious goods, religious concepts and, of course, people between Europe and China," continues the archaeologist.

Diet imprinted in bones

The Neolithic Age marked the inception of animal husbandry and the cultivation of wheat and barley. But what about the diet in the next Bronze Age? Archaeobotany and archaeozoology have been routinely used to reconstruct the diet, environment, agricultural practices and animal husbandry in the Bronze Age, but these methods only provide general information. "For the first time, we decided to answer this question precisely by analysing human and animal skeletons directly. This meant we could study the stable isotopes from the collagen of the bones and teeth that constitute them and define their living conditions," continues Alessandra Varalli, a researcher in UPF's Department of Human Sciences and the study's first author. "In fact, we are what we eat," points out Marie Besse, a professor in the Laboratory of Prehistoric Archaeology and Anthropology in the F.-A. Forel Department at UNIGE. Biochemical analyses of bones and teeth will tell us what types of resources have been consumed." Forty-one human skeletons, 22 animal skeletons and 30 plant samples from sites in Western Switzerland and Haute-Savoie (France) were studied, ranging from the beginning to the end of the Bronze Age.

No differences between men, women and children

The study's first outcome showed that there was no difference between the diets of men and women, and that there were no drastic changes in diet between childhood and the adult phase of these individuals. "So, there was no specific strategy for feeding children, just as men didn't eat more meat or dairy product than women. What's more, when it comes to the origin of the proteins consumed, it was found that although Western Switzerland is home to a lake and rivers, the diet was mainly based on terrestrial animals and plants to the exclusion of fish or other freshwater resources," adds Dr Varalli. But the main interest of the study lies in plants, which reveal societal upheavals.

Agriculture adapted to climate change

"During the early Bronze Age (2200 to 1500 BC), agriculture was mainly based on barley and wheat, two cereals of Near Eastern origin that were grown from the Neolithic Age in Europe», explains Dr Varalli. "But from the late Late Bronze Age (1300 to 800 BC), we note that millet was introduced, a plant from Asia that grows in a more arid environmen." In addition, nitrogen isotopes revealed that manuring was used more intensively. "The analysis of several plant species from different phases of the Bronze Age suggests that there was an increase in soil fertilisation over time. This was most likely to boost the production of agricultural crops."

These two discoveries combined seem to confirm the general aridity that prevailed in Europe during this period, which meant agriculture had to be adapted; and that there was heightened trade between different cultures, such as Northern Italy or the Danube region, leading to the introduction of millet into Western Switzerland. These new cereals might have played an important role in the security of supply, and perhaps contributed to the population increase observed in the Late Bronze Age. In fact, these cereals grow more quickly and are more resistant to drought, at a time when the climate was relatively warm and dry. Finally, the use of fertiliser went hand-in-hand with a general improvement in techniques, both agricultural and artisanal. "This first study on changes in diet in Western Switzerland during the Bronze Age corroborates what we know about the period. But it also demonstrates the richness of the widespread intercultural exchanges," states Professor Besse with enthusiasm. We still have much to learn about this millennium, in spite of the scientific problems related to the paucity of available material. "This is one of the reasons that led me to excavate the Eremita cave with UNIGE students. Located in the Piedmont region of Italy, it is dated to the Middle Bronze Age around 1600 BC," concludes Professor Besse.

Credit: 
Université de Genève