Washington, D.C.—For several decades, scientists have thought that the Solar System formed as a result of a shock wave from an exploding star—a supernova—that triggered the collapse of a dense, dusty gas cloud that contracted to form the Sun and the planets. But detailed models of this formation process have only worked under the simplifying assumption that the temperatures during the violent events remained constant. Now, astrophysicists at the Carnegie Institution's Department of Terrestrial Magnetism (DTM) have shown for the first time that a supernova could indeed have triggered the Solar System's formation under the more likely conditions of rapid heating and cooling. The results, published in the October 20, 2008, issue of the Astrophysical Journal, have resolved this long-standing debate.
"We've had chemical evidence from meteorites that points to a supernova triggering our Solar System's formation since the 1970s," remarked lead author, Carnegie's Alan Boss. "But the devil has been in the details. Until this study, scientists have not been able to work out a self-consistent scenario, where collapse is triggered at the same time that newly created isotopes from the supernova are injected into the collapsing cloud."
Short-lived radioactive isotopes—versions of elements with the same number of protons, but a different number of neutrons—found in very old meteorites decay on time scales of millions of years and turn into different (so-called daughter) elements. Finding the daughter elements in primitive meteorites implies that the parent short-lived radioisotopes must have been created only a million or so years before the meteorites themselves were formed. "One of these parent isotopes, iron-60, can be made in significant amounts only in the potent nuclear furnaces of massive or evolved stars," explained Boss. "Iron-60 decays into nickel-60, and nickel-60 has been found in primitive meteorites. So we've known where and when the parent isotope was made, but not how it got here."
Previous models by Boss and former DTM Fellow Prudence Foster showed that the isotopes could be deposited into a pre-solar cloud if a shock wave from a supernova explosion slowed to 6 to 25 miles per second and the wave and cloud had a constant temperature of -440 °F (10 K). "Those models didn't work if the material was heated by compression and cooled by radiation, and this conundrum has left serious doubts in the community about whether a supernova shock started these events over four billion years ago or not," remarked Harri Vanhala, who found the negative result in his Ph.D. thesis work at the Harvard-Smithsonian Center for Astrophysics in 1997.
Using an adaptive mesh refinement hydrodynamics code, FLASH2.5, designed to handle shock fronts, as well as an improved cooling law, the Carnegie researchers considered several different situations. In all of the models, the shock front struck a pre-solar cloud with the mass of our Sun, consisting of dust, water, carbon monoxide, and molecular hydrogen, reaching temperatures as high as 1,340°F (1000 K). In the absence of cooling, the cloud could not collapse. However, with the new cooling law, they found that after 100,000 years the pre-solar cloud was 1,000 times denser than before, and that heat from the shock front was rapidly lost, resulting in only a thin layer with temperatures close to 1,340°F (1000 K). After 160,000 years, the cloud center had collapsed to become a million times denser, forming the protosun. The researchers found that isotopes from the shock front were mixed into the protosun in a manner consistent with their origin in a supernova.
"This is the first time a detailed model for a supernova triggering the formation of our solar system has been shown to work,'' said Boss. "We started with a Little Bang 9 billion years after the Big Bang."
Source: Carnegie Institution