More than 90 years ago, astronomer Edwin Hubble observed the first hint of the rate at which the universe expands, called the Hubble constant.
Almost immediately, astronomers began arguing about the actual value of this constant, and over time, realized that there was a discrepancy in this number between early universe observations and late universe observations.
Early in the universe's existence, light moved through plasma--there were no stars yet--and from oscillations similar to sound waves created by this, scientists deduced that the Hubble constant was about 67. This means the universe expands about 67 kilometers per second faster every 3.26 million light-years.
But this observation differs when scientists look at the universe's later life, after stars were born and galaxies formed. The gravity of these objects causes what's called gravitational lensing, which distorts light between a distant source and its observer.
Other phenomena in this late universe include extreme explosions and events related to the end of a star's life. Based on these later life observations, scientists calculated a different value, around 74. This discrepancy is called the Hubble tension.
Now, an international team including a University of Michigan physicist has analyzed a database of more than 1,000 supernovae explosions, supporting the idea that the Hubble constant might not actually be constant.
Instead, it may change based on the expansion of the universe, growing as the universe expands. This explanation likely requires new physics to explain the increasing rate of expansion, such as a modified version of Einstein's gravity.
The team's results are published in the Astrophysical Journal.
"The point is that there seems to be a tension between the larger values for late universe observations and lower values for early universe observation," said Enrico Rinaldi, a research fellow in the U-M Department of Physics. "The question we asked in this paper is: What if the Hubble constant is not constant? What if it actually changes?"
The researchers used a dataset of supernovae--spectacular explosions that mark the final stage of a star's life. When they shine, they emit a specific type of light. Specifically, the researchers were looking at Type Ia supernovae.
These types of supernovae stars were used to discover that the universe was expanding and accelerating, Rinaldi said, and they are known as "standard candles," like a series of lighthouses with the same lightbulb. If scientists know their luminosity, they can calculate their distance by observing their intensity in the sky.
Next, the astronomers use what's called the "redshift" to calculate how the universe's rate of expansion might have increased over time. Redshift is the name of the phenomenon that occurs when light stretches as the universe expands.
The essence of Hubble's original observation is that the further away from the observer, the more wavelength becomes lengthened--like you tacked a Slinky to a wall and walked away from it, holding one end in your hands. Redshift and distance are related.
In Rinaldi's team's study, each bin of stars has a fixed reference value of redshift. By comparing the redshift of each bin of stars, the researchers can extract the Hubble constant for each of the different bins.
In their analysis, the researchers separated these stars based on intervals of redshift. They placed the stars at one interval of distance in one "bin," then an equal number of stars at the next interval of distance in another bin, and so on. The closer the bin to Earth, the younger the stars are.
"If it's a constant, then it should not be different when we extract it from bins of different distances. But our main result is that it actually changes with distance," Rinaldi said. "The tension of the Hubble constant can be explained by some intrinsic dependence of this constant on the distance of the objects that you use."
Additionally, the researchers found that their analysis of the Hubble constant changing with redshift allows them to smoothly "connect" the value of constant from the early universe probes and the value from the late universe probes, Rinaldi said.
"The extracted parameters are still compatible with the standard cosmological understanding that we have," he said. "But this time they just shift a little bit as we change the distance, and this small shift is enough to explain why we have this tension."
The researchers say there are several possible explanations for this apparent change in the Hubble constant--one being the possibility of observational biases in the data sample. To help correct for potential biases, astronomers are using Hyper Suprime-Cam on the Subaru Telescope to observe fainter supernovae over a wide area. Data from this instrument will increase the sample of observed supernovae from remote regions and reduce the uncertainty in the data.