That has become abundantly clear in recent months NASA’s James Webb Space Telescope does exactly what it was intended to do. Just as its creators had hoped, the multi-billion dollar machine “unfolds the universe” flawlessly revealing cosmic light we cannot see with our own eyes — and its excellent results make even the most unlikely of stargazers feel alive.
Because of this gold-plated telescope, Twitter went wild one day over a windy red dot. For 48 hours, people all over the world watched a galaxy born shortly after the birth of time itself. It seems that, thanks to JWST’s technological prowess, humanity is united over stardust.
But here’s the thing.
In the midst of personal reverence, researchers from the Massachusetts Institute of Technology warn that we should consider a decisive one scientific consequence of having a superhero telescope.
If JWST is like a zero-to-100 upgrade, they wonder, is it possible that our science models also need a zero-to-100 reboot? Could the data sets scientists have used for decades not match the power of the device and therefore not reveal what it is trying to tell us?
“The data we will get from JWST will be incredible, but … our insights will be limited if our models do not match it in quality,” Clara Sousa-Silva, a quantum astrophysicist at the Center for Astrophysics, Harvard & Smithsonian, told for CNET.
And according to a new study she co-authored, was published Thursday in the journal Nature Astronomythe answer is yes.
More specifically, this paper suggests that some of the light analysis tools scientists normally use to understand exoplanet atmospheres are not fully equipped to handle JWST’s exceptional light data. In the long run, such an obstacle can have the most impact massive JWST quest of all: the hunt for extraterrestrial life.
“Currently, the model we use to decipher spectral information is not on par with the precision and quality of the data we have from the James Webb Telescope,” Prajwal Niraula, PhD student in MIT’s Department of Earth, Atmospheric and Planetary Sciences and co-author to the study, said in a statement. “We have to step up our game.”
Here’s one way to think about the conundrum.
Imagine pairing the latest, most powerful Xbox console with the very first iteration of a TV. (Yes, I am aware of the extreme hypothetical nature of my scenario). The Xbox would try to give the TV amazing high definition, colorful, beautiful graphics to show us – but the TV wouldn’t have the capacity to compute any of that.
I wouldn’t be surprised if the TV exploded right away. But the point is, you wouldn’t know what Xbox is trying to give you, if you don’t get an equally high definition TV.
Similarly, in line with exoplanet discoveries, scientists feed a bunch of deep-space light, or photons, data into models that test for “opacity.” Opacity measures how easily photons pass through a material and differs depending on things like light wavelength, material temperature and pressure.
This means that each such interaction leaves behind a clear signature of what the photons have, and therefore, in the case of exoplanets, what kind of chemical atmosphere those photons passed through to reach the light detector. This is how scientists reverse-calculate, based on light data, what an exoplanet’s atmosphere consists of.
In this case, the detector plug is on the James Webb Space Telescope — but in the team’s new study, after testing the most widely used opacity model, the researchers saw the JWST light data hit what they call an “accuracy wall.” ”
The model was not sensitive enough to analyze things like whether a planet has an atmospheric temperature of 300 or 600 Kelvin, the researchers say, or whether a particular gas takes up 5% or 25% of the atmosphere. Such a difference is not only statistically significant, but, according to Niraula, it is also “important for us to constrain mechanisms of planet formation and reliably identify biosignatures.”
That is, evidence of alien life.
“We have to work with our interpretive tools,” Sousa-Silva said, “so we don’t find ourselves seeing something amazing through the JWST but not knowing how to interpret it.”
Furthermore, the team also found that its models hide its uncertain readings. A few adjustments can easily drop the uncertainty, making the results fit well when they are inaccurate.
“We found that there are enough parameters to adjust, even with the wrong model, to still get a good fit, which means you wouldn’t know that your model is wrong and that what it says is wrong,” Julien de Wit, assistant professor at MIT’s EAPS and study co-author, said in a statement.
Going forward, the team calls for opacity models to be improved to accommodate our spectacular JWST revelations—especially calling for crossover studies between astronomy and spectroscopy.
“There is so much that could be done if we knew perfectly how light and matter interact,” says Niraula. “We know enough about Earth’s conditions, but as soon as we move into different types of atmospheres, things change, and there is a lot of data, with increasing quality, that we risk misinterpreting.”
De Wit compares the current opacity model to the ancient language translation tool Rosetta Stone, explaining that this Rosetta Stone has performed well so far, for example with the Hubble Space Telescope.
“But now that we’re going to the next level with Webb’s precision,” the scientist said, “our translation process will prevent us from capturing important subtleties, like those that make the difference between a planet being habitable or not.”
As Sousa-Silva puts it, “it’s a call to improve our models, so we won’t miss the subtleties of the data.”
#NASAs #Webb #Space #Telescope #good #improved #planet #models