The accepted practice is instead to adjust the model so that it continues to agree with the lack of empirical support.
This very Zen statement is a part of a commentary by theoretical particle physicist Sabine Hossenfelder-- in Nature, no less (so it must be fashionable, if not also true) -- who writes candidly about the crisis in her field (and its neighbours: astrophysics and cosmology): Science needs reason to be trusted [Caution: paywall]. She calls it a crisis of "overproduction" (i.e., abundance) of theories, but I like to think of it as a crisis of "producibility" of experimental data.
In recent years, trust in science has been severely challenged by the reproducibility crisis. This problem has predominantly fallen on the life sciences where, it turns out, many peer-reviewed findings can't be independently reproduced. Attempts to solve this have focused on improving the current measures for statistical reliability and their practical implementation. Changes like this were made to increase scientific objectivity or — more bluntly — to prevent scientists from lying to themselves and each other. They were made to re-establish trust.
The reproducibility crisis is a problem, but at least it's a problem that has been recognized and is being addressed. From where I sit, however, in a research area that can be roughly summarized as the foundations of physics — cosmology, physics beyond the standard model, the foundations of quantum mechanics — I have a front-row seat to a much bigger problem.
I work in theory development. Our task, loosely speaking, is to come up with new — somehow better — explanations for already existing observations, and then make predictions to test these ideas. We have no reproducibility crisis because we have no data to begin with ... [Bold emphasis added]
Here's something that will makes your jaw not just drop, but go into a tailspin:
In December 2015, the LHC collaborations CMS and ATLAS presented evidence for a deviation from standard-model physics at approximately 750 GeV resonant mass2, 3. The excess appeared in the two-photon decay channel and had a low statistical significance. It didn't look like anything anybody had ever predicted. By August 2016, new data had revealed that the excess was merely a statistical fluctuation. But before this happened, high-energy physicists produced more than 600 papers to explain the supposed signal. Many of these papers were published in the field's top journals. None of them describes reality.
1 Comments:
No, my own jaw easily remained well in place.
Working in the Foundations of Quantum Physics (but without the pressure of having to send papers to journals) for over a decade, guess I'd got used to it quite some time back. There are many others, too. Roger Schlafly, e.g., compares book-burning with the black-hole information loss paradox: https://en.wikipedia.org/wiki/Black_hole_information_paradox, and
http://blog.darkbuzz.com/2017/04/why-worry-about-black-hole-info.html
Best,
--Ajit
Post a Comment