The Standard Model is a physical theory of a spectacularly successful sort. It is built on beautiful and deep mathematics, covers almost all known physical phenomena, and agrees precisely with the result of every single experiment ever done to test it. It leaves open a very small number of questions: why this specific combination of groups? What determines the parameters of the model? What about gravity? Does it need to be extended to account for dark matter?
Thanks to John Dupuis (Confessions of a Science Librarian) for calling attention to these links.
The "Unnatural" Standard Model
Waiting for the Revolution
The "Unnatural" Standard Model
by Peter Woit
Not Even Wrong
May 25, 2013
http://www.math.columbia.edu/~woit/wordpress/?p=5946
The Standard Model is a physical theory of a spectacularly successful sort. It is built on beautiful and deep mathematics, covers almost all known physical phenomena, and agrees precisely with the result of every single experiment ever done to test it. It leaves open a very small number of questions: why this specific combination of small symmetry groups and their representations? What determines the parameters of the model (18 if you ignore neutrino masses, 7 more if you include them)? What about gravity? Does it need to be extended to account for dark matter?
For several decades now, there has been a very active and heavily advertised field of “Beyond Standard Model” physics, the study of extensions of the standard model that remain consistent with experimental bounds. While BSM models have played a role in guiding experimentalists towards things to look for that are not already ruled out by what is known, they have never come anywhere near fulfilling the hope that they might provide some insight into the SM itself. They provide no explanation of the unexplained aspects of the Standard Model, instead adding a great deal of additional unexplained structure. Perhaps the simplest and most widely studied example is the minimal supersymmetric extension of the SM, which not only explains none of the 25 undetermined SM parameters, but adds more than 100 additional such parameters to the list.
Theorists have traditionally followed what has been described as “Albert Einstein’s dream that the laws of nature are sublimely beautiful, inevitable and self-contained”, and the SM is our closest approach so far to Einstein’s dream. If you shared this dream, the known BSM models would never have much appealed to you, since they just added complexity and extra unexplained parameters. You also would not have been at all surprised by the strong negative results about such models that are one of the two major achievements so far of the LHC (the other is the Higgs discovery). If you’re a follower of Einstein’s dream, the obvious reaction to the LHC results so far would be to rejoice in the vindication of this dream, welcome the triumph of the simplicity of the SM, and hope that further study of the Higgs sector will somehow provide a hint of a better idea about where the SM parameters come from (almost all of them are Higgs couplings).
Remarkably, a very different story is being sold to the public by those who had a great deal invested in now failed BSM models. In this story, the BSM models were the ones of Einstein’s dream: they were “natural”, and their failure leaves us with the “unnatural” Standard Model.
An article entitled Is Nature Unnatural? is the source of the above quote about Einstein, and it tells us that
Decades of confounding experiments have physicists considering a startling possibility: The universe might not make sense…
In peril is the notion of “naturalness,” Albert Einstein’s dream that the laws of nature are sublimely beautiful, inevitable and self-contained. Without it, physicists face the harsh prospect that those laws are just an arbitrary, messy outcome of random fluctuations in the fabric of space and time…
“The universe is impossible,” said Nima Arkani-Hamed, 41, of the Institute for Advanced Study, during a recent talk at Columbia University [more about this talk here].
What is behind this sort of claim that down is up is abuse of the English word “naturalness”, which in this particular case has been adopted by theorists to refer a technical property better described as “not quadratically sensitive to the cut-off scale”. There’s a lot to be said (and a lot that has been said on this blog) about the precise technical issue here. It’s a real one, and likely an important hint about the true nature of the Higgs sector of the SM and where all those undetermined parameters come from. Getting rid though of this technical problem by invoking hundreds of new undetermined parameters is not the sort of thing Einstein was dreaming about. He would see the LHC results as vindication and encouragement: as we investigate new energy scales we find the universe to be as simple as possible. It’s remarkable to see this great discovery being promoted as telling us that we have to give up on Einstein’s dream and adopt a pseudo-scientific research program based on the idea that physical “laws are just an arbitrary, messy outcome of random fluctuations in the fabric of space and time”.
Update: The Science News story has now appeared at Scientific American, with the title New Physics Complications Lend Support to Multiverse Hypothesis. The “New Physics Complications” are the LHC only seeing pure SM behavior. If the LHC had seen a complicated SUSY spectrum, that would have been “natural”, but somehow seeing the simplest possibility has become a new “Complication”. It is a “complication”, but a sociological not physics one. SUSY theorists do have an answer for the complication of their ideas failing: the Multiverse did it.
Waiting for the Revolution
An Interview with the Nobel Prize-winning Physicist David J. Gross.
by: Peter Byrne
Simons Science News
May 24, 2013
https://www.simonsfoundation.org/features/science-news/waiting-for-the-revolution/
In the early 1970s, David J. Gross exposed the hidden structure of the atomic nucleus. He helped to reinvent string theory in the 1980s. In 2004, he shared the Nobel Prize in Physics. And today he struggles mightily to describe the basic forces of nature at the Planck scale (billions of times smaller than a proton), where, string theorists hope, the equations of gravity and quantum mechanics mesh.
Gross, H. David Politzer and Frank Wilczek were awarded the Nobel for discovering asymptotic freedom, more colloquially known as the strong force that binds the components of the atomic nucleus, the protons and neutrons. Forty years ago, their counterintuitive calculations plugged an important gap in the Standard Model of physics, which describes the 61 known elementary particles. This theoretical work revitalized the nearly moribund quantum field theory and gave birth to QCD (quantum chromodynamics), the theory of the strong interactions.
These days, Gross enjoys challenging young physicists as they chalk equations at the Kavli Institute for Theoretical Physics, the think tank funded by the National Science Foundation that he ran from 1997 until stepping down last year. He is eager for younger scientists to surpass his achievements, to break the impasse of underdetermination that currently troubles particle physics, whereby competing theories predict the same physical results and may therefore be immune to experimental verification within the lifetime of the universe.
Gross characterizes theoretical physics as rife with esoteric speculations, a strange superposition of practical robustness and theoretical confusion. He has problems with the popularizing of “multiverses” and “landscapes” of infinite worlds, which are held up as emblematic of physical reality. Sometimes, he says, science is just plain stuck until new data, or a revolutionary idea, busts the status quo. But he is optimistic: Experience tells him that objects that once could not be directly observed, such as quarks and gluons, can be proven to exist. Someday, perhaps the same will be true for the ideas of strings and branes and the holographic boundaries that foreshadow the future of physics.
Reductionism is not dead, he insists.
Simons Science News caught up with Gross at KITP, where he keeps his office door open and the espresso machine hot. An edited and condensed version of the interview follows.
SIMONS SCIENCE NEWS: Why physics, David?
DAVID GROSS: At age 13, I read a wonderful book by Albert Einstein and Leopold Infeld called “The Evolution of Physics.” I was enormously excited by the possibility of tackling fundamental questions about the universe.
Fast forward to the mid-1960s: I was a graduate student at the University of California, Berkeley, which was the center of S-matrix theory, the string theory of the day. At the university’s Rad Lab, experimentalists were constantly discovering new atomic particles with their equipment. But the theorists were having a hard time keeping up with the pace of discovery: We were particularly clueless about the structure of the atomic nucleus.
“We were all looking for the next overthrow, and we were willing to sacrifice existing theories at the drop of a hat.”
Part of the problem was that while quantum field theory had spectacularly reconciled special relativity and quantum mechanics, it was failing to do the heavy lifting in particle physics. It was not telling us if particles were composite entities or elemental units.
In the lab, we could not see or physically describe the mathematical objects that we called quarks, which we suspected were the key to unlocking the dynamics of the strong force that binds together the clump of protons and neutrons at the center of the atom. And quantum field theory was failing to calculate how this could be so.
A lot of people became frustrated, saying, “Quantum field theory does not work. It is unphysical. Throw it away!”
What did they mean by “unphysical”?
That scientists cannot talk about a thing that cannot be physically observed. If quantum field theory cannot predict particle behaviors based on prior experimental observations, that is, by physical measurements of quantum objects in motion, then field theory is useless as a tool, it was said.
A field is a dynamical object that has a value at every point in the space it occupies. It can be thought of as an object that has position and internal structure. For example, the lines of force one sees when sprinkling iron filings around a magnet depict the shape of a field in physical space.
For quantum mechanical theories to be consistent with the constraints of relativity, we picture the interactions between charged particles as flowing through a quantum mechanical field, a spatial field. Ripples in the field can be treated as electromagnetic waves or radiation or light. And these ripples can also be described as particles that transmit the forces of nature through space.
When I was at Berkeley, the framework of quantum field theory could calculate the dynamics of electromagnetism. It could roughly describe the motion of the weak nuclear force, radiation. But it hit a brick wall with the strong interaction, the binding force.
Were theorists looking for a more effective theory of elementary particles?
My advisor, Geoffrey Chew, went even further: declaring that there are no elementary particles, but only casually connected interactions that obey probabilistic laws. Thus, in the S-matrix framework, particles created particles, with no particle type being more elemental than any other type. Chew called this non-theory “nuclear democracy” or “the bootstrap.”
For some people, the bootstrap was a revolutionary approach — a new philosophy of physics. One could start with a proton and produce other particles and vice versa. And patterns that reified that idea began to emerge in data generated by proton collisions in the linear accelerators in California.
Was the revolution confined to Berkeley?
Since the founding of quantum mechanics in the 1920s, theoretical physics had nurtured an extremely radical tradition. Because relativity and quantum mechanics had revolutionized physics, we were all looking for the next overthrow, and we were willing to sacrifice existing theories at the drop of a hat.
Was revolution a young person’s game?
Not at all. The older physicists were the most radical! The founders of quantum mechanics — Werner Heisenberg, Paul Dirac, Niels Bohr — were all convinced that in order to explain the nuclear force, there had to be another revolution in the foundations of physics. There were all sorts of crazy ideas: Space had to break up at the nuclear scale; quantum mechanics was really nonlinear.
“New discoveries tend to be intuitive, just on the borderline of believability. Later, they become obvious.”
Remarkably, the building of the Standard Model — the theory of how particles and forces interact — was the success of the conservatives. It required no revolution at the foundational level. Normal physics, the kind that goes on experiment after experiment, produced the Standard Model.
What clued you in to the existence of asymptotic freedom?
Desperation. I had set out to disprove quantum field theory — and the opposite occurred! I was shocked.
The story is that the experimenters were banging protons, hoping to find direct evidence of the ephemeral “quarks.” Protons are bags of quarks, but there is no such thing as an individual quark. We glimpse them only indirectly, by measuring the energies and momentums emerging from proton collisions.
Using quantum field theory, my colleagues and I successfully predicted the emergence of certain patterns in the proton collision detritus. To our surprise, the calculations showed that the invisible quarks are not purely mathematical abstractions, but objects, particles, that can move about freely inside the proton when they are close together. And, astonishingly, we learned that as the distance between the quarks increases, the force binding them together also increases. It’s like stretching a rubber band.
Were your predictions within a range of uncertainty?
Enormous uncertainty, but they were true, nonetheless. And since then, they’ve been repeatedly validated through experiment. New discoveries tend to be intuitive, just on the borderline of believability. Later, they become obvious.
How long did it take before the new theory was accepted?
For some people, it was bam! Fast. Because it was the only explanation of the strong force that could be calculated. Others found it philosophically objectionable to base a theory on objects that couldn’t really be seen. There were two groups: younger physicists, like Steve Weinberg and Lenny Susskind, who immediately believed it, and a group of older scientists, who didn’t know very much quantum field theory, which is highly technical. It took a while for them to fully accept a theory based upon indirect observations.
Were they waiting for the revolution?
Quantum mechanics remains our latest revolution. That said, some scientists are waiting for an irreducible final theory. Reductionism has proven to be an extraordinary successful method of investigation. The Standard Model is a very precise, reductionist theory. But it wasn’t radical.
When you chaired the 25th Solvay Conference in 2011, you observed, in your opening remarks, that there is “confusion at the frontiers of physics.” Why?
A scientific “frontier” is defined as a state of confusion. Nonetheless, we have a big problem: Physics explains the world around us with incredible precision and breadth. But further explanation is highly constrained by what we already know. Theories of quantum gravity, for instance, represent serious challenges to our current theoretical framework.
“I do not view the present situation as a crisis, but as the kind of acceptable scientific confusion that discovery eventually transcends.”
String theory strives to unite all four fundamental forces: electromagnetism, radiation (weak), nuclear force (strong) and gravity.
First of all, string theory is not a theory. The Standard Model is a theory. String theory is a model, a framework, part of quantum field theory. It’s a set of rules and tricks for constructing consistent quantum states, a lot of them.
At Solvay, you said the hope that string theory would produce a unique dynamical description of reality appears to be a “mirage.”
String theory is not as revolutionary as we once hoped. Its principles are not new: They are the principles of quantum mechanics. String theory is part and parcel of quantum field theory.
The theoretical structure of modern physics is a lot bigger and richer than we thought, because it’s a theory of dynamical space-time, which must incorporate gravity, a force that is not yet integrated into the Standard Model.
There are frustrating theoretical problems in quantum field theory that demand solutions, but the string theory “landscape” of 10500 solutions does not make sense to me. Neither does the multiverse concept or the anthropic principle, which purport to explain why our particular universe has certain physical parameters. These models presume that we are stuck, conceptually.
Is there a crisis in physics?
I do not view the present situation as a crisis, but as the kind of acceptable scientific confusion that discovery eventually transcends.
What does it mean to say that space-time is an emergent phenomenon?
[Chuckles.] That is a very sophisticated concept, which takes from about birth until the age of two to grasp. We do not really experience space-time; it’s a model. It describes how to get that piece of food that’s on the rug over there: crawl.
Our model of space-time, as amended by Einstein, is extremely useful, but perhaps it is not fundamental. It might be a derived concept. It seems to emerge from a more fundamental physical process that informs the mathematical pictures drawn by string theory and quantum field theory.
Is it possible to falsify string theory/quantum field theory? Or is that a purely philosophical question?
The question of how we decide whether our theories are correct or wrong or falsifiable has a philosophical aspect. But in the absence of empirical data, can we really judge the validity of a theory? Perhaps. Can philosophy by itself resolve such an ontological quandary? I doubt it. Philosophers who contribute to making physics are, thereby, physicists!
“In science, it is essential never to be totally certain.”
Now, in the last century, great physicists such as Ernst Mach, Bohr and Einstein were also philosophers who were concerned with developing theories of knowledge. Einstein famously criticized Heisenberg for focusing only on observable entities, when there can be indirect evidence for entities that cannot be seen. It may be the same with string theory.
Is the revolution at hand?
Those of us in this game believe that it is possible to go pretty far out on a limb, if one is careful to be logically consistent within an existing theoretical framework. How far that method will succeed is an open question.
Does the broad public understand the role of uncertainty in science?
The public generally equates uncertainty with a wild guess. Whereas, for a scientist, a theory like the Standard Model is incredibly precise and probabilistic. In science, it is essential never to be totally certain. And that lesson is hammered into every scientist and reader of history. Scientists measure uncertainty using probability theory and statistics. And we have comfort zones when making predictions, error bars. Living with uncertainty is an essential part of science, and it is easily misunderstood.
Is there an objective reality independent of human consciousness?
I believe that there is a real world, out there, and that we see shadows of it: our models, our theories. I believe that mathematics exists. It may be entirely real in a physical sense; it may also contain “things” that are ideal. But, to be clear, the human mind is a physical object. It’s put together by real molecules and quarks.