Higgs discovery at LHC was a giant leap for experimenters (and some spacetime geometers) but a small step for theoreticians (up to now)
What elementary particle physics program following the 13 TeV scale probing?
A possibly naive answer to this important question can be found at the end of this post. But let us start with a more reliable view from a beloved high energy physics blogger currently silent on his website but still tweeting when he doesn't whistle:
The possibility that the LHC will only further confirm the Standard Model is often referred to as the nightmare scenario. The puzzles that emerge are not the nightmare; physicists love difficult problems. On the contrary, it is the indefinite persistence of the current confusing situation that is considered nightmarish.The most efficient method developed thus far for revealing the fundamental secrets of nature has been to increase beam energy to probe increasingly small distances. Larger and more powerful colliders are seen as the solution. The design and construction of the LHC was a gargantuan task that required decades of work and billions of dollars. Such an undertaking will only become more difficult in the future. Would a doubling of energy be sufficient for any new collider project? Is a factor-of-ten increase needed? It may be the case that the answers we seek are to be found at energy levels that are simply unattainable for the foreseeable future. It is also unclear whether a bigger collider would resolve currently unanswered questions...
As was the case with the space program following the moon landing, there is on the one hand a grandiose plan, and on the other, more modest proposals with clear goals. There are two ways to investigate matter at very small scales. The first is to channel vast energies into a small volume, so that it can be converted into the creation of new particles. This is the most straightforward method and results can be interpreted with minimal ambiguity.
The second approach involves taking advantage of the uncertainty principle in quantum mechanics. According to this principle, very heavy particles can be continuously created from a vacuum. These particles exist for a short time, during which they may affect the behavior of known particles, such as electrons, muons, Higgs bosons, and so on. By measuring the properties of known particles with great precision, and comparing the results to theoretical predictions, insights can be derived into physical laws at energies inaccessible to collider experiments.
Many such experiments are currently being conducted. Examples include research into the magnetic and electric properties of elementary and composite particles, such as muons, tau leptons, protons, neutrons, and kaons. The MEG, Muon g-2, nEDM, NA62, and Qweak experiments, among others, indirectly probe physics at energies well above what can be reached at the LHC, or, for that matter, a future one-hundred-kilometer collider. In many cases, a modest budget can improve precision by orders of magnitude within just a few years. Precision experiments also touch upon many distinct areas of physics—atomic physics, laser physics, condensed matter physics, and nuclear physics—promoting collaboration between particle physics and other domains of science...
Which is it to be: a one-hundred-kilometer collider, or one hundred precision experiments at CERN? This is a serious question. Not only the future but possibly the survival of particle physics is at stake. Shifting the focus away from high-energy colliders toward precision experiments may be the most efficient way to continue exploration of fundamental interactions in the decades ahead. It may even allow particle physics to emerge stronger from its current crisis.
Adding a usual advisory and this blog's obstinate reminder
The LHC experiment is not yet done—far from it. The collider is scheduled to operate for another fifteen years, accumulating something like one hundred times more data than it has to date. Physicists will be scrutinizing the data for any signs of new particles. The Higgs boson will be studied with an eye towards determining how well it conforms to the specific predictions of the Standard Model. A breakthrough may happen at any moment...
Reading the following extracts, I'd like to add to what Adam wrote: a breakthrough may have already happened but it's not an experimental one and its value for model building has yet to be granted by the high energy theoretical physics community.
The noncommutative geometry dictated by physics is the product of the ordinary 4-dimensional continuum by a finite noncommutative geometry ... The compatibility of the model with the measured value of the Higgs mass was demonstrated in [13] due to the role in the renormalization of {an ultra-heavy Higgs-like singlet} scalar field already present in {our model before Higgs discovery}[12]. In [14, 15], with Chamseddine and Mukhanov, we gave the conceptual explanation of the finite noncommutative geometry from Clifford algebras and obtained a higher form of the Heisenberg commutation relations between p and q, whose irreducible Hilbert space representations correspond to 4-dimensional spin geometries. The role of p is played by the Dirac operator and the role of q by the Feynman slash of coordinates using Clifford algebras. The proof that all spin geometries are obtained relies on deep results of immersion theory and ramified coverings of the sphere. The volume of the 4-dimensional geometry is automatically quantized by the index theorem; and the spectral model, taking into account the inner automorphisms due to the noncommutative nature of the Clifford algebras, gives Einstein gravity coupled with a slight extension of the standard model, which is a Pati-Salam model. This model was shown in our joint work with A. Chamseddine and W. van Suijlekom [17, 18] to yield unification of coupling constants.
Alain Connes (Submitted on 7 Mar 2017)
Our proposal is to start from an analogue of the Heisenberg commutation relation to quantize the geometry. The Dirac operator plays the role of momentum while the Feynman slash of scalar fields plays the role of coordinates. When the dimension of the noncommutative space... is 2 or 4 there are two possible Clifford algebras with which the scalar fields are contracted with the corresponding gamma matrices. These two Clifford algebras are related to each other through the reality operator J which is an anti-unitary operator that is part of the data defining the noncommutative space. In four dimensions the sum of the two Clifford algebras is M2(H)⊕M4(C) which is the algebra of the finite space that is tensored with the continuous Riemannian space {One of these algebras is associated with the fourcolors, and the other algebra corresponds to the right-left symmetry}. The quantization condition implies that the volume of the continuous part of the space is quantized in terms of the winding numbers of ... two mappings Y and Y ′ from [a spin-manifold] M4 to [the sphere] S4 . The presence of two maps instead of one allows for the representation of ... M4, with arbitrary topology and large volume as the pullback of the two maps which yields four coordinates given on local charts... The two mapping fields Y and Y ′ from the four-manifold to S4 can be considered to be be solutions of instanton equations and give the physical picture that coordinates of a point are represented as the localization of instantons with finite energy...
The assumption of volume quantization has consequences on the structure of General Relativity... To have a physical picture of time we have ... considered a four-manifold formed with the topology of R×Σ3, where Σ3 is a three dimensional hypersurface, to allow for space-times with Lorentzian signature. The quantization condition is modified to have two mappings from Σ3 → S3 and a mapping X : R → R. The resulting algebra of the noncommutative space is unchanged, and the three dimensional volume is quantized provided that the mapping field X is constrained to have unit gradient... In the synchronous gauge, this field is identified with the time coordinate and modifies Einstein equations by giving an energy-momentum tensor in the absence of matter, giving rise to mimetic cold matter [It thus solves, without extra cost, the dark matter problem [33]]. We have shown that this field... can be used to construct realistic cosmological models [34] such as inflation without the need to introduce additional scalar fields. By including terms in the action of the form f (◻X) which do occur in the spectral action as can be seen from considerations of the scale invariance, it is possible to avoid singularities in Friedmann, Kasner [35] or Black hole solutions [36]. This is possible because the contributions of the field X to the energy-momentum tensor would allow, for special functions f(◻X) to limit the curvature, preventing the singularities from occurring.
We have presented enough evidence that a framework where space-time assumed to be governed by noncommutative geometry results in a unified picture of all particles and their interactions. The axioms could be minimized by starting with a volume quantization condition, which is the Chern character formula of the noncommutative space and a special case of the orientability condition. This condition determines uniquely the structure of the noncommutative space. Inner fluctuations of the Dirac operator by automorphisms of the algebra extends it to include a connection, which is a one form defined over the noncommutative space. Components of the connection along the continuous directions are the gauge fields of the resulting gauge group, and the components along the discrete directions are the Higgs fields. The connection then includes all the bosonic fields of a unified field theory, which is a Pati-Salam model with a definite Higgs structure. The Standard Model with neutrinos (and a singlet) is a special case of the Pati-Salam model which satisfies an order one condition... All bosonic fields in the form of gravity, gauge and Higgs fields are unified in the Dirac operator and all fermion fields are unified in the fundamental representation in the Hilbert space. The dynamics is governed by the spectral action principle where the spectral action is an arbitrary positive function of the Dirac operator valid up to a cutoff scale, which is taken to be near the Planck scale...
The picture is very compelling, in contrast to other constructions, such as grand unification, supersymmetry or string theory, where there is no limit on the number of possible models that could be constructed. The picture, however, is still incomplete as there are still many unanswered questions and we now list few of them. Further studies are needed to determine the structure and hierarchy of the Yukawa couplings, the number of generations, the form of the spectral function and the physics at unification scale, quantizing the fields appearing in the spectral action and in particular the gravitational field.
Quanta of Space-Time and Axiomatization of Physics
Ali H. Chamseddine (Submitted on 27 Feb 2017)
Building a new articulation of quantum matter and gauge interactions through an updated spacetime geometry to forge new experimental tools with the Higgs scalar boson ruler!
This is my "educated" guess, based on the spectral noncommutative geometric insight exposed above and also inspired from the epistemological hindsight described by the late french mathematician and philosopher Gilles Châtelet below:
Toute physique mathématique est fondée par un protocole d'articulation du sens géométrique et du sens physique... Pourquoi donc préférer {la} certitude ... [que {je} peux refermer {mon} poing sur l'électron qui est là devant moi par la simple connaissance de sa position-vitesse] à la richesse de l'algèbre des observables qui en dit beaucoup plus sur le jeu expérimental : symétrie, compatibilité de certaines pratiques, règles de jeu de mesures... Les relations d'incertitudes ... sont à appréhender ... comme le simple refus du physique à se donner par saisie cartésienne {commutative}. Celle-ci est impuissante à comprendre les rôles, les places, les intrigues du théâtre ... quantique.
Gilles Châtelet
Will the LHC bring physicists "closer to the moon", namely to a grand unification of fundamental interactions, while no sign of new "natural physics" shows up aside the Higgs boson? A positive answer could rely on the building of a new natural spacetime model offering a proper geometric interpretation to the scalar particle discovered by the LHC. The spectral model of particle physics is a quite successfull example of such a program that learnt the lesson from the 125 GeV mass measured in 2012. It recently led to put in place a possible foundation stone to the challenging quantisation of the geometry of spacetime. It is based on the same formalism that made it possible to quantise radiation and matter. Last but not least it establishes a potential link with mimetic dark matter and dark energy models and more generally mimetic gravity.
(Background picture from the CERN image library shows the area under which the tunnel for LHC can be found near to Geneva and lac Leman. The French Alps with Mont Blanc provides a natural backdrop and a reminder for the past discovery of particles like positron, muon, meson, kaon...that occured in high altitude cosmic ray laboratories. The drawing is the currently experimentally proven and conceptually understood version of the great loop or grand tour of physics).
//last edit June
|
Comments
Post a Comment
Cher-ère lecteur-trice, le blogueur espère que ce billet vous a sinon interessé-e du moins interpellé-e donc, si le coeur vous en dit, osez partager avec les autres internautes comme moi vos commentaires éclairés !
Dear reader, the blogger hopes you have been interested by his post or have noticed something (ir)relevant, then if you are in the mood, do not hesitate to share with other internauts like me your enlightened opinion !