(partial) Unification without (grand) unification / 统 未 统 ?

Play (high speed recap)


ca. 440 BC: The Pre-Socratics Greek philosophers Leucippus and Democritus were the first to put forward the idea that everything is composed of atoms and that atoms are physically indivisible and in constant motion in empty space. This is a quite elegant solution to the problem of maintenance of identity in a world in constant transformation. 
ca. 1650: Emergence of the scientific revolution that laid the basis of Newton’s mechanics and the law of universal gravitation, whose applicability everywhere in the Universe made possible systematizing the knowledge of the whole physical world. 
1865: Maxwell’s equations unify electricity, magnetism, optics and electromagnetism. 
1905: Einstein’s Special Relativity (SR) unifies electromagnetism and a generalized mechanics, an approach that allows for the whole physics to be expressed in a frame independent way and invariant under the Lorentz transformations. 
1908: Minkowski shows that SR is more meaningful with a further unification of space and time so to treat physical phenomena in a space-time continuum (see Refs. [1, 2] for thorough discussions). 
1915: Einstein’s General Relativity unifies SR, gravity and Riemannian differential geometry imposing that physics should be invariant under general coordinate transformations, i.e., diffeomorphisms. This is achieved by promoting the space-time metric, gµν , which determines distances in space-time ds2 = gµνdxµdxν , (1) into a dynamical variable that solves the so-called Einstein’s field equations for a given matter-energy configuration described by the energy-momentum tensor Tµν ... 
General Relativity assumes that space-times admits a symmetric and metric compatible connection, gµν;λ = 0, the Levi-Civita connection. The Riemann tensor is built with this connection and the contracted version of the Bianchi identity implies that the energy-momentum tensor is covariantly conserved. This formulation emphatically states that physics is a science of space-time (see Ref. [3] for an extensive discussion). Furthermore, the theory accounts for all known observational facts, from the solar system to the largest observed scales — provided that in the latter, dark energy and dark matter are included into the cosmological description (see Ref. [4] for a thorough discussion).  
Somewhat later, in 1917, while discussing for the first time the cosmological implications of GR, Einstein [5] rejected a legitimate solution of his equation that predicted the expansion of the Universe and, rather unfortunately, concluded that the field equations were incomplete and should be implemented with a constant term, the cosmological constant, Λ: Rµν − 1 2 gµνR + Λgµν = 8πG c 4 Tµν . (3) One could assume that the observed expansion of the Universe would render the cosmological constant redundant. However, the recently detected acceleration on the expansion of the Universe [6] brings back the cosmological constant, which suitably adjusted, is the simplest possible explanation for this observational fact...
(Submitted on 17 May 2010 (v1), last revised 14 Jul 2010 (this version, v3))

 

(too) Fast forward (?)

Unitary Field Theories

1918: Weyl’s dilatation unitary field theory... 
1919, 1925: Kaluza-Klein approach 
The Finish physicist Gunnar Nordström was the first to speculate, in 1909, that space-time could have more than four dimensions. A concrete construction based on this idea was 5 put forward by Theodor Kaluza in 1919, who showed that an unified theory of gravity and electromagnetism could be achieved through a 5-dimensional version of general relativity, provided the extra dimension was compact and fairly small, and could therefore have passed undetected. This approach was reexamined by Oskar Klein in 1925, who showed that after correcting some of Kaluza’s assumptions, the resulting 4-dimensional theory contained general relativity, electromagnetism and a scalar field theory. This approach was very dear to Einstein till late in his life [9], and has been followed in the most important attempts to unify all known four interactions of Nature. In a more modern language, the basic underlying feature of the Kaluza-Klein approach is the assumption that the Universe is described by Einstein’s general relativity in a 5-dimensional Riemannian manifold, M5, that is a product of a 4-dimensional Riemannian manifold, M4, our world, and tiny one-dimensional sphere, S1. Much later, in the 1970’s, it was shown the impossibility of encompassing larger gauge groups than U(1) in 4 dimensions starting from D-dimensional general relativity. Furthermore, besides the impossibility of this “monistic” approach, i.e., the route from higherdimensional gravity down to 4-dimensional gravity plus gauge theories, some “no-go theorems” have shown that no non-trivial gauge field configurations could be obtained in 4 dimensions from D-dimensional gravity theory without higher-curvature terms (see e.g. Ref. [10] for the case of N = 1 supergravity in D = d + 4 = 10 dimensions). Another quite relevant result is that, in order to obtain chiral fermions in our 4-dimensional world, D must be even if all extra dimensions are compact [11]... 
1921: Eddington’s Affine Geometry approach... 
1922-1923, 1928: Independently, Cartan and Einstein developed the distant parallelism approach, in which torsion plays a central role. 
1931-1932: The Italian mathematician Paolo Straneo from Genova considered a generalized connection, so that different contractions of the resulting Riemann curvature would concern gravity and electromagnetism, respectively... 
Id.

Pause

... it is interesting to mention Aureliano Mira Fernandes’ thoughts on the unification principle. These appeared in a book published in 1933, based on lectures he presented at the Instituto de Altos Estudos of the Academia das Cincias de Lisboa in February 1st and 4th, 1933, “Modernas Concepçoes da Mecânica” [18]: “One has then reached Riemann’s vision according to which physical phenomena should be governed by an underlying geometrical concept; and conversely, that geometry is determined by the content of space. The unity principle, as a scientific and philosophical goal, is philosophically boosted and dignified. And in the process, domains of mathematical thinking get better equipped for future requirements of the physical theories"... 
Let us take on from this quote and reflect a bit upon the unification principle and add up some of our own thoughts on the matter. Unification seems to be a natural trend in science as, from time to time, knowledge acquired about a class of natural phenomena allows for unifying theories and models that explain apparently distinct phenomena. Unification is achieved in the sense that qualitatively distinct phenomenological manifestations can be explained by a common underlying theory or model. This qualitative change is possible if, and only if, broad and detailed experimental data are available. Furthermore, the unification of theories and models leads to conceptual simplicity, even though most often at expense of mathematical complexity. 
It is important to point out that unification is more than a methodological procedure, as the unity of Nature imposes that all its phenomenological manifestations should be regarded as a whole, without artificial divisions, and hence have a correspondence with particular elements of any serious theoretical framework that aims at describing the most salient features of the Universe. From this perspective, and given the lack of decisive phenomenological facts to guide their efforts, it is not surprising that the enterprise carried by Einstein, Weyl, Eddington, Cartan and followers could not be successful in its aim to unify gravity and electromagnetism. One could add that their disregard of Quantum Mechanics and its methods to address the nuclear and subnuclear interactions was also a highly questionable methodological choice. 
Despite these shortcomings, on a broad sense, their contributions were nevertheless valuable — as they threaded their way through unknown theoretical landscapes, mapped the encountered fruitless regions and, what is perhaps more relevant, devised methods that have been useful in many areas of physics. Their failure also illustrates that a purely theoretical approach is unlikely to achieve the unification goal through the labyrinth of theoretical and mathematically valid possibilities. Phenomenological guidelines are essential and cannot be disregarded in the process of building new theories and models. For sure, phenomenology on its own is no more than an inert compilation of data, notwithstanding the recent emergence of increasingly powerful computers and huge databases, which have led to the idea that models and theories can be built algorithmically... This author is skeptical whether these methods can be straightforwardly applied to physics, given that, most often, physical theories go much beyond the phenomenology that is available at the time they are proposed. It is precisely for these reasons that we believe that the spirit of unification works as a sort of taoist Wei principle, that is to say, an intuition, a feeling about “acting or not acting”, about “doing, but not overdoing” in the process of building new unified physical theories 
Id.

Rewind 

The Quantum Revolution 
... Somewhat after the development of General Relativity, the understanding of microphysics (atoms, molecules, nuclear physics and beyond) made possible the emergence of Quantum Mechanics (QM). The theory has gone itself through a process of methodological unification from 1924 till the 1930’s. Indeed, the set of principles that were put forward by Bohr and Heisenberg (The Matter-Wave Complementarity Principle and The Uncertainty Principle) have provided abundant insight about the need to pursue a fresh new vision of Nature in what concerned microphysical phenomena. Two distinct formulations were then put forward: matrix mechanics, proposed by Heisenberg, Born and Jordan in 1925; and, in 1926, wave mechanics, based on Schr¨odinger’s wave equation. These two formulations were shown to be unitarily equivalent, a result known as the Stone-von Neumann theorem. 
In 1928, Dirac presented the relativistic version of the wave equation, and showed that its solutions admit particles and antiparticles, states with the same mass, but with opposite charge and baryon (lepton) number. 
In 1932, the discovery of the positron (the antielectron) by Carl Anderson did confirm the physical validity of Dirac’s equation, which unifies QM with SR and the concept of spin. Actually, somewhat earlier, in 1926, Dirac suggested that analytical classical mechanics was connected with QM through a formal relationship between Poisson’s brackets and the commutator of physical observables... This connection requires the introduction of Planck’s constant, a distinct feature of QM, and the imaginary unit, which arises as the fundamental entity of QM: the wavefunction that solves Schrödinger’s wave equation is not an observable, being hence undetermined up to an imaginary phase (see e.g. Ref. [20] for an extremely elegant presentation). 
In 1932, Wigner introduced the so-called quasi-probability distribution, while studying the quantum corrections to classical statistical mechanics. This distribution allows to link Schr¨odinger’s wave function to a probability distribution in phase space, as it maps real phase-space functions to Hermitian operators. Thus, a formal unification of QM with statistical mechanics is achieved. 
Later, in 1949, Moyal did show that this distribution is the basis of the encoding of all quantum expectation values, and hence quantum mechanics in phase space. ... 
Finally, in 1948, Feynman has developed his path integral formalism and shown its equivalence with QM. Other relevant developments towards an unified view of Nature include: 1934: Fermi’s theory of the weak force ... 
In late 1940’s and early 1950’s, Feynman, Schwinger, Tomonaga, Dyson and others have developed the basic tools of quantum field theory and have shown that QED, the quantum field theory of charges and photons, was a fully consistent theory that could achieve, through the renormalization procedure, an impressive power of predictability (see e.g. Ref. [24] for an insightful account). 
In 1954, Yang and Mills (and independently, Salam and Shaw) did propose a generalization of Maxwell’s electromagnetism, based on the UEM(1) gauge group, that was invariant under a G = SU(2) non-abelian gauge group transformation... 
Electroweak (EW) unification: The unification of the electromagnetic and weak interactions was achieved through the work of Glashow (1961), Salam (1968), Weinberg (1967) and others. The main ingredients include the gauge group G = SUL(2) ⊗ UY (1) that breaks down, through the Higgs mechanism, to Maxwell’s theory UEM(1). The fundamental spontaneous symmetry breaking mechanism that endows the Higgs field with a non-vanishing vacuum expectation value (experimentally, < 0|H|0 >= 246 GeV ) is on its own a very interesting example of the unity of Nature, as it is the very process underlying phase transitions in condensed matter physics, the Higgs field vacuum expectation value playing the role of an order parameter, the Higgs field effective potential being equivalent to Helmholtz’s free energy. 
Quantum Chromodynamics (QCD): In the 1960-1970’s, it has been shown that strong interactions could also be described by a gauge theory, QCD, with gauge group SUc(3), where c stands for the strong charge dubbed “colour”. This was achieved through the understanding that hadrons could be sorted into groups having similar properties and masses: the “eightfold way”, put forward by Gell-Mann and Ne’eman in 1961. Shortly after, Gell-Mann and Zweig proposed that the strong interaction group structure should be understood through the existence of three distinct “colours” of smaller particles inside the hadrons, the quarks. Thus, the basic components of hadrons and mesons are “coloured” quarks and gluons, the vector bosons of strong interactions, with dynamics ruled by QCD. The theory has remarkable properties such as confinement, which means that coloured states cannot be directly observed, and asymptotic freedom, discovered by David Gross, David Politzer and Frank Wilczek in 1973, which allows for predictions of many high energy experiments using the perturbation techniques of quantum field theory. Confinement and asymptotic freedom mean that quarks are strongly coupled at low energies and that their coupling gets logarithmically weaker with the growth of energy, respectively. The description of the electroweak and the strong interactions through the gauge principle gave rise to the so-called Standard Model (SM) of particle interactions, which is so far consistent with most of the particle physics phenomenology. However, there exists evidence that physics beyond the SM is needed to account for the fact that neutrinos seem to be massive and and that phenomenology requires a new “sterile” neutrino (see e.g. Ref. [27] for an updated discussion).
Id.

Fast Forward (once more) 

Grand Unified Theories 
The gauge principle and the SM suggest a natural procedure to build more encompassing models. One considers non-abelian gauge theories with symmetry groups that admit the SM gauge group as a subgroup. These Grand Unified Theories (GUTs) have been extensively discussed from 1973 onwards by Pati, Salam, Georgi, Glashow, Quinn, Weinberg and others (see Ref. [28] for a thorough discussion), and the most studied cases admitted the following GUT gauge groups and symmetry breaking pattern: GUT := SU(5), SO(10), E(6), ... → GSM = SUc(3) ⊗ SUL(2) ⊗ UY (1) → UEM(1) . (53) It is important to realize that GUTs are also suggested by the evolution of the coupling constants with energy according to the renormalization group equations. Through these equations it is possible to show that if there is no intermediate physics between the EW unification and the GUT, the “great desert” hypothesis, for the SU(5) GUT [29] one finds that the unification scale can be as large as, EGUT ≃ 1017 GeV [30]. Subsequently, precision data arising from LEP collider, the Z “factory” that ran at CERN from 1989 to 2000, have revealed that the coupling constants of the electromagnetic, strong and weak interactions would meet at about 1016 GeV , and hence be consistent with a putative unification, only in the context of the so-called minimal supersymmetric (see below) extensions of the SM [31]. This is a further evidence of physics beyond the SM. A distinct feature of GUTs is the existence of leptoquark particles whose interactions can mediate violation of baryon and lepton numbers. Violation of baryon number, violation of C and CP discrete symmetries, and out-of-equilibrium processes, which is a natural feature in an expanding Universe, do allow for the creation of the baryon asymmetry of the Universe (BAU). The generation of this asymmetry, usually referred to as baryogenesis, is vital to ensure that the Universe does not end up being composed only by photons, as argued by Sakharov in 1967 (see Ref. [32] for a review). An alternative scenario to achieve the BAU involves violation of the baryon number and of the CPT symmetry [33], which might occur in the context of string theory (see below). 
Supersymmetry
After the pioneering work of Wess, Zumino, Deser, van Nieuwenhuizen, Freedman and others in the mid 1970’s, a fundamental new symmetry has emerged from the drawing board of the theoreticians, supersymmetry. This symmetry relates bosons and fermions with equal masses and prevents that radiative corrections in the SM endow the Higgs field with a mass as high as the Planck mass. This suggests that new physics beyond the SM is needed in order to bridge the gap between the electroweak scale, MEW ≃ 10^2 GeV, and the quantum gravity scale. This is achieved as supersymmetry allows for a cancellation of the divergencies arising from bosons loops with the ones arising from fermions loops. If supersymmetry was an unbroken symmetry, this feature would explain the vanishing of the cosmological constant. This is clearly not the case, as known bosons and fermions do not have the same mass. This means that the spectrum of elementary particles must be much larger than the one that has been unraveled so far. A striking feature of supersymmetry is that the algebra, i.e., the anti-commutator, of its generators is proportional to the energy-momentum operator. Thus, a local version of supersymmetry corresponds to a general coordinate transformation and hence, gravity can be accommodated in a local supersymmetric theory. For this very reason, local supersymmetric theories are called supergravity theories (see Refs. [44, 45] for extensive reviews). Supergravity is an important step towards the unification of gravity with gauge theories...Supersymmetry is also a fundamental ingredient of superstring theory, the most developed approach to understand in an unified fashion all interactions of Nature and to harmonize the description of gravity with quantum mechanics (see below). As mentioned above, phenomenologically, supersymmetry is needed to ensure the rendez-vous of coupling constants at about 10^16 GeV [31]. Supersymmetry does also provide many candidates for the dark matter (see Refs. [47, 48] for recent reviews) of the Universe, given that it contains in its spectrum many neutral long lived weakly interacting massive particles (WIMPS); the linear combination of supersymmetric particles with the mentioned features is usually referred to as neutralinos.

Superstring/M-Theory Unification  
The basic assumption of string theory is that the fundamental building blocks of reality are not particles, but rather extended one dimensional objects, quantum strings. These quantum strings can be open or closed, and particles correspond to the modes of excitation of the strings. Open strings ask for “branes” of any dimensionality not just membranes, a two dimensional object, to “support” their ends. A striking feature of superstring theory is that at these contact points the interactions correspond to those of supersymmetry gauge theories. On its hand, closed strings admit the graviton in its spectrum, besides a scalar particle, the dilaton. Thus, a rather model independent prediction of string theory is that the emerging effective gravity theories arising from it are scalar-tensor theories of gravity with higher-order curvature terms [49]. The obstacles encountered in the development of the Kaluza-Klein approach seemed, as discussed above, insurmountable till the “first string revolution” in 1984. Indeed, it was then shown by Green and Schwarz that higher order curvature terms do allow for non-trivial gauge fields configurations after compactification of D = 10 down to 4 dimensions, but also that to ensure the mutual cancellation of gauge and gravitational anomalies the only GUT gauge groups admissible are E8 ⊗ E8 or SO(32) [50]. This breakthrough took place in the context of the supersymmetric string theory whose consistency (Lorentz symmetry and unitarity) requires D = 10 space-time dimensions (the bosonic string demands D = 26 [51]). Actually, in its very first avatar, string theory was proposed to describe hadronic physics. However, the persistent appearance of massless vector and tensor states in its spectrum and the fact that D > 4 made the approach untenable for the description of hadrons. The first suggestion that string theory should instead be regarded as a unified theory of all interactions was put forward in 1974 by Scherk and Schwarz, based on the fact that the massless vector and tensor particles interact precisely as Yang-Mills gauge fields and the graviton [52]. The former feature could be achieved by assuming that the fundamental length scale of the theory ... should be identified with the Planck length... The “first string revolution” did suggest a promising scenario to the understanding of our world. Starting from the E8⊗E8 10-dimensional superstring theory, the so-called heterotic string theory [53], it is natural to demand that the dimensional reduction process down to 4 dimensions should preserve supersymmetry. This requirement turns out to be quite restrictive, as it demands that 6 extra space dimensions are compact, have a complex structure, no Ricci curvature and an SO(3) holonomy group. That is, this compact space must be a Calabi-Yau manifold [54], a possibility that was thought to have opened the way to explain the origins of the SM. A “second string revolution” has emerged from the discovery of the deep connection between all string theories. This is achieved through the so-called S and T dualities and the existence of an encompassing master theory, M-theory, which at low energies can be described by a D = 11, N = 1 supergravity theory [55]. The difficulty of M-theory is that it admits a huge number of solutions, about 10^100 or greater — and every possible value for the cosmological constant and coupling constants. The space of all such string theory vacua is often referred to as the landscape [56]. In this context, a quite radical scenario emerges, namely that the multiple vacua of string theory is associated to a vast number of “pocket universes” in a single large multiverse. These pocket universes, like the expanding universe we observe around us, are all beyond any observational capability, as they lie beyond the cosmological horizon [57]. The implications of these ideas are somewhat disturbing: the vacuum that corresponds to our Universe must arise from a selection procedure, to be dealt with via anthropic or quantum cosmological considerations. That is to say that our existence somehow plays a role in the selection process. If, from one hand, the vast number of vacua in the landscape ensures the reality of our existence, a selection process must be evoked. One refers to the anthropic landscape, when the vacuum selection is based on anthropic considerations. This interpretation is not free from criticism: indeed, it has been pointed out, for instance, that the impossibility of observing a multiverse implies that its scientific status is questionable — It is in the realm of metaphysics, rather than of physics [58]. This situation could be altered if the universes could interact...
Id.

Pause 

The ATLAS and CMS collaborations have searched in the first 3 fb−1 of data collected at 13 TeV for signs of supersymmetry. The observed data are in good agreement with the expectations from standard model processes. Limits on supersymmetric model parameters and masses in simplified model spectra have been derived that begin to exceed previous results obtained with data collected at lower center-of-mass energies. A wide range of signal scenarios, production and decay channels, and SUSY particle mass regions have been studied using sophisticated analysis methods in particular for the data-driven estimation of the SM backgrounds. The LHC continued to deliver about 40 fb−1 of 13 TeV data in 2016, that will be analyzed for the upcoming conferences. Natural supersymmetry is expected at the TeV scale, the data currently being collected will soon allow to challenge these scenarios. 

ATLAS and CMS Collaborations 
5th International Conference on New Frontiers in Physics (ICNFP 2016) 
06-14 Jul 2016
//update 1st January 2018
SUSY searches in ATLAS cover a large range of scenarios. Inclusive squark and gluino searches resulted in mass limits up to above 2 TeV. Direct stop limits increased from about 700 GeV (Run-1) to about 950 GeV, and direct sbottom limits increased from about 650 GeV (Run-1) to about 950 GeV. Chargino mass limits range between 600 and 1100 GeV. Gluino limits reach up to 2 TeV with photon signatures. Searches of long-lived charginos, gluinos, R-parity violation (RPV) and pMSSM resulted in stringent limits. There are excellent prospects to advance SUSY searches with LHC Run-2 (completing 2015-2018 operation), LHC Run-3 (2021-2023) and the high-luminosity HL-LHC (2026-).
Andre Sopczak (Institute of Experimental and Applied Physics, Czech Technical University in Prague) (Submitted on 29 Dec 2017)

Be kind Rewind (again)


I will first provide a historical background of the status of particle physics existing in May 1972 and then provide motivations for the idea of higher unification, which developed over the next two years. This was a time when the electroweak SU(2) × U(1) model[28] based on the Higgs mechanism for symmetry breaking [29] existed. And the renormalizability of such theories had been proven [30] creating much excitement in the field.  
But there was no clear idea of the origin of the fundamental strong interaction. The latter was thought to be generated, for example, by the vector bosons (ρ, ω, K∗ and φ) along the lines suggested3 by Sakurai,[31] inspired by the beautiful Yang-Mills idea[32], or even by the spin-o mesons (π, K, η, η', σ) assumed to be elementary, or a neutral U(1) vector gluoncoupled universally to all the quarks[33].  
By this time, based on the need to satisfy Pauli principle for the baryons treated as three-quark composites, the idea of SU(3)-color as a global symmetry had been introduced implicitly with quarks satisfying parastatistics of rank 3 in [34] and explicitly though quarks obeying familiar Fermi-Dirac statistics in [35]. In this context, the suggestion of generating a “superstrong” force by gauging the SU(3)-color symmetry had also been made by Han and Nambu as early as in 1965 [35], though in a variant form compared to its present usage (see remarks later). But the existence of the SU(3)-color degree of freedom even as a global symmetry was not commonly accepted in 1972 because many thought that this would require an undue proliferation of elementary entities. And, of course, asymptotic freedom had not been discovered yet. Thus the standard model including the SU(3)-color symmetry had not been born. 
In the context of such a background, inspired by ’t Hooft’s proof of renormalizability of spontaneously broken gauge theories, there was a number of papers appearing almost daily at the ICTP preprint library which tried to build variants of the SU(2) × U(1) model. For example, there were even attempts[36] to get rid of the weak neutral current weak interactions because experiments at that time (May 1972) hinted at their absence. 
As I was trying to catch up with these papers, it appeared to me that the heart of the matter laid not in trying to find variants of the SU(2)×U(1) model, but in removing its major shortcomings, first in its gauge sector. These included: (i) in particular the arbitrary choice of the five scattered multiplets for each family consisting of quarks and leptons with rather weird assignment of their quantum numbers including the weak hypercharge which were put in by hand without a guiding principle; (ii) the lack of a reason based on symmetry arguments for the co-existence of quarks and leptons, and likewise (iii) that of the three forces-weak, electromagnetic and strong; and (iv) the absence of a compelling reason for the quantization of the electric charge and that for the observed charge-relation: Qelectron = −Qproton. (v) In addition, I was bothered by the disparity with which the SU(2)×U(1) model treated the left and the right chiral fermions (see Eq. (1)). This amounted to putting in non-conservation of parity by hand. I thought (in Pauli’s words) that God can’t be weakly left-handed and at a deeper level the underlying theory ought to treat left and right on par, conserving parity. 
I mentioned these concerns of mine, based on aesthetic grounds, about the SU(2) × U(1) model to Salam at a tea-gathering at ICTP[3]. I also expressed that in order to remove these shortcomings one would need to put quarks and leptons in one common multiplet of a higher symmetry group (so that one may understand the co-existence of quarks and leptons and explain why Qe− = −Qp) and gauge such a symmetry group to generate simultaneously the weak, electromagnetic and strong interactions in a unified manner. Now, the idea of putting quarks and leptons in the same multiplet was rather unconventional at that time. Rather than expressing any reservation about it, as some others did, Salam responded immediately by saying “That seems like an excellent idea! Let us develop it together”. It is this sort of spontaneous appreciation and encouragement from Salam that helped to enrich our collaboration at every step. Thus started our collaboration from that tea-conversation.  
Searching for a higher symmetry to incorporate the features noted above, it became clear within about two weeks2 that quarks and leptons can be united in an elegant manner by assuming that quarks do in fact possess the SU(3)-color degree of freedom,4 obeying the familiar Fermi-Dirac statistics[35] rather than parastatistics [34], like the electrons do, and extending SU(3)-color to the gauge symmetry SU(4)-color that treats lepton number as the fourth color. Within this picture, the neutrino and the electron emerged as just the up and down “quarks” of lepton color.  
With SU(4)-color, the whole spectrum of quarks and leptons (then consisting of only two families) fitted beautifully into a 4 × 4 structure of a global symmetry group SU(4)flavor × SU(4)color operating on four flavors (u,d,c,s) and four colors (r,y,b,l).5 Such a structure accounted naturally for the vanishing of the sum of quark and lepton charges and that of the combination (Qe− + Qp), as desired. The spontaneous breaking of SU(4)-color at high energies to SU(3)c×U(1)B−L was then suggested to explain the observed distinction between quarks and leptons at low energies, as regards their response to strong interactions; such a distinction must then disappear at sufficiently high energies. 
Uniting quarks and leptons by the SU(4)-color gauge symmetry thus naturally implied the idea that the fundamental strong interaction of quarks arises entirely through the octet of gluons generated by its subgroup of the SU(3)-color gauge symmetry, which is exact in the lagrangian[37,38,39].  
In the course of our attempt at a higher unification [6,7], it thus followed that the effective gauge symmetry describing electroweak and strong interactions at low energies (below a TeV) must minimally be given by the combined gauge symmetry6 G(2, 1, 3) = SU(2)L × U(1)Y × SU(3)c . This became known eventually as the standard model symmetry (SM). It, of course, contains the electroweak symmetry SU(2)L × U(1)Y[28]... 
We wrote up this aspect of our thinking in a short draft, which we submitted to J.D. Bjorken for presentation at the 1972 Batavia conference [6], and then in a paper which appeared in [7]. In here, we suggested the concept of quark-lepton unification through SU(4)-color. In addition, unknown to many, we also initiated in the same paper ( in the third para) the idea of a gauge-unification of the three forces in terms of a single coupling constant, without exhibiting explicitly a symmetry to implement this idea. We conjectured7 that the differing renormalization effects on the three gauge couplings following spontaneous breaking of the unifying symmetry, may cause the observed differences between these three couplings at low energies. Fortunately, this conjecture (hope) was borne out precisely by the discovery of asymptotic freedom about four months later[41].

(Submitted on 29 Jun 2017)

Play slowly

In this contribution I show that the simple requirement of volume quantization in space-time (with Euclidean signature) uniquely determines the geometry to be that of a noncommutative space whose finite part is based on an algebra that leads to Pati-Salam grand unified models. The Standard Model corresponds to a special case where a mathematical constraint (order one condition) is satisfied... There are many consequences of the volume quantization condition which could be investigated. For example imposing the quantization condition through a Lagrange multiplier would imply that the cosmological constant will arise as an integrating constant in the equations of motion. One can also look at the possibility that only the three volume (space-like) is quantized. This can be achieved provided that the four-dimensional manifold arise due to the motion of three dimensional hypersurfaces, which is equivalent to the 3+1 splitting of a four-dimensional Lorentzian manifold. Then three dimensional space volume will be quantized, provided that the field X that maps the real line have a gradient of unit norm ... It is known that this condition when satisfied gives a modified version of Einstein gravity with integrating functions giving rise to mimetic dark matter [21] [22]. All of this could be considered as a first step towards quantizing gravity.
Quanta of Geometry and Unification
Ali H. Chamseddine(Submitted on 3 Jun 2016)

... the point of view adopted in this essay is to try to understand from a mathematical perspective, how the perplexing combination of the EinsteinHilbert action coupled with matter, with all the subtleties such as the Brout-Englert-Higgs sector, the V-A and the see-saw mechanisms etc.. can emerge from a simple geometric model. The new tool is the spectral paradigm and the new outcome is that geometry does emerge on the stage where Quantum Mechanics happens, i.e. Hilbert space and linear operators.  
The idea that group representations as operators in Hilbert space are relevant to physics is of course very familiar to every particle theorist since the work of Wigner and Bargmann. That the formalism of operators in Hilbert space encompasses the variable geometries which underly gravity is the leitmotiv of our approach.  
In order to estimate the potential relevance of this approach to Quantum Gravity, one first needs to understand the physics underlying the problem of Quantum Gravity. There is an excellent article for this purpose: the paper [47] explains how the problem arises when one tries to apply the perturbative method (which is so successful in quantum field theory) to the Lagrangian of gravity coupled with matter. Quoting from [47]: “Quantization of gravity is inevitable because part of the metric depends upon the other fields whose quantum nature has been well established”.  
Two main points are that the presence of the other fields forces one, due to renormalization, to add higher derivative terms of the metric to the Lagrangian and this in turns introduces at the quantum level an inherent instability that would make the universe blow up. This instability is instantly fatal to an interacting quantum field theory. Moreover primordial inflation prevents one from fixing the problem by discretizing space at a very small length scale. What our approach permits is to develop a “particle picture” for geometry; and a careful reading of the present paper should hopefully convince the reader that this particle picture stays very close to the inner workings of the Standard Model coupled to gravity. For now the picture is limited to the “one-particle” description and there are deep purely mathematical reasons to develop the many particles picture.
(Submitted on 7 Mar 2017)


We analyze the running at one-loop of the gauge couplings in the spectral Pati-Salam model that was derived in the framework of noncommutative geometry. There are a few different scenario's for the scalar particle content which are determined by the precise form of the Dirac operator for the finite noncommutative space. We consider these different scenarios and establish for all of them unification of the Pati-Salam gauge couplings. The boundary conditions are set by the usual RG flow for the Standard Model couplings at an intermediate mass scale at which the Pati-Salam symmetry is broken.
(Submitted on 29 Jul 2015)

Comments