mercredi 31 août 2016

A view to a Kill (a farewell to continuous spacetime?)

Light rays to illuminate (the geometrodynamics behind) the entangled dark body radiation of black holes  
This post will be the penultimate of my August-series with titles constantly borrowed from the James Bond films list. 
More seriously it is also the follow up of the last one as announced in its ending picture caption. I try to convey once more the physical insight of the 't Hooft vision on black holes with his own worlds but using texts I have not quoted yet and which are more synthetic. 

what happens to the quantum information carried by particles that enter the black hole, soon to be absorbed by the central singularity? Do they wiggle their way out again, or at least the quantum information carried along with them? When you take the existing theory and equations literally, there is no chance that they can wiggle out. Should we ignore this?  
First of all, we claim that this problem is a much more elementary and important one than what the literature suggests, and that most of the suggested cures are not well enough thought through. What should be done, is to stretch as much as is possible what one can do using standard, mainstream physics. Then, finally, impose the demand that black holes should behave, as much as possible, as ordinary forms of matter. When doing this properly, this leads to amazing observations.
Only few people noted that the application of existing physical knowledge can bring us much further than any wild speculation. Only recently we discovered a way to calculate things that could have been applied decades earlier. Our observation was [22] that we can use a partial wave expansion to describe energy and momentum entering the black hole, and that, applying Einstein’s equations, the information carried in by each partial wave is carried out again, in a partial wave with the same values of the quantum numbers ℓ and m 
This calculation seems to be as elementary as the calculation of the spectrum of a hydrogen atom using quantum mechanics. As in the hydrogen atom, the boundary conditions are of crucial importance, but in a black hole, these boundary conditions are quite counter intuitive 
What this calculation does, is to expose our problem more clearly than ever before, and now we can see what the answers must be. The general coordinate transformation relating the “inside” of a black hole with what an outside observer sees, must be a topologically nontrivial one. [23,24,25] Only then do the equations make sense.  
This does remarkable things with the fabric of space and time itself, not noticed before. For one thing, space and time must be discrete.
Gerard ’t Hooft (10 June 2016)

For the quantization of gravity, it is crucial to understand the role of (real as well as virtual) black holes. The region in the immediate vicinity of the horizon, is characterised by the fact that time translations are substituted by Lorentz transformations. This means that observables near the horizon are undergoing unlimited processes of Lorentz contraction. 
Light rays are essential for defining the exact location of a black hole horizon: it is the boundary that separates regions from which escape to the outside universe by light rays is still possible, from the domain where all light rays are trapped. We shall now show how to use light rays to describe the backreaction of Hawking particles upon the presence of matter entering the black hole. 
The density matrix, that is, the probability distribution, of Hawking particles basically only depends on mass, charge and angular momentum of the black hole, but the actual configuration of the out-going particles characterises the microstate of the black hole, of which we have a large number (≈ e4πMin natural units) of distinct elements [The number 4πM2, corresponding to the black hole entropy, also represents the total number of Hawking particles emitted by a black hole in its life time [14]]. Whenever a particle enters the hole from outside, transitions to different microstates take place. This happens because a particle entering a black hole interacts with what comes out. The most disruptive interaction is the gravitational one, in spite of its apparent weakness... 
The gravity field surrounding a particle causes a very slight Shapiro effect: a passing light beam is slowed down and dragged towards the particle [15]. For a particle at rest, this effect is very small, but when the particle is Lorentz boosted, its gravity field increases in strength. At the black hole horizon, this Lorentz boost enhances its energy exponentially in time. 
Let pµin(Ω) be the momentum distribution of the in-going particles at the spot Ω=(θ,φ) on the horizon, and δxµout(Ω) the Shapiro displacement for the Hawking particles going out. An elementary calculation shows that 
δxµout(Ω)=8πG ∫d2Ω′f(Ω,Ω′)pµin(Ω'),     (4.1)
where f(Ω, Ω′) is a Green function, defined by ∆f(Ω,Ω′)=−δ2(Ω,Ω′) , and here, the angular operator ∆S is defined by
S=∆−1 = −ℓ(ℓ+1)−1 .                          (4.2) 
Often, we ignore the −1. It was further assumed that the momentum pµin is dominated by the radial component pin, so that, also, its derivatives w.r.t. θ and φ are small. Thus, whatever the positions xµout(Ω) of the Hawking particles in the original microstate were, they are now replaced by the displaced positions, and since the particles emerge with a trajectory that separates itself from the horizon exponentially in Schwarzschild time, the significance of this displacement also increases exponentially in time. This is how information regarding ingoing particles is imprinted in the out-going ones. In Ref [17], a formal expression for the ensuing unitary evolution matrix is derived, to be summarised in the next section. One important feature is seen to emerge: the effect is purely geometrical. Only the momentum distribution pµin(Ω) (mainly the minus component) enters. Consequently, unitarity of this evolution (scattering) matrix implies that these in-going particles are to be entirely characterised by their momenta...  
The question is now, how to deduce the black hole microstates from this evolution matrix. Let us simplify the situation a bit by replacing the Schwarzschild black hole metric by Rindler space. This implies that the angular coordinates Ω=(θ,φ) are replaced by two transverse, flat coordinates x=(x,y).... The Green function f(Ω,Ω′) becomes 
f(x,x′) = −1/2π log(|xx′|).                    (5.1) 
Reserving the symbol z for the radial component of the positions, we write the displacement (4.1) due to the Shapiro shift as 
δzout(x) = −4G δp(x′) log(|xx′|/C).   (5.2) 
The constant C will soon drop out. Assume now that a black hole produced by one given initial state |in0⟩, upon its final explosion leads to a given final state |out0We then calculate the final state when a slight modification is brought about to the state |in⟩. Let the modification consist of adding one light particle with momentum δp− entering the Rindler horizon at the transverse position x. All particles at the transverse position x' in the final state |out⟩ are then dragged along such that their out-coordinate - is modified by an amount given by Eq. (5.2). 
We write this modification as a property of the black hole scattering matrix:  
|in0⟩ |out0⟩ → S |in0 + δp(x)⟩ = exp(−i ∫d2x p+out(x'z|out0⟩,     (5.3)
... now we can reach any other initial state |in⟩, when described by the distribution of the total momentum going in, ptot(x) (as compared to the original initial state |in0), to find the new final state |out⟩ as a displacement of the original finite state |out0⟩: 
⟨out|S| in⟩ = ⟨out0|S|in0⟩ exp[4iG ∫d2x′ log(|x'x/C) p+out(x'pin(x)].        (5.4) 
Note that, here, the operators pin(xand p+out(x') both describe the total momenta of all in- and out going particles as distributions on the Rindler horizon. The important step to be taken now is to postulate that the entire Hilbert space of the in-particles is spanned by the function pin(x), and the black hole scattering matrix maps that Hilbert space onto the space of all particles going out, spanned by the function p+out(x'). We arrive at the unitary scattering matrix S : 
p+out|S|pin⟩ = N exp[ 4iG ∫d2x′ log(|x'x/C) p+out(x'pin(x)].              (5.5) 
This procedure leads to one common factor N , which now can be fixed (apart from an insignificant overall phase) by demanding unitarity... From Eq. (5.3) and the expression (5.5) for the scattering matrix, we can now deduce the relations and commutation rules between the operators pin(x), p+out(x'), zin(x) and z+out(x'), where the latter can be regarded as the coordinates of in- and out-going particles relative to the Rindler horizon... The algebra ... is quite different from the usual Fock space algebra for the elementary particles. In fact, it resembles a bit more the algebra of excited states of a closed string theory, but even that is not the same...
{As the operators z± and  p±  form a linear set one can decompose these new physical degrees of freedom into eigenmodes through decoupled k-waves in Rindler space. This diagonalisation provides in particular a pair of equations that can be seen as boundary conditions on the horizon implying the waves moving in are transformed into waves going out (these waves are not to be interpreted as particles as they may consist of many ones). This can be interpreted} as a real bounce against the horizon. The information is passed on from the in-going to the out-going particles. We do emphasise that in- and out-going particles were not assumed to affect the metric of the horizon, which is fine as long as they do not pass by one another at distances comparable to the Planck length or shorter; in that case, the gravitational effect of the transverse momenta must be taken into account. For the rest, no other assumptions have been made than that the longitudinal components of the gravitational fields of in- and out-going particles should not be ignored. This must be accurate as long as we keep the transverse distances on the horizon large compared to the Planck length. 
It is also important to emphasise that, even though we describe modes of in-going matter that “bounce back against the horizon”, these bounces only refer to the information our particles are carrying, while the particles will continue their way falling inwards as seen by a co-moving observer. In accordance with the notion of Black Hole Complementarity [20], an observer falling in only sees matter going in all the way, and nothing of the Hawking matter being re-emitted, since that is seen as pure vacuum by this observer. Rather than stating that this would violate nocloning theorems, we believe that this situation is asking for a more delicate quantum formalism...
One could try to compute the black hole entropy from the contributions of these reflecting modes. For each mode, the result is finite... Using the thermodynamical equations ... one can derive the contribution of each mode with transverse wave number k to the total entropy... The expression we obtained must now be summed over the values k. If we take these to describe a finite part of the black hole horizon area, we see that the summed expression will be proportional to the area, as expected, but the sum diverges quadratically for large k... The explanation for this divergence is that... our expressions are inaccurate at very large k, where transverse gravitational forces should be taken into account 
It is not easy to correct for this shortcoming, but we can guess how one ought to proceed. It was remarked already in Refs. [18], that the algebraic expressions we obtain on the 2-dimensional horizon, take the form of functional integrals very much resembling those of string theory. We did treat the transverse position variables x and wave number variables k very differently from the longitudinal variables z± and  p± , but it is clear that we are dealing with the full expressions of an S2 sphere. This sphere should be given two arbitrary coordinates σ=(σ12), after which these should be fixed by a gauge condition relating them to the transverse coordinates x. We took σ=x, but apparently this fails when the longitudinal variables fluctuate too wildly. As long as a more precise procedure has not been found, we can simply insert a cut-off for the transverse wave numbers k, or equivalently, the angular momentum quantum numbers ℓ for a finite-size black hole. The divergence is quadratic. The expected expression, Hawking’s entropy S=4πM2, comes about of we introduce a sharp cut-off at |k|≈MPlanck. Such wave number cut-offs impy that the conjugate variables, x, or equivalently, Ω=(θ,φ), form a discrete lattice [21]... 
Note, that we did not apply second quantization, such as in Ref. [17], since now we are not dealing with a quantum field theory. At every value of k, there are exactly two wave functions Ψ...(one at each side of the horizon, which mix). Second quantization would fail here, since the microstates appear to be determined by a single function such as p...
It was observed that the in-going and out-going particles with which we started, produce vertex insertions as in a string world sheet, as if all particles considered should be regarded as closed string loops. It all takes the form of a string theory. Strings were not put in, however, rather, they come out as inevitable objects! But beware, these are not “ordinary” strings. The black hole horizon is the string world sheet [18]. If ordinary strings were to be Wick rotated to form space-like string world sheets, all factors i would disappear from the action, whereas our expressions are still in the complex plane, as if the string slope parameter α′ should have the purely imaginary value 4Gi. In most string treatments of black holes, the string world sheets are assumed to be in the longitudinal direction, that is, the world sheets are taken to be orthogonal or dual to the horizon.  
Our analysis appears to be closely related to ideas using the BMS approach [22], although there, the emphasis seems to be specially on the in- and out-going gravitational waves, while we focus on all particle types entering or leaving the black hole. Secondly, ... we attribute the black hole properties to the immediate surroundings of the future and past event horizon. Also, one may note that both approaches now focus on light-like geodesics, which justifies attempts to employ conformally invariant (or covariant) descriptions of quantum gravity

What is the required Nature's Book keeping system?
A (last) mid-summer night's dream (or day's homework) could (have) be(en) to (start to) harness its potential for the construction of the proper geometric framework for quantum gravity. 
I am amazed by the physical intuition of 't Hooft that shines through his prospective quantum modelisation of black hole dynamics. I must confess I am also intrigued by the vague but pushy analogy one might tentatively establish between the physicist computing scheme outlined in bold characters in the text above with the recent new degrees of freedom devised by spectral noncommutative geometers to describe tentative quanta of geometry. Since the quantum information of incoming matter is supposed to impinge on outgoing one only through a momentum operator while two position operators are required to describe in a unitary way outgoing matter, could it be that a set made of a Dirac operator D and two Feynman-slashed position operators Y and Y' as defined by noncommutative geometry is the Nature's book keeping system to track the entangled spacetime-matter dynamics of black hole horizons?

Laws of gravity, as they are known today, then suggest that all forms of matter will be geometric: the way they affect the curvature of space and time is the only form of information that is conserved [5]–[7]. We think we drew the important conclusion that observations of the sort mentioned here, will be the only way to reconcile finiteness of the degrees of freedom with an unbounded group of local Lorentz transformations. This is extremely important, if true. It means that Fock space will not be the appropriate language; rather, we get something that resembles a bit more string theory, which is also basically geometric. String theories known today, however, seem not yet to be based on very sound book keeping... 
The demand from string theory that space and time themselves must feature either 10 or 26 dimensions, however, seems to be too restrictive. If indeed, as we suspect, physical degrees of freedom form discrete sets, then dimensionality of space and time may be less well-defined notions, so that such ‘predictions’ from string theory may well be due to some mathematical idealisation having little to do with reality. All in all, we are badly in need of a more orderly listing of all conceivable configurations of physical variables in a small region of space and time... 
String theory was an interesting guess, but may well have been a too wild one. We are guessing the mathematical structures that are likely to play a role in the future, but we fall short on grasping their internal physical coherence and meaning. For this, more patience is needed.
(Submitted on 29 Apr 2016)

Comment: the scrutinizing reader will notice that in the first text of 't Hooft I quote today I choose not to select his commentary about noncommutative geometry. This is because I think it addresses specifically the noncommutative field theory program, or to say more geometrically it refers to the construction of noncommutative spaces via deformation of commutative algebra while I am interested in this post (and in this blog) to chronicle advances in the isospectral deformation of classical manifolds, what I use to call spectral noncommutative geometric program (see chap 19  in this review for details).

To (almost) conclude this 2016 August-series
There is real adventure to be had at a time in which pure mathematics, theoretical physics, astronomy, philosophy and experiment are all coming together in a manner unseen for almost a hundred years. You probably have to go back still further to the seventeenth-century Scientific Revolution to get a full sense of what I mean, to the era of ‘natural philosophy’, Copernicus, Galileo, Newton. I do sometimes hear some of my physicist colleagues lamenting the old days when physics was young, when there was not a vast mountain of theory or technique that a young researcher had to climb before even starting out... actually we are at the birth of a new such era right now.
On Space and Time, Shahn Majid, 2007

Now it seems that the empirical notions on which the metric determinations of Space are based, the concept of a solid body and that of a light ray, lose their validity in the infinitely small; it is therefore quite definitely conceivable that the metric relations of Space in the infinitely small do not conform to the hypotheses of geometry; and in fact one ought to assume this as soon as it permits a simpler way of explaining phenomena. The question of the validity of the hypotheses of geometry in the infinitely small is connected with the question of the basis for the metric relations of space. In connection with this question, which may indeed still be ranked as part of the study of Space, the above remark is applicable, that in a discrete manifold the principle of metric relations is already contained in the concept of the manifold, but in a continuous one it must come from something else. Therefore, either the reality underlying Space must form a discrete manifold, or the basis for the metric relations must be sought outside it, in binding forces acting upon it. An answer to these questions can be found only by starting from that conception of phenomena which has hitherto been approved by experience, for which Newton laid the foundation, and gradually modifying it under the compulsion of facts which cannot be explained by it. Investigations like the one just made, which begin from general concepts, can serve only to insure that this work is not hindered by too restricted concepts, and that progress in comprehending the connection of things is not obstructed by traditional prejudices. This leads us away into the domain of another science, the realm of physics, into which the nature of the present occasion does not allow us to enter”.
Bernhard Riemann (10th June 1853)

mardi 30 août 2016

(How not to be afraid by wormhole) Octopussy (or Medusa)

//The title of this post has been slightly edited on 1st September 2016

Octopus wormhole (or wormhole octopussy?) 
(Olena Shmahalo/Quanta Magazine)
The ER = EPR idea posits that entangled particles inside and outside of a black hole’s event horizon are connected via wormholes.

"The story of Medusa tells you she destroys those who contemplate only her reflection, which is of course horrible, instead of contemplating her real face, which is the face of wholeness" 
Patrick MoranSwimming in stone

Contemplating quantum gravity in its full nonlocal glory... without fearing for local burns at the black hole horizon?

The modern representation of Karl Schwarzschild’s spherically symmetric solution of Einstein’s equations reads

ds2 = −(1 −2M/r) dt2 + 1/(1 − 2M/r) dr2 + r2(dθ2 + sin2θ dϕ2 ) . (1.1) 
[Note: In Schwarzschild’s original work [1], the coordinate r in Eq. (1.1) was called R, while he chose an other radial coordinate r such that the point R=2M corresponds to r=0 , since it seemed to be obvious to expect a singular mass distribution at the origin of the coordinate frame. Today, we know that this was unnecessary, for two reasons: first, one is free to choose the most convenient coordinate system anyway, and secondly, the surface r=2M does not represent a physical singularity at all, but just a coordinate singularity, much like the north pole of the Earth. It is the black hole horizon.]
As we now know very well, matter can enter the black hole through the horizon, defined by the surface r=2M , while in the standard, unquantised theory, nothing can emerge out of it. The horizon is a one way door. [Note: On some web pages, these facts are still being disputed, which we can only attribute to ignorance. Schwarzschild, who wrote his paper in less than two months after Einstein’s discovery, could be excused for not immediately realising the rather subtle features of black hole horizons, which required several years to be cleared up, but today’s experts cannot afford to make such mistakes.] In the coordinates of Eq. (1.1), the point r = 0 is a real physical singularity. 
Even though the horizon appears to be a regular region of space-time, we do have a problem with it. According to Hawking’s well-known result [2], it is due to vacuum fluctuations that a distant observer will observe particles leaving the black hole: Hawking radiation. These particles have a thermal spectrum, independent of the black hole formation process.
Hawking’s original conclusion was that this result must imply that a black hole as a physical object violates the laws of quantum mechanics: even if it originates from matter in a single quantum state, it ends up in a thermal, that is, a quantum mechanically mixed state. How could it be that a derivation that uses quantum mechanics can yield a result violating the laws of this theory? Hawking particles are now understood to be formed at the horizon, not, as was originally thought, somewhere near the r=0 singularity in its past...
(Submitted on 13 Nov 2015)

Hawking particles emitted by a black hole are usually found to have thermal spectra, if not exactly, then by a very good approximation. Here, we argue differently. It was discovered that spherical partial waves of in-going and out-going matter can be described by unitary evolution operators independently, which allows for studies of space-time properties that were not possible before. Unitarity dictates space-time, as seen by a distant observer, to be topologically non-trivial. Consequently, Hawking particles are only locally thermal, but globally not: we explain why Hawking particles emerging from one hemisphere of a black hole must be 100 % entangled with the Hawking particles emerging from the other hemisphere. This produces exclusively pure quantum states evolving in a unitary manner, and removes the interior region for the outside observer, while it still completely agrees locally with the laws of general relativity. Unitarity is a starting point; no other assumptions are made. Region I and the diametrically opposite region II of the Penrose diagram represent antipodal points in a PT or CPT relation, as was suggested before. On the horizon itself, antipodal points are identified.
(Submitted on 14 Jan 2016 (v1), last revised 14 Apr 2016 (this version, v4))

To make a long story short: 't Hooft proposes to solve the  information loss paradox of quantum evaporating black holes arguing that a pure state of collapsing matter forming a black hole evaporates indeed through the radiation of Hawking particles out of the horizon but globally this radiation is not thermal but made of entangled pairs of Hawking particles emitted from antipodal points on the two hemispheres of the horizon (thus restoring the unitary evolution of black hole evaporation).

Another speculative way to "feel" quantum gravity with just a snow ball's chance in hell against quantum fluctuations
The following work I am going to highlight now might be based on less solid grounds than (and potentially be incompatible with) the 't Hooft work but I find it a very evocative outline of tentative ideas about quantum gravity to be worth a quote here:

Here we use a thought experiment, based on a comparison of Hawking radiation with Unruh radiation, to show that these two quantum phenomenon imply a small violation of the equivalence principle. The manner in which the equivalence principle is violated by the comparison of these two effects might point toward a resolution of some of the short comings of general relativity such as the existence of singularities for certain space-times and the difficulty in formulating a quantum version of general relativity...
Hawking radiation [8] is the thermal radiation emitted by a black hole of mass, M. It occurs as a consequence of placing quantum fields in the gravitational background of a black hole. An observer who stays at a fixed distance, R, from a black hole of mass, M, will measure a temperature given by [9
Normally, the Hawking temperature THawking is quoted for an observer a large distance from the black hole (i.e. R → ∞) so the blue shift factor √(1−2GM/c2R) is not written down. It is kept here since it is crucial for seeing how the equivalence principle is violated. 
By the equivalence principle an observer, accelerating through flat, Minkowski spacetime, should also measure thermal radiation. Otherwise the observer could immediately tell the difference between a gravitational field and an accelerating frame – the accelerating frame would be the one in which no thermal radiation is detected. Soon after Hawking’s original paper on black hole radiation it was shown that an accelerating observer (with an acceleration of a = |a|) does detect thermal radiation with a temperature given by 
TUnruh =  a/2πckB

This radiation and TUnruh are know as Unruh radiation and the Unruh temperature respectively [10]. Thus, at least qualitatively, there is no violation of the equivalence principle – an observer in an Einstein elevator fixed at a distance, R, from a black hole will measure both a downward acceleration toward the floor of the elevator and thermal radiation at a temperature THawking ; an observer in an Einstein elevator which is accelerating through flat, Minkowski space-time will also measure both a downward acceleration and thermal radiation. However, looking at this situation quantitatively uncovers a violation of the equivalence principle except in the limit as the observer approaches the event horizon... 
We now ask “What are the possible implications, for gravity, of this violation of the equivalence principle from the above thought experiment?”. In addressing this question we will assume that the strength of the gravitational effects are proportional to or connected with the Hawking temperature and the inertial effects are proportional to or connected with the Unruh temperature. In particular we assume the ratio of the gravitational and inertial masses are connected with the ratio of the Hawking temperature to the Unruh temperature. This assumption is not trivial since the violation of the equivalence principle discussed above deals with the Einstein elevator formulation of the equivalence principle while the distinction between gravitational and inertial masses is a different formulation of the equivalence principle. 
First we look n the near horizon limit R → 2GM/c2 a. Here the gravitational effects dominate the inertial effects since 
THawking → c3/[8πGMkB] > TUnruh  → /2πckB   × (GM/R2)
For example, taking M = MSun ... and R = 1AU ... yields THawking ≈6.2×10−8K and TUnruh≈2.4×10−23K. While THawking is 15 orders of magnitude larger than TUnruh both temperatures are effectively zero when compared to something like the 2.7K cosmic microwave background. We take this as an indication that the violation of equivalence principle, implied by the difference in Hawking and Unruh temperatures for the same accelerations, is a very small effect under normal conditions i.e. low energy density, low gravitational field strength, non-relativistic velocity. Nevertheless, the implication would be that the variation in gravitational mass is slight larger than the variation in inertial mass. This might have some bearing on the rotation curves of galaxies. The anomalous velocity profile of outlying stars orbiting the galactic center is usually explained by an enhancement of the gravitational force coming from the presence of dark matter. Here, the enhancement of gravity over inertia would come from the slight dominance of gravitational mass over inertial mass at large distances from the galactic center. 
 Second we look in the near horizon limit R→2GM/c2 and find
 THawking →... TUnruh
Thus, in the near horizon region the equivalence principle is restored. (Note that exactly at the horizon both temperatures diverge to the same infinite value due to the blue shift factor. This is as expected since for an observer fixed just above the horizon the local acceleration and Hawking temperature both diverge). This is surprising. One might have guessed that in a region of stronger gravitational field strength, such as near the horizon versus far from the horizon, the violation of the equivalence principle would be more pronounced; that the divergence between quantum mechanics (as represented by Hawking and Unruh radiation) and general relativity would be larger. The fact that this is not the case might be taken as an indication that general relativity and quantum mechanics are more compatible, not less, as the strength of the gravitational field increases...
In the near horizon limit R→2GM/c2 our arguments pick out the event horizon as special – it is the surface where the (local) equivalence principle on which general relativity is based, becomes compatible with (non-local) quantum mechanics as represented by Hawking and Unruh radiation. 
Finally, behind the horizon (i.e. R<2GM/c2  ) the expressions for Hawking temperature and Unruh temperature break down, and we continue our journey inward based on the following conjecture: Outside the horizon (R>2GM/c2  ) gravitational effects dominate inertial effects; near the horizon ... gravitational effects and inertial effects are equivalent; thus we postulate that inside the horizon ... inertial effects dominate gravitational effects. As R→0 inertial effects will become ever more dominant over gravitational effects... one can say that ... any ... material that has fallen to this point inside the horizon, is “frozen” and non-dynamical, since the inertial mass of the material will have increased to the point where further motion is impossible. This postulated transition of gravity, under conditions of high mass-energy density, to a non-dynamical theory also might have relevance for the difficulty of consistently calculating quantum corrections to the gravitational field i.e. the long standing and unresolved problem of formulating a quantum theory of gravity...
Black holes are often described as a scientific version of Hell – a place of extreme conditions which is inhospitable to any person who falls inside. Taking into account the above postulated picture, if a black hole is to be compared to Hell then it would be the Hell of Dante’s Inferno, the center of which is frozen, rather than the traditional hot Hell of fire and brimstone

A radiolarian (skeleton) with a bilateral symmetry may mimic the radiating horizon of a black hole with its antipodal entangled Hawking particles better than an octopussy ;-) Its lattice structure may also fit pretty well for the discreteness of spacetime that emerges from the quantum gravity effects at the black hole horizon but this is another story for a later post...

lundi 29 août 2016

(Reflections on and off a) GoldenEye (or rather black hole horizon)

Imaging a Black Hole. At left is a model image for Sgr A* using a semi-analytic accretion flow (Broderick et al. 2011). Light is gravitationally lensed by the black hole to form a distinctive “ring” encircling the black hole’s “shadow” (Falcke et al. 2000). The ring diameter is ~5 Schwarzschild radii . The image is bright on the approaching side of the accretion disk and faint on the receding side because of Doppler effects. At right, a sample image shows expected EHT performance in 2017-2018 (Fish, Johnson, et al. 2014). (Source:

The Event Horizon Telescope is an international collaboration to create a worldwide very long baseline interferometry array observing at 1.3-millimeter wavelength. When the EHT is complete, it will be able to make images of black holes with a resolution of 10-20 microarcseconds. This resolution is fine enough to resolve the event horizons of the supermassive black holes in the center of our own Milky Way and of M87. You can see the official EHT website here; a Scientific American blog, written by Seth Fletcher, provides many great articles here. For a lot of great answers to common questions about the EHT, check out this reddit AMA.

Michael D. Johnson, Scintillating Astronomy

Great or temperate quantum gravity forecasts

Quantum information transfer necessary to reconcile black hole evaporation with quantum mechanics, while approximately preserving regular near-horizon geometry, can be simply parameterized in terms of couplings of the black hole internal state to quantum fields of the black hole atmosphere. The necessity of transferring sufficient information for unitarization sets the strengths of these couplings. Such couplings via the stress tensor offer apparently significant advantages, and behave like quantum fluctuations of the effective metric near the horizon. At the requisite strength, these fluctuations, while soft (low energy/momentum), have significant magnitude, and so can deflect near-horizon geodesics that span distances of order the black hole radius. Thus, the presence of such couplings can result in effects that could be detected or constrained by observation: disruption of near-horizon accretion flows, scintillation of light passing close to the black hole, and alteration of gravitational wave emission from inspirals. These effects could in particular distort features of Sgr A* expected to be observed, e.g., by the Event Horizon Telescope, such as the black hole shadow and photon ring.
(Submitted on 26 Jun 2014 (v1), last revised 28 Oct 2014 (this version, v3))

Classical collapse models suggest that a sufficiently massive self-gravitating system will undergo continued collapse until a singularity forms. In 1975, Hawking [1] pointed out that if an event horizon forms and if effective field theory is valid away from a stretched horizon, then radiation from the black hole is produced in a mixed state from the point of view of the observer who remains outside the black hole provided that the freely falling observer detects nothing unusual (“no drama”) while crossing the horizon. Under these conditions information is lost if the black hole evaporates completely, which violates unitarity and led Hawking to propose that quantum mechanics should be modified [2] (he has since changed his mind). To preserve unitarity in quantum mechanics one of two possibilities must be true: (a) Hawking radiation is in fact pure or (b) the evaporation leaves behind a long lived remnant, which preserves all the information that collapsed into the black hole. However, if quantum gravity is CPT invariant then remnants are ruled out and only the first of the two options above remains viable. In 1993, building on the work of ’t Hooft [3] and Preskill [4], Susskind et. al. [5] proposed that unitarity could be preserved if information is both emitted at the horizon and passes through the horizon so that an observer outside would see it in the Hawking radiation and an observer who flies into the black hole would see it inside. No single observer would be able to confirm both pictures: one simply cannot say where the information in the Hilbert space is located, so quantum mechanics is saved at the cost of locality. This is the principle of Black Hole Complementarity 

Recently, A. Almheiri  D. Marolf, J. Polchinski and  J. Sully  (AMPS) [6] suggested that the three assumptions of Black Hole Complementarity viz., (a) unitarity of Hawking evaporation, (b) validity of effective field theory outside a stretched horizon and (c) “no drama” at the horizon for a freely falling observer are not self-consistent. Briefly, their argument can be stated as follows. Consider a very large black hole so that a freely falling observer crossing the horizon sees an effectively flat spacetime (on scales much smaller than the horizon length). From the point of view of an observer who stays outside the horizon, the purity of the Hawking radiation implies that late time photons are maximally entangled with some subset of the early radiation. However, these late photons when propagated back from infinity to the near horizon region using effective field theory must be maximally entangled with modes inside the horizon from the point of view of the freely falling observer (this is simply a property of the Minkowski vacuum, appropriate to a freely falling observer). This is not permitted by the strong additivity of entanglement entropy. Assuming then that effective field theory is valid and that Hawking radiation is pure, the paradox can only be avoided if the backward propagated photon is not entangled with a mode behind the horizon. But this would lead to a divergent stress tensor near the horizon, so AMPS concluded that the freely falling observer would burn up before she could cross it. This is the “firewall”.  
Considerable interest has surrounded the proposed firewall ..., all of it assuming that continued collapse will occur, leading to black holes with event horizons. But Hawking has recently raised several objections to the firewall and suggested that the correct resolution of the AMPS paradox is that event horizons do not form, only apparent horizons form [8]. Radiation from the black hole is then deterministic, but chaotic.
... in view of the AMPS paradox, an entirely new picture of the black hole has emerged. Instead of a spacetime singularity covered by an event horizon we will have an essentially quantum object, an extremely compact dark star, which is held up not by any degeneracy pressure but by quantum gravity just as ordinary atoms are sustained by quantum mechanics. Astronomical observations [18, 19] indicate that astrophysical black holes possess dark surfaces and this is consistent with the picture we have just described.
Cenalo Vaz  (Submitted on 19 May 2014)

Whatever EHT might see, a black hole may definitely be a true attom of quantum gravity... 
... with the double t to underline the pairs of antipodal entangled Hawking particles continuously emitted from the black hole horizon according to the 't Hooft model and the original atom word to refer to the fact that any black hole would be some kind of an elementary micro or macro-scopic (depending on its varying mass) quantum object in a non-stationnary state, without any tangible inside but a frontier with some kind of non-local properties.

If we could do experiments with radiating black holes, this entanglement would have been detectable: if at one side of the black hole a particle emerges that happens to be strongly suppressed by a Boltzmann factor, then at the other side, the same particle will be seen to emerge with probability one – not suppressed at all! 
Indeed, the heat bath mentioned earlier, is a strangely unphysical one: antipodes in 3-space are 100 % entangled. In practice, this means that the stationary Hartle-Hawking state is extremely improbable in describing the universe far from the location of the black hole.
Gerard t Hooft (Submitted on 17 May 2016)

An already observed entangled Hawking radiation... but from an atomic Bose-Einstein condensate sonic "analogue" black hole!
Waiting for an hypothetical observation of quantum gravity as soon as 2017 whatever it will look like, let's landing smoothly down on earth for a while and contemplating the amazing ingenuity of physicists to make their dreams come true in the lab before heavens:

The analogy between sound propagation in nonhomogeneous media and light propagation in curved spacetimes has opened the possibility to detect the analogue of black hole radiation in the lab [1 {*}]. Indeed, when sound waves propagate in a flowing medium whose velocity crosses at some point the speed of sound, they experience the analogue of an event horizon. If the phonon state is stationary and regular across this sonic horizon, one expects to obtain a thermal flux of phonons with a temperature kBT=κ/2π, where κ is the gradient of the flow velocity evaluated at the sonic horizon. Since the analogy works perfectly in the hydrodynamical limit, the above result should be valid at least when κ is much smaller than the critical wave-vector characterizing the dispersion [2345]. 
Following the original work of Unruh, various setups were proposed, see [6] for a review. In Refs. [...] the particular case of sound waves in dilute Bose-Einstein condensates (BEC) was considered. These condensates have nice properties both from an experimental and a theoretical point of view. From the first, progress in the manipulation and control of their physical properties is rapid, and from the second, the equations for the condensate and the phonons are well understood.
(Submitted on 22 May 2009 (v1), last revised 2 Oct 2009 (this version, v4))

We observe spontaneous Hawking radiation, stimulated by quantum vacuum fluctuations, emanating from an analogue black hole in an atomic Bose–Einstein condensate. Correlations are observed between the Hawking particles outside the black hole and the partner particles inside. These correlations indicate an approximately thermal distribution of Hawking radiation. We find that the high-energy pairs are entangled, while the low-energy pairs are not, within the reasonable assumption that excitations with different frequencies are not correlated. The entanglement verifies the quantum nature of the Hawking radiation. The results are consistent with a driven oscillation experiment and a numerical simulation.
Observation of quantum Hawking radiation and its entanglement in an analogue black hole  Jeff Steinhauer Nature Physics (Received 23 November 2015 Accepted 15 July 2016 Published online 15 August)

Jeff Steinhauer - Observation of thermal Hawking radiation...

vendredi 26 août 2016

(Which particle scares) the living daylights (out of quantum field model builders?)

Looking at the Higgs naturalness problem from a strictly scalar point of view
This post is a tentative explicitation of what I had in mind when I wrote the following question I asked in the comment section to the post After the Hangover on the blog Résonaance:
Is the Higgs naturalness problem not the tale of the electroweak scalar frog that wished to be as big as the cosmological constant bull ghost hunting the China shop of renormalizable quantum field theories at the daunting Planck scale?
The discovery of the Higgs boson (Aad 2012, Chatrchyan 2012) and nothing else exotic so far (Soni 2013) has put to rest questions of the existence of the Higgs boson, and rejuvenated questions about its viability without additional dynamics beyond the Standard Model. The Higgs boson is unique among the elementary particles in that its quantum corrections are quadratically sensitive to high scales [Giudice (2008)] . This leads to what many perceive to be a naturalness problem for the Higgs boson...

The naturalness argument, first articulated by Susskind (1979), suggests that if the Standard Model is a valid theory up to a very high scale, say Λ∼ MPl ∼1018 GeV, then m2 bare has to be a very large and extraordinarily fine-tuned number to cancel ... the very large contribution {to the Higgs boson mass from the top quark loop}... , thereby reproducing the small Higgs boson mass of 125 GeV. There is no equivalently disquieting equation in particle physics that apparently requires such dramatic fine-tuning of quantum corrections. Only the cosmological constant has perhaps more mystery of such large discrepancies compared to expectations... This problem sometimes also is called the “hierarchy problem”, in that there exists a large hierarchy of 1016 between the Planck mass and the weak scale, yet the quadratic divergences of the Higgs sector imply that the two scales should be similar.
The problem of naturalness as presented above is not without weakness. The core of the argument against naturalness being a serious problem is that there are no observables that cannot be accounted for in the theory. We always have infinities in quantum corrections that are formally cancelled by counter terms embedded in the bare parameter. Furthermore, if we regularize in dimensional regularization, artificially setting the number of dimensions to be n=4−ε, the infinities show up as 1/ε quantum corrections that are cancelled unceremoniously, in contrast to the seemingly dramatic cancellations of the Λ2 corrections that arise in a cutoff regularization method. There is no culture or meaning of declaring that counter-term cancellations with 1/ε corrections are outrageously fine-tuned. It is just a formal bookkeeping process to account for it, and all calculations can be matched to observables, and there is no conflict with the data. Naturalness, in this viewpoint, is unjustified hysteria generated by just one of our artificial means of keeping track of intermediate steps in a quantum field theory calculation. Other authors have addressed this viewpoint (Bardeen 1995, de Gouvêa et al. 2014, Farina et al. 2013).
The above discussion has focused on quantum corrections in the pure Standard Model theory. The naturalness concern rears its head more confidently if we assume that there is new physics with unknown dynamics at a high scale Λ that the Higgs boson couples to, which in turn generates quadratic sensitivities to Λ in the quantum corrections of the Higgs boson mass.
Sometimes it is argued that we know already there is a new scale, the scale of the onset of strong gravity and quantum gravity at 
, and the Higgs boson mass is surely affected by dynamics there. However, it is not a solid argument that the Higgs boson mass must suffer from destabilizing quadratic corrections due to gravity alone. Indeed, there is no obvious separate shift symmetry violating interaction of the Higgs boson with gravity that is not already suppressed by powers of the 1/
 coupling and the original Standard Model couplings. This only would leave corrections that are at most of order the Higgs mass. Furthermore, whatever non-perturbative concerns one might have for the Higgs boson inheriting instability up at the Planck scale due to gravity, it remains uncertain how to account for it. Quantum gravity is a notoriously unsolved mystery, and the naturalness issues of the cosmological problem being so small 1047 GeV4 compared to expectations 1072 GeV4 further highlights our ignorance. It is plausible that any high-scale quantum gravity intuition that we might try to invoke is dramatically wrong, and so it is reasonable to question any quantum gravity argument impugning Higgs naturalness. Since our aim is to test how robust naturalness arguments can be let us banish further thoughts about gravity and the damage it could do to the Higgs boson and the weak scale.
Another line of thinking is to consider the prospects of many new particles at higher scales that are charged under the Standard Model gauge symmetries. For example, the existence of heavy vector-like fermions charged under the Standard Model electroweak symmetries will induce large finite quantum corrections at the two loop level (Martin 1997), and it has been argued that any fermions of this kind that exist above 10 TeV destabilize the Higgs mass scale (Farina et al. 2013). This is a powerful argument in general against the Higgs boson, since there is nothing to prevent arbitrarily heavy and arbitrary number of vectorlike fermions. Their masses are gauge invariant without the need of additional spontaneous symmetry breaking, and the fermions do not contribute to gauge anomalies. Furthermore, in many string constructions there are a large number of vectorlike fermions that can arise in the spectrum.... However, assuming that the additional states have to be charged under the Standard Model for this worry to arise may seem too specialized to some. Perhaps the underlying theory gives the Standard Model gauge groups with pure chiral fermions (i.e., left and right fermions with different charges), whose mass is then necessarily bound to the Higgs boson vacuum expectation value. There are no known vectorlike fermions in nature, and invoking their existence, giving them Standard Model gauge charges, and assuming they exist at very massive scales is maybe too much speculation to convict the Higgs boson theory and the Standard Model.
At this point we have excluded gravity and additional states charged under the Standard Model from the discussion on Higgs boson naturalness. We must ask ourselves what else could create a problem for the stability of the Higgs potential. The leading answer to this is the proliferation of additional heavy scalars in nature. By “proliferation” we mean the existence of additional spin-zero scalar bosons beyond the Higgs boson that was recently discovered. All particles whose fields transform trivially (i.e., spin-zero) under the Lorentz group operations of rotations and boosts are classified as “additional scalar bosons.” The analogous categories are the spin-1/2 fermions, and the spin-1 bosons. There are at least 45 spin-1/2 fermions in nature...  As yet, we know of only one spin-zero scalar boson... the Higgs boson, and introducing more of these particles creates additional challenges that are not experienced when increasing the number of spin-1/2 and spin-1 representations. For example, if we introduce into the spectrum a scalar Φ with mass MΦ one finds that there is no symmetry that forbids a renormalizable coupling between H and Φ in the form of HΦ. Interactions that are not forbidden by a symmetry generically occur in quantum field theory, since a theory does not suffer from self-consistency and completeness questions “as long as every term allowed by symmetries is included” (Weinberg 2009). Therefore we expect this mixing to be present. However, its presence introduces a dangerous correction to the Standard Model Higgs mass, ∆mH2∝MΦ2. If MΦ2mH2 the weak scale is destabilized and wants to raise itself to the higher mass scale of MΦ. 
It is this prospect of additional heavy scalars that is particularly troublesome for naturalness of the Higgs boson mass and the weak scale. In other words, it is not the intrinsic unnaturalness of the Higgs boson of the Standard Model that is necessarily so troubling, but the immediate prospects of destabilization when its kind is proliferated in nature. There may be an intrinsic naturalness issue with the Standard Model, but that is more controversial as explained above. However, the presence of more scalars at hierarchically larger scales in nature leads to a clear instability problem. In the next section we will discuss in more detail this proliferation instability problem of the Higgs boson. We then argue that this is a real concern for the Standard Model, and that testing a theory against proliferation is not an idle speculation but is confronting a generic possibility. Finally, we show how some theories protect against proliferation instability, and in specific we show that supersymmetry is a prime example of one that solves the problem elegantly
James D. Wells (Submitted on 19 Mar 2016)

For the time being no hint of supersymmetry has been found so as I add in the same comment:
... I find it interesting to underline that some minimal nonsupersymmetric SO(10) models "can (i) fit well all the low energy data, (ii) successfully account for unification of the gauge couplings, and (iii) allow for a sufficiently long lifetime of the proton ... [and] once the model parameters are fixed in terms of measured low energy observables, the requirement of successful leptogenesis can fix the only one remaining high energy parameter" (

Now in the words of a late giant of particle physics:
The first LHC runs at 7-8 TeV have led to the discovery of a candidate Higgs boson and to the non observation of new particles or exotic phenomena: no signals of new physics have been found neither by direct production of new particles nor in electroweak precision tests nor in flavour physics... This is surprising since the hierarchy problem [1] and, to some extent, the elegant WIMP (Weakly Interacting Massive Particle) solution of Dark Matter strongly suggested the presence of new physics near the Fermi scale. But as well known the hierarchy problem is one of ”naturalness”: the SM theory is renormalizable, finite, well defined and predictive once the dependence on the cut off is absorbed in a redefinition of masses and couplings. Thus the theory can indeed work in practice and the hierarchy problem only arises at the conceptual level if one looks at the cut off as a parametrization of our ignorance on the new physics that will modify the theory at large energy scales. 
... as the criterium of naturalness has so far failed, we are lacking at present a reliable argument on where precisely the new physics threshold should be located... 
The possibility of accommodating all compelling phenomena that demand new physics below MGUT in a non SUSY SO(10) model is highly non trivial. In fact, it singles out a particular breaking chain with a Pati-Salam symmetry at an intermediate mass scale MI∼1011GeV. We have shown that a reasonable fit to the data can be obtained in this framework; of course, the price to pay is a very large level of fine tuning. 
Guido Altarelli, Davide Meloni , A non Supersymmetric SO(10) Grand Unified Model for All the Physics below MGUT, june 9, 2013

Je trouve maintenant intéressant d'essayer de situer dans le contexte qui précède les avancées du programme de géométrisation spectrale de la physique que je rapporte régulièrement sur ce blog. 

Rappelons que les axiomes de la géométrie spectrale noncommutative appliquée à la physique ont été construits pour tenter de comprendre conceptuellement le succès expérimental du modèle standard de la physique des particules en se basant sur l'idée fondamentale suivante: 

l'élément de longueur le mieux à même d'appréhender la physique à l'échelle subnucléaire pourrait être une généralisation de l'opérateur de Dirac ou dit autrement une extension du propagateur quantique pour les fermions.

Soulignons que les axiomes ont été efficacement raffinés au fil des ans par une confrontation patiente et obstinée avec la physique expérimentale ce qui a permis de sélectionner le bon modèle d'espacetemps-matière vu comme le produit d'un espace continu à 4 dimensions avec un espace discret de dimension ordinaire nulle mais pourvu d'une dimension abstraite non nulle littéralement singulière. J'ai volontairement accolé le terme matière à celui d'espacetemps pour souligner l'approfondissement du caractère relationnel entre la notion de matière - telle qu'elle est comprise à l'échelle attométrique avec les fermions du modèle standard : quarks et leptons - et le concept d'espacetemps - tel que le conçoit l'astrophysique jusqu'à l'échelle cosmologique dans le modèle de concordance.]

Grâce au principe d'action spectrale - issu d'une savante distillation du concept d'observable en physique quantique et du principe d'équivalence tiré de la relativité générale -  il est possible d'évaluer la pertinence de la géométrie noncommutative pour la physique des particules en testant les prévisions phénoménologiques à basse énergie des modèles d'espacetemps-matière, supposés exister à une échelle de grande unification, en calculant leur dynamique intrinsèque, c'est à dire le spectre de tous les bosons de jauge et scalaires de Higgs avec leurs nombres quantiques et tous les termes d'interaction (y compris avec les degrés de liberté connus de la gravitation). Comme tente de le souligner le qualificatif modèle d'espacetemps-matière, le spectre des fermions est une donnée initiale mais il est lui même axiomatiquement contraint. 

Moyennant l'usage des techniques éprouvées du groupe de renormalisation on peut alors déduire d'un choix donné d'espacetemps-matière un ensemble de prévisions phénoménologiques à basse énergie comme la masse de 125 GeV pour le scalaire de Higgs du modèle standard, à condition de supposer l'existence d'un couplage spécifique avec un nouveau scalaire noté σ d'une masse de l'ordre de 1012 GeV. On pourrait évidemment croire que cette prévision est issue d'un réglage fin des paramètres du modèle en particulier celui de l'échelle d'unification, or il ne me semble pas que ce soit le cas. Au contraire il est intéressant de constater que la phénoménologie de ce modèle standard spectral est capable en première approximation de s'ajuster naturellement avec celle observée aujourd'hui encore au LHC pour une large plage de valeur d'échelle d'unification entre 6.5×1012−1.4×1017GeV

Plus frappant, il faut souligner que loin d'être introduit de façon ad hoc pour ajuster la masse du Higgs à sa valeur expérimentale, le nouveau scalaire σ - singulet de jauge pour le modèle standard - est aussi directement responsable d'une brisure spontanée de symétrie à l'origine de la masse très faible des trois neutrinos connus (ceux de chiralité gauche) via un mécanisme de bascule (seesaw) de type I qui requiert l'existence supplémentaire de trois nouveaux neutrinos de chiralité droite ayant une masse de Majorana de l'ordre de l'échelle d'unification. Ainsi le modèle standard spectral avec le formalisme géométrique particulier dont il découle n'est pas loin de réaliser une théorie standard susceptible d'expliquer aussi la seule phénoménologie de physique des particules au delà du modèle standard ordinaire clairement établie: celle des oscillations de saveur de neutrinos.

Plus prometteur encore il faut souligner que l'importance du rôle du scalaire σ pour la physique a obligé les explorateurs du jardin noncommutatif à revoir leurs axiomes : leur permettant de réduire (et non d'augmenter!) leur nombre et cela de différentes manières. Les derniers développements pour la physique des hautes énergie conduisent à la sélection axiomatique et à la distillation spectrale d'une classe particulière de modèles de grande unification à la Pati-Salam qui sont sensiblement voisins dans leurs phénoménologies des modèles nonsupersymétriques basés sur le groupe SO(10) cités plus haut. D'autres considérations permettent aussi d'envisager une meilleur compréhension de la transition de phase qui pourrait survenir à l'échelle de grande unification.

Je n'ai pas encore trouvé dans la littérature scientifique (pré)publiée de travail visant à étudier systématiquement la phénoménologie du modèle standard spectral ou ses extensions à la Pati-Salam. Je ne doute pas qu'un jour un tel travail soit entrepris. Même si à première vue la tâche peut paraître ingrate dans la mesure où il y a tout lieu de penser qu'il sera très difficile de trouver une expérience cruciale de physique subatomique à même d'apporter une preuve irréfutable de la véracité de ce type de modèle, je constate qu'il y a par exemple encore de courageux thésards et d'audacieux directeurs de thèse pour travailler sur des modèles où les particules au delà du modèle standard sont à des échelles inaccessibles, sur Terre du moins. La réponse finale viendrait alors du ciel, de l'astrophysique donc : quoi de plus naturel pour des spéculations qui reposent sur une compréhension du scalaire de Higgs comme un boson de jauge associé à la composante discrète de l'espacetemps-matière, une trace de la gravitation quantique à l'échelle électrofaible... mais ceci est une autre histoire.

Pour revenir au problème de la naturalité du scalaire de Higgs, il me semble que la recherche de sa solution gagnerait à être envisagée plus systématiquement dans le cadre de la géométrie noncommutative. Les initiateurs du programme de géométrie spectrale noncommutative ont déjà dessiner des pistes ici et . Je trouve intéressant de noter que pour ce que je peux comprendre de la première par exemple elle révèle la présence naturelle dans l'action spectrale d'un terme lié aux neutrinos de Majorana droits permettant d'annuler la divergence quadratique des corrections radiatives à la masse du boson de Higgs à condition d'en avoir au moins trois...

En guise de conclusion personnelle et donc très hypothétique:
Peut-être que "tuer certaines divergences dans des calculs perturbatifs en les compensant génériquement (pour le dire naïvement) par des particules supersymétriques n'est pas jouer" selon les règles de la nature.