I have just spent a long weekend at Emergent Quantum Mechanics (EmQM15). This workshop is organised every couple of years by Gerhard Groessing and is the go-to place if you’re interested in whether quantum mechanics dooms us to a universe (or multiverse) that can be causal or local but not both, or whether we might just make sense of it after all. It’s held in Austria – the home not just of the main experimentalists working to close loopholes in the Bell tests, such as Anton Zeilinger, but of many of the physicists still looking for an underlying classical model from which quantum phenomena might emerge. The relevance to the LBT audience is that the security proofs of quantum cryptography, and the prospects for quantum computing, turn on this obscure area of science.
The two themes emergent from this year’s workshop are both relevant to these questions; they are weak measurement and emergent global correlation.
Weak measurement goes back to the 1980s and the thesis of Lev Vaidman. The idea is that you can probe the trajectory of a quantum mechanical particle by making many measurements of a weakly coupled observable between preselection and postselection operations. This has profound theoretical implications, as it means that the Heisenberg uncertainty limit can be stretched in carefully chosen circumstances; Masanao Ozawa has come up with a more rigorous version of the Heisenberg bound, and in fact gave one of the keynote talks two years ago. Now all of a sudden there are dozens of papers on weak measurement, exploring all sorts of scientific puzzles. This leads naturally to the question of whether weak measurement is any good for breaking quantum cryptosystems. After some discussion with Lev I’m convinced the answer is almost certainly no; getting information about quantum states takes exponentially much work and lots of averaging, and works only in specific circumstances, so it’s easy for the designer to forestall. There is however a question around interdisciplinary proofs. Physicists have known about weak measurement since 1988 (even if few paid attention till a few years ago), yet no-one has rushed to tell the crypto community “Sorry, guys, when we said that nothing can break the Heisenberg bound, we kinda overlooked something.”
The second theme, emergent global correlation, may be of much more profound interest, to cryptographers and physicists alike.
The core problem of quantum foundations is the Bell tests. In 1935, Einstein, Podolsky and Rosen noted that if you measured one of a pair of particles that shared the same quantum mechanical wave function then this would immediately affect what could be measured about the other, even if it were some distance away. Einstein held that this “spooky action at a distance” was ridiculous, so quantum mechanics must be an incomplete theory. In 1964 the Irish physicist John Bell proved that if particle behaviour were explained by hidden local variables, their effects would have to satisfy an inequality that would be broken in some circumstances by quantum mechanical behaviour. In 1974, Clauser, Horne, Shimony and Holt proved a related theorem (CHSH) that limits the correlation between the polarisation of two photons, assuming that this polarisation is carried entirely by and within them. Freedman and Clauser showed this was violated experimentally, followed by Aspect, Zeilinger and many others. These experimental results, the “Bell tests”, convince many physicists that reality must be weird; that reality must be non-local, no-causal, or even involve multiple universes.
At EMQM15, we had a number of talks from people advancing models, or ideas for models, according to which quantum phenomena emerge from a combination of local action and global correlation. As the Nobel prizewinner Gerard ‘t Hooft put it in his keynote talk, John Bell assumed that spacelike correlations are insignificant, and this isn’t necessarily so. In Gerard’s model, reality is information, emerging from a cellular automaton fabric operating at the Planck scale, on which the fundamental particles are virtual particles – like Conway’s gliders but in three dimensions. In the version he presented in 2013, the fabric is regular and this may provide the needed long-range correlation. The problem with that was that the Lorentz group is open, which seemed to prevent the variables in the automata being bitstrings of finite length. In the 2015 version, the automata are randomly distributed. This was inspired by an idea of Steven Hawking’s on balancing the information flows into and out of black holes; if the idea’s right, automata may have to be randomly located, which conveniently fixes the Lorenz problem (and Fay Dowker’s shown this may be the only way to do so).
In a second class of emergence models, the long-range order comes from an underlying thermodynamics (God’s bubble-bath rather than God’s clockwork). Gerhard Groessing has a model in which long-range order emerges from subquantum statistical physics; his talk this time was on how these ideas can apply to the double slit experiment. Ariel Caticha derives QM as entropic dynamics, which means deriving transitions in configuration space by maximising appropriate entropies; potential drift then imposes directionality and correlation. Ana Maria Cetto looks to the zero-point field and sets out to characterise active zero-point field modes that sustain entangled states. Bei-Lok Hu adds a stochastic term to semiclassical gravity; it’s symmetric, traceless and divergenceless. The effect after renormalisation is nonlocal dissipation with coloured noise; this suggests that we look for the cat in the noise kernel of spacetime.
The quantum crypto pioneer Nicolas Gisin has a new book on quantum chance in which he suggests that the solution might be nonlocal randomness: a random event that can manifest itself at several locations. But who keeps track of who’s entangled with whom – the angels? And we had some philosophy papers too: Frank Walleczek noted that Bell’s 1976 paper distinguished signals from messages and divided beables into controllables and uncontrollables; so you can’t use supraluminal propagation to communicate as it’s an “uncontrollable influence”. This is a compatibilist view of agency while Conway’s theorem is incompatibilist. But does compatibilism not allow superdeterminism? By 1990 Bell was almost a compatibilist; he allowed deterministic chaos in complicated systems (e.g. computers) to give free-enough variables.
We have a dog in this fight ourselves; Robert Brady and I recently showed that you can easily extend Maxwell’s 1861 model of a magnetic line of force as a flux tube in an ideal fluid so that perturbations in the flux tube act like photons, in that they are emitted and absorbed discretely, obey Maxwell’s equations, and violate the CHSH inequality in exactly the same way that photons do. This leads me to suspect that the quantum vacuum may behave somewhat like a superfluid, where the order parameter gives suitable long-range correlation.
But whether you think the quantum vacuum is God’s computer, God’s bubble bath, or even God’s keystream generator, there is a sense of excitement and progress, of ideas coming together. And once we start understanding the physics, the quantum mysticism will evaporate. With it may well go the hopes for (or fear of) a quantum computer. But the real win will be understanding the universe. Theoretical physics has been stuck for the past forty years, and we need to break the logjam.
There is now a video of a talk I gave on the implications of our superfluid vacuum model at the Crossing conference in Darmstadt in June.
The energy of the (aristochratic) tradition of analytical mathematics is preventing a marriage with (newly rich) computational
mathematics, which might serve because the satisfactory school mathematics of the IT age https://math-problem-solver.com/ .
There have been some questions and actions after the tales.