vzn quantum theory research program laid out

Whether you can observe a thing or not depends on the theory which you use. It is the theory which decides what can be observed.–einstein[54]

hi all! the minev experiment really got my neurons buzzing and inspired me to dive deep into a lot of QM lately. so have been looking further into many QM directions that are relatively new, only about ~2decades old. during this time Quantum Computation has had a big effect on the development of physics research + trends. the age-old problem introduced with the origins of QM, “the measurement problem” comes front-and-center. QC fundamentally depends on “accurately measuring” qubits. but due to complementarity identified by Bohr + the heisenberg uncertainty principle, “accurate measurement” is an extremely slippery, subtle concept in QM/QC.

this stuff is some of the hardest in the world to “wrap ones brain around.” the worlds top geniuses are still struggling themselves. its a rarefied crowd, an at times esoteric/ arcane area. even physics specialists into QC are not so familiar with some of the deeper ramifications. last month, outlined a bunch of vocabulary that is related to the Minev work. ah, its much more comprehensive, its an entirely new vocabulary around QM mostly from the optics subfield. had to try to disentangle all that somehow…

“the devil is in the details…”

wading in deep (in the carollian sense of a “dizzying rabbit hole”) led me to many outstanding findings, papers and resources, and gave me a big new idea. would like to write it up “professionally” but alas, am intimidated/ overwhelmed by arxiv at times. it would lead to some nice personal satisfaction to put a well written paper there but theres this very monumental hurdle of audience to overcome, so to speak, specifically to actually find one.

but, there was a lot of effort to collect refs and organize them, and so am going to write up these semiformally/ informally into these notes/ story/ leadup. its also very useful to go to wikipedia. wikipedia can sometimes leave a lot to be desired in its technical articles which can be very specialized and assume a lot of background, but its a very valuable resource esp for benchmarking basic ideas, so am going to cite it a lot in the following. its remarkable how narrow/ specialized some of the topics are.

asked about the Minev experiment on SE quantum computing chat room, and got some response from glS and mod Mithrandir, with other/ further discussion with More_Anonymous, Thomas_Klimpel, and bolbteppa also expressing some interest/ reaction.[1] glS helpfully posted a related question mostly asking whether it is “new science”.[2] ACM replied basically that there is no fundamental new breakthru, only significant experimental finesse/ advance, that it uses basic prior-established concepts of eg weak measurement[4], quantum tomography[3], quantum zeno effect[5]. overall the chat dialog lasted over a week on the same/ similar topics which is something of a small miracle on SE. attention is the real scarce commodity these days.

needed to review all these complex topics and try to piece it all together and put in some kind of order; its far from easy to put them all in order. they tend to be quite scattered in a field and across fields. ie, crosscutting. was looking at historical developments, and some to lots of this is relatively new. have not really seen it all described and collected neatly in one place. suspect that will happen someday by someone. the closest found by me was nice [32][33] survey by Tannoudji/ Dalibard “manipulating atoms with photons”.

a lot of this starts with studying the Rabi cycle of atoms,[6] which was never really a part of initial quantum mechanics, which studied the energy level changes, but not as exactly. the basic idea is the atom works like a flip-flop between dark and light states, existing in a “rhythmic superposition.” there may be a yet another state, a 3-way scenario that leads to what are called “quantum beats” which are intermediate resonances/ superpositions between 2 other states.[7]

found a lot of this more general material in an outstanding 1997 book, Quantum Challenge by Greenstein/ Zajonc.[8] one of my favorites! this was very helpful to review. the Minev experiment is quite aligned with some of the basic rabi cycle scenarios presented in the book. it makes it less incomprehensible to understand how old the basic ideas are and try to figure out what is old vs what is new.

there is a close connection between the rabi cycle and the jaynes-cummings semiclassical model of the atom.[9] this model is able to describe a lot of the atom dynamics and is kind of a mix between classical and quantum but more “realistic” or having more “realism” than quantum descriptions.

the minev experiments are on what are called “artificial atoms” but really they seem to be quite real in their dynamics, they seem to model real atoms almost exactly through the supercooling and superconductivity properties. there will be some further debate here, but my feeling is right now they are very accurate “models” and almost like nearly perfect miniature replicas/ simulations of real atomic dynamics. they allow one to create rabi cycle systems or 3-level systems with “customized or manufactured properties according to specifications”.

⭐ ⭐ ⭐

the broader area of study here is quantum thermodynamics which considers “non ideal” quantum systems eg those that have diverse aspects of heat + friction.[10] a key area of study in this is dissipation of energy in a quantum system.[11] the main mathematical tools are basically modified schroedinger equations with noise terms, studied in quantum stochastic calculus.[12] looking at noise and dissipation more in general leads to a larger theory, stochastic quantum mechanics.[13]

so a lot of this is based on QM foundations taught in textbooks, but a lot of it is outside the standard/ condensed undergraduate ideas about QM. a crosscutting idea here is that of a “quantum fluctuation”[14] which has long been recognized initially in experiments and then can be modelled in the stochastic theory.

in the early 1990s the concept of “open quantum systems” was recognized, studied more in depth and formalized better, and it appears to have originated in “quantum optics” which is also at the center of much QC advances.[15][16] here an open system is basically a nonideal system that gets perturbed due to what might be called “environmental noise.” a key concept that emerged was that of a “quantum trajectory.”[17]

now, the use of the word “trajectory” is tricky here. Bohm talked about trajectories for much of the 20th century, or maybe about ½ of it, but his use of the term was never widely accepted because it was understood to be another shorthand for “hidden variable theory”; so the importation into this area was maybe not at all (directly!) connected to Bohm. Bohm is sometimes persona-non-grata among practicing physicists, although there may be somewhat of a covert/ secret admirer society.

roughly, then, a quantum trajectory are purely evolving states subject to some noise terms/ contributions, almost always (my understanding) from the environment, and then can be modelled mathematically as noise terms in the calculus.[18]

a key innovation in the area was the “quantum jump method”.[19] this was initially the use of stochastic terms in numeric simulations that was found to model real systems effectively. so here “jump” was referring to a numerical, algorithmic technique. but researchers found that these theoretical “jumps” were accurately modelling the noise in real open systems.

here its helpful to cite neumaier again who has a helpful page/ summary.[37] a ref that comes up around this subject is Plenio/Knight 1997, a notable title: “The quantum-jump approach to dissipative dynamics in quantum optics”.[38][39] in other words, in short, the dissipative dynamics of quantum systems is coupled with quantum jumps! keep that in mind for later!

a lot of this is about creating nearly ideal systems in physical experiments. for example a frequent concept is homodyne (heterodyne) detection.[52] atomic qubit technology is driving a lot of this experimental precision. there are many related concepts, its a very large field, its called NISQ now, “noisy intermediate quantum systems.” what if careful understanding of the NOISE leads to an entirely new theory? here is a massive 96p ref/ survey on measurement of noise in quantum systems by Clerk et al.[53]

⭐ ⭐ ⭐

a key area of contention online was glS/mithrandir (+ACM+knzhou+bolbteppa) (predictably) asserting that the Minev experiment is not making novel predictions. but thats a bit of a subtle issue that, trying to understand in detail, led me far down the rabbit hole. experimentalists have to be very careful about making such claims, and need to read the paper in much more detail. but in interviews, they (Minev + advisor Devoret) are on the record: [20]

The excellent agreement between the predictions of QTT and the experimental results suggests something deeper than the mere fact that the theory works for single quantum systems. It means that the highly abstract “quantum trajectory” that the theory refers to (a term coined in the 1990s by physicist Howard Carmichael, a coauthor of the Yale paper) is a meaningful entity — in Minev’s words, it “can be ascribed a degree of reality.” This contrasts with the common view when QTT was first introduced, which held that it was just a mathematical tool with no clear physical significance.

“Quantum trajectory theory makes predictions that are impossible to make with the standard formulation,” Devoret said. In particular, it can predict how individual quantum objects such as particles will behave when they are observed — that’s to say, when measurements are made on them.

but looking closer, found this evocative quote, and led me to wonder– even doubt some– if Minev + Devoret + Carmichael[22] are really all clear on the full implications of his experiment.

“It is reassuring that the theory matches perfectly with what is seen” said David DiVincenzo, an expert in quantum information at Aachen University in Germany, “but it’s a subtle theory, and we are far from having gotten our heads completely around it.”[21]

wait, perfectly aligns with WHAT theory? is it possible they missed something? ofc for professional researchers this is an unthinkable question, but on the other hand, its the endlessly surprising quantum mechanics thats at stake here! behind all the formidable mathematical formalism employed, Minev himself seems to be hinting at/ accepting of hidden variables point blank: “quantum trajectories can be ascribed a degree of reality.”

⭐ ⭐ ⭐

this led me to wonder if there are any QM theories that are compatible/ consistent with these findings, and then ran into this general category called “objective collapse” theories, had heard a little bit but never looked much into these in detail.[23] basically the idea is that the collapse of the wavefunction is not defined in QM and maybe it is due to some kind of (environmental?) noise. but it turns out to be extremely difficult to try to prove this is the case, apparently these theories are “very, even extremely close” to standard QM in their predictions. the 1st objective collapse theory was GRW formulated in the early 1990s.[24]

then found the closely related model CSL, continuous spontaneous localization model, and then my brain started buzzing… or “clicking”![25]

thinking thru my own longstanding ideas about QM and then reading, realized that there was a way to mesh them!

basically, had some further inspiration, a “wild and crazy idea”. but my question, had anyone thought of it before? great ideas are like that, they usually have a long lineage. so what was it?

[23] has a remarkable citation. it says:

In the context of collapse models, it is worthwhile to mention the theory of quantum state diffusion.[15]

[15] Gisin, N; Percival, I C (1992). “The quantum-state diffusion model applied to open systems”. Journal of Physics A: Mathematical and General. 25 (21): 5677–5691. Bibcode:1992JPhA…25.5677G. doi:10.1088/0305-4470/25/21/023. ISSN 0305-4470.

wow, its a small world. this is intellectual synchronicity. the rabbit hole deepens. [36] is a later 1997 survey by Gisin, Percival. [40] is a similar ref by Gisin, Brun, the latter wrote a comprehensive survey of Quantum Trajectory Theory cited last post.[68]

ok, time to brush up on some of this. was reminded of Gisin, have been tracking his groundbreaking research since the late 1990s, 1st catching eye of his long-range bell experiments, which were already “world class” at the time, but maybe lost track of him somewhat more lately (eg embarrassingly, last ~1½ decade or so!). decided to brush up on his research. holy cow, only 389 refs on arxiv, that shouldnt take too long to scan, right?[26] and turned up this alarming, colorfully dark anecdote he shared, and could personally relate, but thats a topic for another day.[27]

but my luck or force was strong, turned up this excellent survey of QM collapse where Gisin simply argues that existing QM is incomplete due to the collapse postulate. wow, way to go![28] on p7 footnote [19] he goes into fascinating history in which he codiscovered the GRW equations.

[19] In 1988 Professor Alberto Rimini visited Geneva to present a colloquium. He presented the famous GRW paper [53] in the version Bell gave of it [54]. In the GRW theory, the non-linear stochastic terms added to the Schr¨odiger equation lead to solutions with discontinuous jumps of the wave-packet, i.e. to some sort of spontaneous collapses triggered by nothing but mere random chance, as time passes. Near the end of his colloquium, Rimini mentioned that an open question was to massage the stochastic modifications in such a way that the solutions would be continuous trajectories (in Hilbert space). He also emphasized the need for an equation that would preserve (anti-)symmetric states. He may have added that, with Philip Pearle [55], they have a solution, but for sure he had no time to explain it. Immediately after the colloquium I went to Alberto and told him that I knew how to answer his questions. He encouraged me and I immediately added a small section to a paper already quasifinished [1]. There is no doubt that Philip Pearle found CSL independently. Lajos Diosi, by the way, did also find it [49]. Actually, everyone who, at that time, knew both GRW and Itˆo stochastic differential calculus would have found it, because it is quite trivial, once you know the tools and the problem. Anyway, Ghirardi and Pearle got very angry that I published my result first and I decided to leave that field. I didn’t like fights and wanted a carrier.[sic]

wait, Bell came up with a GRW related theory? holy cow! looking further leads to details/ analysis in [31].

re Gisin am thinking, what a massive pity he “left the field.” any chance of rejoining? emailed him but didnt hear back yet, lol! he says he “left” because of the 2 major problems, 1st listed below, and the 2nd is relativistic extension. but here is another key quote aligned with my latest thinking. so here is the equation that represents my thinking! but here further is the key concept/ caveat related to my own idea. p7

First, one unpleasant characteristic of such a modified dynamics is that the very same equation (2) can also be derived within standard quantum theory by assuming some coupling between the quantum system and its environment and conditioning the system’s state on some continuous measurement outcomes carried out on the environment [57]. This makes it highly non trivial to demonstrate an evolution satisfying equation (2) as a fundamental evolution, as one would have to convincingly show that the system does not interact significantly with its environment.

ok! so that is how practitioners in the field view the idea, and the basic obstacle/ hurdle that has to be overcome. however, a few years after this was written, experiments have been constructed that nearly exactly replicate these conditions outlined eg Minev et al! gisin didnt seem to cite himself here, but this ref by him expands on the idea.[51] ref [57] is [34] by Wiseman, a name that comes up in this area, talking about “unravellings” a term that was noticed last post. wikipedia is not a big help here, but heres a relatively general reference again by Wiseman.[35]

Once one has chosen an interaction, one therefore has the remaining freedom to choose the manner in which to interrogate the environment, and different methods lead to qualitatively different kinds of measurements. These different measurements, which constitute the measurement strategies discussed above, are often referred to as different unravellings of the environmental interaction.[8]

ref [8] is Carmichael. did he invent the term “unravelling” also along with “quantum trajectory”? continuing with Gisin:

I remain convinced that collapse models of the form sketched above, see [56] and references there in, is the best option we have today to solve the unacceptable quantum measurement problem. … Note, however, the following two critical points

⭐ ⭐ ⭐

ref [56] is following VERY comprehensive 130p article by Bassi, 2012.[29] looking into Bassi immediately pops up this delightful/ engaging 2020 article/ profile in NYT. wow![30] the abstract is not as bold as Gisin in simply declaring that QM must be incomplete, its a little more diplomatic/ couched! but its nearly the same sentiment:

Quantum mechanics is an extremely successful theory that agrees with every experiment. However, the principle of linear superposition, a central tenet of the theory, apparently contradicts a commonplace observation: macroscopic objects are never found in a linear superposition of position states. Moreover, the theory does not really explain as to why during a quantum measurement, deterministic evolution is replaced by probabilistic evolution, whose random outcomes obey the Born probability rule.

p88 echoes Gisins ideas about eliminating the noisy environment:

If instead we experimentally observe a quantum to classical transition such as the collapse of the wave function while convincingly reducing all potential sources of noise, this would strongly hint that an alteration of the fundamental equations of quantum mechanics is needed.

wow, what an encyclopedia. it is encouraging that there is a lot of inquiry into CSL experimental analysis/ detection. my idea is maybe best covered by section E p104, “microspheres and nanoparticles in optical potentials”.

Two criteria to test CSL and other collapse models, are fulfilled: a high mass of the particle and a large size of the superposition which can be comparable to rC

wait, why is a high particle mass required? he goes on to describe an optical trapping experiment VERY CLOSE to some of what Minev et al have already performed if you consider the artificial atom as the particle. and that is my idea: my suspicion is that tests to prove CSL theory at the atomic scale are already nearly built, and it would like nearly the same as Minev. but Minev didnt see it… exactly… because he wasnt looking for it! a near miss! re the quote starting this blog!

Whether you can observe a thing or not depends on the theory which you use. It is the theory which decides what can be observed.–einstein[54]

Bassi is esp interested in the idea that, as expressed in abstract, “macroscopic objects are never in a linear superposition of position states” and it SEEMS like this is a key distinction between classical/ quantum that has been pursued in “sideway/ alongside” thinking about QM for decades. the contents p3 are a hair slightly more dramatic/ sharp than the abstract, talking about a “seeming contradiction”:

Yet, there is one apparently innocuous observed phenomenon the theory seems unable to explain, and in fact seems to contradict with. This is the observed absence of superposition of different position states in a macroscopic system.

so Bassi is very interested in the regime between macroscopic and microscopic as a region to find an anomaly, the so called mesoscopic/ mesoscale intermediate region. its a great idea, however, my thinking is that this might be something of a red herring, aka “barking up the wrong tree” in the vernacular. macroscopic superpositions have been demonstrated now for decades! some of the earliest work was with so-called “SQUIDS.” recently it is more with so-called “cantilevers”.

Bassi (the clear thought/ action leader in the area) nor anyone else nearby seems not to be realizing how CSL models would have implications on eg microscopic areas ie qubit measurement experiments, and that, astonishingly, maybe the effects of an CSL have already been measured! CSL needs to be reformulated in that way. my thinking is maybe CSL effects will be 1st measured/ understood at the microscopic range and then scaled upward into mesoscopic and even macroscopic.

⭐ ⭐ ⭐

now presenting without further ado the VZN QM + COLLAPSE + LHV THEORY.

Neumaier [37] writes:

The Lindblad equations, universally used to describe the dynamics of (mixed) states of open systems have dissipative terms, which are the leftover of collapse when averaged over the quantum jumps.

wait, what? who said that? doesnt this explain the collapse itself, mathematically? this is so close to the proposal. the proposal is simply in words, the DETECTOR CAUSES THE COLLAPSE.

GRW theory posits random stochastic discontinuities. those discontinuities are to be attributed to detector state change from dark to a higher level, but which also causes a perturbation on the system. this perturbation has now been measured experimentally, its called BACK-ACTION, 2013.[46][49]

it is even possible to build up this back action into a control-system feedback loop. this was done in 2011 by Haroche et al.[47] he has an awesome review of his life work in [48] evocatively titled “controlling photons in a box and exploring the quantum to classical boundary.”[48] another extraordinary experiment by him from 2007, “quantum jumps of light record birth+death of photon in cavity.”[50]

this basic concept of back-action explains why it is impossible to measure something without disturbing it in quantum theory. the measurement causes a perturbation. Haroche et al describe it as “a fundamental difficulty: the sensor measurements cause a random back-action on the system.” here sensor is simply DETECTOR.

this back-action sure sounds a lot like the “flashes” in the GRW-Flash theory considered by Bell et al.!

these ideas about real models of the atom dynamics seem to be closely related to the concept of “light dressed states”[43] that shows up in jaynes-cummings theory[9] which wiseman has studied closely both theoretically and experimentally.[42]

⭐ ⭐ ⭐

searching on the wiseman ref on arxiv, then came up with these startling refs. WHIZ, BANG, KAPOW! it appears wiseman is already very close to the idea. [44] (2011) are experimental tests for the idea with not high efficiency, only 58% required. [45] (2013) expands on the idea with qubits and a threshhold efficiency of only 37%. there is also reference to EPR-steering which is probably quite similar to the dynamics of the Minev experiment. wiseman does not seem to realize how close these equations/ ideas are to the CSL theory, but it gives them further credence.

Since the advent of quantum trajectory theory some two decades ago [1–5], it has been widely accepted that the stochastic dynamics of individual open quantum systems (e.g. the quantum jumps of atoms) are not inherent to the system, but rather depend upon the presence, and nature, of a distant, macroscopic detector (e.g. a photodiode). However, old ideas, namely Bohr’s [6] and Einstein’s [7] original conceptions of a quantum jump as an objective microscopic event, die hard.

?!? wait, WHAT? “it has been widely accepted”? that is great/ fabulous news, but news to me! where is the (single?) ref for that? there is conspicuously no ref! is this really not an accurate statement? and language is again very tricky here. if a detector is influencing the system, why is that not still “objective”? yes, words/ technical vocabulary are a big part of the challenge as Bohr long understood/ insisted.

ok, anyway, this is very, very close to my idea in my head for several decades, and am amazed it wasnt proposed decades ago. its amazing to finally see it written out on paper. there are a few other ingredients to add.

there is a tricky/key concept of “dead time” for macroscopic detectors.[55] apparently afaik (and need to research this further) almost nobody has related it to atomic transition time and the rabi frequency. my key idea is that maybe dead time is a fundamental property of matter & detectors.

this has various implications. a big one is that 100% optical efficiency is in fact a mirage, unachievable.[56] it is possible to get good efficiency for the “bright” range of the rabi cycle, but the dark cycle (“state”) will always be present. maybe using an array of atoms in different cycles, one can indeed get 100% efficiency, but the idea is that with a single detector atom, it can only be something like half efficient in “detecting” incoming photons.

this is a really big deal for other reasons. nobody has yet proposed a very solid model for Bell entanglement that fits all experimental findings, although theres probably one buried in the literature somewhere. but it is known for over 2 decades that the detector loophole allows LHV (local hidden variable) theories (thx again Gisin!).[57] the model has been revised/ expanded in 2015 to allow for “arbitrary limited detection efficiency.”[58]

also, there are a lot of misconceptions about bell experiments and what they do and do not prove. the simple summaries are not too accurate. the technical details are that LHV models probably have never really been definitively ruled out.[59] wikipedia has some idea on loopholes also.[60]

⭐ ⭐ ⭐

so heres what the experimentalists/ theoreticians still SEEM to be missing at this point, except maybe Wiseman, who is the closest to the idea (it is possible this is outlined somewhere, but havent seen it yet). if one has an environment and a single atom, yes, there is noise and decoherence, an interchange, COUPLING. very much in the sense of control theory!

but what if one shrinks ones detector down to a single atom, and the measured system down to a single atom? and what if the only interaction possible is BETWEEN them, and “tracked” aka (as some are referring to, eg Minev) “all information captured”? this is very close to the optical experiments on artificial atoms and qubits that are now being performed. it seems then that one may be able to successfully discover/ MEASURE the collapse of the wavefunction as a fundamental physical property of the universe. which not surprisingly has eluded both the worlds top theoreticians and experimenters for over a century.

in the presence of space-waves (EM waves) the detector probabilistically “FIRES” and causes BACKACTION onto the measured system. “GRW-FLASH!” the detector may or may not detect space-waves running over its span, this is the fundamentally dissipative/ unmeasurable character of QM.

but both Gisin/ Bassi and maybe even Wiseman at times are talking about “isolating” the quantum system. STOP THAT, GIVE UP ON THAT NOW! the system cannot be fully isolated from the detector, QM apparently correctly predicts that. the near identical nature of the stochastic schroedinger equation to a system that is conditioned on measurements IS NOT A BUG, ITS A FEATURE. it appears researchers have already largely discovered the “correct” answer, they just dont see it yet! also Bassi maybe needs to realize while macroparticles make sense, maybe theyre not “NECESSARY” in a sense. FIGURE OUT THE CSL MODEL FOR INDIVIDUAL ATOMS, ASAP! DESIGN A SPECIFIC EXPERIMENT FOR IT!

the answer is (apparently, at this time!) to shrink the supposed “open system” aka ENVIRONMENT down to a SINGLE DETECTOR (aka ATOM!) and its inevitable effect on the measured entity. sound familiar? this is simply QC QUBIT MEASUREMENT at heart, and nearly exactly what Minev constructed/ analyzed. in short, consider 2 INTERACTING ATOMS ONLY 1 of which leads to the photo-avalanche measurement cascade. and the control/ feedback loop for detection would probably be employed.

so Gisin/ Bassi et al talk about isolation of the system from the environment, but this is the very subtle issue at stake, and these terms have been too ambiguously thrown around ever since the beginnings of QM. an optical cavity is indeed exactly an “isolation chamber” in a sense. but what is the environment, what is the system? these are the zen “one hand clapping” koan terms that need to be very subtly deconstructed/ reexamined.

in a classical control system, what is the system, what is the environment? in this situation, this scenario, these are not clearly distinguishable, the terms are not typically used there; the only exception that comes to mind might be that of a thermostat. the concepts from control theory are very relevant, have already been applied, will be helpful here, and need to be further expanded. in a control system the basic idea is, like in QM, COUPLING.

and re qubit measurement, yes, “very simply,” (in the sense of deep zen paradoxes!) the AVALANCHE/ CASCADE, right in front of their eyes for nearly a century, is the fundamental macro-microscopic interface that has been eluding scientists for so long! so @#%& can we now stop talking about consciousness and wigners friend and the multiverse for awhile now? (which even “engineers” like Gisin are apt to in moments of weakness or reverie, lol! [28]) and now just get down to brass tacks and qubits and build some KICKASS quantum computers more powerful than anything imaginable?

so yeah, it looks like the COMPUTER SCIENTISTS or COMPUTER ENGINEERS are finally going to come up with the correct theory of nature as a side effect of building the worlds greatest + most sophisticated computers, lol!

what are the details of such an experiment? not sure exactly, Wiseman seems to be the closest to sketching them out. but he suggests having a detector far away from the measured atom to rule out the relativistic effect. my idea is that maybe even with the detector close, one could come up with an experiment to show that QM is incomplete wrt collapse.

however, QM is very slippery. it is not exactly an incorrect theory, only INCOMPLETE. a provably new theory would have to make a fundamental prediction of the universe, say maybe related to heisenbergs constant or some new one that cant be extracted from existing QM. that is a very challenging thought experiment, but it looks like workers in the field are very close to coming up with the arrangement, if it hasnt already been done, buried in some paper somewhere that eludes my google searches… so far! as QM teaches us, not all things that exist are immediately knowable!

⭐ ⭐ ⭐

a few further ideas. Schroedinger in his 1952 article “are there quantum jumps” compared them to epicycles.[61] a subtle point. this is a 9pg article in The British Journal for the Philosophy of Science with no references, something that would not be very endearing to a practicing “shut up and calculate” physicist. schroedinger here is not saying that quantum mechanics is wrong, or even that it is incomplete. (ok, maybe he is, have to read the whole thing.)

but there is a tricky concept of epicycles in the history of science. they were neither exactly wrong, or incomplete. dont recall who exactly made this point carefully in the literature, but semiclassical pointed this out to me in a long/ engaging/ extended online chat on the subject once. it is possible to approximate elliptical orbits with epicycles to roughly an arbitrary degree of accuracy. so they are not exactly incorrect. and epicycle theory is not exactly incomplete either, if the only goal is calculation! but if the goal is human understanding of the reality of the universe, they fall short, they are incomplete. however, this is not something that can be mathematically proven. the epicycle theory could in some limit be highly accurate.

so QM in terms of epicycles points to something else. it relates to the concept of occams razor, 1st articulated by a 13th century monk.[62] this principle is sometimes cited in scientific and physical contexts. am not a blind adherent to this principle, but think it has some validity/ utility in some contexts, and possibly in others, a red herring. it does relate to the modern concept of falsifiability devised by popper.[63]

but as the epicycle example shows, falsifiability may be too strong a criteria to crucially separate scientific theories of reality. occams razor seems more applicable here. occams razor may be problematic in physics and science, as facebook might say of relationships “its complicated”.[64][65] my feeling is the main problem is that simplicity is a human concept. how can one judge whether 1 of 2 concepts is more simple? that can be very subtle. it seems reasonable in some cases to make a case one way or the other.

but, now need to make that case. many terms have been put forth for describing the measurement of qubits, many of the key ones already mentioned. scientists/ physicists do not currently find any glaring contradictions with the theory, although one might argue the measurements are now so fine as to be similar to attempting to find angels dancing on pins, so to speak.

my thinking is that this is due to the standard QM theory being extended in a way that might be called “scope creep” or “mission creep”.[66][67] various high-precision experiments are now being done on qubits and the practitioners themselves indicate they dont see any anomaly with the theory. but it is their JOB to make sure there are no anomalies. a measurement anomaly to a physicist causes significant cognitive dissonance.

so imagine in a sense a so-called “physicist” (its a modern term that is newer than epicycles) describing planetary trajectories in terms of epicycles. here the “scope/ mission creep” is adding on multiple epicycles to improve accuracy, and further creep in the idea that the theory accurately depicts reality.

the basic problem here is what might be called the “center of the theory”. the theory while workable and even accurate does not recognize the heliocentric universe. so what is the analogy to QM? it is that there is some other central feature that is being missed that the theory fails to “orient around”. that central feature, apparently, is the REAL stochastic component, the collapse of the wavefunction as a fundamental property of matter, ie, basically objective collapse theory. quantum trajectory theory seems to be making correct predictions, but maybe its still missing something central.

eg if a new fundamental constant of matter can be discovered/ replicated specifically via the new theory, and not with the prior one, that would be nearly experimental proof the standard theory is an epicycle theory along with schroedingers cat, wigners friend, and everetts multiverses etc! one might object that these were never part of QM per se, but is that really the case? if a theory is an epicycle theory, isnt it EXPECTED there might be a lot of confusing metaphysical baggage accruing around it, as increasingly desperate interpretations proliferate and fail to really stick? isnt it circumstantial evidence, an indicator of an epicycle theory accruing?

so am in basic agreement with Bassi, Gisin et al that standard QM theory seems to be incomplete. however, quantum trajectory theory seems to amend/ extend/ revise standard QM theory substantially in ways that are not easy to extricate. at this point, at such high experimental precision/ finesse, it may even be making “correct predictions” for what are basically CSL type phenomena. in that case, the theory has crossed into an “epicycle” like situation. is that really the case? further development/ investigation is needed, but myself, think there is some conceivable/ real possibility at this point.

  • [1] SE quantum computing classical channel dialog on Minev experiment
  • [2] Is the recent Nature paper by Minev et al. evidence of new physics? / glS
  • [3] quantum tomography, wikipedia
  • [4] weak measurement, wikipedia
  • [5] quantum zeno effect, wikipedia
  • [6] Rabi cycle, wikipedia
  • [7] quantum beats, wikipedia
  • [8] The Quantum Challenge: Modern Research on the Foundations of Quantum Mechanics: Modern Research on the Foundations of Quantum Mechanics (Physics and Astronomy) 2nd Edition
  • [9] jaynes cummings model for 2-level atom, wikipedia
  • [10] quantum thermodynamics, wikipedia
  • [11] quantum dissipation, wikipedia
  • [12] quantum stochastic calculus, wikipedia
  • [13] stochastic quantum mechanics, wikpedia
  • [14] quantum fluctuation, wikipedia
  • [15] open quantum system, wikipedia
  • [16] quantum optics, wikipedia
  • [17] quantum trajectory theory, wikipedia
  • [18] quantum stochastic calculus trajectories, wikipedia
  • [19] quantum jump method, wikipedia
  • [20] The Quantum Theory That Peels Away the Mystery of Measurement/ Ball, quanta
  • [22] Howard Carmichael, wikipedia
  • [23] objective collapse theories, wikipedia
  • [24] GRW collapse theory, wikipedia
  • [25] continuous spontaneous localization model, wikipedia
  • [26] N Gisin paper search, arxiv
  • [27] THOUGHT POLICE – ON ARXIV? By Nicolas Gisin
  • [28] Collapse. What else? / Gisin
  • [29] Models of Wave-function Collapse, Underlying Theories, and Experimental Tests/ Bassi
  • [30] The Rebel Physicist on the Hunt for a Better Story Than Quantum Mechanics
  • [31] The GRW flash theory: a relativistic quantum ontology of matter in space-time? / Esfeld, Gisin
  • [32] Manipulating atoms with photons/ Tannoudji, Dalibard 32p
  • [33] Manipulating atoms with photons, 13p
  • [34] Robust unravelings for resonance fluorescence/ Wiseman, Brady
  • [35] Classical Robustness of Quantum Unravellings/ Atkins, Wiseman
  • [36] Quantum State Diffusion: from Foundations to Applications/ Gisin, Percival
  • [37] Are there quantum jumps? / Neumaier
  • [38] Plenio/ Knight
  • [39] Plenio/ Knight Rev Mod Phys
  • [40] From Quantum to Classical: the Quantum State Diffusion Model/ Gisin, Brun
  • [41] H. M. Wiseman and G. E. Toombes
  • [42] Quantum jumps in a two-level atom/ Wiseman, arxiv
  • [43] light dressed state, wikipedia
  • [44] Are dynamical quantum jumps detector-dependent? / Wiseman
  • [45] Quantum Jumps Are More Quantum Than Quantum Diffusion/ Daryanoosh, Wiseman
  • [46] Quantum Back-Action of an Individual Variable-Strength Measurement/ Hatridge
  • [47] Real-time quantum feedback prepares and stabilizes photon number states
  • [48] serge haroche 2012 nobel prize review, “controlling photons in a box and exploring the quantum to classical boundary”
  • [49] quantum back action, wikipedia
  • [50] Quantum jumps of light recording the birth and death of a photon in a cavity
  • [51] Quantum trajectories for Brownian motion/ Strunz
  • [52] optical heterodyne detection
  • [53] Introduction to Quantum Noise, Measurement and Amplification/ Clerk et al 96p
  • [54] wikiquote/ einstein
  • [55] dead time, wikipedia
  • [56] quantum efficiency, wikipedia
  • [57] A local hidden variable model of quantum correlation exploiting the detection loophole/ Gisin
  • [58] Quantum nonlocality with arbitrary limited detection efficiency/ Gisin
  • [59] Challenging preconceptions about Bell tests with photon pairs/ Gisin
  • [60] loopholes in bell test experiments
  • [61] are there quantum jumps? Schroedinger, 1952
  • [62] occam’s razor, wikipedia
  • [63] falsifiability/ popper, wikipedia
  • [64] Why Occam’s Razor Doesn’t Apply to Physics/ space.com
  • [65] The Tyranny of Simple Explanations/ Atlantic, Ball
  • [66] scope creep, wikipedia
  • [67] mission creep, wikipedia

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s