We perform analysis of explicit and implicit assumptions in the theorems of J. Von Neumann and J. Bell.
Modern development of quantum technologies based on quantum information theory (in particular, laser-based quantum-information technologies) stimulated the analysis of proposed computational, cryptographic, and teleportational schemes from the viewpoint of quantum foundations. It is evident that not all mathematical calculations performed in the complex Hilbert space can directly be realized in the physical space. Recently, analyzing the original EPR paper, we found that their argument was based on the misuse of von Neumann’s projection postulate. In contrast to von Neumann, Einstein, Podolsky, and Rosen (EPR) applied this postulate to the observables represented by the operators with degenerate spectra. It was completely forbidden by von Neumann’s axiomatics of quantum mechanics. It is impossible to repeat the EPR considerations within the von Neumann’s framework. We analyze here quantum teleportation by taking into account von Neumann’s projection postulate. Our analysis shows that the so-called quantum teleportation is impossible within the von Neumann’s framework.
Our aim is to understand the role of implicit assumptions which has been used by Einstein, Podolsky, and Rosen (EPR) in their famous article [Phys. Rev., 47, 777 (1935)] devoted to the so-called EPR paradox. We found that the projection postulate plays a crucial role in the EPR argument. It seems that EPR made a mistake in this paper — the projection postulate was applied not in its original form (as it has been formulated in von Neumann's book [Mathematical Foundations of Quantum mechanics, Princeton University Press (1955)] but in the form which was later formalized as Lüders' postulate [Ann. Phys., Lpz. 8, 322 (1951)]. Von Neumann's postulate was crucially modified by extending it to observables with degenerate spectra. This modification is the real source of “quantum nonlocality.” The use of the original von Neumann postulate eliminates this problem — instead of (an action at a distance)-nonlocality, we obtain a classical measurement nonlocality, which is related to the synchronization of two measurements (produced on the two parts of a composite system). If one uses correctly von Neumann's projection postulate, no “elements of reality” can be assigned to entangled systems.
Our aim in this paper is to enlighten the possibility to treat quantum mechanics as emergent from a kind of classical physical model, in spite of recent remarkable experiments demonstrating a violation of the Bell inequality. To proceed in a rigorous way, we use the methodology of ontic–epistemic modeling of physical phenomena. This methodology is rooted in the old Bild conception about theoretical and observational models in physics. This conception was elaborated in the fundamental works of Hertz, Boltzmann, and Schrödinger. Our ontic model (generating the quantum model) is of the random field type, prequantum classical statistical field theory (PCSFT). We present a brief review of its basic features without overloading the presentation by mathematical details. Then we show that the Bell inequality can be violated not only at the epistemic level, i.e., for observed correlations, but even at the ontic level, for classical random fields. We devote the important part of the paper to an analysis of the internal energy structure of prequantum random fields and their coupling with the background field of subquantum fluctuations. Finally, we present a unified picture of the microworld based on the composition of prequantum random fields from elementary fluctuations. Since quantum systems are treated as the symbolic representation of prequantum fields, this picture leads to a unifying treatment of all quantum systems as special blocks of elementary fluctuations carrying negligibly small energies.
We consider the old foundational problem of quantum/classical optics – indivisibility of photon. Quantum theory predicts that in experiments on coincidence detection double clicks are impossible (up to noise); on the other hand, the known semiclassical and classical models predict a relatively high rate of coincidence. We present a model of the classical (random) wave type which predicts that in the same way as in quantum optics coincidence of clicks is a rare event. However, this model has a new prediction compared to quantum optics, namely, that the rate of double clicks depends substantially on the discrimination threshold of a detector. We propose to perform new detailed tests to check the discrimination threshold dependence predicted by our model. In experiments on coincidence detection performed to date, for example, the Grangier experiment does not contain statistical data on the threshold dependence.
Already Schrödinger tried to proceed towards a purely wave theory of quantum phenomena. However, he should give up and accept Born’s probabilistic interpretation of the wave function. A simple mathematical fact was behind this crucial decision. The wave function of a composite system S = (S _{1}, S _{2}) belongs to the tensor product of two L2 spaces and not to their Cartesian product. It was impossible to consider it as a vector function ψ(x) = (ψ _{1}(x), ψ _{2}(x)), x ∈ R ^{3}. Here we solved this problem. It is shown that there exists a mathematical formalism that provides a possibility to describe composite quantum systems without appealing to the tensor product of the Hilbert state space, and one can proceed with their Cartesian product. It may have important consequences for the understanding of entanglement and applications to quantum information theory. It seems that “quantum algorithms” can be realized on the basis of classical wave mechanics. However, the interpretation of the proposed mathematical formalism is a difficult problem and needs additional studies.
We show that the basic equation of the theory of open systems, the Gorini-Kossakowski-Sudarshan-Lindblad equation, as well as its linear and nonlinear generalizations have a natural classical probabilistic interpretation - within the framework of prequantum classical statistical field theory. The latter gives an example of the classical probabilistic model (with random fields as subquantum variables) reproducing the basic probabilistic predictions of quantum mechanics.
Based on the prequantum classical statistical field theory (PCSFT), we present the results of numerical simulations of a model with hidden variables of the field-type reproducing probabilistic predictions of quantum mechanics (QM). PCSFT is combined with measurement theory based on detectors of the threshold type. The latter describes discrete events corresponding to the continuous field model. Using numerical modeling, we show that the classical Brownian motion (the Wiener process valued in the complex Hilbert space), producing clicks when approaching the detection threshold, gives the probabilities of detection predicted by the QM formalism (as well as PCSFT). This numerical result is important, since the transition from PCSFT to the threshold detection has a complex mathematical structure (within the framework of classical random processes) and it was modeled only approximately. Also we perform numerical simulation for the PCSFT value of the coefficient of the second-order coherence. Our result matches well with the prediction of quantum theory. Thus, in contrast to a semiclassical theory, PCSFT cannot be rejected as a consequence of measurements of g((2))(0). Finally, we analyze the output of the recent experiment performed in NIST questioning the validity of some predictions of PCSFT.
Recently it was shown that the main distinguishing features of quantum mechanics (QM) can be reproduced by a model based on classical random fields, the so-called prequantum classical statistical field theory (PCSFT). This model provides a possibility to represent averages of quantum observables, including correlations of observables on subsystems of a composite system (e.g., entangled systems), as averages with respect to fluctuations of classical (Gaussian) random fields. We consider some consequences of the PCSFT for quantum information theory. They are based on our previous observation that classical Gaussian channels (important in classical signal theory) can be represented as quantum channels. Now we show that quantum channels can be represented as classical linear transforms of classical Gaussian signals.
Last year, the first experimental tests closing the detection loophole (also referred to as the fair sampling loophole) were performed by two experimental groups, one in Vienna and the other one in Urbana-Champaign. To violate the Bell-type inequalities (the Eberhard inequality in the first test and the Clauser–Horne inequality in the second test), one has to optimize a number of parameters involved in the experiment (angles of polarization beam splitters and quantum state parameters). We study this problem for the Eberhard inequality in detail, using the advanced method of numerical optimization, namely, the Nelder–Mead method.