1. Introduction
Quantum mechanics (QM), notable for its empirical success, encounters foundational debates, often focusing on phenomena such as wavefunction collapse, the unique role of time, entropy in measurement processes, and the quantum-classical transition. The canonical formalism of QM is based on five principal axioms[
1,
2]:
- Axiom 1
State Space: Each physical system corresponds to a complex Hilbert space, with the system’s state represented by a ray in this space.
- Axiom 2
Observables: Physical observables correspond to Hermitian operators within the Hilbert space.
- Axiom 3
Dynamics: The time evolution of a quantum system is dictated by the Schrödinger equation, where the Hamiltonian operator signifies the system’s total energy.
To bridge theory and experiment, QM introduces two additional postulates:
- Axiom 4
Measurement: The act of measuring an observable results in the system’s transition to an eigenstate of the associated operator, with the measurement value being one of the eigenvalues.
- Axiom 5
Probability Interpretation: The likelihood of a specific measurement outcome is determined by the squared magnitude of the state vector’s projection onto the relevant eigenstate.
Contrastingly, statistical mechanics (SM), the other statistical pillar of physics, derives its probability measures through entropy maximization, informed by the empirical finding that energy measurements at thermodynamic equilibrium average to a specific value (
):
To maximize entropy while satisfying this constraint, the theory uses a Lagrange multiplier approach[
3].
This gives rise to the well-known Gibbs measure.
Inspired by Gibbs’ methodological innovation in statistical mechanics, this study introduces a novel formulation of QM that tackles its foundational aspects through a process of systematic derivation rather than axiomatic stipulation. While fully compatible with the canonical axioms of QM, which are derived as theorems within this framework, the Prescribed Observation Problem (POP) formulation distinguishes itself by directly incorporating the relevant empirical constraints into the derivation process. The subsequent Results section outlines this formal mathematical procedure, emphasizing a core observation – the phase anti-constraint – as a fundamental empirical constraint, followed by entropy maximization. This approach theorematically resolves to QM’s axioms, intrinsically integrating the theory’s empirical basis into its foundation.
The Discussion section explores the implications of this derivation, drawing parallels to Gibbs’ transformative impact on statistical mechanics, with the aim of clarifying and addressing enduring debates within QM. The additional empirical basis in the POP formulation allows for the promotion of axioms to theorems, increasing the stringency of the theory and permitting the ruling out of alternative interpretations. By founding theory construction on measurement outcomes and entropy maximization techniques, we propose the first fully consistent QM formulation that invalidates competing interpretations, offering a compelling resolution to the interpretational and foundational dilemmas that have long plagued QM.
2. Results
In statistical mechanics, the founding observation is that energy measurements of a thermally equilibrated system tend towards an average value. Comparatively, in quantum mechanics (QM), the founding observation involves the interplay between the systematic elimination of complex phases in measurement outcomes and the presence of interference effects in repeated measurement outcomes. To represent this observation, we introduce the
Phase Anti-Constraint:
At first glance, this expression may seem to reduce to a tautology equating zero with zero, suggesting it imposes no restriction on energy measurements. However, this appearance is deceptive. Unlike a conventional constraint that limits the solution space, this expression serves as a formal device to expand it, allowing for the incorporation of complex phases into the probability measure. The expression’s role in broadening, rather than restricting, the solution space leads to its designation as an ”anti-constraint.”
We recognize that the anti-constraint may initially appear abstract at this stage of the derivation. However, its significance will become evident upon the completion of the optimization problem. For the moment, this expression can be conceptualized as the correct expression that, when incorporated as an anti-constraint within an entropy-maximization problem, theorematically resolves into the axioms of quantum mechanics
1.
In the Prescribed Observation Problem (POP) framework, a ’prescribed observation’ serves as the foundational constraint for entropy maximization. Constructing this constraint involves a comprehensive synthesis of empirical evidence, which is obtained by considering the ensemble (denoted as ) of all possible measurement outcomes of a quantum system. To gather this evidence, repeated measurements are performed on a collection of identically-prepared quantum systems. The founding observation then encapsulates the empirical data from the ensemble to establish the foundational constraint for entropy maximization.
Our next procedural step involves solving the corresponding Lagrange multiplier equation, mirroring the methodology employed in statistical mechanics. For that, we deploy the following Lagrange multiplier equation
2:
We solve this prescribed observation problem as follows:
The partition function, is obtained as follows:
Finally, the probability measure is:
Though initially unfamiliar, this form effectively establishes a comprehensive formulation of quantum mechanics, as we will demonstrate.
Upon examination, we find that phase elimination is manifestly evident in the probability measure: since the trace evaluates to zero, the probability measure simplifies to classical probabilities, aligning precisely with the Born rule’s exclusion of complex phases:
However, the significance of this phase elimination extends beyond this mere simplicity. As we will soon see, the partition function Z gains unitary invariance, allowing for the emergence of interference patterns and other quantum characteristics under appropriate basis changes.
We will begin by aligning our results with the conventional quantum mechanical notation. As such, we transform the representation of complex numbers from
to
. For instance, the exponential of a complex matrix is:
Then, we associate the exponential trace to the complex norm using
:
Finally, substituting
analogously to
, and applying the complex-norm representation to both the numerator and to the denominator, consolidates the Born rule, normalization, and initial state probability into a coherent probability measure:
We are now in a position to explore the expanded solution space that "POPs-out" of the optimization problem.
The wavefunction is delineated by decomposing the complex norm into a complex number and its conjugate. It is then visualized as a vector within a complex n-dimensional Hilbert space. The partition function acts as the inner product. This relationship is articulated as follows:
where
We clarify that represents the probability associated with the initial preparation of the wavefunction, where .
We also note that Z is invariant under unitary transformations.
Let us now investigate how the axioms of quantum mechanics are recovered from this result:
The entropy maximization procedure inherently normalizes the vectors with . This normalization links to a unit vector in Hilbert space. Furthermore, as the POP formulation of QM associates physical states with its probability measure, and the probability is defined up to a phase, we conclude that physical states map to Rays within Hilbert space. This demonstrates Axiom 1.
-
In
Z, an observable must satisfy:
Since , then any self-adjoint operator satisfying the condition will equate the above equation, simply because . This demonstrates Axiom 2.
-
Upon transforming Equation
23 out of its eigenbasis through unitary operations, we find that the energy,
, typically transforms in the manner of a Hamiltonian operator:
The system’s dynamics emerge from differentiating the solution with respect to the Lagrange multiplier. This is manifested as:
Which is the Schrödinger equation. This demonstrates Axiom 3.
-
From Equation
23 it follows that the possible microstates
of the system correspond to specific eigenvalues of
. An observation can thus be conceptualized as sampling from
, with the post-measurement state being the occupied microstate
q of
. Consequently, when a measurement occurs, the system invariably emerges in one of these microstates, which directly corresponds to an eigenstate of
. Measured in the eigenbasis, the probability distribution is:
In scenarios where the probability measure
is expressed in a basis other than its eigenbasis, the probability
of obtaining the eigenvalue
is given as a projection on a eigenstate:
Here, signifies the squared magnitude of the amplitude of the state when projected onto the eigenstate . As this argument hold for any observables, this demonstrates Axiom 4.
Finally, since the probability measure (Equation
21) replicates the Born rule, Axiom 5 is also demonstrated.
Revisiting quantum mechanics with this perspective offers a coherent and unified narrative. Specifically, the phase anti-constraint (Equation
4) is sufficient to entail the foundations of quantum mechanics (Axiom 1, 2, 3, 4 and 5) through the principle of entropy maximization. Equation
4 becomes the formulation’s sole axiom, and Axioms 1, 2, 3, 4, and 5 now pop out as theorems.
Here is the corrected and refined version of the provided text, with the addition of an extra item regarding the stringency of the formulation and its ability to rule out interpretations:
3. Discussion
The POP framework, drawing on the principles of entropy maximization pioneered by Josiah Willard Gibbs, introduces an innovative approach to the construction of physical theories. Distinct from conventional formulations that rely on axiomatic declarations, the POP framework derives fundamental principles as theorems entailed by a prescribed observation. This methodology presents several notable advantages:
Empirical Grounding: The core theorems of the theory are directly derived from a prescribed observation, ensuring that the theoretical framework is strictly anchored in empirical reality.
Internal Consistency: The derivation process itself is an essential aspect of the theory, providing insight into its genesis and ultimate justification, ensuring a high degree of internal consistency.
Unifying Basis: Echoing Gibbs’ contributions, the POP framework offers a coherent interpretative basis that is applicable across various physical theories, facilitating a greater unity in physics.
Optimal Inference: The theoretical constructs, formulated as solutions to optimization problems, inherently represent the least biased representations achievable within the constraints of the available data.
Adaptability: The framework allows for seamless theoretical adjustments in response to new empirical findings without revising its foundation. This adaptability, rooted in its use of founding observations for theory construction, ensures the framework can smoothly integrate new data, potentially making it a dynamic tool that evolves alongside scientific progress.
Interpretational Stringency: The additional stringency of the POP formulation, containing its own empirical basis and derivation procedure, can be used to rule out a large class of interpretations that are inconsistent with the theorem, its basis and its derivation, providing a more definitive resolution to interpretational ambiguities.
This discussion will explore the interpretive implications of the POP approach and its potential to resolve longstanding debates in quantum mechanics.
3.1. QM and SM as inferred Solutions: A Shared Interpretive Foundation
The POP framework offers a shared interpretive foundation valid across the domains of both SM and QM.
- (1)
-
A Prescribed Observation as the Sole Axiom:
-
Statistical Mechanics: The founding observation in SM is that energy measurements of a system in thermodynamic equilibrium converge to an average value (
). This observation is prescribed as the core constraint that leads to the derivation of the Gibbs measure, recognized as the least biased probability measure consistent with the constraint. The theory is encapsulated in the following solution:
-
Quantum Mechanics: The founding observation in the POP formulation of QM identifies the systematic elimination of complex phases and the occurrence of interference effects in measurement outcomes. This observation is prescribed as a constraint, enabling the application of entropy maximization to derive a probability measure aligned with the principles of quantum mechanics. Consequently, traditional axioms of QM are reformulated as theorems, demonstrating that QM, akin to SM, is fundamentally informed by a prescribed observation. The theory is encapsulated in the following solution:
- (2)
-
Ontological Status of the Wavefunction:
- Statistical Mechanics: In SM, the Gibbs measure serves as a predictive instrument, facilitating statistical forecasts in situations where full information about the system’s state is unavailable. This role does not ascribe to it any inherent ontological significance.
- Quantum Mechanics: The POP formulation of QM conceptualizes the wavefunction similarly to SM’s Gibbs measure, as a construct for probabilistic forecasting. This interpretation alleviates the wavefunction from ontological responsibilities.
- (3)
-
Rationalization of the Born Rule:
- Statistical Mechanics: The derivation of the Gibbs measure from empirical observations affirms its theoretical solidity, eliminating any notion of arbitrariness.
- Quantum Mechanics: The POP formulation of QM elucidates the Born rule as a natural outcome of entropy maximization, integrating it seamlessly into quantum theory. This reimagines the Born rule as a theorem derived from the theory’s foundation.
- (4)
-
Role of Entropy in Measurements
- Statistical Mechanics: The inherent entropy in the Gibbs measure reflects the uncertainty about a system’s exact microstate configuration, symbolizing the informational limitations on complete system knowledge.
Specifically, the entropy is given as follows:
-
Quantum Mechanics: In the POP formulation of QM, entropy serves to quantify the uncertainty associated with measurement outcomes. For instance, a photon polarized as
, the entropy for measurements within its eigenbasis is
However, the measurement entropy applies to measurements both within and outside the eigenbasis. For measurements outside the eigenbasis, represented as
where
and
are coefficients in the rotated basis,
may not equal
S. The measurement entropy of the POP formulation of QM, foundational in theory, matches the quantity of random information generated in quantum cryptography[
6] where a string of random bits can be generated from quantum mechanical measurements.
- (5)
-
Emergence of Time:
- Statistical Mechanics: Temperature emerges in SM from the collective interactions of particles, each occupying discrete energy states. When the system’s average energy is constrained and entropy is maximized, the Lagrange multiplier is derived, which is inversely related to temperature. This relation underscores temperature as an intensive property, arising not from individual particles but from the ensemble’s overall statistical behavior. Unlike direct energy measurements that perturb specific microstates, an ideal thermometer equilibrates with the system, reflecting the macroscopic thermal state without disturbing or selecting individual microstates. This equilibrium allows the thermometer to measure temperature as a collective property, indicative of the system’s overall energy distribution rather than discrete particle states.
- Quantum Mechanics: The POP formulation of QM reconceptualizes time as a Lagrange multiplier, , challenging its traditional portrayal as an external parameter. This approach likens the measurement of time to that of temperature by an ideal thermometer in statistical mechanics. Clocks, in this analogy, do not force the quantum system into a specific "time eigenstate" but rather synchronize with the system’s evolution to measure time as an emergent, systemic property resulting from the system’s statistical configuration. This perspective positions time as an emergent, quantifiable property within quantum mechanics, akin to how temperature emerges from macroscopic equilibration in statistical mechanics. Time, thus understood, governs the probabilistic evolution of quantum states through unitary transformations, paralleling the regulatory function of temperature in determining energy state distributions at thermodynamic equilibrium.
- (6)
-
Microscopic Equation of State
- Statistical Mechanics: By taking the total derivative of the entropy (Equation 36), we derive a macroscopic equation of state in the form , which quantifies thermodynamic cycles involving transitions between states of thermodynamic equilibrium using energy (E) and temperature (T) as macroscopic variables. Such cycles typically involve:
- (a)
The system transitions from state (, ) to (, ).
- (b)
A return to the original state (, ), characterizing these cycles by entropy changes, usually an increase, in line with the second law of thermodynamics.
Quantum Mechanics: In the POP formulation of QM, unlike SM where the average energy changes with respect to the temperature, there is no macroscopic equation of state because the expectation values of "macroscopic variables" such as energy expectation value and other observable expectation values are unitarily invariant. As such, they do not depend on time, the intrinsic parameter. Instead, the Schrödinger equation, derived from the differentiation of the wavefunction with respect to the Lagrange multiplier t (Equation 29), acts as a microscopic equation of state. This equation underpins the temporal evolution of quantum states via unitary transformations, facilitating the forward and backward progression (from to , or from to ) of quantum states.
In the context of quantum mechanics, the Von Neumann entropy serves as another macroscopic variable that remains invariant under unitary transformations described by the Schrödinger equation. Given a quantum state represented by a density matrix
, the Von Neumann entropy is defined as:
The existence of a microscopic equation of state in QM grants a unique form of internal freedom, as allowed by the system’s macroscopic description. It enables the internal configurations of quantum states to change without affecting the macroscopic variables.
However, it is crucial to distinguish between the invariance of the Von Neumann entropy under unitary transformations and the changes in the measurement (Shannon) entropy during the measurement process. The measurement entropy, which will be discussed in the following section, quantifies the uncertainty associated with the outcomes of a specific measurement and can change depending on the choice of measurement basis (which can change with time). In contrast, the Von Neumann entropy characterizes the overall uncertainty of the quantum state and remains constant under unitary evolution when no measurements are performed.
- (7)
-
Chronodynamical Transformation
-
Statistical Mechanics: In SM, conceptual tools such as Maxwell’s Demon and the Szilárd Engine elucidate the interaction between information and thermodynamics. Maxwell’s Demon, a thought experiment, seemingly contravenes the second law of thermodynamics by selectively reducing system entropy without energy expenditure. Yet, further analysis, particularly by Charles Bennett[
7] , shows that the demon’s information processing incurs an entropy cost, effectively reconciling the apparent paradox with the second law. Similarly, the Szilard Engine demonstrates that converting information into work does not violate thermodynamic principles, given the overall entropy, including information erasure, is considered, thereby preserving the net entropy of the universe.
- Quantum Mechanics: In the POP formulation of QM, the challenge of entropy reduction in SM is paralleled by the phenomenon of wavefunction collapse, which theoretically reduces non-zero Shannon entropy to zero upon producing a measurement outcome. Utilizing the conceptual frameworks of Maxwell’s Demon and the Szilard Engine, we critically examine the traditional notion of wavefunction collapse within quantum measurements, focusing on the entropy implications of a measurement apparatus (MA):
- (a)
MA as a Maxwell Demon: In this scenario, the MA acts to reduce entropy to zero during measurement, mirroring the role of Maxwell’s Demon. This suggests a reduction of entropy without an equivalent exchange, posing a challenge to the principles of information conservation and the laws of statistical mechanics.
- (b)
MA as a Szilard Engine: Here, the necessity for the MA to possess prior knowledge of outcomes to counteract the reduction in wavefunction entropy during measurement mirrors the Szilard Engine. However, this condition implies the existence of hidden variables, which is contradicted by Bell’s inequality.
- (c)
MA as an Entropy Reshuffler: In this case, the reduction of entropy in the wavefunction by the MA is compensated by an equivalent increase elsewhere, suggesting reversible transformations. This scenario, where no net information is created, contrasts with the concept of irreversible wavefunction collapse described by Axiom 4.
The traditional interpretation of the MA, as potentially violating thermodynamic and quantum principles, calls for a critical reassessment.
In response, we propose a "chronodynamical transformation," which applies thermodynamic principles to the temporal dimension of quantum measurement, conceptualizing the measurement as an information-preserving transformation. This approach ensures coherence and consistency in handling quantum information and measurement outcomes, without introducing a new ontological layer, by adhering to the following principles:
- (a)
Future Hidden Variables: We introduce hidden variables in the system’s future state, and suggest that outcomes of quantum measurements are determined by them. These variables are formal devices that help reconcile the flow of information and entropy in quantum measurements without contradicting the results of Bell’s inequality (since they are in the future, not in the past), offering a novel way to circumvent traditional constraints. As they are in the future these variables do not represent a new ontological layer — they are equivalent to throwing the dice whenever a measurement outcome is registered. However, they facilitate correct bookkeeping of entropic changes.
- (b)
The Flow of Time Becomes the Measurement Apparatus: In this model, the progression of time itself is responsible for transforming future hidden variables into observed physical states. This process redistributes the potential entropy from the wavefunction’s measurement outcomes to the actual states observed in the universe, ensuring the conservation of total information.
- (c)
Time-based Szilard Analogy: Time acts as a continuous engine processing future-encoded outcomes via a time-analogous Szilard process and manifesting them within existing system degrees of freedom.
- (d)
Reduced Role of Conventional Measurement Devices: In this proposal, the traditional function of measurement devices as agents of wavefunction collapse is reevaluated. Instead, these devices primarily act as transcoders, converting quantum information into formats accessible to humans. For example, an audible "click" from a measurement device does not trigger a measurement in the quantum system. Rather, this sound indicates the device’s function in presenting an outcome, which emerges from the system’s interaction with temporally defined hidden variables, to the observer. This interpretation shifts the perceived role of measurement devices from direct participants in the collapse process to intermediaries that convert the outcomes from one representation to another.
- (e)
Alignment with Physical Principles: The proposal aligns with the foundational principles of thermodynamics, information theory, and quantum mechanics, and –because the hidden variables are in the future– respects the limitations imposed by Bell’s inequality. It presents a unified narrative that integrates the concept of time into quantum measurement evolution, emphasizing the preservation of information and entropy.
To illustrate with an example, consider a polarized photon described by the state . Exploring a measurement transformation from time to , at the system embodies 1 bit of measurement entropy and possesses 1 bit of future hidden variable information, resulting in a net entropy of 0 bits. By , the system exhausts 1 bit of future hidden variable information allowing it to transition to 0 bit of measurement entropy, maintaining overall entropy balance. This transformation exemplifies the chronodynamical transformation approach, illustrating how quantum systems evolve over time within a framework that preserves the fundamental principles of quantum mechanics, thermodynamics cycles and transformations, and information conservation.
- (8)
-
The Arrow of Time
- Statistical Mechanics: SM delineates a statistically-favored arrow of time through entropy, adhering to the second law of thermodynamics that posits an inevitable increase in entropy during macroscopic state transformations, signifying the irreversibility of natural processes. This principle is vividly demonstrated in thermodynamic cycles like the Carnot cycle, where transitions between equilibrium states result in an overall entropy increase, mirroring the unidirectional flow of time towards greater disorder. In SM, the arrow of time is thus unequivocally linked to entropy, reinforcing the macroscopic phenomena of time’s irreversible progression as a natural tendency towards increased entropy.
- Quantum Mechanics: The POP formulation of QM elucidates a sophisticated understanding of time’s arrow, also grounded in entropy but with distinctions that more closely mirror human experience of time. It suggests that while future inference of measurement outcomes is hidden by entropy, the backward inference of quantum states is open to reconstruction via calculation. This conceptualization aligns with everyday experience—where the present is directly experienced, the past is reconstructed or inferred based on memory and evidence, and the future remains largely unknown, accessible only through predictions based on current knowledge and the elimination of some possibilities within a broad ensemble of potential outcomes.
- (a)
Backward Time Inference: This inferential process, facilitated by known measurement outcomes of the present, allows for the reconstruction of past quantum systems through calculations involving the Schrödinger equation. This is possible since no new measurement outcomes are produced as we infer backwards, and the inference operates from the basis of established outcomes. Thus, the present holds all necessary information for this reconstruction. This mirrors our capacity to infer or reconstruct the past from present knowledge and evidence.
- (b)
Forward Time Experience: Conversely, the experience of moving forward in time is marked by a transition into the unknown, with future outcomes realized as they occur. Since the measurement entropy is greater than zero, this progression faces an intrinsic "entropy barrier" when attempting to predict realized future outcomes from the present, underscoring the asymmetry between our inability to infer the future and our ability to infer the past.
3.2. A Contention-Free Formulation of QM
The POP reformulates QM by prioritizing measurement outcomes and inferential techniques over traditional axiomatic declarations. This approach aims for "inferential completeness", deriving the existence and properties of quantum entities directly from empirical observations rather than presupposing them through ontological assertions. By inferring quantum theory, the POP formulation seeks to mitigate theoretical contention and philosophical disputes.
-
Presentism and Inferencial Completeness
The framework fosters a form of "presentism," where reality is defined by the presently available measurement outcomes. From this minimal basis, we aim for "inferential completeness." The concept of time as a continuous parameter (and having a past, present and future states) along with its conjugated unitary transformations (yielding quantum systems) are inferred using entropy maximization techniques. This approach achieves "inferential completeness" from the presently available measurement outcomes. It enables a coherent interpretation of temporal evolution in quantum systems, infused with entropy and consistent with the human experience of time. In it, the present is directly experienced as it yields comprehensive information through known measurement outcomes, while the past and future are reconstructed or anticipated based on present knowledge, respectively. This contrasts with the "block universe" perspective, which considers time (past, present, and future) and space as ontological entities. From the POP perspective, the ontology of the "block universe" hypothesis is deemed inferentially redundant.
-
Flexible Adaptation to the Evidence
The inference of the universe’s least biased theoretical model, whether it be quantum or classical, is directly shaped by the nature of available measurement outcomes. In this context, the distinction between a "complete measurement ensemble"—a comprehensive datasets of measurement outcomes requiring multiple copies of identically-prepared systems that facilitate the derivation of a wavefunction through entropy maximization—and singular events (for instance the "Oh-My-God" particle) becomes crucial. Singular events, despite their quantum origins, provide isolated snapshots of phenomena and, if they significantly outnumber complete measurement ensembles, limit our ability to infer the quantum mechanics at play in nature. Applying entropy maximization techniques to a singular event entails a single element "probability measure" akin to a classical description. Therefore, in a scenario where the sequence of measurement outcomes that defines the present state of the universe is heavily skewed towards singular events rather than complete measurement ensembles for each given system, we are compelled to infer a primarily classical history for the universe. This inference is not a dismissal of quantum mechanics but a reflection of the empirical evidence predominantly available to us. The principle of seeking the least biased model, grounded in the available data, naturally leads to a classical interpretation under these conditions, yet allows for the "discovery" of quantum mechanics when complete measurement ensembles are available within the measurement outcomes such as in controlled laboratory conditions. Consequently, the puzzling transition between quantum and classical descriptions may be a simple consequence of inferring the least biased model starting from complete measurement ensembles, single occurrences, or anything in between, depending on the occasion and availability of evidence. This perspective offers a compelling explanation for the quantum-classical transition, rooted in the nature of the available empirical evidence and the application of the POP framework’s inferential methodology.
-
Empirical Evidence of Measurement Entropy
Quantum randomness generation[
8], where outcomes are inherently unpredictable, provides compelling empirical support for our framework. Consider a standard setup: an electron, polarized along the vertical axis, is measured at a 45-degree angle. Each 45-degree angle measurement generates a random outcome with equal probability. This randomness, foundational in theory, underpins practical applications such as quantum cryptography[
6], where unpredictability is essential.
The POP formulation of QM emphasizes Shannon entropy as the key tool to quantify the information content in these random measurement outcomes. Importantly, quantifying the information within experimentally generated sequences aligns precisely with predictions stemming from this methodology, demonstrating its empirical validity for a process fundamental to the understanding and applications of quantum physics.
Let us now compare the conventional von Neumann entropy to our measurement entropy involving Shannon entropy:
-
Von Neumann Entropy for a Pure State:
Given a pure state
, the density matrix is
. The von Neumann entropy
is:
For pure states, , reflecting a lack of statistical uncertainty about the system.
-
Shannon Entropy for a 45 Measurement:
Assuming equal probability (0.5) for both outcomes after measuring an electron at a 45
angle, the Shannon entropy
H is:
-
Shannon Entropy for an Arbitrary Angle:
When measuring an electron at an arbitrary angle, the probabilities of the outcomes may be unequal, resulting in fractional bits of information. For example, if the probabilities are 0.9 and 0.1, the Shannon entropy
H is:
-
Shannon Entropy for a 90 Measurement on a Vertically Polarized Electron:
For a vertically polarized electron measured at a 90
angle, the outcome becomes deterministic rather than random. Therefore, the Shannon entropy is:
The distinction between von Neumann entropy and Shannon entropy becomes evident when quantifying the information generated in quantum randomness experiments. While von Neumann entropy assigns zero entropy to a pure state, the POP formulation of QM effectively quantifies the unpredictability inherent in scenarios like the 45-degree measurements. By explicitly incorporating Shannon entropy as the measure of measurement uncertainty, it seamlessly captures the true informational content, including fractional bits generated by measurements at different angles.
Notably, one can generate an indefinite amount of random information from a single electron by repeatedly measuring it along different axes. This raises the question of where this seemingly endless pool of information comes from. The chronodynamical transformation, proposed in the POP formulation of QM, addresses this issue by suggesting that the random sequence of bits is read from future hidden variables, ensuring the conservation of total information whilst respecting the constraints imposed by Bell’s inequality. This perspective offers a compelling resolution to the apparent paradox of generating an indefinite amount of random information from a single quantum system.
The POP formulation of QM embeds the probability measure quantifying the measurement entropy, as verified by quantum randomness generation experiments, from the outset. Moreover, it provides a coherent explanation for the origin of the seemingly unlimited information content in quantum systems through the concept of chronodynamical transformations.
-
Automatic Mitigation of Ontological Misrepresentations
The canonical formulation of QM places infinite-dimensional Hilbert spaces on the same ontological footing as finite-dimensional Hilbert spaces. This assertion has led to debates among physicists and philosophers[
9], with some questioning the validity of treating space as continuous[
10], arguing that it may be fundamentally discrete, given that only a finite number of measurements of the wavefunction’s position can be made at any given time. Others maintain that infinite-dimensional Hilbert spaces provide a fundamental and ontologically real description of the quantum wavefunction in space.
Let us now investigate how the POP formulation of QM infers infinite-dimensional Hilbert spaces. We recall that in the POP formulation of QM the construction of a prescribed observation involves performing measurements on multiple copies of identically prepared quantum systems to obtain a comprehensive set of measurements and construct the ensemble
. Consistently with this approach, let us now consider the extension of the entropy maximization problem from the discreet
to the continuum ∫:
where
n is the number of subintervals,
is the width of each subinterval,
is a point within the i-th subinterval , often chosen to be the midpoint .
is a factor required to transform the energy
into an energy density
, required for integration
3.
By using this limit, we have applied the definition of the Riemannian sum to the Lagrange equation, which yields an integral:
Solving this optimization problem yields a wavefunction defined in an infinite-dimensional Hilbert space and parametrized over the continuum.
What does this tell us about the ontology of infinite-dimensional Hilbert space?
The subtlety is in the limiting process. By introducing the limit to define the Riemann sum and obtain the integral, we extend the ensemble size of measurement outcomes to the continuum. However, in the lab we do not perform measurements on uncountably-many identically prepared quantum systems; we only test a finite number.
Due to laboratory limitations, the prescribed observation associated to the continuum is obtained by performing a finite number of measurements on n identically prepared systems (n being finite) and then assuming, by induction, that the observed patterns hold for up to infinity. With this assumption, we mathematically complete the set of discrete measurements, achieving the smoothness of the continuum, and solve the optimization problem to infer an infinite-dimensional Hilbert space.
The goal of this analysis is not to question the existence of infinite-dimensional Hilbert spaces as mathematical entities but to highlight that the POP formulation of QM assigns different ontological statuses to infinite-dimensional and finite-dimensional Hilbert spaces. Specifically, the former requires an induction assumption (encapsulated in the limit of a Riemannian sum) that the latter does not. Thus in the POP formulation of QM, the continuum cannot penetrate the same ontological layer of certainty as the one occupied by the discrete case.
By carefully distinguishing between the mathematical representation and the underlying ontology, the POP framework offers a more transparent and philosophically consistent foundation for quantum mechanics. It acknowledges the utility of infinite-dimensional Hilbert spaces as mathematical constructs while recognizing the induction assumption involved in extending discrete measurements to the continuum. The POP approach, as it embeds the genesis of the theory within its foundation, automatically acquires the tools to mitigate ontological misrepresentations.
-
Making Measurements Maximally Informative
The POP framework’s inference of quantum mechanics through entropy maximization reveals a profound insight: when an observer is given a sequence of measurement outcomes and is free to formulate a theory to explain these outcomes, the theory that makes the measurements maximally informative emerges as the most effective choice for understanding and predicting observable phenomena. Remarkably, this theory resolves to quantum mechanics, which has been highly successful in describing the behavior of physical systems.
To appreciate the significance of this finding, let us consider the spectrum of theories an observer could formulate. At one end of the spectrum lies a deterministic theory, where the outcomes of measurements are completely predictable, given the initial conditions and the laws governing the system. In a deterministic framework, the Shannon entropy associated with measurement outcomes is zero, indicating that no new information is gained from the measurements. In other words, a deterministic theory assigns no information content to measurement outcomes, as they are fully determined by the pre-existing state of the system.
At the other end of the spectrum lies the theory that maximizes the information content of measurements, which is precisely what quantum mechanics does, as inferred through the POP framework. Between these two extremes, there exists a continuum of theories that assign varying levels of information to measurement outcomes. However, the theory that maximizes the information content of measurements is unique and stands out as the optimal choice for the observer seeking to extract the most information from empirical data. This suggests that an observer, given the freedom to formulate a theory based on measurement outcomes, would naturally arrive at quantum mechanics as the most effective choice.
3.3. Ruling Out Some Alternative Interpretations
A common misconception in the foundations of quantum mechanics is that any interpretation consistent with the mathematical formalism, i.e., the axioms, is equally valid. However, the POP framework challenges this notion by demonstrating that the axioms themselves are not fundamental postulates but rather theorems derived from a more foundational empirical basis. As such, for an interpretation to be considered valid, it must not only align with the axioms but also be consistent with its empirical genesis.
The POP framework’s reformulation of quantum mechanics, grounded in inferential reasoning based on measurement outcomes, provides a stringent criterion for assessing the validity of various interpretations. By tracing the logical flow from empirical data to the mathematical structure of quantum theory, the POP approach exposes the inadequacies of interpretations that fail to consider the theory’s empirical genesis.
In this light, interpretations that introduce additional ontological elements or propose mechanisms not directly derived from the foundational empirical constraints can be seen as superfluous and, in some cases, even contradictory to the core principles of quantum mechanics. Furthermore, by recognizing the wavefunction as a derived entity, the POP approach exposes the circularity in interpretations that treat it as a fundamental aspect of reality.
In the following items, we will critically examine several prominent interpretations of quantum mechanics in light of the POP framework’s insights. By evaluating their consistency with the empirical basis and the inferential structure of quantum theory, we will demonstrate how the POP approach can effectively rule out interpretations that fail to meet these criteria, thereby providing a more solid foundation for our understanding of quantum reality.
-
Circular Fallacies
The POP framework exposes a fundamental fallacy in interpretations that propose wavefunction collapse, such as the Copenhagen interpretation. Interpretations within this class assume that the wavefunction exists prior to measurement and that measurement causes a collapse of the wavefunction.
This line of reasoning is circular because it fails to recognize that the wavefunction itself is inferred from the statistical regularities observed in measurement outcomes. In the POP framework, the logical flow is from measurement outcomes to the wavefunction, not the other way around. The wavefunction is a mathematical tool derived from empirical data.
By attempting to explain measurement outcomes as a consequence of the wavefunction’s collapse, these interpretations are essentially trying to use the wavefunction to explain the very empirical data from which it is derived. This circular reasoning arises from treating the wavefunction as a fundamental entity that exists independently of measurements, rather than recognizing it is inferred from measurement outcomes.
The POP framework resolves this inconsistency by properly acknowledging the logical flow from measurement outcomes to the wavefunction. It treats the wavefunction as an inferred entity, derived as a consequence of the statistical regularities observed in measurement outcomes, and does not assume its existence prior to measurement. By avoiding the circular reasoning of collapse interpretations, the POP framework provides a more logically consistent and empirically grounded description of quantum phenomena.
-
Superfluous Structures
The POP framework exposes the problem of redundant structures in interpretations of quantum mechanics, such as the Many-Worlds Interpretation (MWI) and Pilot Wave Theory. These interpretations introduce additional elements that are not derived from the empirical constraints of measurement outcomes and are ultimately superfluous to the description of quantum mechanics.
The Many-Worlds Interpretation proposes that every quantum measurement splits the universe into multiple branches, each representing a different outcome. However, the POP framework demonstrates that this interpretation arises from a fundamental misunderstanding of the empirical basis of quantum mechanics. In the POP framework, measurement outcomes are sufficient to entail quantum mechanics. If the branching of multiple worlds were required for the axioms to be derived, the POP framework would not be able to derive them merely from the ’clicks’ registered in this universe alone. The fact that the POP framework yields the complete foundation of quantum mechanics without invoking multiple universes within the founding empirical basis indicates that the multi-universe structure must necessarily be irrelevant to quantum mechanics.
Similarly, Pilot Wave Theory introduces the concept of quantum potential and particle positions to explain the behavior of quantum systems. According to this interpretation, particles have well-defined positions and are guided by a quantum potential determined by the wavefunction. However, the POP framework demonstrates that these additional elements are not required to derive the foundation of quantum mechanics from the empirical constraints of measurement outcomes and are thus superfluous to the description of quantum mechanics. The POP framework can derive the complete structure of quantum mechanics, including the wavefunction and its evolution according to the Schrödinger equation, without invoking particles piloted by the wavefunction.
Both the Many-Worlds Interpretation and Pilot Wave Theory introduce unnecessary additional structures that are not required to derive the basis of quantum mechanics. The POP framework in this context illustrates the importance of including the empirical genesis of a physical theory as its foundational theorem, which automatically enforces parsimony by revealing superfluous structures.
-
Conflation of Domains
The Ensemble Interpretation of quantum mechanics states that the wavefunction describes an ensemble of similarly prepared systems rather than individual systems. While this interpretation correctly recognizes the importance of considering an ensemble of systems, the POP framework reveals that it mischaracterizes the role of the ensemble in the derivation of the wavefunction and its subsequent application.
In the POP framework, an ensemble of identically prepared systems is indeed required to evaluate the full set of measurements that can be made on the system. By considering the outcomes of measurements performed on this ensemble, one can derive the wavefunction through the entropy maximization procedure. However, the crucial point is that the derived wavefunction applies to a single upcoming identically prepared system, not to the ensemble as a whole.
The Ensemble Interpretation conflates the necessity of an ensemble for deriving the wavefunction with the application of the wavefunction itself. While an ensemble is required to infer the wavefunction, the wavefunction, once derived, provides a description of the probabilistic behavior of a single identically-prepared system subjected to measurement.
4. Conclusion
The POP framework introduces a revolutionary approach to quantum theory construction, distinguishing itself from traditional methodologies by inferring theoretical constructs from measurement outcomes rather than declaring them via axioms. Central to this framework is the utilization of the anti-constraint allowing for the elimination of complex phases and production of interference effects, which, supported by a century of empirical validation, serves as the founding observation to infer quantum mechanics. By solving an entropy maximization problem, the POP framework successfully reconstructs the complete structure of quantum theory from the ground up (deriving Axiom 1, 2, 3, 4 and 5 as theorems), using a single axiom (Equation
4). Such an approach not only mitigates the longstanding ontological and interpretive debates that have characterized the field of quantum mechanics but also paves the way for significant advancements in our understanding of quantum foundations.
Statements and Declarations
Competing Interests: The author declares that he has no competing financial or non-financial interests that are directly or indirectly related to the work submitted for publication.
Data Availability Statement: No datasets were generated or analyzed during the current study.
During the preparation of this manuscript, we utilized a Large Language Model (LLM), for assistance with spelling and grammar corrections, as well as for minor improvements to the text to enhance clarity and readability. This AI tool did not contribute to the conceptual development of the work, data analysis, interpretation of results, or the decision-making process in the research. Its use was limited to language editing and minor textual enhancements to ensure the manuscript met the required linguistic standards.
References
- Dirac, P.A.M. The principles of quantum mechanics; Number 27, Oxford university press, 198.
- Von Neumann, J. Mathematical foundations of quantum mechanics: New edition; Vol. 53, Princeton university press, 2018.
- Reif, F. Fundamentals of statistical and thermal physics; Waveland Press, 2009.
- Kullback, S.; Leibler, R.A. On information and sufficiency. The annals of mathematical statistics 1951, 22, 79–86. [CrossRef]
- Shannon, C.E. A mathematical theory of communication. Bell system technical journal 1948, 27, 379–423. [CrossRef]
- Wiesner, S. Conjugate coding. ACM Sigact News 1983, 15, 78–88.
- Bennett, C.H. The thermodynamics of computation-a review. International Journal of Theoretical Physics 1982, 21, 905–940. [CrossRef]
- Bell, J.S. On the einstein podolsky rosen paradox. Physics Physique Fizika 1964, 1, 195.
- Aristotle. Physike akroasis - Natural hearing; 350 BCE.
- of Elea, Z. Zeno’s Paradoxes 450 BCE.
1 |
The Phase Anti-Constraint was derived through a combination of physical intuition and mathematical exploration. The wavefunction’s association with a probability measure via the Born rule suggested that it could be derived from a maximum entropy principle, analogous to other probability measures in physics. The specific form of the anti-constraint was obtained by seeking a mathematical expression that, when incorporated into an entropy maximization problem, would yield the fundamental principles of quantum mechanics. |
2 |
The relative Shannon entropy includes a reference probability measure , often called the prior, which will represent the probability associated with the system’s wavefunction in its initial state. |
3 |
The presence of the term in the prescribed observation is a necessary feature of the continuum. It allows the conversion of the energy to an energy density , which along with forms a trio that is invariant with respect to an integration change of basis. This is the same reason why the relative Shannon entropy (as opposed to merely the Shannon entropy) is the correct entropy to use in the continuum case. The relative Shannon entropy is invariant with respect to an integration change of basis; whereas the Shannon entropy isn’t. |
|
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).