1. Introduction
The simulation hypothesis, which asks us to consider the possibility that we are living in a simulation, is slowly gaining momentum as a view worthy of serious scientific consideration. Some of its advocates are convinced that it is true and talk of its near certainty1,2; others are more measured in their claims3. The idea has felt its fair share of backlash with some in the scientific community dismissing it as impractical4 or pseudoscientific5. Though claims of near certainty of the hypothesis are unfounded, the hypothesis can be cogently assigned a non-zero probability. There is no such thing as an idea that is ahead of its time. Like many ideas which present possible existential risks (e.g. the technological singularity), how early we grapple with them will determine the extent to which we can mitigate any threats that may issue from them.
2. Type I and Type II Simulations
Simulations may be placed in one of two categories – Type I simulations which do not aim to alter the contents or physical laws of the base reality; and Type II simulations which do have such intentions. Since it costs more bits of the simulating reality to produce one bit of the simulated reality for a Type II simulation versus a Type I, all things being equal, Type II simulations have lower resolution than Type I. That is, if we are aiming to maximize the resolution of a simulation, we need only consider Type I simulations. Since Type I simulations do not aim to alter the contents or physical laws of the base reality, a Type I simulation may be considered a sampling of the quantum wave function of the simulating reality.
3. Type I Simulations & the Nyquist-Shannon Sampling Theorem
The complementary terms Nyquist rate fs and Nyquist frequency f are commonly used in the information sciences to specify limits to how fast a signal may be sampled. While the Nyquist rate itself defines the maximum sampling rate if a discrete counterpart of a sampled continuous signal is to be free of aliasing, the Nyquist frequency refers to the maximum frequency component of the sampled signal that can be preserved and recreated. It is well established that the alias free condition is fs ≥ 2f or equivalently f ≤ ½ fs. This is known as the Nyquist-Shannon sampling theorem. In our universe, as far as we know, what sets the maximum temporal frequency fmax is the Planck time tP i.e. fmax = fP = 1/tP (where fP is the Planck frequency). This frequency is therefore the universe’s maximum sampling frequency. Accordingly, the maximum frequency of the simulating reality that can be preserved in the simulation (i.e. the Nyquist frequency) or more simply the maximum frequency of the simulation is half the maximum sampling frequency i.e. f = ½fP. Similar reasoning leads to an analogous relationship for spatial frequency. The consequence of this is that the spatial and temporal resolution of a simulation (indicated by the inverse of the Planck length lP and Planck time tP, respectively) can be no more than half that of the simulating reality.
4. Bostrom Distance of Nested Simulations
The implications of the Nyquist-Shannon theorem for the resolution of a simulation allow us to conveniently specify the degree of nestedness of simulations relative to another simulation or a presumed base reality. This parameter shall be named the Bostrom distance (in honor of Swedish philosopher Nick Bostrom who has done much work in highlighting the importance of the Simulation Hypothesis). The Bostrom distance N
B between a simulating reality and a simulation N levels lower is calculated as follows:
where t
P,X and t
P,Y are the Planck time in the simulating reality X and that in the simulated reality Y (N levels lower). The equation may just as well be stated in terms of the Planck length l
p. Furthermore, it does not matter whether the Planck time is ultimately the smallest possible time. What matters is the linearity between the two corresponding times.
5. Bostrom Limit of Nested Simulations
An argument has been put forward that it is near certain that we are living in a simulation given that there may be an infinite or at least indefinitely large number of nested simulations issuing from a base reality
1,2. But this argument may be undermined by the coupling of the progressive halving of the resolution of nested simulations with the expectation that there is a limit to the pixelation of a simulated reality that is consistent with the capacity to create a simulation. Such a limit may be readily framed in terms of intelligence i.e. we can readily conceive of a simulation so temporally or spatially pixelated or poorly resolved that it is incompatible with the emergence of the threshold intelligence required to create a further simulation. By definition this simulation (to be called the Bostrom simulation) would be the last in the lineage of nested simulations starting with the base reality. Let us call the Bostrom distance of this final simulation (relative to the base reality) the Bostrom limit N
B* and the corresponding temporal resolution of that simulation is time t
P,
B* . Then we may write:
where t
P,0 is the Planck time in the base reality (which may be termed the 0
th simulation). An analogous equation may be tendered for the spatial aspect in terms of the Planck length. Since the Planck length scales linearly with the Planck time, the value for N
B* calculated spatially must equal that calculated temporally.
A convincing case can be made for the existence of a base reality for without it (within the model thus established), if there is no base reality (i.e. there are levels all the way up) then the limit is a reality of infinitely high information density which is forbidden intuitively and also from the perspective of quantum/digital physics.
6. Probability that we are Living in a Simulation
The notion that we are most definitely living in a simulation is gaining small traction in the scientific community. Though the simulation hypothesis may legitimately be accorded a non-zero probability, it is definitely not anywhere close to certainty. This is so for primarily two reasons. First, this bold assumption must contend with the Bostrom limit, and secondly with the fact that the probability of generating a simulation of maximum resolution is expected to decay non-linearly with Bostrom distance.
The proposition of near certainty that we are living in a simulation follows from the assumption that the lineage of nested simulations from the base reality is infinite or at least indefinitely large. For N simulations nested in a single base reality, the probability P(N) that we are living in a simulation is:
As N approaches infinity, this probability approaches certainty and this is largely the present state of thinking about the Simulation Hypothesis.
With the inclusion of the constraint of the Bostrom limit, the probability P reduces to:
If we can somehow simulate a reality and also paradoxically include the process of simulation (a strange looping) then because the simulation takes place in the base reality, it must ripple into all nested simulations down to the Bostrom simulation. In this case, the equation above would be applicable. However, if we discard this strange occurrence of the sampled wave function also including information about the very act of sampling, then we have nested Type I simulations of the base reality minus information related to the sampling process. The wavefunction of the simulation 1 Bostrom distance from the base reality will be identical to that of the base reality but with half the resolution. Given the similarity of wave functions (notwithstanding the indeterminacy of quantum mechanics and the halving of the resolution), there will still be a high probability that this 1
st simulation arrives at an isomorphic simulation. This new simulation (2 Bostrom distances from the base) also stands in the same relation to its immediate predecessor. The expectation is that this would lead to a probability of living in a simulation of Bostrom distance N
B (relative to base reality) that decays exponentially as a function of Bostrom distance (i.e. the probabilities of living in any one of the N levels in a nested simulation are not equal). The probability P(N) of living in the N
th level of the simulation is given by:
If we make λ = 0, then eq. 6.3 becomes eq. 6.2 (i.e. the equiprobable distribution). The decay constant λ is given by:
where T is the difference in the age of the base reality and the age of the wave function used to create the 1
st simulation, f is the Planck frequency of the 1
st simulation, f
0 is the Planck frequency of the base reality, and α is a constant which captures the uncertainty due to quantum indeterminism (or generally analogues of indeterminism that may be deeper than the quantum scale) as well as the loss of resolution (and any alteration in the structure of reality that affects the probability of production of a simulation, stemming from loss of resolution)
The probability P(0) of living in the 0
th level of the simulation (i.e. the base reality) is given by:
The probability of living in the Bostrom simulation (i.e. the final simulation in the nested lineage) is given by:
The probability of living in a simulation
is given by:
Since the sum of the probabilities are normalized to unity, then we would expect that the sum of the probability of living in a simulation and that of not living in one (i.e. living in the base reality) must be 1; also the sum of probabilities of living in any one of the N levels of the nested lineage (from the base reality or 0th simulation to the Bostrom limit) must also equal 1.
The form of the probability function will not change if we sample the wave function corresponding to an earlier time in the base reality; only the decay constant must be larger. Furthermore, even if the exponential form specified is not the exact form of the probability function, it would still no doubt constitute a geometric series (with a common ratio less than 1 but greater than 0) and once decided can take the place of the proposed exponential, with the same kinds of consequences.
Lastly, it must be observed that if the base reality does not meet the technological requirements to produce a simulation of temporal and spatial resolution above that of the Bostrom simulation, then the probability of created even the first simulation is zero and consequently that of any other downstream simulation is also zero.
7. Implications for the Laws of Physics
If the maximum spatial and temporal resolution of a simulation must be equal to half that of the simulating reality, an immediate consequence of this is that the conjugates of space and time – namely, energy and momentum – must also be halved i.e. the maximum photon energy and momentum observable in that simulation must be half that of the simulating reality. While this does not directly alter the laws of physics – for example, the Heisenberg Uncertainty Principle is preserved, as is the speed of light – firstly, in such a digital framework, the laws of physics may emerge naturally out of an information theoretic treatment of the physical data. Secondly, the laws of physics may be compromised to the extent that they depend on resolution. Take for example, the phenomenon of quark confinement. While it is true that such a confinement can theoretically be overcome in our universe (at the Hagedorn temperature6), it is reasonable to suppose that there are analogous phenomena (possible in simulations of lower Bostrom distance) which can never be resolved simply because the energy for such a resolution is not permissible in the simulation. How deep we can probe into the structure of reality (and into the past, i.e. the early universe) is dependent on the photon energies we can harness technologically; and so the greater the Bostrom distance of a simulation, the lower its maximum photon energy and therefore the lower its probing capacity. In relation to the Bostrom limit, if sapience relies on a rootedness into quantum mechanics, and a simulation is so poorly resolved that it is not sufficiently rooted, then that simulation will be incompatible with sapience
An interesting corollary of what has been established is that the detection of photon energies of values higher than the Planck energy (barring erroneous detection) shall point to either one of two possibilities (both of great consequence):
-
1.
The detection of a realm of physics deeper than the quantum realm
-
2.
The detection of a glitch in our simulation which permits the transient influx of energy/information from the simulating reality. Can this be permitted by a higher order version of the Heisenberg Uncertainty Principle?
8. Bostrom Scales for Gauging Level of Technological Advancement of a Civilization
In 1964, the Russian astronomer proposed a scale (based on energy consumption) for gauging the level of technological advancement of a civilization7. Here, the author wishes to propose a Kardashev-like rating of the technological advancement of a civilization based on its maximum temporal and spatial frequencies used in computing (in relation to the maximum counterparts in their respective universes/simulations). The spatial frequency would off course be the inverse of the dimension of the smallest pixel technologically possible. Such a scale may be called the Bostrom scale, with three variants, of different degrees of robustness. The Bostrom A scale (BA scale) is naively based on the highest energy photon detected in the universe in relation to the Planck energy. The Bostrom B scale (BB scale), more robust than the BA scale, is based on the highest photon energy technologically produced in the universe in relation to the Planck energy EP. Lastly, the Bostrom C scale (BC scale) which is based on the highest processing frequency and highest spatial frequency (smallest pixel dimension) used in computing. Since improvements in temporal and spatial frequencies used in computing are not necessarily proportional, the BC scale may be further divided into a spatial component BCσ and a temporal component BCτ. The last metric measures how much information can be extracted from space-time.
The formulae for rating civilizations on the BA, BB, BC
σ and BC
τ are as follows:
where E
d is the maximum photon energy detected
where E
c is the maximum photon energy created/produced technology
where l
min is the dimension of the smallest pixel
where fmax is the maximum processing frequency, and fP is the Planck frequency
As of December 18th 2022, the values for the above indices in our universe are as follows:
Table 1.
Present indices for the four flavors of the Bostrom scale .
Table 1.
Present indices for the four flavors of the Bostrom scale .
Index |
Max Value of Reference Parameter8,9,10,11
|
Value of Index |
BA |
Ed = 7.2 x 10-5 J |
13.4 |
BB |
Ec = 1.0 x 10-6 J |
15.3 |
BCσ
|
lmin = 5.6 x 10-7 m |
28.5 |
BCτ
|
fmax = 8.8 x 109 Hz |
33.3 |
9. Conclusion
With the claims thus made we may have done enough to place the simulation hypothesis indisputably in the realm of science. First, by applying the Nyquist-Shannon sampling theorem, it was shown that the resolution of the Nth simulation down a lineage degrades by (1/2)N. Coupling this with a limit to which the resolution of a simulation can be degraded and still be consistent with the production of further simulation), we arrive at a finite Bostrom distance from the base reality (the Bostrom limit, NB*). This immediately annihilates the prospect of infinite nestedness of simulations. Using the Bostrom limit, equations for the probability that our universe is a simulation were generated under the assumption of:
-
1.
infinite nestedness per base reality (yielding a probability of unity)
-
2.
finite nestedness (constrained by Bostrom limit) with equiprobability among levels (yielding a probability of NB*/NB* + 1)
-
3.
Bostrom limited nestedness as in the previous case but which the consideration that the probability of living in the Nth level of a nested lineage of simulations exponentially decays with the Bostrom distance from the base reality. This significantly deviates from the equiprobable distribution giving the highest weighting to the base reality and the lowest to the Bostrom simulation.
The implications for the laws of physics were noted – namely that it would half the maximum energy allowable in the simulation (versus the simulating reality) and therefore limit how deep reality can be probed. It could also affect the laws of physics by creating situations analogous to quark confinement (when in simulations closer to the base reality, no such confinement would be observed). Lastly, a Kardashev-like scale named the Bostrom scale was put forward, in three flavors, to gauge the level of technological advancement of a civilization based on the extent to which it can extract information from reality.
Statements and Declarations
On behalf of all authors, the corresponding author states that there is no conflict of interest.
References
- "Joe Rogan & Elon Musk - Are We in a Simulated Reality?".
- "Neil deGrasse Tyson Explains the Simulation Hypothesis".
- Bostrom N. 2003. “Are You Living in a Computer Simulation?” Philosophical Quarterly. 53(211): 243-255.
- Ellis, George (2012). "The multiverse: conjecture, proof, and science". Retrieved 18th December 2022.
- Hossenfelder, Sabine (February 13, 2021). "The Simulation Hypothesis is Pseudoscience". BackReAction. Retrieved 18th December, 2022.
- Gaździcki, Marek; Gorenstein, Mark I. (2016), Rafelski, Johann (ed.), "Hagedorn's Hadron Mass Spectrum and the Onset of Deconfinement", Melting Hadrons, Boiling Quarks – From Hagedorn Temperature to Ultra-Relativistic Heavy-Ion Collisions at CERN, Springer International Publishing, pp. 87–92.
- Kardashev, N.S. (1964). "Transmission of information by extraterrestrial civilizations".
- Amenomori M, Bao YW, Bi XJ, Chen D, Chen TL, Chen WY, Chen X, Chen Y, Cirennima, Cui SW, Danzengluobu, Ding LK, Fang JH, Fang K, Feng CF, Feng Z, Feng ZY, Gao Q, Gou QB, Guo YQ, He HH, He ZT, Hibino K, Hotta N, Hu H, Hu HB, Huang J, Jia HY, Jiang L, Jin HB, Kajino F, Kasahara K, Katayose Y, Kato C, Kato S, Kawata K, Kozai M, Labaciren, Le GM, Li AF, Li HJ, Li WJ, Lin YH, Liu B, Liu C, Liu JS, Liu MY, Lou YQ, Lu H, Meng XR, Mitsui H, Munakata K, Nakamura Y, Nanjo H, Nishizawa M, Ohnishi M, Ohta I, Ozawa S, Qian XL, Qu XB, Saito T, Sakata M, Sako TK, Sengoku Y, Shao J, Shibata M, Shiomi A, Sugimoto H, Takita M, Tan YH, Tateyama N, Torii S, Tsuchiya H, Udo S, Wang H, Wu HR, Xue L, Yagisawa K, Yamamoto Y, Yang Z, Yuan AF, Zhai LM, Zhang HM, Zhang JL, Zhang X, Zhang XY, Zhang Y, Zhang Y, Zhang Y, Zhaxisangzhu, Zhou XX; Tibet ASγ Collaboration. First Detection of Photons with Energy beyond 100 TeV from an Astrophysical Source. Phys Rev Lett. 2019 Aug 2;123(5):051101. [CrossRef] [PubMed]
- Webb, Jonathan (21 May 2015). "LHC smashes energy record with test collisions". Retrieved 18th December 2022.
- O'Donnell, Deirdre (19 February 2022)."OmniVision announces the development of the world's smallest pixel for mobile image sensors". Retrieved 18th December 2022.
- Asus (02 January 2013) "8.79GHz FX-8350 is the Fastest Ever CPU | ROG - Republic of Gamers Global". Retrieved 18th December 2022.
|
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).