Physical Sciences

Sort by

Article
Physical Sciences
Astronomy and Astrophysics

Gary Jarvis

Abstract: In this paper we conduct a model free analysis of the expansion of the universe using stellar luminosity data that is available for redshifts, z< 1.8. Our results lead to an expansion velocity of 6.87  0.36 × 106 ms-1, and Hubble constant of 70.9  3.7 km/s/Mpc consistent with other theories. This analysis leads us to a new theory to explain the expansion of the universe that augments general relativity to create a container within which quantum effects can be explained by treating time as an artefact of a fourth, expanding, spatial dimension. We show that the theory can be applied to not only explain mass creation, the speed of light limit, gravity, black holes without singularities and other macroscopic effects, but also to interpret physical effects at the subatomic level such as wave particle duality and electron spin. It provides a solution to the double slit conundrum and can explain how quantum entangled partners can behave in a quantum way and pass seemingly time-defying information. The theory also provides a quantitative link to the fine structure constant from the expansion velocity.
Article
Physical Sciences
Theoretical Physics

Gregor Herbert Wegener

Abstract: The Supra-Omega Resonance Theory (SORT) is presented as a closed structural architecture that unifies multiple scientific domains under an invariant mathematical core. The framework is constructed around a finite and closed set of 22 idempotent resonance operators, a global consistency projector, and a calibrated projection kernel. Together, these elements define a mathematically frozen architecture that admits no arbitrary extensions and precedes empirical integration by design. Version 6 of SORT establishes architectural completion. The operator algebra is closed under composition, global consistency is enforced via a light-balance condition, and validation bounds are defined as invariant thresholds. The same mathematical core is realized across distinct domains, including cosmology, artificial intelligence systems, quantum systems, and complex systems, each interpreting the invariant structure through domain-specific semantics while preserving algebraic identity. Empirical confrontation is positioned as a subsequent phase rather than a present objective. The decision to complete the architecture prior to data integration is methodological, ensuring that future empirical validation is reproducible, unambiguous, and structurally grounded. The MOCK v4 environment enforces deterministic execution, cryptographic reproducibility, and layered consistency verification as architectural features rather than auxiliary tooling. This article constitutes a programmatic statement for the SORT research program. It documents a structurally complete theory architecture prepared for empirical validation while remaining independent of any specific phenomenological application.
Article
Physical Sciences
Theoretical Physics

Michael B. Heaney

Abstract: The conventional formulation of quantum mechanics explains the Einstein, Podolsky, and Rosen (EPR) experiments with “spooky action at a distance" and wavefunction collapse. A time-symmetric and retrocausal formulation of quantum mechanics explains the same experiments without spooky action at a distance or wavefunction collapse. An experiment that can distinguish between the conventional and time-symmetric formulations is described.
Article
Physical Sciences
Astronomy and Astrophysics

Genanady S. Bisnovatyi-Kogan

,

E. A. Patraman

Abstract: Models of neutron and strange stars are considered in the approximation of a uniform density distribution. A universal algebraic equation, valid for any equation of state, is used to find the approximate mass of a star of a given density without resorting to the integration of differential equations. Equations of state for neutron stars had been taken for degenerate neutron gas and for more realistic ones, used by Bethe, Malone, Johnson (1975). Models of homogeneous strange stars for the equation of state in the "quark bag model" have a simple analytical solution. The solutions presented in the paper for various equations of state differ from the exact solutions obtained by the numerical integration of differential equations by at most ∼ 20%. The formation of strange stars is examined as a function of the deconfinement boundary (DB), at which quarks become deconfined. Existing experimental data indicate that matter reaches very high densities in the vicinity of the DB. This imposes strong constraints on the maximum mass of strange stars and prohibits their formation at the final stages of stellar evolution, because the limiting mass of neutron stars is substantially higher and corresponds to considerably lower matter densities.
Article
Physical Sciences
Applied Physics

Dorin Bibicu

,

Lumința Moraru

Abstract: This study presents two-dimensional numerical simulations of acoustic wave scattering involving a simplified human body model placed inside an enclosed cabin. The simulations utilise the µ-diff backscattering algorithm in MATLAB, which is suitable for model-ling frequency-domain interactions with multiple scatterers under penetrable boundary conditions. The body is represented as a cluster of penetrable, tangent circular cylinders with acoustic properties mimicking muscle, fat, bone, and clothing layers. Hidden PVC cylinders are embedded to simulate concealed objects. Several configurations were examined, varying the number of PVC inclusions (two to four), the frequency range, and the presence of an absorbing cabin wall. Sound pressure level (SPL) distributions around the body and at a 1-meter distance were analysed. Polar plots reveal distinct differences between the baseline body model and those incorporating PVC inclusions. The most pronounced effects occur near 160 Hz when an absorbing wall is present within the acoustic enclosure. The presence of an absorbing wall modifies wave behaviour, producing enhanced directional attenuation. The results demonstrate how object composition, spatial arrangement, and enclosure geometry influence acoustic backscattered fields. These findings highlight the potential of wave-based numerical modelling for detecting concealed items on the human body in confined acoustic environments, supporting the development of non-invasive security screening technologies. This work presents the first study addressing the 2D simulation of multiple acoustic waves scattering by a human body model within an acoustically enclosed environment for detecting hidden items on the human body.
Essay
Physical Sciences
Quantum Science and Technology

Jiqing Zeng

Abstract: The double-slit experiment, as a cornerstone experiment of quantum mechanics, has long been regarded as the ultimate proof of wave-particle duality. However, results from high-precision experiments conducted in recent years by teams including Tonomura and Bach, as well as from the "recoiling slit" experiment by Jianwei Pan's team, have revealed profound contradictions with mainstream quantum mechanical interpretations. These contradictions expose systematic biases in conceptual definitions and the interpretation of physical mechanisms within the mainstream narrative. Based on particle flow scattering theory and incorporating the design details and results of Pan's team's experiment, this paper critiques the mainstream quantum mechanical narrative that mystifies the "cumulative effect of particle flow scattering" as "wave-particle duality" and "wave function collapse." It argues that the essence of the bright and dark fringes in the double-slit experiment is the statistical distribution of particles after their interaction with slit matter, rather than wave interference. Research indicates that the core dilemma of mainstream quantum mechanical interpretation stems from a misreading of the physical essence of experiments and conceptual confusion. Reconstructing a physical picture based on classical scattering theory and statistical laws is the inevitable path for quantum mechanics to overcome its interpretational predicament.
Article
Physical Sciences
Nuclear and High Energy Physics

He Liu

,

Peng Wu

,

Hong-Ming Liu

,

Peng-Cheng Chu

Abstract: We investigate temperature fluctuations in hot QCD matter using a 3-flavor Polyakov-loop extended Nambu--Jona-Lasinio (PNJL) model. The high-order cumulant ratios $R_{n2}$ ($n>2$) exhibit non-monotonic variations across the chiral phase transition, characterized by slight fluctuations in the chiral crossover region and significant oscillations around the critical point. In contrast, distinct peak and dip structures are observed in the cumulant ratios at low baryon chemical potential. These structures gradually weaken and eventually vanish at high chemical potential as they compete with the sharpening of the chiral phase transition, particularly near the critical point and the first-order phase transition. Our results indicate that these non-monotonic peak and dip structures in high-order cumulant ratios are associated with the deconfinement phase transition. This study quantitatively analyzes temperature fluctuation behavior across different phase transition regions, and the findings are expected to be observed and validated in heavy-ion collision experiments through measurements of event-by-event mean transverse momentum fluctuations.
Article
Physical Sciences
Theoretical Physics

Amin Al Yaquob

Abstract: We present the electroweak sector of Geometric Design (codename GD-313) in a form suitablefor referee audit. The framework selects the 3–13 vacuum Gr(3, 16) via a bounded integer corridordefined by two publicly tabulated anchors. On the dynamical side, we specify an explicit embeddingof SU(2)L °ø U(1)Y into the U(16) structure compatible with the (3, 13) split and computethe induced one-loop coupling ratio from the Grassmannian coset sector. Under clearly statedassumptions (background-field gauge and cancellation of the universal prefactor in the ratio), theindex computation yields sin2 θW = 3/13 at the declared matching convention. We provide anauditable appendices package: embedding generators, index computation, and a reproducibilitychecklist.
Article
Physical Sciences
Astronomy and Astrophysics

Huang Hai

Abstract: Based on a unified non-perturbative quantum gravity framework, this paper systematically elaborates the cross-scale universality of the quantum gravitational correction term with a logarithmic term. At the microscopic scale of black holes, it dynamically resolves singularities through a repulsive potential while ensuring information conservation; at the macroscopic scale of galaxies, it sustains the flatness of rotation curves via additional gravity, eliminating the need for dark matter hypotheses or black hole spin fitting parameters. With quantum vortices (statistical average topological structures of microscopic particles) and nested AdS/CFT duality as the physical core, the framework derives a modified gravitational potential containing a logarithmic term: The logarithmic term lnr is the key to realizing the cross-scale "short-range repulsion and long-range attraction" effect. Double verification through black hole shadow observations (Sgr A*, M87*) and galactic rotation curve data (Milky Way, Andromeda Galaxy, NGC 2974) demonstrates that the framework achieves high observational consistency in both strong gravitational fields (black holes) and weak gravitational fields (galaxies). It for the first time realizes a unified description of gravity from the microscopic to the macroscopic scale, providing observable and reproducible empirical support for quantum gravity theory.
Article
Physical Sciences
Theoretical Physics

Azzam AlMosallami

Abstract: We investigate Planck-scale compact objects and gravitationally induced quantum correlations within the framework of Causal Lorentzian Theory (CLT), a flat-spacetime, Lorentz-invariant field theory of gravitation with explicit causal propagation and localized gravitational field energy. In CLT, gravitational phenomena arise from conformal time dilation rather than spacetime curvature, eliminating event horizons and curvature singularities. Point-like sources are regularized through smooth mass distributions, yielding finite gravitational fields at all scales. We analyze Planck-scale compact objects, derive a finite horizon-free gravitational energy emission mechanism, and compute gravitationally induced quantum phase shifts arising from conformal time dilation. Extending the analysis to multi-particle systems, we construct causal gravitational phase-correlation networks that mimic entanglement-like signatures without quantizing gravity or introducing gravitons. The framework provides concrete, testable predictions for micro-scale interferometry and optomechanical experiments, offering a consistent semi-classical bridge between gravitation and quantum mechanics.
Article
Physical Sciences
Theoretical Physics

Jaime Melo

Abstract: The Entropic Framework (EDF) reinterprets the entropy arrow and dimension hierarchy as it identifies in the current paradigm the cause for open issues and singularities yet to be solved by Particle and Cosmological Physics. In EDF asymptotic approach, dimensional mapping find a natural limit point in a pregeometric zero dimensional constraint. The same perspective settles in the 0D the maximum entropy. The Asymptotic Equipartition Property (AEP) for this maximum is S0 = ln 2. The time arrow tells us entropy should increase toward higher level dimensions, therefore from 4D to 1D, in opposite of the longstanding view. EDF formalizes this through four functorial projections Pn (n = 0, 1, 2, 3) with non-trivial kernels, incorporating supersymmetric Golden algebras, Fibonacci divisors, braid group representations, and writhe saturation conditions. The framework derives three fermions generations from soliton triplication; Planck’s constant ¯h = Scycle/12, from a twelve-fold angular kernel; color confinement, from braid closure at twelve crossings; Einstein gravity G ∝ 144−1, from writhe saturation and strict ultraviolet finiteness at all loop orders; and a monotonic entropic descent S0 > S1 > S2 > S3 > S4 → 0 ruled by thermodynamic. EDF provides four testable predictions: 1) Twelve-fold spectral resonances; 2) Tunable gravitational coupling in analogues; 3) Discrete attosecond temporal bins; 4) Entropy drift in quantum systems.
Article
Physical Sciences
Nuclear and High Energy Physics

Engel Roza

Abstract: A structure based analysis of the pion’s decay path reveals that neutrinos show up in three flavours, each built up by three identical mass eigenstates. It requires a proper understanding of the nature of charged leptons, such as why the loss of binding energy stops the lepton generation at the tauon level. The analysis reveals fundamental interrelationships between mesons, charged leptons and neutrinos. It is shown that the results of the theoretical model for neutrinos developed in the article are in agreement with the results of the phenomenological PMNS model. The article ends with a discussion on the pros and cons of a structure based theory developed from first principles and phenomenological modelling.
Article
Physical Sciences
Theoretical Physics

Mohamed Sacha

Abstract: In the QICT programme, mass is not an intrinsic invariant but an operational certification cost governed by an audit depth and an information-copy (certification) latency. Separately, a QICT Golden Relation for the singlet-scalar mass yields a structural reference band centered on m0 = 58.1±1.5 GeV. Collider searches for type-III seesaw heavy leptons, however, report a sequence of characteristic mass limits (e.g. 335 GeV, 470 GeV, 790 GeV, 870 GeV, 910 GeV in ATLAS; 840 GeV and 880 GeV in CMS), all quoted at 95% CL and accompanied by “no significant excess” statements. This note explains, in a logically closed manner, why an analysis may foreground a particular high scale such as 470 GeV rather than the lower structural value: (i) the collider numbers are limits, not directly measured resonance peaks, and (ii) in QICT, the reconstructed event-scale mass corresponds to a regime-dependent effective mass meff that can occupy stable plateaus when the certification latency compresses toward a speed-limit bound. The existence of multiple reported scales strengthens the defense by showing that the highlighted number depends systematically on channel content, luminosity, and statistical procedure, consistent with a regime/plateau picture rather than a unique intrinsic mass.
Article
Physical Sciences
Particle and Field Physics

Andrew Michael Brilliant

Abstract: Peer review of empirical patterns in high-precision, low-dimensionality param- eter spaces relies on implicit evaluation standards. When N = 3 parameters at 2% precision permit thousands of statistically significant formulas, reviewers must distinguish structure from coincidence, but the criteria for doing so remain unar- ticulated. We found no published record of community debate establishing explicit standards, despite decades of informal application. This paper proposes one such articulation: seven criteria emphasizing tempo- ral convergence through timestamped predictions. We offer specific thresholds not because we believe them correct, but because explicit proposals can be calibrated while implicit standards cannot. The need for explicit standards is timely. Lattice QCD has only recently achieved the precision necessary for discriminatory tests of quark mass relations. Historical precedents from lepton phenomenology (Koide, Gell-Mann–Okubo) provide limited guidance: leptons offer ∼35,000× greater discriminatory power than light quarks, in- volve no RG running, and constitute a fundamentally different measurement regime. The historical record is further compromised by survivorship bias: patterns that di- verged are largely unrecorded. Historical cases motivate the problem by illustrating why implicit evaluation proved adequate for leptons but may prove inadequate for quarks. They cannot validate the proposed solution. Validation is prospective by design: starting from this publication, patterns evaluated under this framework will be tracked publicly. The framework succeeds if it proves predictively useful; it fails if it requires constant post-hoc adjustment, judged by its own temporal convergence criterion. If this proposal provokes disagreement that leads to better criteria, it will have served its purpose. If it is ignored, the current system of implicit evaluation contin- ues unchanged. We consider both engagement and refinement to be success.
Article
Physical Sciences
Astronomy and Astrophysics

Hai Huang

Abstract: We propose a non-perturbative quantum gravity framework using quantum vortices (statistical average topological structures of microscopic particles) embedded in AdS/CFT holographic duality, resolving black hole singularities without renormalization. Thus, this constitutes a singularity-resolution mechanism grounded in physical processes rather than mathematical techniques. The quantum vortex field generates a repulsive potential within the critical radius r∗ ≈ 8.792 × 10−11m, dynamically preventing matter from reaching r = 0 and avoiding curvature divergence. The derived Huang metric (Schwarzschild metric with quantum corrections) enables parameter-free prediction of black hole shadow angular diameters, without post-observation fitting of Kerr black hole spin. Observational verification shows: the theoretical shadow of Sgr A* is 53.3 μas (EHT: 51.8 ± 2.3 μas), and that of M87* is 46.2 μas (EHT: 42 ± 3 μas), resolving contradictions of the Kerr model. This framework unifies singularity elimination, information conservation, and shadow prediction, providing a testable quantum gravity paradigm.
Article
Physical Sciences
Quantum Science and Technology

Mohamed Haj Yousef

Abstract: We formulate a geometric framework in which observable spatial geometry and temporal directionality emerge from the intersection of two orthogonal Lorentzian temporal domains, identified as objective (physical) and subjective (informational). Each domain carries a dual-time structure consisting of a generative temporal coordinate and a manifest temporal coordinate, and is modeled using split-complex geometry that encodes conjugate Lorentzian temporal orientations. Observation is described as an intersection process in which the two Lorentzian domains meet at a Euclidean interface: oppositely oriented manifest temporal components cancel, while generative components combine into an effective temporal magnitude. This intersection yields a three-dimensional Euclidean spatial geometry accompanied by a scalar temporal parameter. The interaction between the domains is formulated using a bi-fibered temporal bundle equipped with independent temporal gauge connections. The associated gauge curvatures encode generative desynchronization, geometric phases, and topological sectors. A discrete temporal interchange symmetry exchanging the two domains is spontaneously broken by a composite temporal order parameter, resulting in an emergent arrow of time. Variation of the action yields effective gravitational field equations in which spacetime curvature receives contributions from the temporal gauge and phase fields. This construction provides a consistent geometric setting in which Euclidean space arises as an observational intersection of conjugate Lorentzian temporal structures, while temporal asymmetry, gauge curvature, and topological quantization emerge from the underlying bi-temporal geometry.
Article
Physical Sciences
Theoretical Physics

Sergiu Vasili Lazarev

Abstract: We present New Subquantum Informational Mechanics (NMSI), a comprehensive theoretical framework proposing that information—not matter or energy—constitutes the fundamental substrate of physical reality. The framework introduces the Riemann Oscillatory Network (RON), comprising N ≈ 10¹² nodes corresponding to non-trivial zeros of the Riemann zeta function ζ(s), serving as the computational substrate underlying observable physics.Central to NMSI is the π-indexing mechanism, wherein blocks of decimal digits from π provide deterministic addresses into RON. We derive the architectural threshold L* = 2·log₁₀(N) = 24, demonstrating that for block lengths L > 24, collision frequencies undergo structural transition from statistical independence to correlated behavior. This threshold emerges not as an arbitrary choice but as a mathematical necessity dictated by finite register addressing in RON.The framework introduces the DZO-OPF-RON triad as the minimal irreducible architecture for coherent physical systems: the Dynamic Zero Operator (DZO) provides dynamic regulation maintaining balance condition G[Ψ*] = 0, the Operational Phase Funnel (OPF) implements geometric mode selection via Gabriel Horn topology with aperture A(x) = A₀/x², and RON supplies the finite oscillatory substrate. We prove via six-case exhaustive analysis that elimination of any component leads either to persistent chaos or trivial collapse.Physical implementations include: CMB low-ℓ anomalies as OPF transition signatures at ℓc ≈ 24, where spectral entropy H(ℓ) exhibits regime change; BAO drift as DZO cyclic regulation with amplitude ε ≈ 1% tied to cosmic cycle parameter Z ∈ [−20, +20]; and early JWST high-redshift galaxies at z > 10 as structures inherited from previous cosmic cycles through baryon recycling mechanism at turnaround Z = −20.The tornado vortex serves as a terrestrial laboratory for validating the predicted constraint accumulation integral J(rc) = 55.26 ± 10 nats at the coherence transition radius, where J(r) = ∫ |∂Ω/∂r|/Ωref dr measures accumulated geometric constraint. Three coherence indicators I₁ (turbulence intensity), I₂ (normalized shear), and Ω (enstrophy) simultaneously satisfy threshold criteria at rc, providing direct experimental access to OPF-DZO dynamics.We provide twelve falsifiable predictions testable during 2025–2035 using DESI, JWST, LISA, CMB-S4, and Einstein Telescope, with explicit numerical thresholds and statistical confidence levels. Three computational tests using publicly available π digits (10¹² available) and CMB data (Planck 2018) are executable immediately: (1) CMB spectral entropy transition at ℓc = 24 ± 5, (2) π-block χ² transition at L = 24 ± 2, (3) π-ζ GUE correlation emergence for L ≥ 26. The framework challenges ΛCDM cosmology not through modification but through fundamental replacement, offering coherent alternatives to dark matter, dark energy, and the Big Bang singularity through cyclic informational dynamics.
Article
Physical Sciences
Astronomy and Astrophysics

Yuxuan Zhang

,

Weitong Hu

,

Wei Zhang

Abstract: We propose an algebraic framework constructed from a finite-dimensional 19-dimensional Z3-graded Lie superalgebra g = g0 ⊕ g1 ⊕ g2 (dimensions 12+4+3), featuring exact closure of the graded Jacobi identities (verified symbolically in key sectors and numerically in a faithful matrix representation, with residuals ≲ 10−12 across 107 random combinations) and a unique (up to scale) invariant cubic form on the grade-2 sector, driving a triality symmetry on the vacuum sector. Interpreting the grade-2 sector as the physical vacuum state, we explore whether representation-theoretic invariants and contractions within this algebraic structure can account for observed Standard Model parameters—including fermion masses, mixing angles, and gauge couplings—as well as the magnitude of the cosmological constant, black-hole entropy scaling, and certain qualitative features of quantum entanglement. The framework yields twelve quantitative predictions amenable to experimental scrutiny at forthcoming facilities such as the High-Luminosity LHC, Hyper-Kamiokande, DARWIN/XLZD, and LiteBIRD.
Article
Physical Sciences
Astronomy and Astrophysics

Junli Chen

Abstract: This article reviews the same-frequency mutual interference explanation of gravitational bending light, gravitational lensing and light bending, analyzes the observation data of HerS-3 Einstein Cross, and uses the same-frequency mutual interference explanation of light bending to deduce the rationality of the formation of the HerS-3 Einstein Cross. This article believes that the light starts from the large When passing by a massive luminous planet (galaxy), the light bends due to the influence of electromagnetic waves continuously emitted by the massive luminous planet (galaxy). The degree of bending of the light is directly proportional to the brightness of the massive luminous planet and inversely proportional to the shortest distance between the light and the planet, regardless of the mass of the planet. Generally, the mass-to-light ratio (excluding dark matter) of planets (galaxies) in the universe is much smaller than the mass-to-light ratio of the sun. Therefore, the degree of light bending calculated using the gravitational lens principle is much smaller than the actual value. At this time, we have to use non-existent dark matter to supplement it. The HerS-3 Einstein cross derivation of dark matter is another example. However, using the same-frequency mutual interference explanation of light bending will not deduce dark matter, which is consistent with observational reality.
Article
Physical Sciences
Mathematical Physics

Wawrzyniec Bieniawski

,

Andrzej Tomski

,

Szymon Łukaszyk

,

Piotr Masierak

,

Szymon Tworz

Abstract: Assembly theory defines structural complexity as the minimum number of steps required to construct an object in an assembly space. We formalize the assembly space as an acyclic digraph of strings. Key results include analytical bounds on the minimum and maximum assembly indices as functions of string length and alphabet size, and relations between the assembly index (ASI), assembly depth, depth index, Shannon entropy, and expected waiting times for strings drawn from uniform distributions. We identify patterns in minimum- and maximum-ASI strings and provide construction methods for the latter. While computing ASI is NP-complete, we develop efficient implementations that enable ASI computation of long strings. We establish a counterintuitive, inverse relationship between a string ASI and its expected waiting time. Geometric visualizations reveal that ordered decimal representations of low ASI bitstrings of even length N naturally cluster on diagonals and oblique lines of the squares with sides equal to 2N/2. Comparison with grammar-based compression (Re-Pair) shows that ASI provides superior compression by exploiting global combinatorial patterns. These findings advance complexity measures with applications in computational biology (where DNA sequences must violate Chargaff's rules to achieve minimum ASI), graph theory, and data compression.

of 305

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated