We prove that the probability of “A OR B”, where A and B are events or hypotheses that are recursively dependent (OR is the logico-probabilistic operator) is given by a "Hyperbolic Sum Rule” (HSR) relation isomorphic to the hyperbolic-tangent double-angle formula, and that this HSR is Maximum Entropy (MaxEnt). The possibility of recursive probabilities is excluded by the “Conventional Sum Rule” (CSR) which we also prove MaxEnt (within its narrower domain of applicability). The concatenation property of the HSR is exploited to enable analytical, consistent and scalable calculations for multiple recursive hypotheses: such calculations are not conveniently available for the CSR, and are also intrinsic to current artificial intelligence and machine learning architectures presently considered intractable to analytical study and methodological validation. We also show that it is as reasonable to state that “probability is physical” (it is not merely a mathematical construct) as it is to state that “information is physical” (now recognised as a truism of communications network engineering). We relate this treatment to the physics of Quantitative Geometrical Thermodynamics which is defined in complex hyperbolic (Minkowski) spacetime, and show how the HSR is isomorphic to other physical quantities including certain components important for digital signal processing.