Article
Version 3
Preserved in Portico This version is not peer-reviewed
The Entropy Function for Non Polynomial Problems and Its Applications for Turing Machines
Version 1
: Received: 28 January 2020 / Approved: 30 January 2020 / Online: 30 January 2020 (10:53:55 CET)
Version 2 : Received: 28 February 2020 / Approved: 2 March 2020 / Online: 2 March 2020 (15:26:03 CET)
Version 3 : Received: 4 March 2020 / Approved: 5 March 2020 / Online: 5 March 2020 (15:06:23 CET)
Version 2 : Received: 28 February 2020 / Approved: 2 March 2020 / Online: 2 March 2020 (15:26:03 CET)
Version 3 : Received: 4 March 2020 / Approved: 5 March 2020 / Online: 5 March 2020 (15:06:23 CET)
How to cite: Santana Lima, M. The Entropy Function for Non Polynomial Problems and Its Applications for Turing Machines. Preprints 2020, 2020010360. https://doi.org/10.20944/preprints202001.0360.v3 Santana Lima, M. The Entropy Function for Non Polynomial Problems and Its Applications for Turing Machines. Preprints 2020, 2020010360. https://doi.org/10.20944/preprints202001.0360.v3
Abstract
We present a general process for the halting problem, valid regardless of the time and space computational complexity of the decision problem. It can be interpreted as the maximization of entropy for the utility function of a given Shannon-Kolmogorov-Bernoulli process. Applications to non-polynomials problems are given. The new interpretation of information rate proposed in this work is a method that models the solution space boundaries of any decision problem (and non polynomial problems in general) as a communication channel by means of Information Theory. We described a sort method that order objects using the intrinsic information content distribution for the elements of a constrained solution space - modeled as messages transmitted through any communication systems. The limits of the search space are defined by the Kolmogorov-Chaitin complexity of the sequences encoded as Shannon-Bernoulli strings. We conclude with a discussion about the implications for general decision problems in Turing machines.
Keywords
Computational Complexity; Information Theory; Machine Learning; Computational Statistics; Kolmogorov-Chaitin Complexity; Kelly criterion
Subject
Computer Science and Mathematics, Computer Science
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (1)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment
Commenter: Matheus Santana Lima
Commenter's Conflict of Interests: Author