Abstract
Communication, compression of information, transmission of information through noisy channels, interconnecting different information systems, cryptography, gate construction –– these areas all depend on classical information theory. We show that, in classical terms, semantic aspects of communication are not at all irrelevant to the engineering problem, contrary to Shannon, and affect the message intended to be transmitted. This is revisited and captured by an analogy to trust, in that they are essential to the channel (for proper use), but cannot be transferred (under risk of flaws) through that same channel. Information is also described by, at least, a tri-state system — not by binary logic. The trust analogy semantics can be coded as the Curry-Howard relationship, connecting computer code with structural logic, by way of different categories. Two-state and Boolean logic (aka Shannon semantics) was used classically before, with Shannon theory, but without trust analogy semantics – found to be a sine qua non condition. This is now familiar in classical gate construction with physical systems with, e.g., Verilog and SystemVerilog. The applications to computation and quantum theory are further explored. The most fundamental entity in today`s theory of information is proposed to use at least three logical states, not bits, in all applications, including: cyber-physical systems, devices, in computation, and in quantum theory.