Article
Version 3
This version is not peer-reviewed
Introduction to the E-Sense Artificial Intelligence System
Version 1
: Received: 5 September 2023 / Approved: 6 September 2023 / Online: 6 September 2023 (04:13:53 CEST)
Version 2 : Received: 9 January 2024 / Approved: 10 January 2024 / Online: 10 January 2024 (04:25:56 CET)
Version 3 : Received: 28 October 2024 / Approved: 29 October 2024 / Online: 30 October 2024 (10:29:42 CET)
Version 2 : Received: 9 January 2024 / Approved: 10 January 2024 / Online: 10 January 2024 (04:25:56 CET)
Version 3 : Received: 28 October 2024 / Approved: 29 October 2024 / Online: 30 October 2024 (10:29:42 CET)
How to cite: Greer, K. Introduction to the E-Sense Artificial Intelligence System. Preprints 2023, 2023090370. https://doi.org/10.20944/preprints202309.0370.v3 Greer, K. Introduction to the E-Sense Artificial Intelligence System. Preprints 2023, 2023090370. https://doi.org/10.20944/preprints202309.0370.v3
Abstract
This paper describes the E-Sense Artificial Intelligence system. It comprises of a memory model with 2 levels of information and then a more neural layer above that. The lower memory level stores source data in a Markov (n-gram) structure that is unweighted. Then a middle ontology level is created from a further 3 phases of aggregating source information. Each phase re-structures from an ensemble to a tree, where the information transposition may be from horizontal set-based sequences into more vertical, typed-based clusters. The base memory is essentially neutral, where any weighted constraints or preferences should be stored in the calling module. The success of the ontology typing is open to question, but results produced answers based more on use and context. The third level is more functional, where each function can represent a subset of the base data and learn how to transpose across it. The functional structures are shown to be quite orthogonal, or separate and are made from nodes with a progressive type of capability, including unordered to ordered. Comparisons with the columnar structure of the neural cortex can be made and the idea of ordinal learning, or just learning relative positions, is introduced. While this is still a work in progress, it offers a different architecture to the current favourites and may be able to give different views of the data from what they can provide.
Keywords
brain model; memory model; neural model; cortex; statistical clustering
Subject
Computer Science and Mathematics, Artificial Intelligence and Machine Learning
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment