Preprint
Article

Structure and Requirements for a Knowledge Architecture

Altmetrics

Downloads

187

Views

133

Comments

0

This version is not peer-reviewed

Submitted:

17 September 2024

Posted:

21 September 2024

You are already at the latest version

Alerts
Abstract
Knowledge is not merely the characteristic of certain pieces of information being deemed true, believed and justifiable, but rather a process. The sentences that reaches the status of knowledge forms a universe that provides an interpretation of the reality one aims to describe. Language exchanges information through communication and the justification, if concludes successfully with respect to a sentence, it elevates the information to the status of knowledge. We provide a model where we define the fundamental elements of communication and grammar. As a result, knowledge must be \open\ (everyone must be able to acquire it), \free\ (everyone must be able to justifiy) and \tidy\ (everyone must be able to retrace the justification process in all its parts recursively). Ultimately, the development of knowledge processing, referred to as \emph{knowmatic}, represents the evolution of the information technologies (\it).
Keywords: 
Subject: Computer Science and Mathematics  -   Logic

1. Introduction

The problem of knowledge is anything but recent: philosophy (particularly gnosiology and epistemology) [1], economics and logic [2] faces its own declination of the problem of knowledge. Among the most notable examples is the Gettier problem [3] which highlights situations where justified true belief fails to constitute knowledge. His counterexamples illustrate instances where someone may hold a belief that is both justified and true, yet it seems insufficient to claim knowledge. This dilemma prompts philosophers to refine their understanding of knowledge, emphasizing the importance of additional criteria beyond mere justification and truth. The ongoing debate surrounding the Gettier problem continues to shape contemporary epistemology, probing the features and boundaries of what constitutes genuine knowledge. The problem has been analyzed from many different points of view and the literature on the subject is more extensive than we could present in this paper. We use [MWE] for Merriam-Webster Dictionary [4], [CAM] for Cambridge Dictionary [5], [COL] for Collins Dictionary [6], [BRI] for Encyclopedia Britannica [7], [PLA] for The Stanford Encyclopedia of Philosophy [8] and [IEP] for the Internet Encyclopedia of Philosophy [9]. Some interesting reviews are “The Analysis of Knowledge” in [PLA]and in [???]. Furthermore, every civilization describes knowledge within its own culture by providing its own interpretation and our digital era is no different [?].
This paper is divided into 4 sections. We provide the model of knowledge from the fundamental elements of language up to communication, information, knowledge and research. Our challenge is to provide a complete framework within which not only the objective and interdisciplinary, formal and rigorous aspects can be traced back, but also subjective aspects as we believe that the knowledge model can equally be applied. We will try to address the problem of the diffusion and dissemination of knowledge in all its forms. In particular, knowledge disseminated through scientific articles (since it is, to date, the preferred form of dissemination [10,11]). There are also significant issues associated with the massive publication, and the enormous amount of information prevents deserving new research from emerging (or even simply being found among the countless other publications) [12]. A brief review on knowledge and cognition introduces the main section of the article. Knowledge encompasses the information, beliefs, and understanding gained through experience and reasoning, while cognition involves the mental processes used to acquire, process, store, and apply information. Observations are influenced by existing knowledge and frameworks; contextual influences impact both the discovery and justification of knowledge. These contexts shape which theories are developed and accepted, influencing the direction and interpretation of scientific research. Language is the means for knowledge acquisition, as well as a necessary and sufficient condition. This authorizes us to analyze language exclusively to investigate how sentences can become knowledge, which leads us to the final section where we outline a specific model of IT that is not based on the generic processing of information (hence the name “automatic information” or “informatics”) [13], but on the processing of knowledge. We will demonstrate that the ability of technologies to distinguish knowledge from information is mandatory for the advancement of IT itself.

2. Structure for Knowledge

Although inherently indefinable, knowledge can be described as expressing a unique relationship between the mind and any object, whereby the object exists not only in itself but also for consciousness. It is therefore an active operation of the spirit, occurring under certain conditions and presupposing three elements: a subject that knows, an object known, and a specific relationship between the two (translation by the author of [14]). We found C. Guastella [15] and F. Masci [?] work particularly enlightening due to its simplicity and clarity of exposition.

2.1. Language and Communication

 Definition 1.
An alphabet is a finite set of symbols [MWE] [CAM] [COL] [BRI].
Consider, for example, the set of symbols a, b, , y, z, σ made up of the English alphabet, consisting of 26 symbols, plus the space symbol σ .
 Definition 2.
A string is a concatenation of a finite set of symbols from an alphabet[MWE] [CAM] [COL] [BRI].
gjwqtut, tile, ey σ vly, keyboard are examples of strings. The concatenation of strings is a string and a symbol is particular type of string.
 Definition 3.
A dictionary is a finite set of strings, called words [MWE] [COL].
 Definition 4.
An expression is a concatenations of finite set of words.
[MWE] and [COL] defines an expression as “something that manifests, embodies, or symbolizes something else”. To avoid clutter, we shall no longer indicate the σ character with its explicit representation but with the symbol of space-between-words as usual. Then, tile gjwqtut keyboard tile is an example of expression, this is just string concatenation. A word is a particular type of expression.
 Definition 5.
A grammar is a finite set of rules of concatenation of symbols, words and expressions to form clauses [MWE] [CAM] [COL] [BRI].
The expression tile gjwqtut gjwqtut is not a clause while keyboard is a tile is (given that is is within the dictionary and play the correct role in accordance with the rules of the given grammar).
Clause are generally defined as “a group of words containing a subject and predicate and functioning as a member of a complex”[MWE] or “the part of sentence with a subject and a verb”[CAM]. A clause is a particular type of expression where we cannot yet guarantee that their content is meaningful as there are no reasons to not construct strings like the thready chair is healed.
 Definition 6.
A sentence is a concatenations of finite set of clauses according to the rules of grammar.
In [MWE]  a sentence is “a word, clause, or phrase or a group of clauses or phrases forming a syntactic unit which expresses an assertion, a question, a command, a wish, an exclamation, or the performance of an action, that in writing usually begins with a capital letter and concludes with appropriate end punctuation, and that in speaking is distinguished by characteristic patterns of stress, pitch, and pauses” and in [COL] is “a sentence is a group of words which, when they are written down, begin with a capital letter and end with a full stop, question mark, or exclamation mark”. Basically it is a complete construction contained between distinctive signs that mark the beginning and the end, making it complete but still not significant in general. Then we shall define a metric to measure the informative content or, at least, compare sentences based on the information they convey.
 Definition 7.
A language is a pair formed by a dictionary and a grammar [MWE] [CAM].
 Definition 8.
An information is the meaning of a sentence [MWE] [CAM] [COL] [BRI].
When a sentence describes qualities or behaviors that the subject does not actually possess, it fails to convey meaningful or useful information. This is because the message does not accurately reflect the reality of the subject. Then, it shall be considered as having zero information content as it provides no valuable information about the subject (since an object does not belong to a subject, declaring its non-belonging or an attribute about a non-belonging object is not informative). On the other hand, a sentence that accurately describe the development, characteristics and behaviors of a subject carry meaningful information.
 Definition 9.
A communication is a finite process by which informations is exchanged [MWE] [CAM] [COL] [BRI].
The structure that emerges from the definitions is depicted in Figure 1 in a content-form plane. The alphabet, by definition, has no content and no form since symbols do not represent anything on their other than themselves and lack any substructure. A symbol is the grammatical equivalent of a geometric point: it has no parts [16].
It is important to exercise caution when using terms like dictionary, grammar, language to avoid restricting their meaning to what is commonly understood as spoken or written language. Here, language refers to any manifestation that involves the combination of simpler elements to create more complex ones, which can in turn be combined to form other elements. Syntax establishes the rules of form, grammar and structure, while semantics sets the rules for content, information and interpretation. This definition encompasses formal languages, natural languages, musical notation systems, shorthand writing, choreographic notation, and programming languages. More broadly, it includes any context where a communication process, or transfer of information, can be identified. Hockett [17], identifies 13 characteristics of language, to which we should consider adding the features of “learnability” (the ability to acquire a language), “introspection” (the ability to use language to describe itself), “order” (the importance of word arrangement), “recursiveness” (the presence of sentences within larger sentences), “hierarchy” (the ability to create hierarchical structures within sentences of a text, within words of a sentence, or within symbols of a word), and “distinction” (the ability to identify sets of sentences, words, and symbols with common functions).

2.2. Fact

 Definition 10.
A fact is an elementary truth referring to a subject [MWE] [CAM] [COL]. The truth is established by a finite set of criteria of truth.
Criteria of truth refers to the standards or principles by which the truth is assessed. If a sentence is true without the need for proof, then it is a fact: a set of criteria of truth establishes the truth or falsehood of sentences, elevating true sentences to facts.

2.3. Knowledge

Knowledge can be innate, acquired by observation, thinking or thinking-plus-observing and through different sources [IEP]. A widespread but not universally accepted definition states that knowledge is the justification of true beliefs (also known as JTB) [18]. So knowledge is characterized by three features (tripartite theory of knowledge): it is a belief, it is true and it is justified. This concept is often attributed to Gettier [3]. At its core, knowledge involves holding a belief about something. This means that an individual must accept or affirm a sentence. This sentence must be true: a belief might be sincere and strongly held, but if it does not correspond to reality (or to any other criterion of truth), it does not qualify as knowledge. Justification refers to having good reasons or evidence that support the belief. In other words, there should be a rational basis for holding the belief. This criterion helps distinguish knowledge from mere lucky guesses.
The most consensus feature about knowledge is the truth: one can believe in something false or can communicate false sentences (definition 6, 8 and 9), yet one cannot have false knowledge (know something false) [19]. Definition 8 imposes no constraint on the truth of the sentence itself because the fact is not involved in the process of creating a sentence, nor in the overall communicative process. Ultimately, the main controversy is related to the third feature of knowledge, justification. In particular all those cases where justification is subjective or the true belief is not part of the common knowledge such as luck, superstition or scenarios based on some random pattern that are not deterministically reproducible [PLA][18,19,20,21].
The disagreement on the exact definition of knowledge develops in different strands. In [20], JTB is a necessary but not a sufficient condition. Then, there must be an unknown feature X (which leads to the definition of knowledge as JTBX) where X is a condition or list of conditions logically independent from J, T and B. Or, J is replaced with an other feature F such as reliability or other, leading to FTB. If F=J+X. Then, FTB=JTBX. If F=J. Then, FTB=JTB. Hence, FTB is the most general definition. To distinguish from mere J, we shall use the term argumentation to refer to a broader process [22] able to F-ify (J-ify, X-ify or combinations or others).
 Definition 11.
Let a finite set of facts and a language at least able to convey facts into communicable sentences. Then, a TB sentence is knowledge if there exists a finite process F able to F-ify the given sentence by means of that set of facts in the given language.
Knowledge is a believed and true fact (or finite set of facts) based on criteria of truth around which arguments, evidence, and demonstrations can be presented in a given language. Let a language and a finite set of facts, than finite knowledge is obtained by argumentation of true beliefs on given facts and can be incorporated into another set of facts to acquire1 new knowledge. Hence, new knowledge proceeds by extension of the previous one. When it is not possible to extend the previous knowledge, then new knowledge stand alongside the previous one with no intersection.
In Figure 3 there are two sets of facts (solid line). The set F a c t s 1 = { F 1 , , F 4 } and the set F a c t s 2 = { F 4 , F 5 } . Given a language, a finite set of true beliefs can be argued by using only facts F 1 and F 2 to acquired knowledge K n o w l e d g e 1 (dotted line). Adding fact F 3 , new knowledge can be acquired from K n o w l e d g e 1 and F 3 (or by F 1 , F 2 and F 3 ) and when F 4 is also added, than is acquired all the knowledge allowed by F a c t s 1 . Using F 3 , F 4 and F 5 knowledge K n o w l e d g e 2 is acquired. It is possible to use K n o w l e d g e 1 and K n o w l e d g e 2 to acquire K n o w l e d g e 3 just as it is possible to acquire K n o w l e d g e 3 by F 1 , , F 5 .

2.4. Research

[CAM] defines research as “a detailed study of a subject, especially in order to discover (new) information or reach a (new) understanding”. The definition makes no reference to the choice of criteria of truth because research is a process of study, exploration, discovery, much like communication. In fact, research can take various forms such as scientific, personal, spiritual, and can involve different phases and different criteria of truth. We need to take a step further and narrow down the definition in the scientific context to assert that scientific research is “a method of investigation in which a problem is first identified and observations, experiments, or other relevant data are then used to construct or test hypotheses that purport to solve it”[COL]. Let the set of all known facts through observation and measurement of nature, then the process of arguing true belief through these known facts is science. In our convention, science, is commonly defined as “the finite set of knowledge acquired through a methodical and rigorous process that organizes knowledge in such a way as to provide verifiable explanations and predictions about nature” [CAM][COL][BRI].
 Definition 12.
Let a language  a finite set of facts and a sentence. A research in a given language, is the process of FTB of the sentence by means of that set of facts[MWE][CAM][PLA].
Knowledge is acquired through research. The process of arguing precedes knowledge itself, in the sense that the success of the argumentation marks the acquisition of new knowledge. In the event of a failure in the process, no knowledge has been acquired and the process itself is knowledge [23,24,25].
Figure 4. Research argues true beliefs through facts and a language and produces knowledge: it is acquired by research and research is the process of acquisition.
Figure 4. Research argues true beliefs through facts and a language and produces knowledge: it is acquired by research and research is the process of acquisition.
Preprints 118470 g004
 Definition 13.
A universe is a triad formed by a finite set of facts, a research and knowledge.
As stated, research is a given process (then a fact) that processes facts through language in an attempt to concatenate them and raise them to knowledge. Given the finitude of facts set and the research process, knowledge is also finite. However, new knowledge is added to previous knowledge expanding the universe. It might be useful to provide a definition for the universe composed of the knowledge actually produced and the universe composed of all the knowledge that can be produced (having unlimited time to argue and unlimited space to store it). The parallelism with cosmology is interesting: the observable universe (o-universe) is the universe composed of the knowledge actually produced defined as “the region of space that humans can actually or theoretically observe [...]. [...] Unlike the observable universe, the universe is possibly infinite and without spatial edges”[BRI].

2.4.1. Scientific research

“Subjective knowledge” is that which emerges from individual perspectives and facts unique to a single cognitive subject. “Intersubjective knowledge” arises from shared facts between two or more cognitive subjects, where these common facts can be connected to acquire shared knowledge. A special case of intersubjective knowledge is “objective knowledge” emerging from facts that are universally common among all cognitive subjects within a given set. The goal of scientific research is to identify facts that are as universally common as possible, thereby knowledge that is robust and widely accepted.

3. Requirements for knowledge

Society is shaped by the language that shapes society [26,27]. Then, society decides how to encode the facts in their own language and share information [28]. Likewise, humanity creates the research and then knowledge. Then, the ability to acquire knowledge depends on the quality and effectiveness of research and language. Gibson [28] and Piantadosi [29] have conducted intense work researching the interaction between learnability and efficiency, study of vocabulary, learning process and efficiency of language. Knowledge is a partial understanding of the universe (an o-universe). This is why there are so many descriptions of the same universe (so many different o-universes in the same universe). Even if there is just one finite set of facts, the research may change from person to person, from society to society from context to context. Even if there is just one research, facts may change leading to different knowledge (different o-universes). For one and only one o-universe, it is necessary to set facts, research, and knowledge. Every degree of freedom alters the knowledge. It is possible that o-universes composed by different facts and research could intersect. Hence, it is possible to acquire same knowledge from different facts and research.
The most general and comprehensive interpretation of knowledge should include every empirical, sensory, spiritual and cognitive process and everything should be traced back to a particular case of knowledge as long as some requirements are met; even the interpretation itself. We do not exclude that other requirements may be imposed to ensure compliance with certain requirements; however, finding no other evidences, we assert that the following is a necessary condition for establishing a knowledge architecture, go back from knowledge to facts, justify sentences in a given research and investigate the result of the research itself. Knowledge is2 open, free, tidy.

3.1. Open

 Requirement 1.
A universe is open: always and continuously readable.
The first requirement focus on accessibility to knowledge. The attribute open directly aligns with the widely embraced concept of open access (OA), which advocates for literature that is “digital, online, free of charge, and largely unrestricted by copyright and licensing restrictions” [?].
OA refers to the practice of accessing existing knowledge to acquire it. There are various types of OA depending on the degree of accessibility to knowledge [10]: gold, green [?], hybrid [?], bronze, diamond [32], and black [?] depicted in Figure 5. “Free for readers” set is an example of open knowledge.
Scientific research is disseminated through journals, papers, conferences and other academic channels. It is important to emphasize that requirement 1 has a broad significance: not only is scientific research open, nor is it just knowledge that is open. Rather, the universe itself is open. Then are facts, research, language and knowledge.

3.2. Free

 Requirement 2.
A universe is free: always and continuously writable.
Requirement 2 ensures the ability to share acquired knowledge (to partake in it), there is freedom to share it. “Free for authors” set is an example of free universe.

3.3. Tidy

 Requirement 3.
A universe is tidy: make explicit in an open and free way its structure.
Tidy means “methodical, precise, well ordered and cared for”[MWE]. Given a finite set of facts and a research, one and only one knowledge is obtained (Figure 4). But from a knowledge it is possible to go back to several sets of facts and/or more research or more languages (see definition 8 and Figure 6). Hence, it is necessary to explicitly declare from which facts and with which research a given knowledge was acquired. Tidy points to the structured and ordered feature of the knowledge. This implies that the relationship between knowledge, information and research have a structure and is well organized, ensuring clear and well-defined dependencies within the knowledge network. Research proceeds from facts to knowledge, so knowledge cannot be used by research to generate information (unless that information is itself knowledge). This implies that knowledge does not present loops and is directed from one node to another (research goes from facts to knowledge and not vice versa). Therefore knowledge can be expressed through a directed loopless graph[34]. A nonempty “Peer-reviewed” set implies a tidy universe. Then, Diamond and Green OA sets are examples of open, free and tidy knowledge.

4. Review on knowledge and cognition

Knowledge refers to the information, beliefs, and understanding that individuals acquire through experience, reasoning, and reflection. Cognition refers to the mental processes through which individuals acquire, process, store, and apply information. For a subject carrying out a cognitive process (cognitive individual), justified rather than unjustified beliefs are preferable as they are more likely to be true [35]. These processes include perception, attention, memory, reasoning, and problem-solving. Chisholm in the theory of knowledge and probability [36,37] defines a sentence as “counterbalanced” for a cognitive individual if and only if the sentence is equally justifiable as its opposite and “probable” if is more justified in believing than disbelieving.
We can interpret the path of knowledge as an architecture: from the foundation to the roof, from Kant categories to the connections between them. In [38], Kant analyzes the dynamics of observation, interpretation and nature focusing on the acquisition of knowledge and the cognitive process. Our understanding is shaped by innate categories of thought, highlighting the active role of the mind in structuring knowledge. Willard Quine challenges the distinction between analytic and synthetic knowledge, arguing that our knowledge is a web of interconnected beliefs where observation is theory-laden [39]. Nelson Goodman discusses the problem of induction and the ways in which we classify and interpret our experiences [40]. In Norwood Russell Hanson [41] the concept of “theory-laden observation” emphasizes that observation is influenced by the theoretical frameworks and prior knowledge. Popper [42] introduce the concept of falsifiability, suggesting that scientific theories are never definitively proven but are instead subjected to temporary, though rigorous, testing followed by refutation. Kuhn [43] proposes that scientific progress occurs through paradigm shifts where dominant theoretical frameworks are replaced by new ones that better explain anomalies [44,45]. Feyerabend [46,47] opposes methodological monism, promoting a more pluralistic approach to scientific inquiry, where multiple methodologies coexist. Bogen [48] demonstrates that observation is not neutral but depends on the theoretical framework in which it is carried out and illustrates that observation is not neutral but is deeply dependent on the theoretical framework in which it is conducted. We cannot fail to notice that since its origin, the search for knowledge has moved towards an increasingly holistic vision: the categories that were previously independent elements have become monads3 within which it is possible to glimpse the entire universe [49]. Knowledge becomes a phenomenon emerging from the connections between the individual elements that constitute it.
A deep consequence is the problem of knowledge acquisition contexts: discovery and justification [24]. Assuming that it is significant to distinguish these two contexts (it is not obvious that these two areas are precisely defined [25,44,45]), it clearly emerges, as a common element among all the main epistemological research, that any stage of the cognitive process is marked by the economic, aesthetic, ethical and political context where this happens. Kuhn [43] has significantly influenced the understanding of these contexts. Scientific progress is not linear but occurs through paradigm shifts, where established frameworks are replaced by new ones that better explain observed phenomena. This view challenges the notion of a separation between discovery and justification, suggesting that both are intertwined with the prevailing scientific paradigms and socio-cultural influences of the time. Recent analyses, such as those by Foster [25], highlight that context profoundly shape both the discovery and justification stages and the justification is itself part of the discovery and the two processes are, substantially, the same. So the justification is itself part of the discovery and the two processes are interconnected and parts of a more general macro-process which is the discovery itself according to definition 12 (not in the sense of intuition alone but of a set of processes that lead to the acquisition of new knowledge). The economic context can influence research priorities and funding availability, while aesthetic values can guide the selection of theories based on their simplicity or elegance. Ethical and political contexts impact the direction of scientific inquiry and the acceptance of certain methodologies or findings within society. These influences underscore the non-neutrality of the cognitive process. Analysis conducted by Bogen aligns with the broader epistemological view that observation and interpretation are deeply embedded within theoretical frameworks, further influenced by broader contextual factors. Finally, it is worth reflecting on crucial experiments, as they serve as the link from knowledge to the requirements 1, 2 and 3.

4.1. Crucial experiments

Crucial experiments act as key points where competing theories or hypotheses can be tested against empirical evidence of one theory over others. These experiments are designed to test specific predictions made by competing theories under controlled conditions, to minimize ambiguity and maximize clarity in the results. However, as our earlier discussions have highlighted, the execution and interpretation of crucial experiments are not free from the various requirements and influences we have considered. The theoretical frameworks within which these experiments are conceived and conducted influence the design and implementation of these experiments, what is observed, and how results are interpreted [50].
Let consider an initial condition and two different theories which, at the initial condition, produce two different predicted events. Nature (in the context of scientific research) is questioned at the chosen initial condition and the event it produces are observed. If the expected event of one of the two theories corresponds with the event manifested by nature under the same initial conditions within error and reliability. Then, the theory that showed that event is considered correct. Formally, let C be an initial condition, T i be a set of different theories to test, E i = T i C be the set of predicted events and N be the nature. Let E = N C be the event produced by N at C. Then, T i is correct if and only if T i C = N C . Evidently two different theories can lead to the same events at C and there can exist a theory T i + 1 different from any T i such that E = T i + 1 C . Furthermore, if for every C such that T i C = N C . Then, T i C = T i + 1 C . But there exists at least one C ¯ , for T i C ¯ T i + 1 C ¯ and T i + 1 C ¯ = N C ¯ . Then, T i T i + 1 and C ¯ is the crucial initial condition or the crucial experiment that disproved T i in favor of T i + 1 . Here the paradigm has changed and there has been a scientific revolution [43,45].
The correspondence principle [51] was originally expressed by Niels Bohr as the assumption that any new, more general theory must reduce to the older, well-established theory in the domain where the older theory has been confirmed by experiment. As a consequence of acquiring new knowledge from existing knowledge, the theories are literally encapsulated into one another: N e w t o n i a n M e c h a n i c s G e n e r a l R e l a t i v i t y T O E and E l e c t r o m a g n e t i s m Q u a n t u m M e c h a n i c s T O E and just examples for a hypothetical Theory Of Everything (TOE) [52]. This is the process of acquiring knowledge from knowledge. This is why we questioned the need to have structured knowledge so that it is possible to question our knowledge and discover the contradictions it contains or the intellectual impulses it hides.

4.2. Acquisition

Mechanisms of language acquisition have been identified that rely mainly on the segmentation of sentences: the recognition of regular patterns in the distribution of sounds, words, and symbols [53,54]. This process occurs on multiple levels, breaking down sentences into words and words into symbols, becoming one of the two fundamental mechanisms for language acquisition [55,56]. Marcus [57] in infants, and more recently, Peña [58] in adults, demonstrated that the subjects involved are able to extract structural rules from an artificial language. Language is acquired statistically when possible. If segmentation is made impossible (for example, by adding pauses or using a continuous flow without segmentation cues), language is acquired based on the extraction of general rules about the structure of the language itself.
In both cases (transition probability through segmentation and the search for syntactic rules), the process involves formal4 analysis without any reference to content, as no consideration of information is known to the cognitive subjects acquiring the language. Thus, language acquisition is deeply constrained by syntax rather than semantics (at least in the initial phase), as it undergoes a process of structuring, segmentation and pattern identification that enables its acquisition. Both processes are particular cases of a more general statistical analysis focused on individual words in a clause or sets of them in sentences.
Communication, made possible by language acquisition through segmentation, enables the exchange of information in a symmetric way by parties involved in the process [17]. Moreover, language is expressive enough to allow for introspection, meaning it is possible to F-ify exchanged information using language itself. Then, communicate knowledge. Ultimately, general statistical analysis as the amin process for language acquisition enables the exchange of information, which, through language introspection, can F-ify information and thereby allow the exchange of knowledge. Essentially, any cognitive process is a communication process based on the exchange of information through language and the acquisition of knowledge is essentially the acquisition of a FTB process.
The final aspect related to language acquisition concerns the interpretability of sentences. The comprehension process involves the construction, analysis, and processing of structures. Acquiring a language requires the ability to assign an analysis to the input, meaning constructing a syntactic representation but this process requires a grammar [59]. Note that, an ambiguous sentence is associated with distinct syntactic representations. As previously mentioned, in order to comprehend sentences, individuals segment the statements that constitute them. This is clearly demonstrated by the experiments conducted in [60,61], as well as the models provided by [62]. However, both ambiguous and unambiguous representations include a “context”, which triggers connections between elements that are not explicitly present but that the individual must infer in order to assign meaning [63]. In this context, the most well-known interpretative model is the so-called “garden path” model, which is based on the concepts of minimal attachment and late closure5[64,65].
The phenomena observed in individuals acquiring language are the same, regardless of the language[66] or the age at which acquisition occurs [67,68]. Several experiments demonstrate that the acquisition process is identical in both Italian[64] and English in adults[66] and infants[69].
The principles of minimal attachment and late closure, the generation of expectations/assumptions when the context is undefined, semantic presuppositions, and the measurement of comprehension times for ambiguous sentences with disambiguating context[70] all provide strong evidence supporting the principle of computational resource economy[71]. Furthermore, the parallel with human language acquisition principles will become clearly evident when knowledge is linked based on networks that emerge from these same principles.

5. Knowmatic

The universe is a finite set of sentences where facts are TB sentences that we do not want to F-ify them because they are elementary or established by criteria of truth while knowledge consists of FTB sentences. All other sentences are simply waiting to be evaluated T or not T, B or not B, F or not F. IT is based on communication, which requires content to be communicated. Therefore, IT necessitates information, which is nothing more than the meaning of sentences in a given language. Consequently, IT exchanges sentences without any inherent awareness of their status, whether they are facts, knowledge or just sentences.
Beyond the generic processing of information, we yearn to process knowledge. The process of knowledge integrates deeper cognitive processes and contextual understanding into computational models, aiming to handle inputs in a way that mirrors human cognition and learning. We shall introduce the concept of “automatic knowledge”, from which the name knowmatic. It is the systematic analysis, organization, and management of knowledge using methodologies, techniques and frameworks for understanding its structures, acquisition, representation, origins, forms, validation, dissemination, application and utilization.
Computation6 refers to the finite process of performing a finite sequence of operations to solve problems [73]. In IT, computation refers to the use of algorithms and data to transform inputs into outputs and in general, information theory considers computation as the processing of data to extract or encoding informations. In physics, computation describes processes that occur in nature, such as genetic coding in biology or neural processing in the brain. In general, processes are forms of computation [74].
“The Method” by Edgar Morin [75,76,77,78,79,80] is a six-volume exploration of complexity in natural and social sciences and humanities. Theoretical aspects of computation include studying the limits of what can be computed and the efficiency of algorithms. Quantum computation leverages quantum mechanics to perform calculations potentially far more efficiently than classical technologies. Overall, computation is fundamental to understanding and modeling both artificial and natural systems in science and technology. Computation is viewed as the driving force behind all changes, with processes defined as evolving entities or phenomena undergoing transformation. Now, there is a need to capture, formalize and understand these changes but mathematics appears to lack a model that can adequately describe this complexity. In natural contexts, individual elements not only exhibit distinct behaviors but also give rise to “emergent” properties [81]; behaviors that cannot be deduced solely from the analysis of individual components. Real systems often exhibit non-local communication phenomena, such as the interactions between neurons [82]. Addressing these phenomena requires appropriate mathematical frameworks. Since the 1970s [83], two primary approaches have emerged: statistical mechanics and dynamical systems theory. Statistical mechanics, building on Boltzmann work, employs physical concepts like “temperature” and “free energy” [84]. Dynamical systems theory relies on differential equations, though these systems are only solvable in specific cases, often requiring simulations or specialized methods such as stability theory [85], bifurcation theory [86] and chaos theory [87,88].
Computation appears in many forms throughout Morin work because it provides a framework for understanding the interconnectedness and complexity of various systems. Nature itself is a computing machine. The output of its computation are the natural laws we perceive, the laws scientists describe with mathematics, the study of living things in biology, the results of engineering techniques, the transformations of matter through chemistry and all that we call reality. This computational perspective helps unify diverse fields, showing how biological evolution, cultural dynamics, and ethical considerations can all be understood through the lens of computation. Computation underlies ITfrom which the information, that is the result of the computation, emerges [89]:
knowmatic IT computation process

5.1. Connecting knowledge

Let a be a symbol and a finite set A = { a 1 , , a n } be an alphabet. Let S = s i A * (Kleene star) be a string and D = { d i } A * * be a dictionary where d i A * is a word. Let E = e i D * be an expression and G = { g i } be a finite set of rules. Let C D * G C be a clause (an expression that respects the rules of grammar G C ) and M = { C D * G C } be the set of clauses. Let T M * G T be a sentence (a set of clauses, that respects the rules of grammar G T ) and U = { T M * G T } be the set of sentences [90,91]. Let G : U { { σ 1 , ρ 1 , ω 1 } , { σ 2 , ρ 2 , ω 2 } , } U 3 * be a function that map a sentence into a non-empty finite set of triple of clauses where { σ i , ρ i , ω i } U 3 is a triple formed by a subject σ i U , a relation ρ i U and an object ω i U . For a given sentence T, let Σ = Σ T = { σ 1 , } be the set of all subjects, P = P T = { ρ 1 , } be the set of all relations and Ω = Ω T = { ω 1 , } be the set of all objects. Ω forms a vector space such that for any σ Σ , σ = ρ i Ω i (Einstein notation)7. We can adopt the same notation as in Boole [92], whose aim stated in the title was to derive the laws governing the “calculus of thought”. By doing so, we move the problem towards an algebraic and formal treatment.
Any T involves relations ρ i from σ to objects Ω i . Then, T gives a partial description of σ since it gives some relations ρ i from σ to some objects ω i . Since every element of U is finite in length8 (finite, for short) and T U , any description of σ will only consider a subset of all possible relations between all possible objects. Let S be the complete description of σ , meaning that all relations to all objects have been expressed in T. Since U is countably infinite, there are infinitely many objects to relate σ to, so T is infinite. But every element of U is finite. Hence, cannot exist such T.
S = σ if and only if the sentence T covers all its relations for all objects ( Ω and P are countable infinite). Any subject can have a countable infinite number of relations associated with a countably infinite number of objects, in an infinite predication needed to completely describe σ . Structures in Jackendoff [93] and Talmy [94] on cognitive semantics supports the conceptualization of subjects having potentially infinite associations with relations and objects. Then, any actual sentence will be incomplete because it will only cover a finite subset of the infinite possible predications. Dowty [95] on thematic roles and argument selection, emphasizes the selection of a finite set of relations and objects in practical descriptions. Ultimately, only with full knowledge (that is, the ability to state every possible triple) can we claim S = σ (full knowledge hypothesis). The complete knowledge of a single σ implies complete knowledge of all subjects. Therefore, only with complete knowledge of every subject can we have complete knowledge of a single subject and complete knowledge of a single subject implies complete knowledge of every subject. Langacker [96] and Talmy [94] cognitive grammar and semantic framework further support the idea that complete knowledge requires complete predications about all relations and objects. This provides a basis for understanding the conceptual completeness that would be required for a complete description: any practical description of a subject is inherently incomplete. Completeness is only theoretical and requires exhaustive predication across all relations and objects and is never achieved in practical linguistic contexts.
Consider the sentence Colorless green ideas sleep furiously [97,98] or our previous The thready chair is healed. As stated before, the key distinction between syntax and semantics is that a sentence can be syntactically correct yet semantically nonsensical. Previous sentences are well-formed according to the rules of syntax, they adheres to the grammatical structure of the language but they are semantically nonsensical because the individual words and their combinations do not convey information (meaningful concept). A semantically nonsensical sentence is a sentence that does not predicate on any object of the subject. Being colored is not a object of ideas, just as being thready is not a object of chairs and being colorless is not a object of green ideas, just as being able to heal is not a object of thready chairs and so on. Then it is possible to create a hierarchy of relations that can be more or less informative (express a triple with is already present in the knowledge).
Let T be a sentence and let G T = { , { σ i , ρ i , ω i } , } where σ i , ρ i , ω i U . Then, G σ i = { , { σ i j σ , ρ i j σ , ω i j σ } , } , G ρ i = { , { σ i j ρ , ρ i j ρ , ω i j ρ } , } and G ω i = { , { σ i j ω , ρ i j ω , ω i j ω } , } and so on. Let T C be the Chomsky sentence and T R be our sentence, in Figure 7 we depicted the structure of T C and T R where
  • G T C = { { sleep , nsubj , Colorless green ideas } ,
    { sleep , advmod , furiously } }
  • G sleep is self-evident since sleep D therefore it has no further structure other than the concatenation of symbols e, l, p and s that compose it. The process of destructuring could continue up to the level of the alphabet and not stop at the dictionary but here it is useless to continue further. G nsubj , G advmod and G furiously are self-evident for the same reason.
  • G Colorless green ideas = { { ideas , amod , Colorless } ,
    { ideas , amod , green } } .
Note that each subgraph represents a valid and grammatically correct sentence: Colorless ideas T C , green ideas T C , sleep furiously T C , Colorless green ideas T C and T C itself. Two nodes σ and ω and the edge ρ connecting them are linked by σ ρ ω . Then, ρ : σ ω .
Consider the following abbreviations used in Figure 7 and Figure 8:
  • advmod: adverbial modifier. Adverb modifying a verb, adjective, or adverb.
  • amod: adjectival modifier. Adjective modifying a noun.
  • auxpass: passive auxiliary verb. Auxiliary used in passive voice.
  • det: determiner. Introduces a noun and provides reference information.
  • nsubj: nominal subject. Noun or pronoun performing the action of the verb.
  • nsubjpass: nominal subject in passive voice. Subject of a verb in the passive voice.

5.2. Creation of the Knowmatic Machine

Consider the following sentences
  • T 1 = A plane has a streamlined shape to reduce air
    resistance
  • T 2 = The fuselage of a plane houses the passengers and
    cargo
  • T 3 = The tail section of a plane stabilizes its flight and
    includes the vertical stabilizer
  • T 4 = Planes use ailerons on the wings to control roll
We determined the graphs G 1 , , G 4 respectively according to the process shown earlier. Then, we merged these graphs into a single graph G = G 1 G 4 (depicted in Figure 8), which we will refer to as the knowledge graph (or KG). We can assume that all our knowledge consists of the four sentences mentioned above. Then, the only observations we can make are those permitted by σ , ρ and ω in KG. For example, we know that there exist a subject stabilizer which has the sole object vertical. While we might acquire additional knowledge such as The aircraft autopilot system maintains a horizontal flight path by continuously adjusting the ailerons and elevators to counteract any deviations in pitch and roll, and thereby discover that there exist another object horizontal for subject stabilizer. We have no knowledge of the fact, for instance, that these objects are deeply related and that stabilizer is associated with vertical as much as with horizontal. Although it is so obvious and fundamental for enabling flight, a universe consisting only of the five sentences cannot know this.
Let G be a KG, it is not possible to acquire new knowledge by connecting concepts that have never been connected before, because we would be unable to justify them. From a node in G , we can always count the number of incoming and outgoing edges, and thus determine the number # G | ω = ω ¯ of subjects for which that node ω ¯ is an object (incoming edges to ω ¯ ) and the number # G | σ = σ ¯ of objects for which that node σ ¯ is a subject (outgoing edges from σ ¯ ). Formally, given σ , ω , and ρ , it is possible to know σ ρ ω if and only if9
# G | ω = ω ¯ , σ = σ ¯ , ρ = ρ ¯ = 1 σ ρ ω
Therefore, it is possible to know something if and only if either all parts of the knowledge are already known (left-hand side of equation 2) or what is to be known is assumed as a fact (right-hand side). Anything that is already part of the knowledge is either part of prior knowledge or assumed as a fact. Consequently, everything that is known is assumed as a fact, and knowledge is nothing more than connecting facts.
Given a sentence T and the set of expressions T i generated by permutations of the words in T, let G be the graph of T and G i be the graph of T i . Then, G = G i T = T i . Hence, Σ = Σ G , P = P G and Ω = Ω G . The knowmatic machine must always be able to measure the amount of information carried by a sentence T associated with a graph G T for a given KG G .
 Definition 14.
Let T be a sentence associated with a graph G T and G be a given KG. T is said to bemeaninglessif G T G = .
“Meaningless” from an informative points of view is equivalent to “nonsensical” from a semantic point of view. Meaning that Σ G T Σ G = P G T P G = Ω G T Ω G = . Then, the information I T , G = I G T , G is I G T , G = 0 G T G = . By definition the function I is symmetric.
Consider the following definition
 Definition 15.
Let T be a sentence associated with a graph G T and let G be a KG. Also, let f : R R , lim x 0 f x = 0 and f 0 . Then the measure of information I G T , G conveyed by T is
I G T , G = f { σ ¯ , ρ ¯ , ω ¯ } G T # G T G | ρ = ρ ¯ σ = σ ¯ ω = ω ¯

6. Future Research

This paper marks the beginning of an extensive series of inquiries we are currently pursuing. In particular, the practical implementation of this architecture has led to the development of a knowledge-based search engine named wide3, from the idea of a something “having a greater than usual measure across” [MWE] based on the three requirements we outlined earlier. At present, wide3 is a PoC (Proof of Concept) of a processing engine capable of structuring knowledge and F-ify facts in order to acquire new knowledge. Unlike conventional search engine or artificial intelligence, wide3 does not merely relay information but also conveys its justification, making it a knowledge processor. Furthermore, F is customizable, which means that if justification (F=J) is not considered a valid process for acquiring knowledge, it is possible to configure an alternative process.
We argue that knowmatics, or its variations, will have a significant impact on the development of IT because, unlike the current probabilistic approach of AI, we aim to propose a paradigm shift based on the analysis of language as a way for knowledge acquisition and demonstrate the emergence of semantics from the information exchanged through syntax. Although Chomsky [97] has already shown that syntax does not imply semantics, we can demonstrate that semantics emerges from syntax through the measurement of information.
Finally, if we analyze the technological and educational context, we find that the learning of new language acquisition techniques, supported by technology, will also have an impact on future generations. From an educational point of view, the creation of learning systems based on the logical, grammatical, structural, cognitive and epistemological analysis of information represents a big step forward compared to current state of pedagogy.

References

  1. Machuca, D.; Reed, B. Skepticism: from antiquity to the present; Bloomsbury Publishing, 2018. [Google Scholar]
  2. Russell, B. The problems of philosophy; OUP Oxford, 2001. [Google Scholar]
  3. Gettier, E.L. Is justified true belief knowledge? analysis 1963, 23, 121–123. [Google Scholar] [CrossRef]
  4. Dictionary, M.W. Merriam-Webster.com; Merriam-Webster. 2022. Available online: https://www.merriam-webster.com.
  5. dictionary.cambridge.org Dictionary. dictionary.cambridge.org; Cambridge Dictionary, 2022. Available online: https://dictionary.cambridge.org.
  6. comminsdictionary.com Dictionary. comminsdictionary.com; Collins Dictionary, 2022. Available online: https://www.collinsdictionary.com.
  7. britannica.com Dictionary. britannica.com; Encyclopædia Britannica, 2022. Available online: https://www.britannica.com.
  8. plato.standford.edu Encyclopedia. The Stanford Encyclopedia of Philosophy; The Metaphysics Research Lab, 2022. Available online: https://plato.stanford.edu/.
  9. iep.utm.edu Encyclopedia. Internet Encyclopedia of Philosophy; Internet Encyclopedia of Philosophy, 2022. Available online: https://iep.utm.edu/.
  10. Ware, M.; Mabe, M. The STM report: An overview of scientific and scholarly journal publishing; International Association of Scientific, Technical and Medical Publishers, 2015. [Google Scholar]
  11. White, K. Publication Output by Country, Region, or Economy and Scientific Field. 2021. Available online: https://ncses.nsf.gov/pubs/nsb20214/publication-output-by-country-region-or-economy-and-scientific-field (accessed on 19 February 2024).
  12. Meho, L.I. The rise and rise of citation analysis. Physics World 2007, 20, 32. [Google Scholar] [CrossRef]
  13. Wang, Y. On cognitive informatics. In Proceedings First IEEE International Conference on Cognitive Informatics. IEEE, 2002; pp. 34–42.
  14. Ranzoli, C. Dizionario di scienze filosofiche; U. Hoepli, 1926. [Google Scholar]
  15. Schmidt, N. Review of Saggi Sulla Teoria della Conoscenza. Saggio Secondo Filosofia della Metafisica. Parte prima: La Causa Efficiente. The Philosophical Review 1907, 16, 91–94. [Google Scholar] [CrossRef]
  16. Heath, T.L. ; others. The thirteen books of Euclid’s Elements; Courier Corporation, 1956. [Google Scholar]
  17. Hockett, C.F.; Hockett, C.D. The origin of speech. Scientific American 1960, 203, 88–97. [Google Scholar] [CrossRef]
  18. Hannon, M. Knowledge, concept of. Routledge Encyclopedia of Philosophy 2021. Available online: https://www.rep.routledge.com/articles/thematic/knowledge-concept-of/v-2. [CrossRef]
  19. Hetherington, S. Knowledge. Internet Encyclopedia of Philosophy 2021. Available online: https://iep.utm.edu/knowledg/.
  20. Ichikawa, J.J.; Steup, M. The Analysis of Knowledge. In The Stanford Encyclopedia of Philosophy, Summer 2018 ed.; Zalta, E.N., Ed.; Metaphysics Research Lab, Stanford University, 2018. Available online: https://plato.stanford.edu/entries/knowledge-analysis/.
  21. Lehrer, K. Theory Of Knowledge: Second Edition; Taylor & Francis, 2018. Available online: https://books.google.it/books?id=ScJKDwAAQBAJ.
  22. Cyras, K.; Badrinath, R.; Mohalik, S.K.; Mujumdar, A.; Nikou, A.; Previti, A.; Sundararajan, V.; Feljan, A.V. Machine reasoning explainability. arXiv, arXiv:2009.00418.
  23. Bird, A. Thomas Kuhn. In The Stanford Encyclopedia of Philosophy, Spring 2022 ed.; Zalta, E.N., Ed.; Metaphysics Research Lab, Stanford University, 2022. [Google Scholar]
  24. Aufrecht, M. Reichenbach falls-and rises? Reconstructing the discovery/justification distinction. International Studies in the Philosophy of Science 2017, 31, 151–176. [Google Scholar] [CrossRef]
  25. Foster, J.G.; Rzhetsky, A.; Evans, J.A. Tradition and innovation in scientists’ research strategies. American sociological review 2015, 80, 875–908. [Google Scholar] [CrossRef]
  26. Andresen, J.T.; Carter, P.M. Languages in the world: How history, culture, and politics shape language; John Wiley & Sons, 2016. [Google Scholar]
  27. Hoff, E. How social contexts support and shape language development. Developmental review 2006, 26, 55–88. [Google Scholar] [CrossRef]
  28. Gibson, E.; Futrell, R.; Piantadosi, S.P.; Dautriche, I.; Mahowald, K.; Bergen, L.; Levy, R. How efficiency shapes human language. Trends in cognitive sciences 2019, 23, 389–407. [Google Scholar] [CrossRef]
  29. Piantadosi, S.T.; Tily, H.; Gibson, E. Word lengths are optimized for efficient communication. Proceedings of the National Academy of Sciences 2011, 108, 3526–3529. [Google Scholar] [CrossRef]
  30. Florio, M. La privatizzazione della conoscenza: Tre proposte contro i nuovi oligopoli; Gius. Laterza & Figli Spa, 2021. [Google Scholar]
  31. Caso, R.; others. Open Data, ricerca scientifica e privatizzazione della conoscenza 2022.
  32. Costello, E. Bronze, free, or fourrée: An open access commentary. Science Editing 2019, 6, 69–72. [Google Scholar] [CrossRef]
  33. Farquharson, J. Diamond open access venn diagram [en SVG] 2022. Available online: https://figshare.com/articles/figure/Diamond_open_access_venn_diagram_en_SVG_/21598179. [CrossRef]
  34. Bondy, J.A.; Murty, U.S.R.; others. Graph theory with applications; Macmillan London: location, 1976; Vol. 290.
  35. Alston, W.P. Concepts of epistemic justification. The monist 1985, 68, 57–89. [Google Scholar] [CrossRef]
  36. Chisholm, R.M. Theory of knowledge; Prentice-Hall Englewood Cliffs: NJ, 1989. [Google Scholar]
  37. Chisholm, R. Probability in the Theory of Knowledge. In Knowledge and Skepticism; Routledge, 2019; pp. 119–130. [Google Scholar]
  38. Kant, I.; Meiklejohn, J.M.D.; Abbott, T.K.; Meredith, J.C. Critique of pure reason; JM Dent London, 1934. [Google Scholar]
  39. Quine, W.V.O.; Ullian, J.S. The web of belief; Random House New York, 1978. [Google Scholar]
  40. Goodman, N.; Douglas, M.; Hull, D.L. How classification works: Nelson Goodman among the social sciences 1992.
  41. Hanson, N.R.; Hanson, N.R. Observation and Explanation: A guide to Philosophy of Science. What I Do Not Believe, and Other Essays 2020, pp. 81–121.
  42. Popper, K. The logic of scientific discovery; Routledge, 2005. [Google Scholar]
  43. Kuhn, T.S. The structure of scientific revolutions; University of Chicago press Chicago, 1997. [Google Scholar]
  44. Kuhn, T.S.; et al. The essential tension: Tradition and innovation in scientific research. In The third University of Utah research conference on the identification of scientific talent; University of Utah Press: Salt Lake City, 1959; pp. 225–239. [Google Scholar]
  45. Kuhn, T.S. Objectivity, value judgment, and theory choice. In Arguing about science; Routledge, 2012; pp. 74–86. [Google Scholar]
  46. Preston, J. Paul Feyerabend. In The Stanford Encyclopedia of Philosophy, Fall 2020 ed.; Zalta, E.N., Ed.; Metaphysics Research Lab, Stanford University, 2020. [Google Scholar]
  47. Feyerabend, P. Against method: Outline of an anarchistic theory of knowledge; Verso Books, 2020. [Google Scholar]
  48. Bogen, J. Theory and observation in science 2009.
  49. Leibniz, G.W. The Monadology (1714). Gottfried Wilhelm Leibniz: Philosophical Papers and Letters, Dordrecht 1969.
  50. Achinstein, P. Crucial experiments 1998. [CrossRef]
  51. El’yashevich, M.A. Niels Bohr’s development of the quantum theory of the atom and the correspondence principle (his 1912–1923 work in atomic physics and its significance). Soviet Physics Uspekhi 1985, 28, 879. [Google Scholar] [CrossRef]
  52. Swenson, R. A grand unified theory for the unification of physics, life, information and cognition (mind). Philosophical Transactions of the Royal Society A 2023, 381, 20220277. [Google Scholar] [CrossRef] [PubMed]
  53. Saffran, J.R.; Aslin, R.N.; Newport, E.L. Statistical learning by 8-month-old infants. Science 1996, 274, 1926–1928. [Google Scholar] [CrossRef] [PubMed]
  54. Saffran, J.R.; Johnson, E.K.; Aslin, R.N.; Newport, E.L. Statistical learning of tone sequences by human infants and adults. Cognition 1999, 70, 27–52. [Google Scholar] [CrossRef]
  55. Aslin, R.; Slemmer, J.; Kirkham, N.; Johnson, S. Statistical learning of visual shape sequences. meeting of the Society for Research in Child Development, Minneapolis, MN, 2001.
  56. Bonatti, L.L.; Peña, M.; Nespor, M.; Mehler, J. Linguistic constraints on statistical computations: The role of consonants and vowels in continuous speech processing. Psychological Science 2005, 16, 451–459. [Google Scholar] [CrossRef]
  57. Fiser, J.; Aslin, R.N. Statistical learning of new visual feature combinations by infants. Proceedings of the National Academy of Sciences 2002, 99, 15822–15826. [Google Scholar] [CrossRef]
  58. Peña, M.; Bonatti, L.L.; Nespor, M.; Mehler, J. Signal-driven computations in speech processing. Science 2002, 298, 604–607. [Google Scholar] [CrossRef]
  59. Fodor, J.D. Learning to parse? Journal of psycholinguistic research 1998, 27, 285–319. [Google Scholar] [CrossRef]
  60. Fodor, J.A.; Bever, T.G. The psychological reality of linguistic segments. Journal of verbal learning and verbal behavior 1965, 4, 414–420. [Google Scholar] [CrossRef]
  61. Garrett, M.; Bever, T.; Fodor, J. The active use of grammar in speech perception. Perception & Psychophysics 1966, 1, 30–32. [Google Scholar]
  62. Jarvella, R.J. Syntactic processing of connected speech. Journal of verbal learning and verbal behavior 1971, 10, 409–416. [Google Scholar] [CrossRef]
  63. Wanner, E. On remembering, forgetting, and understanding sentences: A study of the deep structure hypothesis; Walter de Gruyter GmbH & Co KG, 2019. [Google Scholar]
  64. De Vincenzi, M. Syntactic parsing strategies in Italian: The minimal chain principle; Springer Science & Business Media, 1991. [Google Scholar]
  65. Frazier, L.; Fodor, J.D. The sausage machine: A new two-stage parsing model. Cognition 1978, 6, 291–325. [Google Scholar] [CrossRef]
  66. Gibson, E.; Pearlmutter, N.; Canseco-Gonzalez, E.; Hickok, G. Cross-linguistic attachment preferences: Evidence from English and Spanish 1996.
  67. Clifton Jr, C.; Frazier, L. Comprehending sentences with long-distance dependencies. In Linguistic structure in language processing; Springer, 1989; pp. 273–317. [Google Scholar]
  68. Frazier, L. Theories of sentence processing 1987.
  69. Felser, C.; Marinis, T.; Clahsen, H. Children’s processing of ambiguous sentences: A study of relative clause attachment. Language Acquisition 2003, 11, 127–163. [Google Scholar] [CrossRef]
  70. Altmann, G.; Steedman, M. Interaction with context during human sentence processing. Cognition 1988, 30, 191–238. [Google Scholar] [CrossRef]
  71. Traxler, M.J. Plausibility and subcategorization preference in children’s processing of temporarily ambiguous sentences: Evidence from self-paced reading. The Quarterly Journal of Experimental Psychology: Section A 2002, 55, 75–96. [Google Scholar] [CrossRef]
  72. Gurevich, Y. What is an algorithm. In International conference on current trends in theory and practice of computer science; Springer, 2012; pp. 31–42. [Google Scholar]
  73. De Mol, L. Turing machines 2018.
  74. von Foerster, H.; Fiche, B. Communication Amongst Automata. The American journal of psychiatry 1962. [Google Scholar]
  75. Morin, E. La Nature de la Nature; Seuil: Paris, 1992. [Google Scholar]
  76. Morin, E. La Vie de la Vie; Seuil: Paris, 1992. [Google Scholar]
  77. Morin, E. La Connaissance de la Connaissance; Seuil: Paris, 1986. [Google Scholar]
  78. Morin, E. Les Idées: leur habitat, leur vie, leurs mœurs, leur organisation; Seuil: Paris, 1991. [Google Scholar]
  79. Morin, E. L’Humanité de l’Humanité; Seuil: Paris, 2001. [Google Scholar]
  80. Morin, E. Éthique; Seuil: Paris, 2004. [Google Scholar]
  81. Haken, H. Synergetics: An approach to self-organization. Self-organizing systems: The emergence of order 1987, pp. 417–434.
  82. Andreucci, D.; Herrero, M.A.; Velazquez, J.J. On the growth of filamentary structures in planar media. Mathematical methods in the applied sciences 2004, 27, 1935–1968. [Google Scholar] [CrossRef]
  83. Willshaw, D.J.; Von Der Malsburg, C. How patterned neural connections can be set up by self-organization. Proceedings of the Royal Society of London. Series B. Biological Sciences 1976, 194, 431–445. [Google Scholar]
  84. Aarts, E.; Korst, J. Simulated annealing and Boltzmann machines: a stochastic approach to combinatorial optimization and neural computing; John Wiley & Sons, Inc., 1989. [Google Scholar]
  85. Willems, J.L. Stability theory of dynamical systems. (No Title) 1970.
  86. Marsden, J.E.; McCracken, M. The Hopf bifurcation and its applications; Springer Science & Business Media, 2012. [Google Scholar]
  87. Holden, A.V. Chaos; Princeton University Press, 2014. [Google Scholar]
  88. Degn, H.; Holden, A.V.; Olsen, L.F. Chaos in biological systems; Springer Science & Business Media, 2013. [Google Scholar]
  89. Heath-Carpentier, A. The challenge of complexity: Essays by Edgar Morin; Liverpool University Press, 2022. [Google Scholar]
  90. Hopcroft, J.E.; Ullman, J.D. Formal languages and their relation to automata; Addison-Wesley Longman Publishing Co., Inc., 1969. [Google Scholar]
  91. Lewis, H.R.; Papadimitriou, C.H. Elements of the Theory of Computation. ACM SIGACT News 1998, 29, 62–78. [Google Scholar] [CrossRef]
  92. Boole, G. An investigation of the laws of thought: on which are founded the mathematical theories of logic and probabilities; Walton and Maberly, 1854. [Google Scholar]
  93. Jackendoff, R.S. Semantic structures; MIT press, 1992; Vol. 18. [Google Scholar]
  94. Talmy, L. Toward a cognitive semantics, volume 1: Concept structuring systems; MIT press, 2003. [Google Scholar]
  95. Dowty, D. Thematic proto-roles and argument selection. language 1991, 67, 547–619. [Google Scholar] [CrossRef]
  96. Langacker, R.W. Foundations of cognitive grammar: Volume I: Theoretical prerequisites; Stanford university press, 1987. [Google Scholar]
  97. Chomsky, N. Three models for the description of language. IRE Transactions on information theory 1956, 2, 113–124. [Google Scholar] [CrossRef]
  98. Chomsky, N. The logical structure of linguistic theory 1975.
1
We chose the term acquiring knowledge to avoid the debate over “a priori” or “a posteriori” knowledge. In any case, with respect to a subject, whether knowledge is a priori or a posteriori, it is acquired at the end of some process. In the case of a priori, the process can be understood as the birth of the subject itself.
2
We are referring to what knowledge is but we are aware that this is not the current state of knowledge due to economic interests that have privatized, commodified and exploited it [30,31].
3
The reference is explicitly to Leibniz’s Monadology and the relationship between the monad and the universe “Thus, although each created monad represents the entire universe [...]. And since this body expresses the whole universe through the connection of all matter in the plenum, the soul also represents the entire universe by representing this body, which belongs to it in a particular manner” (translation by the author).
4
Meaning that is related to the form.
5
These concepts will find a parallel in the following sections, where knowledge is structured symmetrically with the acquisition process we are currently investigating.
6
We want to report Gurevich work in a conference paper presented at the International Conference on Current Trends in Theory and Practice of Computer Science on the exact formal definition of algorithm [72].
7
The row vector ρ i P { 0 } * and 0 the null relation such that for every i j , { σ i , ρ j , ω j } = { σ i , 0 , ω j } .
8
Let n < be the length of the alphabet, # A * = # A * * = # A * * * = = 0 .
9
# G | ω = ω ¯ , σ = σ ¯ , ρ = ρ ¯ = 1 is equivalent to { σ ¯ , ρ ¯ , ω ¯ } G , but we preferred to use the cardinality (a scalar quantity) because it allows us to introduce the definition and measurement of information.
Figure 1. The relation between elements, from the alphabet to communication.
Figure 1. The relation between elements, from the alphabet to communication.
Preprints 118470 g001
Figure 2. Sentences and facts have the same form and content but are distinguished by the information they convey, as a fact has a higher level of truth compared to a sentence, which can generally be either true or false.
Figure 2. Sentences and facts have the same form and content but are distinguished by the information they convey, as a fact has a higher level of truth compared to a sentence, which can generally be either true or false.
Preprints 118470 g002
Figure 3. Knowledge acquisition based on facts that argues about true beliefs.
Figure 3. Knowledge acquisition based on facts that argues about true beliefs.
Preprints 118470 g003
Figure 5. OVenn four-set diagram using rectangles inspired by [33] in which are shown different types of OA. The rectangle with the solid line represents the set of journals in which the authors retain copyright, the dotted rectangle the peer-reviewed journals, the dashed rectangle the free for authors journals and the alternating rectangle the free for readers journals.
Figure 5. OVenn four-set diagram using rectangles inspired by [33] in which are shown different types of OA. The rectangle with the solid line represents the set of journals in which the authors retain copyright, the dotted rectangle the peer-reviewed journals, the dashed rectangle the free for authors journals and the alternating rectangle the free for readers journals.
Preprints 118470 g005
Figure 6. Given a knowledge it is possible to trace back a lot of research and many different facts.
Figure 6. Given a knowledge it is possible to trace back a lot of research and many different facts.
Preprints 118470 g006
Figure 7. Structure of subjects, relations and objects of T C and T R
Figure 7. Structure of subjects, relations and objects of T C and T R
Preprints 118470 g007
Figure 8. KG for four sentences about plane, fuselage, tail and wings.
Figure 8. KG for four sentences about plane, fuselage, tail and wings.
Preprints 118470 g008
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2024 MDPI (Basel, Switzerland) unless otherwise stated