1. Introduction
The historical road to digital transformation (DGT) assumes the methodology and technical approaches that tend to remove the virtual line, separating the actual system operation from the operation of its digitally empowered version. Although digital transformation opens the possibilities for innovative approaches to reengineering products and services, it often represents a disruptive mechanism that questions the existing organizational business models, topology, and well-established operation modes. It dynamically associates a concrete real-world organizational system and its virtual equivalent according to the digital sophistication level of involved, coarse-grained stakeholders appearing as subjects (external or internal active participants) or objects (internal, transformation-prone participants) of a digital transformation endeavor. The Digital Engineering (DE) paradigm establishes an enterprise as a socio-cultural-technical system and transforms the Systems Engineering (SE) practices toward technology innovation drivers [
1]. It assumes the collaboration/cooperation of all relevant stakeholders over a Digital Engineering Systemic Framework that integrates related resources and activities throughout the entire life cycle of a transformed system. DE goes beyond using specific tools and models and frames a culture of innovation, collaborative problem-solving, and continuous improvement [
2]. The current orientation towards DGT additionally originates from the vision of the so-called Great Reset, which addresses the essential domains of global crises and urgent needs for a systematic and sustainable globally synchronized response [
3]. Unfortunately, recent research shows that the main obstacles come from the inherent complexity of global systems and different approaches to economic and political perspectives [
4].
On the other hand, DGT research has a transdisciplinary nature and relies on the Transdisciplinary Systems Research Methodologies (TSRM). Instead of directly implicating a solution, TSRMs aim to provide the necessary context to understand, evaluate, and argue for or against many possible solutions, narrowing down the solution space to specify the directions for future implementation actions [
5]. The uncertainties related to the DGT success do not highly correlate with the existence of a solution but, more significantly, with an implementation failure.
The most influencing DT drivers are closely related to recent industrial revolution frameworks, starting from the three-dimensional Reference Architectural Model Industry 4.0 (RAMI 4.0) and its virtual description of production or service objects with Architecture (interoperability layers), Life Cycle, and Value Stream (differentiating product type and product instances), and Hierarchy (hosting of the production process) dimensions [
6] (
Figure 1).
Industry 4.0 frames a digital transformation from traditional to cyber-physical-systems (CPS) with economic benefits in mind. It utilizes innovative methods and technological means to upgrade the organizational system resources (organizational structure, business models, human resources, processes, and assets). Therefore, the existence of an intelligent enterprise information system aids the continuous rise of the system’s maturity level. That is why developing an assessment tool for digital transformation progress tracking, aligned with the Industry 4.0 drivers, appears mandatory [
7]. The following essential technology pillars appear as supportive for digital transformation towards Industry 4.0: Big Data and Analytics, Digital Twins and Simulation, Industrial Internet of Things, Augmented Reality, Autonomous Robots, Horizontal and Vertical Software Integration, Cybersecurity, and Cloud-based Additive Manufacturing [
8]. While Industry 4.0 assumes intelligent manufacturing with variable degrees of strategic impacts on business model transformation, the importance of intelligent support in the decision-making process, based on the embedded fuzziness of key Industry 4.0 transformation success factors, emerges [
9]. In [
10], the authors claim that currently, Industry 4.0 concepts are mainly adopted in large companies, and thereby propose a sustainable methodology suitable for Small and Medium Enterprises (SMEs) based on a specified conceptual framework, performance measurement structure synchronized with the continuous transformation achievements, with the particular emphasize on an intelligent system supporting SMEs in digital transformation courses.
On the other hand, Industry 5.0 returns the essential human values and vital social needs on the DGT scene with the recognized duality of virtual and real-world systems. The transition from Industry 4.0 to Industry 5.0 exhibits a four-dimensional paradigm shift: from Information to Intelligence, from Communication to Connectivity, from Cyber-Physical to Cyber-Physical-Social systems, and from Physical Automation to Knowledge Automation [
11]. The essential novel technologies associated with the Industry 5.0 paradigm include Blockchain, Autonomous Area Vehicles, 5G&6G Wireless Networks, Exoskeleton, Mixed Reality, Additive Manufacturing, Artificial Internet of Things, Motion Capture, and Digital Twin [
12]. Current literature raises a challenging dilemma of whether Industry 5.0 complements, replaces, or transparently integrates with Industry 4.0. The comparative analysis of evolutionary trajectories of Industry 4.0 and Industry 5.0 from the management perspectives argues the need for revolutionary changes in a relatively short period [
13], while [
14] elaborates on three different stages of Industry 5.0 perspectives. From the assessment point of view, there is a lack of novel enterprise maturity models that augment well-documented Industry 4.0 maturity models with Industry 5.0 novel dimensions [
15]. In [
16], the authors suggest a hybrid solution that leverages the high automation level of decision-making concepts introduced by Industry 4.0 and the leading role of humans, reactivated by Industry 5.0, based on a collective (hybrid) intelligence as the decision-making driver.
Further propositions concerning novel industry revolutions, like Industry 6.0, represent symbiotic integration of technological transformation, production chain innovation, and social sustainability progression towards the Smart Factory 6.0 paradigm as a sophisticated automated manufacturing facility that integrates emerging technologies and digital systems. Smart Factory 6.0 reflects the upcoming wave of cognitive manufacturing [
17] and sublimates IoT, AI, ML, robotics, the Blockchain, big data analytics, quantum computing, and cloud computing [
18,
19,
20] technologies.
On the other hand, Industries beyond 6.0, announced by arbitrary predictive visions, introduce significant speculative assumptions reflecting turbulences emerging from the contemporary scientific, educational, technology, and social contexts. In
Figure 2, a mind map diagram correlates different inclusive technologies related to discussed paradigms that challenge ongoing and future digital transformation projects.
Regarding organizational systems’ digital sophistication, two main aspects direct their ability to cope with digitalization and digital transformation: the strategy (higher-order capabilities) and the operation (low-order capabilities). They rely on available organizational resources to guide their transformation and achieve the desired outcome of a transformed system [
21].
The development of universal classification and the general typology of organizational systems concerning their DGT abilities has been mainstream in recent scientific research publications. Being business transformation-driven, it has promoted the Business Process Modeling (BPM) foundation for strategic and operational activities. The BPM framework approach to digital transformation, elaborated in [
22], defines six core elements of successful BPM initiatives: Strategic Alignment, Governance, Method, Information technology, People, and Culture, and proposes a set of corresponding recommendations derived through critical analysis of chosen large companies’ DGT projects. The generic and specific practices embedded in the proposed recommendations have strongly influenced the digital transformation hyper-framework requirements model building. Each core element represents a challenging domain-specific digital transformation dimension that deserves full-scale independent support. Configuration-based integration or orchestration of individual domain-specific elements (components or frameworks) enables need-to-have dynamic binding optimized for the particular transformation context.
The existence of the appropriate malty-dimensional strategy that guides the creation and manages the conducting of transformation initiatives represents the foundation for a supportive digital transformation canvas [
23].
Figure 3 shows the 11 P’s (Purpose, Process, Partner, Platform, People, Project, Product, Performance, Planet, Protection, and Privacy) of the digital transformation strategy foundation, grouped into four pillars (Strategy, Operation, Value, and Pitfalls), adapted and modeled as a mind-map diagram. Besides the component-based definition of DGT processes that may constitute an opened digital transformation life cycle model (ODGTLCM), the embedded, role-profiled stakeholders’ support through software-empowered solutions is essential for effective collaborative/cooperative operational framework development.
The dynamic nature of DGT justifies the current shift to an agile methodology that better suits iterative or incremental development, continuous delivery, customer-priority-driven collaboration/cooperation, and continuous adaptation to change [
24]. On the other hand, DGT is not possible if all systems-wide required resources (produced, disseminated, stored, retrieved, and consumed in any different sense) are not digitally twinned. A digitally transformed system is not a system with embedded IT support (composite pattern). It is a new system (decorator pattern) with a different identity and radically changed structure and behavior. Digital transformation endeavors have shown that exclusively focusing on information technologies (IT) supporting the transformation is not enough to guarantee effective and evolutionary transformation and sustainable operation through the entire life cycle of a transformed system. Information technologies, combined with the strategic importance of ethics, introduce a broader sense of systems value network and split the uncertainties into two main categories: external and internal. In addition to the organization’s decision-making mechanisms, with the digital transformation, the impetus moves to society and industry trends [
25].
The importance of organizational culture and people emphasizes the social and organizational aspects that deserve careful analysis before defining the strategic and operational foundations for evolutionary transformation activities guided by the appropriate methodology selection. The categorization of different organizational cultures enables the grouping of potential type-related enablers and disablers that may affect the entire life cycle of a digitally transformed system [
26]. The positive synergy between IT and human resource management (HRM) strategy is empirically recognized as a DGT booster [
27] influencing future strategy formulation. The digital human resource management (DHRM) strategy combined with the organizational culture typology results in the integrated typology model [
28] that better reflects the nondeterministic nature of socio-technology systems.
The existing literature has thoroughly examined the influence of DGT on external stakeholders with a focus on customer relationships, but the crucial importance of internal stakeholders’ (workforce) digital proficiency appears less rigorously addressed. By fostering a culture of collaboration/cooperation, empowerment, and innovation, organizational systems can raise the collective intelligence and creativity of its internal stakeholders and force them to seize emerging opportunities of fostering individual digital literacy and digital readiness and aid the certainty of the overall DGT process in organizations [
29].
With human-operated/used systems, success depends on the minimal distance between problem-domain concepts and activities and their digital equivalents. This distance is expressed through two quite different but often interchangeably used terms: User Interface (UI) and User Experience (UX). User Interface addresses the observable (visible) representations of a product, application, system, service, or function serving the purpose of interaction. It is traditionally recognized as a human-computer interaction (HCI), or more generally, human-machine interaction (HMI) engineering domain that has experienced significant changes in the conceptual foundation throughout the entire development history. Initially, general human cognition factors solely directed the design. Later on, user-centric contextual specializations augmented them. With the inclusion of social and cultural impact mitigation, current professional ethics, and fundamental moral principles, the UI design has evolved into a multidisciplinary endeavor [
30]. User Experience (UX) encapsulates UI and relates to the end-user’s subjective perceptions experienced through the entire scenario, directing the use of a product, system, service, or function developed by a particular vendor and disseminated as UI usable objects. The scenario includes pre-use, use, and post-use phases that, joined together, build the subjective feeling of a used object. The proliferation of AI and Machine learning technologies has opened novel challenges to UI design, adding profile-based dynamic configuration abilities [
31,
32], shifting from tangible to intangible interactions [
33], and integrating emotional intelligence and interaction optimization supporting highly diverse and personalized interactions [
34]. Digital Experience Platforms [
35], Generative Artificial Intelligence [
36,
37], Virtual and Augmented reality paradigms additionally foster the UX foundation [
38,
39].
The highest transformation quality of interactive systems emerges when the interaction mechanisms sequencing appear natural in fulfilling the intended user’s mission, not solely as the elements of the application’s Graphical User Interface (GUI). The proper balancing of human-directed and system-directed activities while operating complex cyber-physical and socio-technology systems is of great importance for their acceptance and sustainable operation through the entire system’s lifecycle. On the other hand, with the tight incorporation of contemporary information technologies and the growing tendency to minimize human-related dependencies, total automation appears as the Holy Grail. The level and efficiency of automation depend on the reactive coupling of UX and UI objects. The automatic generation of UI demands the extendible GUI library support for UI building blocks and the directed UX feedback that synchronizes UI-related meta-models with actual experiences gained through operational usage. The reflective synchronization of the actual twin and its virtual counterpart represents one of the main aspects of the digital twining paradigm [
40].
With digital transformation, it is essential to rely on life-long integrated support that does not separate systems design, software design, and Information Technology operating infrastructure. The successful DGT never ends. It evolutionarily transforms the actual system and the corresponding digital equivalent according to the built-in ability to adapt and operate in the context of any future transition. That is why the sustainable engineering/reengineering of cyber-physical and socio-technology systems assumes the existence of powerful, tool-based support applied through the entire life cycle at arbitrary abstraction levels or development stages. Consequently, the Systems Engineering (SyEng), Software Engineering (SwEng), and Operations Engineering (OpEng) domains represent three cornerstones in the DGT process of complex cyber-physical and cyber-social systems. The well-balanced approach to process and product synergy enables the continuous assessment of the engineering process maturity level and quality management, flexible product architecting, and the effective collaboration/cooperation of the real-world (Actual) system and its Computer-supported Virtual reflection. Agile development emerged as a promising approach based on the higher flexibility, continuous delivery of small, usable systems increments, and better response to change management [
41]. With the emphasis on agile framework development, machine learning (ML), and artificial intelligence (AI) support, the automation of routine tasks appears feasible.
As an additional challenge, digitally transformed systems exhibit a dramatic growth in volumes and diversity of generated, stored, retrieved, and processed data. The proliferation of the Internet of Things (IoT) and the Digital Twin paradigm introduce the real-time dimension of acquired and stored data instances uncommon with conventional enterprise systems. They demand more sophisticated data engineering efforts for storing and retrieving data and the higher involvement of machine learning (ML) and artificial intelligence (AI) powered framework technologies to support valuable data analytics [
42]. The Internet of Things (IoT) and the Digital Twin paradigm face standardization problems due to many vendors distributing IoT devices. This diversity causes variations in communication protocols, data models, data exchange formats, and security requirements. Therefore, the development of IoT-based systems ends with a higher effort and, due to the characteristics of individual application domains, results in a lower reusability level and higher extendibility risks [
43]. The Web-of-Things (WoT) paradigm enables interoperability over IoT platforms by abstracting WoT technology building blocks through properties, events, and actions [
44]. The importance of reliable and trustworthy infrastructure supporting the innovative business ecosystem assumes modern technologies used in fifth-generation (5G) networks through slicing, virtualization, and blockchain [
45].
Measuring the success level of DGT faces the potential paradox that the transformation process of a successful system never ends. Everything is transient except change (transformation), which is eternal and eternally drives the system’s transformations. The successful DGT involves complex engineering/reengineering activities framed by three synergic dimensions: forward-looking (vision, specification, modeling, design, implementation activities, methods, techniques, and platforms), backward-looking (product verification and validation), and downward-looking (process monitoring and control). Coping with the inherent complexity, it demands the utilization of corresponding Digital Transformation Environments (DGTEs) with dynamic configuring ability and transparent integration of independently developed dedicated frameworks [
46,
47,
48].
The essential motivation factors for our research directions originate from the Editor’s Corner section of [
49], where the Editor in Chief estimates the common-values-motivated drivers that will eliminate previously experienced obstacles and support the future challenges towards a total digital transformation. Consequently, the synergic support for DGT drivers assumes a lined, evolutionary process that requires a radical shift in engineering methodology, organizational culture, and expectations.
All engineering activities exhibit the duality of the engineering process and the engineered product. Model-Based Engineering (MBE) represents an engineering approach resulting in a hyper-model instance that digitally connects different types of engineering artifacts created during various engineering process steps, establishes forward and backward artifacts traceability, and minimizes the risks and effort in change management activities. MBE relies on the formal modeling languages that support specification, modeling, simulation, emulation, design, analysis, testing, validation, and verification of a virtual twin representing the structure and behavior of an engineered system. The underpinned formalisms enable the implementation of tailorable workbenches (software tools) providing generic services such as edition, visualization, transformation, comparison, storage, retrieval, export, import, and additional operations on virtual twin instances.
The primary mission of this research article is to specify a minimal subset of requirements that determine the development of an extendible, hyper-framework, all-in-one, software-empowered prototype of DGTE that suites the essential characteristics of a meta-specification-driven digital transformation supporting tool. In the context of this research, we have formulated the following five hypotheses.
The first hypothesis (H1) is that an interoperable multi-staged, Digital Twin-based, hyper-framework that interrelates Systems Engineering (SyEng), Software Engineering (SwEng), and Operation Engineering (OpEng) features into a collaborative/cooperative specification, development, and operational environment based on an extendible library of tailorable life cycle models (TLCMs) is essential.
The second hypothesis (H2) is that the hyper-framework has to mandatory support the stage-based creation of different configurations, ranging from individual services to system-of-systems, that form the executive infrastructure reflecting the events and services supported by the particular stage, independent of their digitalization status.
The third hypothesis (H3) is that each configuration instance has to represent an executable virtual system capable of real-time emulation (imitate with another system, surrogate), simulation (model-based prediction), operation (traceable running), monitoring (noninvasive and secured), modeling, decision-support mechanism, access (User Interface), and evaluation (User Experience) of configured components structure and behavior.
The fourth hypothesis (H4) is that each component needs to rely on abstract information resource conceptualization, hiding the persistency layer typology from the dynamic resource management layer and establishing the foundations for Data Engineering (DatEng) and Data Analytics (DatAna) features.
The fifth hypothesis (H5) is that the explicit support for H1, H2, H3, and H4 has to be a meta-specification driven and integrated into the DGT hyper-framework (DGTHyF). Meta-specifications, interpreted by the strategy pattern with extendible Generative Artificial Intelligence-supported concrete strategies, incline to Industry 6.0 and Beyond.
Concerning the fact that even the longest journey starts with a first step and faces the coexistence of different approaches and solutions, we hope the stated hypotheses suitably frame a promising one.
The rest of the article contains four sections.
Section 2, Materials and Methods, elaborates on the research foundation of the stated hypothesis.
Section 3, Results, introduces the main conceptual characteristics of the proposed hyper-framework, meta-specifications that enable dynamic handling of the different life cycle models, Architecture Foundation, Dynamic configuration and reconfiguration of participating components, and the generic support to extendible heterogeneous information resources handling. In
Section 4, we discuss and cross-relate the analyzed references to justify the appropriateness of the proposed hyper-framework.
Section 5 contains the concluding remarks and the future research directions.
3. Results
This research combines digital twinning, multidimensional cognition processes, and systems engineering, software engineering, and operational engineering methodology to build an extendable, generic, configurable, and usable hyper-framework prototype that supports the sustainable digital transformation of cyber-physical-socio-technology systems DGT through the entire lifecycle of the transformed system. In our previously published research [
48], we have proposed the conceptual framework for digital twin (DT) verification and validation, based on a quintuple helix model, as a paradigm for DGT hyper-framework (DGTHyF) prototype specification and development.
Due to its inherent complexity, we propose the evolution prototyping, meta-specification-based generation of hyper-document specification instances (models and code), and continuous integration and deployment support through a staged model of composing component systems. We claim that DGTHyF specification, modeling, and prototyping, compliant with the hypothesis (H1-H5), enable its incremental development, tailoring, verification, deployment, and operational validation in the lined agile DGT process.
Currently, there exists a large family of commercially available Integrated Development Environments (IDEs) representing interactive, GUI (Graphical Users Interface) oriented event-driven, software systems commonly supporting multiple programming languages, syntax-sensitive code editing, compiler or interpreter support, symbolic debugger, extendible library support, code search, version control, and code integration features. The proposed DGT Hyper-Framework (DGTHyF) is a prototype software system with configuration-based architecture, components, and service orchestration abilities. It comprises the extendible set of component frameworks, each supporting distinct closed systems (closed for modification but opened for extension) with internally extendible functionality.
Following the Model-based-engineering approach, we have defined a starting hyper-document represented as the AstahProfessional modeling tool component-based project suite (
Figure 10). It is composed of the extendible collection of underpinned hyper-documents that represent either atomic-grained (non-hyper-linked) or course-grained (hyper-linked) large data objects representing the specifications (models) of corresponding building blocks. The overall project suite represents a hyper-graph (a graph whose nodes are graphs) and is a starting meta-specification (meta-model) of the proposed DGTHyF, which we address as a DGTHyF generic virtual twin. Its highest granulation level is presented in
Figure 10 (a) and augmented by the package-contained example of the DGTHyF generalized architecture model (
Figure 10 -(b)) as an object-oriented model (class diagram—
Figure 10—(c)). The initial repertoire of General Purpose Components (
Figure 10—(d)) and Life Cycle Model Handler a Domain Specific Component (
Figure 10—(e)) represent the highest granularity specification of the corresponding hyper-document model.
The GenericFramework is a recursively orchestrated abstract HyperFramework, at the highest hierarchy level of the generic architecture meta-model, that specifies common structural and behavioral characteristics of an arbitrary hyper-framework. The DGTHyFApplicationPrototype is a GenericFramework that aggregates an open set of abstract DGTHyFComponentabstractions that either belong to the General Purpose Components Registry (a configuration of abstract GeneralPurposeComponentspecializations) or Domain Specific Components Registry (a configuration of abstract DomainSpecificComponent specializations) and forms a dynamic Architecture configuration. Each DomainPackage is a stand-alone container composed of DSCMetahandler (Domain Specific Component Meta-Handler) supporting the dynamic generation of domain-specific context and configuration and an arbitrary domain-specific component that encapsulates domain-specific services, in this version represented by a Life Cycle Model Handler (LCMHandler).
The generic architecture specification (
Figure 11) serves as a referent meta-model for the automatic generation of the initial prototype-shell code (
Figure 12), thereby illustrating the whole spectrum of features announced by the stated hypothesis (H1-H5).
3.1. Multy-Staged Hyper-Framework Requirements Model
The high-level generic requirements model of the proposed DGTHyF defines a universal proto-iterative starting point of an arbitrary evolution prototype as an interactive, event-driven, graphical users interface (GUI) oriented, component-based, meta-specification configurable, software tool with flexible, profile-based, architecting features (
Figure 13), designated as a hyper-document node 00-01-DGRHyF Requirements Model (see
Figure 13—(a)).
In the current stage, the generic requirements (
Figure 13—(a)) are composed of three specification documents models: 00_DG_Domain related requirements (specifies the highest level DGT domain related (Systems) requirements,
Figure 13—(1)—with detailed representation shown in
Figure 13), 01_GenericAssetRequirements (the highest level software requirements,
Figure 13—(2)—with detailed representation shown in
Figure 14), and further hyper document layer 02_DecisionSupportMechanisms (specifies component’s internal decision making mechanisms,
Figure 13—(3)—with detailed representation shown in
Figure 15).
Specified systems requirements (
Figure 14) correspond to the systems engineering virtual twin at the highest abstraction level and split overall responsibilities into two hierarchies: the Strategy Method and Expertise (01.01) and Technology Operation Sustainability (01.02). They are mainly process-oriented. DGT Framework requirements model (
Figure 15) is a virtual twin corresponding to the highest abstraction level of a product specification belonging to the software engineering solution domain. On the other hand, the Decision Support requirements model corresponds to the virtual twin that specifies the internal control mechanisms of arbitrary component decision-making core related to the highest level of the implementation domain.
Figure 16.
The detailed requirements model (
Figure 13—(3)).
Figure 16.
The detailed requirements model (
Figure 13—(3)).
3.2. DGTHyF Generic Behavioral Model
Based on hypothesis 3 (H3), through the entire lifecycle, the digitally transformed system contains architectural building blocks (component/service/micro-service) that are in one of the following stages: not transformed (performing in a problem domain without digital support), in transition (digital transformation is in progress), transformed (digitally supported, verified, and validated), deployed (installed and usable), operational with refactoring abilities (used with user experience feedback abilities), and occasionally temporarily (disabled) or permanently removed (retired). The DGTHyF generic behavioral model, presented as a UML state diagram (AstahProfessional modeling tool), is shown in
Figure 17.
These components are dynamically configured and support the intended services in a technically possible and time-dependent manner or form. Depending on the particular component service stage, the transformation activities belong to the operational, engineering, or reengineering category.
Consequently, the service discovery mechanism demands the establishment of traceable navigation paths reflecting the overall architecture. It assumes the creation and continuous maintenance of multi-layered hyper-structures of self-contained autonomous systems with arbitrary granularity levels (from atomic-grained to globally-grained), exclusively interrelated through interfacing points that enable the dynamic creation of horizontal (connecting different type ingredients) and vertical (connecting same type ingredients) associations.
On the other hand, the complexity mitigation of the service invocation mechanism assumes the stage-based polymorphism reduced to the meta-invocation defined by the statement:
with(context-bb: BuildingBlock; current-stage: Stage; invoke-se: Service)
that decuples the service invoker from the service discovery mechanism and service provider Building Block’s context.
Figure 18 defines a meta-object-specification model corresponding to the specified generic behavioral model (
Figure 17) as a further model refinement towards the executable specification (a programming code).
The sample Java code, generated from the OOM class model, extracted from the further refinements of the behavioral model’s transformation to Java programming code, is presented in
Figure 19.
Currently, various formalisms rely on transforming behavioral models into corresponding analytical models. While these formalisms offer flexibility due to the range of available analytical languages and notations, they introduce additional complexity and require proper handling. The interoperability of different tools, general-purpose or domain-specific languages, and integrated production environments supporting simulations is one of the most challenging issues for further research and engineering directions. If we define software and technical infrastructure as systems, it is possible to raise the abstraction level and concatenate similarities in the higher abstract concept whose specializations redefine or add specific state and behavior refinements.
3.3. DGT Life Cycle Model Handling Component
According to hypothesis 1 (H1), the DGT Life Cycle Model (DGTLCM) Handling Component represents a central domain-specific DGTHyF component for this article’s mission. The DGTLCM is a hyper-document that sublimates conceptual, logical, and physical repository models serving as meta-specifications canvas for data-driven generation of arbitrary DGTLCM instances.
The model specification development environment used for data layer modeling illustrates interoperable collaboration/cooperation (
Figure 20) between the AstahProfessional UML2 modeling tool that hosts Object-Oriented Models and the Power Designer Data Architect Version 16.1 modeling tool that hosts data layer models in platform-independent form (conceptual and logical data layer model
Figure 21) and platform dependent specialization (physical model
Figure 22) with MySQL relation database as a hosting database server (
Appendix A—Sample Data Definition Language script).
Conceptually, the presented model possesses three of four Meta Object Facility (MOF) data layer abstraction levels (meta, model, and instance-configuration) represented with related ER sub-models, with naming convention related to the data abstraction layer (DAL) specification discussed in
Section 2.
The multidimensional data models rely on the data modeling principle of the deepest possible primary key propagation, which better suites database queering purposes and fosters model-based transformation from relational to arbitrary NoSQL counterparts (document, graph, column table, or key-value) due to the substitution of multilevel joins by embedded selections exclusively with the primary key, from bottom layer to higher layers perspective. In such a model, all queries start on the lowest hierarchy level where all related primary keys exist as composite primary identifiers (See
Figure 22, UsedMetaActivityGraph table example).
4. Discussion and Related Work
The DGTHyF initial meta-specifications frame the development of a software-supported development environment (tool) with generic impacts on future DGT endeavors. Such a tool has to support a wide range of problem domains and sustain through the entire life cycle of digitally transformed systems. Due to its inherent trinity, the superimposing of traditional Systems Engineering, Software Engineering, and Operation Engineering Life Cycle models [
118,
119] appear as the ultimate starting point for DGTHyF verification and validation purposes (hypothesis 1). The Collaboration/Cooperation of heterogeneously staged components (systems)—(hypothesis 2), Executable Virtualization Ability—(hypothesis 3), hiding the data-layer complexity (hypothesis 4), and the overall meta-specification-based generativity (hypothesis 5) creatively incorporated in the specification and modeling of the extendible integrated Life Cycle Model management domain specific DGTHyF component (
Section 3). The DGTHyF is not a replacement for the existing or future incoming frameworks but a self-extendible hub for an open set of collaborative frameworks.
In the discussion section, we emphasize the collaborative potentials of mainstream meta-specifications, related hypotheses, and further perspectives (beyond the current stage).
Software Development Environments of arbitrary kinds are usually targeted in model-driven software development (MDSD) or model-based system engineering (MBSE) suits. They are closely related to the general architecture modeling paradigms and constitute the core of different contemporary Enterprise Architecture frameworks (EAF).
In the proposed approach, the DGT project is specified by a model-based hyper-document (currently defined as an AstahProfessional project) supporting the open, collaborative interoperability with arbitrary external tools (currently illustrated with the Power Designer Data Architect modeling tool hyper-linking) that may be either containerized or virtualized over the underlining digital technology infrastructure. Consequently, we direct the discussion and related work analysis to the conceptual or software-empowered hyper-document-based collaborative frameworks framed by the specified five hypotheses in the contemporary DGT context.
The elaborated DGTHyF is modeled and specified as an evolution prototype of heterogeneous collaborative component frameworks belonging to an ontology-based or tool-based group of EAFs (
Figure 23), integrated over a virtual twin hyper-document instance.
The selected EAFs belong to three general groups: Ontology-based, Model-based, and Tool-based EAFs. Although a more detailed discussion of all available and selected frameworks is far beyond the scope of this article, it is necessary to highlight their core foundations.
The ontology-based group has a long development tradition focused on the fundamental principles and practices that form the boundaries of the addressed problem domain and serve as a starting point for concept clarification purposes. These specifications form the data structure skeleton of software tools supporting systems, software, and operations engineering activities. The most referenced Ontology-based EAF are: the Zachman Framework (ZF) [
120], the Open Group Architecture Framework (TOGAF) [
121], the Federal Enterprise Architecture Framework(FEAF) [
122], the Defense oriented group of frameworks (US Department of Defense—DoDAF [
123], NATO Architecture Framework—NAF [
124]), International Standards Organization on EAF [
125]), the Capability Maturity Model Integrated (CMMI) Framework [
126], the International Council On Systems Engineering (INCOSE) framework discussed in the introductory section [
1], and the Skills Framework for the Information Age (SFIA) [
127].
The model-based EAF group directly impacts the current stage of DGTHyF regarding its inherent model-based orientation. It is used similarly as an ontology-based group but with extensive augmentation by the formal modeling languages. We have solely appointed the Object Management Group (OMG) [
128] due to its relevancy, with the most referenced members being the Unified Modeling Language (UML2), Systems Markup Language (SySML), Business Process Modeling (BPM+), Information Exchange Framework (IEF), Meta-Object Facility Framework (MOF), and the Unified Architecture Framework (UAF).
The selected subset of Tool-based EAFs splits into three main groups. First, the Ontology-supported EAF tools (the Sparx System EAF [
1,
129] and SAP LenIX [
130], with dominantly TOGAF and FEAF ontology foundation). The second, general agile (the Scaled Agile Framework [
131] heavy-weight tool), and the third, Scrum-based EAF light-weight tools group (the Large Scale Scrum (LeSS) [
132], Nexus [
133], Scrum@Scale [
134], and the Enterprise Scrum [
135]) (see
Figure 23).
The DGTHyF is a software-empowered heterogeneous hub promoting interoperability as an essential requirement. Although cooperative interoperability is usually assumed while developing tool-based frameworks, we predict the collaborative interoperability approach as a further universal direction.
Consequently, the briefly discussed sample context justifies the proposed DGTHyF’s fundamental hypothesis as fostering mechanisms that place the collaborative heterogeneity in the core of arbitrary DGT context.
Appendix A. Sample Data Definition Language script genereted from physical model (Figure 22)
drop table if exists UsedMetaActivityGraph;
/*=========================================================*/
/* Table: UsedMetaActivityGraph */
/*=========================================================*/
create table UsedMetaActivityGraph
(
MET_LCMTYPE char(2) not null,
LCMMM_ID numeric(4,0) not null,
LCMM_TIMESTAMP_T char(10) not null,
LCMTYPE char(2) not null,
LCM_ID bigint not null,
LCM_MODALITY char(2) not null,
LCM_TOMESTAMP timestamp not null,
MM_LEVEL numeric(2,0) not null,
USE_INSTANCEA int not null,
USE_MET_LCMTYPE char(2) not null,
USE_LCMMM_ID numeric(4,0) not null,
USE_LCMM_TIMESTAMP_T char(10) not null,
USE_LCMTYPE char(2) not null,
USE_LCM_ID bigint not null,
USE_LCM_MODALITY char(2) not null,
USE_LCM_TOMESTAMP timestamp not null,
USE_MM_LEVEL numeric(2,0) not null,
INSTANCEA int not null,
primary key (USE_MET_LCMTYPE, MET_LCMTYPE, USE_LCMMM_ID, USE_LCMM_TIMESTAMP_T, LCMMM_ID, LCMM_TIMESTAMP_T, USE_LCMTYPE, USE_LCM_ID, USE_LCM_MODALITY, USE_LCM_TOMESTAMP, LCMTYPE, LCM_ID, LCM_MODALITY, LCM_TOMESTAMP, USE_MM_LEVEL, MM_LEVEL, USE_INSTANCEA, INSTANCEA)
);
alter table UsedMetaActivityGraph add constraint FK_MetaMaster foreign key (MET_LCMTYPE, LCMMM_ID, LCMM_TIMESTAMP_T, LCMTYPE, LCM_ID, LCM_MODALITY, LCM_TOMESTAMP, MM_LEVEL, USE_INSTANCEA)
references UsedMetaActivities (MET_LCMTYPE, LCMMM_ID, LCMM_TIMESTAMP_T, LCMTYPE, LCM_ID, LCM_MODALITY, LCM_TOMESTAMP, MM_LEVEL, INSTANCEA) on delete restrict on update restrict;
alter table UsedMetaActivityGraph add constraint FK_MetaSlave foreign key (USE_MET_LCMTYPE, USE_LCMMM_ID, USE_LCMM_TIMESTAMP_T, USE_LCMTYPE, USE_LCM_ID, USE_LCM_MODALITY, USE_LCM_TOMESTAMP, USE_MM_LEVEL, INSTANCEA)
references UsedMetaActivities (MET_LCMTYPE, LCMMM_ID, LCMM_TIMESTAMP_T, LCMTYPE, LCM_ID, LCM_MODALITY, LCM_TOMESTAMP, MM_LEVEL, INSTANCEA) on delete restrict on update restrict;