Sort by

Article
Business, Economics and Management
Other

Sajad Ebrahimi

,

Bahareh Golkar

,

Jaideep Motwani

Abstract:

Since the start of the Conservation Reserve Program (CRP) in 1985, the US farmers has participated in the program with offering a portion of their environmentally sensitive lands to the program in exchange of annual rental payments. However, recent declining enrollments in the programs have raised concerns regarding its spatial relevance to environmental needs and economic status and incentives in participating regions. Therefore, this study explores the CRP participation and its drivers across regions to understand any spatial patterns that may exist. To do so, this research employs a combination of spatial analyses, named as exploratory spatial data analysis (ESDA). Incorporating CRP participation rates and three contributing factors to the program, including CRP rental rate, soil erosion on cultivated farmlands, and farm income per acre, the approach applies Global Moran’s I, Univariate Local Indicators of Spatial Association (LISA), and Bivariate LISA (BiLISA) to answer the research questions. To validate the methodological framework, the study applies it to the Midwestern US counties which are one of the main contributors to the program. The results revealed significant spatial clustering for the variables and regional heterogeneity in CRP participation, implying that a uniform, nationwide policy design may not adequately address local environmental and economic conditions. Additionally, spatial mismatches for counties with high soil erosion risk and offered with strong rental incentives may not consistently achieve higher participation, implying inefficiencies in current CRP targeting and offer-selection mechanisms. Overall, the results support a shift toward a more data-driven, spatially informed decision-making process when it comes to strategizing CRP implementation.

Short Note
Physical Sciences
Thermodynamics

Jordan Barton

Abstract:

This paper advances Coherence Thermodynamics for understanding systems composed purely of information and coherence. It derives five laws of coherence thermodynamics and applies them to two case studies. Three canonical modes of coherent informational systems are developed: Standing State, Computation Crucible, and Holographic Projection. Each mode has its own dynamics and natural units, with thermodynamic coherence defined as the reciprocal of the entropy–temperature product. Within this theory, reasoning is proposed to emerge as an ordered, work‑performing process that locally resists entropy and generates coherent structure across universal features.

Article
Biology and Life Sciences
Biochemistry and Molecular Biology

Tianyi Zhou

,

Kevin Song

,

Hui Huang

,

Ning Lyu

,

Qin Feng

Abstract: Traditional ChIP-seq analysis is essential for identifying transcription factor (TF) binding sites, but it is constrained by its linear view of the genome. How TF-bound regions interact with distant genomic loci within the three-dimensional (3D) chromatin architecture often remains unclear, limiting our ability to interpret enhancer-promoter communication and long-range gene regulation. To address these limitations, we developed ChIP-SP, an R package that integrates ChIP-seq data with Hi-C chromatin loop interactions, enabling the study of TF-mediated regulatory regions within a 3D genomic context. In this study, we evaluated ChIP-SP using the androgen receptor (AR) as a model TF in LNCaP prostate cancer cells. By focusing on AR ChIP-seq peaks that participate in chromatin looping and examining a 25 kb radius around each peak in 3D genomic space, ChIP-SP identified 1,499 AR-spatially regulated genes, and many of them were confirmed to be androgen-responsive. We similarly applied ChIP-SP to glucocorticoid receptor (GR) ChIP-seq data in A549 lung cancer cells and successfully identified GR-spatially regulated genes. These results demonstrate that ChIP-SP extends traditional ChIP-seq annotation into a multidimensional framework and enables the construction of a spatial cistrome for transcription factors. The tool is flexible, customizable, and holds strong potential for uncovering novel regulatory target genes, particularly in cancer biology.
Article
Medicine and Pharmacology
Cardiac and Cardiovascular Systems

Stanisław Wawrzyniak

,

Ewa Wołoszyn-Horák

,

Julia Cieśla

,

Marcin Schulz

,

Michał Krawiec

,

Michał Janik

,

Paweł Wojciechowski

,

Iga Dajnowska

,

Dominika Szablewska

,

Jakub Bartoszek

+3 authors

Abstract:

Background: There exists some inconsistent evidence on the relationship between altered cardiac morphology, its function, and frailty. Therefore, this study aimed to assess the associations among frailty, lean body mass, central arterial stiffness, and cardiac structure and geometry in older people with a normal ejection fraction. Methods: A total of 205 patients >65 years were enrolled into this ancillary analysis of FRAPICA study and were assessed for frailty with Fried phenotype scale. Left ventricular dimensions and geometry were assessed with two-dimensional echocardiography. Fat-free mass was measured using three-site skinfold method. Parametric, non-parametric statistics and analysis of covariance were used for statistical calculations. Results: Frail patients were older and women comprised the majority of the frail group. Frail men and women had comparable weight, height, fat-free mass, blood pressure, central blood pressure, and carotid-femoral pulse wave velocity to their non-frail counterparts. There was a linear correlation between the sum of frailty criteria and left ventricular end diastolic diameter (negative) and relative wall thickness (positive). In the analysis of covariance, frailty and gender were independently associated with left ventricular mass, left ventricular mass indexed, and relative wall thickness. Frailty shifts heart remodeling toward concentric remodeling/hypertrophy. Conclusions: Frailty is independently associated with thickening of the left ventricular walls and a diminished left ventricular end-diastolic diameter, leading to concentric remodeling or hypertrophy. This phenomenon is more pronounced in women. This adverse cardiac remodeling may serve as another phenotype feature of frailty according to the phenotype frailty criteria.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Yingxin Ou

,

Sumeng Huang

,

Feiyang Wang

,

Kan Zhou

,

Yingyi Shu

Abstract:

Non-stationary time-series data poses significant challenges for anomaly detection systems due to evolving patterns and distribution shifts that render traditional static models ineffective. This paper presents a novel continual learning framework that integrates dynamic distribution monitoring mechanisms to enable adaptive anomaly detection in non-stationary environments. The proposed framework employs a dual-module architecture consisting of a distribution drift detector and an adaptive learning component. The distribution drift detector utilizes statistical hypothesis testing to identify temporal shifts in data distributions, while the adaptive learning module employs rehearsal-based continual learning strategies with dynamic memory management to maintain model performance across evolving patterns. We introduce a hybrid loss function that balances stability and plasticity, preventing catastrophic forgetting while enabling rapid adaptation to new distributions. Experimental results demonstrate an average F1-score improvement of 11.3% over the best-performing baseline, highlighting the robustness and adaptability of the proposed framework under non-stationary conditions while maintaining computational efficiency suitable for real-time applications.

Review
Physical Sciences
Space Science

Simon Evetts

,

Beth Healey

,

Tessa Morris-Paterson

,

Vladimir Pletser

Abstract: The rapid expansion of commercial human spaceflight is forcing a re-examination of how we decide who is “fit to fly” in space. For six decades, astronaut selection has been dominated by national space agencies using stringent, mission-driven criteria grounded in risk minimisation and long-duration operational demands. Contemporary standards such as NASA-STD-3001 and agency-specific medical regulations embed a philosophy in which astronauts are rare, heavily trained national assets expected to tolerate extreme environments with minimal performance degradation. In contrast, commercial operators aim to fly large numbers of spaceflight participants (SFPs) with highly heterogeneous medical and psychological profiles, under a US regulatory regime that emphasises informed consent and currently imposes very limited prescriptive health requirements on passengers. This article reviews the evolution and structure of traditional astronaut selection, outlines emerging approaches to screening and certify-ing commercial spaceflight customers, and explores the conceptual and practical gap between “selection” and “screening”. Drawing on agency standards, psychological se-lection research, and recent proposals for commercial medical guidelines, it proposes a risk-informed, mission-specific framework that adapts lessons from government as-tronaut corps to the needs of commercial spaceflight. We argue that future practice must balance inclusion and market growth with transparent, evidence-based risk manage-ment, supported by systematic data collection across government and commercial flights.
Article
Medicine and Pharmacology
Medicine and Pharmacology

José Manuel García-Álvarez

,

Alfonso García-Sánchez

,

José Luis Díaz-Agea

Abstract: (1) Background: The increasing complexity of today's healthcare system requires the formation of highly cohesive work teams that guarantee safe and high-quality care. Clinical simulation has become established as a pedagogical strategy capable of promoting the collaborative skills of teams of students and healthcare professionals. The objective of this study was to analyze the influence of learning through clinical simulation on promoting group cohesion in nursing student teams; (2) Methods: A quasi-experimental study with a pre-post design without a control group was conducted with final-year nursing students using the short Spanish version of the Group Environment Questionnaire, validated for nursing students. This questionnaire was completed twice by the participating students, before and after clinical simulation practices; (3) Results: Clinical simulation sessions significantly increased group cohesion in most items and in all dimensions with a large effect size greater than 0.5. The dimension Group Integration-Task (GI-T) showed the greatest improvement after clinical simulation practices; (4) Conclusions: Clinical simulation has significantly increased all dimensions of group cohesion among nursing students. Clinical simulation primarily enhances collaboration and commitment among nursing students to achieve common goals. Due to its impact on group cohesion, clinical simulation should be used systematically to improve the efficiency and quality of student and healthcare professional teams.
Article
Environmental and Earth Sciences
Ecology

Bernhard Wessling Jersbek

Abstract: The principles of nonequilibrium thermodynamics are briefly discussed, with a focus on entropy. For the first time, the energy consumption and entropy production of CO2 final storage and utilization (CCS and CCU) are quantitatively analyzed and interpreted. This shows that the final storage and chemical utilization of CO2 are not sustainable processes for solving the climate crisis. Building on this, a new proposal for a quantitative criterion for sustainability is presented: entropy. In addition, a relatively simple indicator is presented that is a helpful (and easier to calculate) indicator for the entropy production of various processes or products.
Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

T. Marques

,

J.B. Melo

,

A.J. Pontes

,

A. Gaspar-Cunha

Abstract:

In injection molding, advanced numerical modeling tools, such as Moldex3D, can significantly improve product development by optimizing part functionality, structural integrity, and material efficiency. However, the complex and nonlinear interdependencies between the several decision variables and objectives, considering the various operational phases, constitute a challenge to the inherent complexity of injection molding processes. This complexity often exceeds the capacity of conventional optimization methods, necessitating more sophisticated analytical approaches. Consequently, this research aims to evaluate the potential of integrating intelligent algorithms, specifically the selection of objectives using Principal Component Analysis and Mutual Information/Clustering, metamodels using Artificial Neural Networks, and optimization using Multi-Objective Evolutionary Algorithms, to manage and solve complex, real-world injection molding problems effectively. Using surrogate modeling to reduce computational costs, the study systematically investigates multiple methodological approaches, algorithmic configurations, and parameter-tuning strategies to enhance the robustness and reliability of predictive and optimization outcomes. The research results highlight the significant potential of data-mining methodologies, demonstrating their ability to capture and model complex relationships among variables accurately and to optimize conflicting objectives efficiently. In due course, the enhanced capabilities provided by these integrated data-mining techniques result in substantial improvements in mold design, process efficiency, product quality, and overall economic viability within the injection molding industry.

Article
Computer Science and Mathematics
Computer Networks and Communications

Krishna Bajpai

Abstract: The evolution of high-performance computing(HPC) interconnects has produced specialized fabrics such asInfiniBand, Intel Omni-Path, and NVIDIA NVLink,each optimized for distinct workloads. However, the increas-ing convergence of HPC, AI/ML, quantum, and neuromorphiccomputing requires a unified communication substrate capableof supporting diverse requirements including ultra-low latency,high bandwidth, collective operations, and adaptive routing. Wepresent HyperFabric Interconnect (HFI), a novel design thatcombines the strengths of existing interconnects while addressingtheir scalability and workload-fragmentation limitations. Ourevaluation on simulated clusters demonstrates HFI’s ability toreduce job completion time (JCT) by up to 30%, improve taillatency consistency by 45% under mixed loads and 4× betterjitter control in latency-sensitive applications., and sustain effi-cient scaling across heterogeneous workloads. Beyond simulation,we provide an analytical model and deployment roadmap thathighlight HFI’s role as a converged interconnect for the exascaleand post-exascale era.
Article
Physical Sciences
Condensed Matter Physics

Jian-Hua Wang

Abstract: The conventional framework for quantum statistics is built upon gauge theory, where particle exchanges generate path-dependent phases. However, the apparent consistency of this approach masks a deeper question: is gauge invariance truly sufficient to satisfy the physical requirement of indistinguishability? We demonstrate that gauge transformations, while preserving probabilities in a formal sense, are inadequate to capture the full constraints of identical particles, thereby allowing for unphysical statistical outcomes. This critical limitation necessitates a reconstruction of the theory by strictly enforcing indistinguishability as the foundational principle, thus moving beyond the conventional topological paradigm. This shift yields a radically simplified framework in which the statistical phase emerges as a path-independent quantity, \( \alpha = e^{\pm i\theta} \), unifying bosons, fermions, and anyons within a single consistent description. Building upon the operator-based formalism of Series I and the dual-phase theory of Series II, we further present an exact and computationally tractable approach for solving N-anyon systems.
Article
Physical Sciences
Theoretical Physics

Vladlen Shvedov

Abstract: We propose a geometrically motivated framework in which the large-scale evolution of the Universe is described by a coherent multidimensional wavefunction possessing a preferred direction of propagation. Within this formulation, the scalar envelope of the wavefunction defines a critical hypersurface whose temporal evolution provides an effective geometric description of cosmic expansion. The resulting picture naturally incorporates an arrow of time, large-scale homogeneity, and a nonsingular expansion history, without invoking an inflationary phase, a cosmological constant, or an initial singularity. The critical hypersurface takes the form of a three-dimensional sphere whose radius plays the role of a cosmological scale factor. Its evolution leads to a time-dependent expansion rate with a positive but gradually decreasing acceleration. The associated density evolution follows a well-defined scaling law that is consistent with the standard stress–energy continuity equation and corresponds to an effective equation-of-state parameter w = -1/3. As a consequence, the total mass–energy contained within the expanding hypersurface increases with time in a manner that remains fully compatible with the continuity relation. Analytical estimates derived from the model yield values for the present expansion rate and mean density that are in close agreement with current observational constraints. Within this geometric interpretation, the gravitational constant emerges as an invariant global potential associated with the critical hypersurface, linking the conserved properties of the wavefunction to observable gravitational coupling. The framework therefore provides a self-consistent, effective description in which cosmic expansion and gravitational dynamics arise from the geometry of a universal wavefunction, suggesting a deep connection between quantum structure, spacetime geometry, and cosmological evolution.
Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Towhidul Islam

,

Safa Asgar

,

Sajjad Mahmood

Abstract: Lung cancer remains one of the leading causes of cancer-related mortality worldwide, highlighting the importance of early detection for improving patient survival rates. However, current machine learning approaches for lung cancer prediction often depend on suboptimal model configurations, limited systematic ensemble comparisons, and insufficient interpretability. This study introduces a novel framework, called Lung Explainable Ensemble Optimizer (LungEEO), that integrates three methodological advances: (1) comprehensive hyperparameter optimization across 50 configurations of nine machine learning algorithms for base model selection, (2) a systematic comparison of Hybrid Majority Voting strategies, including unweighted hard voting, weighted hard voting, and soft voting with an ensemble stacking approach, and (3) a dual explainable AI (XAI) layer based on SHAP and LIME to provide parallel global and local explanations. Experiments conducted on two heterogeneous lung cancer datasets indicate that ensemble approaches consistently outperform individual models. Weighted hard voting achieved the best performance on Dataset 1 (Accuracy: 89.04%, F1-Score: 89.04%), whereas ensemble stacking produced superior outcomes on Dataset 2 (Accuracy: 87.95%, F1-Score: 87.95%). Following extensive hyperparameter tuning, Random Forest and Multi-Layer Perceptron performed consistently well as base learners on both datasets. In addition, integrating SHAP with LIME offers additional insights into model behavior, boosting the interpretability of ensemble predictions, and strengthening their potential clinical applicability. To the best of our knowledge, the combined use of these interpretability techniques within an ensemble framework has received limited attention in existing lung cancer prediction studies. Overall, the proposed LungEEO framework offers a promising balance between predictive performance and interpretability, supporting its potential use in clinical decision support.
Article
Biology and Life Sciences
Neuroscience and Neurology

Gianluca Coppola

,

Antonio Di Renzo

,

Gabriele Sebastianelli

,

Irene Giardina

,

Davide Chiffi

,

Giada Giuliani

,

Francesco Casillo

,

Chiara Abagnale

,

Lucia Ziccardi

,

Andrea Pucci

+4 authors

Abstract: Background/Objectives: Photophobia is one of the most prevalent migraine symptoms, both during and outside of attacks, but its pathogenesis is unknown. The posterior thalamic nuclei may directly affect ambient light discomfort. This study examined the link between photophobia and the structure and morphometry of the thalamus and its subregions, including the lateral geniculate nuclei and pulvinar subnuclei. Methods: Twenty patients with episodic migraine without aura (MO) and 20 healthy controls (HCs) underwent high-resolution T1-weighted magnetic resonance imaging and comprehensive ophthalmological assessment. Patients were scanned interictally and none of them were under preventive therapy. Volumetric segmentation encompassed the whole thalamus and the lateral geniculate nuclei and pulvinar subregions. Interictal photophobia was evaluated using a visual analogue scale, ranging from 0 to 10. All thalamic and subregion volumes were used as independent variables and photophobia levels as dependent variables in general linear models. The model considered gender as a factor and total intracranial volume as a covariate. Results: No statistically significant differences were observed in the overall thalamic volume or in any of its subregions between MO patients and healthy controls (punc > 0.05). No relationships emerged between thalamic volumes and interictal subjective photophobia levels (p > 0.05). Conclusions: Our results suggest that photophobia is not linked to thalamic macroscopic volumes alterations during the interictal phase of MO patients. Further research is needed to determine whether these results could be extended to patients with migraine with aura or during other phases of the migraine cycle.
Article
Physical Sciences
Quantum Science and Technology

Jussi Lindgren

Abstract: The Stueckelberg wave equation is solved for unitary solutions, which links the eigenvalues of the Hamiltonian directly to the oscillation frequency. As it has been showed previously that this PDE relates to the Dirac operator, and on the other hand it is a linearized Hamilton-Jacobi-Bellman PDE, from which the Schrödinger equation can be deduced in a nonrelativistic limit, it is clear that it is the key equation in relativistic quantum mechanics. We give a stationary solution for the quantum telegraph equation and a Bayesian interpretation for the measurement problem. The stationary solution is understood as a maximum entropy prior distribution and measurement is understood as Bayesian update. We discuss the interpretation of the single electron experiments in the light of finite speed propagation of the transition probability field and how it relates the interpretation of quantum mechanics more broadly.
Article
Social Sciences
Urban Studies and Planning

Zlata Vuksanović–Macura

,

Stefan Denda

,

Edna Ledesma

,

Marija Milinković

,

Milan M. Radovanović

,

Jasmina Gačić

,

Veronika N. Kholina

,

Marko D. Petrović

Abstract: Open-air food markets have long functioned as key sites of food provision, social interaction, and local economic exchange in European cities. In recent decades, many of these markets have undergone significant transformation due to modernization-oriented urban regeneration. This study examines the transformation of Palilula Market in Belgrade, Serbia’s capital, from a traditional open-air market to a large, enclosed market complex, situating the analysis within the post-socialist urban context. Utilizing historical analysis, semi-structured interviews with vendors, and on-site observations, the research examines the impact of spatial reconfiguration on vendor livelihoods, economic practices, and social relations. The results demonstrate that, although the new indoor market has enhanced infrastructure, hygiene, and year-round usability, it has also led to higher rents, reduced stall capacity, increased competition, and stricter regulations. These developments have constrained small-scale vendors and diminished informal social interactions. This study expands the understanding of urban regeneration processes in post-socialist, neoliberal contexts by showing how market modernization shapes the inclusivity and socio-cultural significance of traditional urban markets.
Article
Computer Science and Mathematics
Other

Felipe Oliveira Souto

Abstract: This work presents a series of interconnected mathematical \emph{constructions} that take the zeros of the Riemann zeta function as primordial elements. Rather than seeking a conventional proof of the Riemann Hypothesis, we investigate: what kind of mathematical reality emerges when we \emph{postulate} that these zeros form the spectrum of an operator within a specific geometric arena? Our constructions reveal a remarkable chain of coherence, linking geometry (minimal surfaces), topology (M\"obius bands), statistics (GUE), and fundamental physical constants. Within the constructed framework, the critical line $\Re(s)=1/2$ appears as a \emph{necessary condition}, GUE statistics as an intrinsic geometric property, and relations between the first four zeros encode the fine structure constant $\alpha^{-1} = 137.035999084\ldots$ to experimental precision \cite{CODATA2018}. We present these constructions not as final theorems, but as substantive \emph{insights} from a perspective that treats the zeta function not merely as an object of analysis, but as a potential organizational principle of mathematical reality.
Article
Business, Economics and Management
Business and Management

Omotayo Olaleye Feyisetan

,

Fadi Alkaraan

Abstract:

We empirically examine the combined influence of innovation intensity strategies and boardrooms gender diversity on ESG performance. The theoretical lenses underpinning this study are rooted in Resource-Based View (RBV) and Upper Echelons Theory (UET). The empirical analysis is based on a sample of financial and non-financial firms selected from FTSE 350 listed companies, publicly listed companies on the London Stock Exchange (LSE) over the period (2012-2023). The findings of this study reveal that innovation intensity strategies have positive and significant relationship with ESG performance, for both financial and non-financial firms. Further, the percentage of women on the board has a positive and significant relationship with ESG performance, for both financial and non-financial firms. However, the magnitude of the coefficient for financial firms suggests that this effect is very negligible and not significant for non-financial firms. The percentage of women employees has a negative and significant relationship with ESG performance in financial firms. Unlike financial firms, the percentage of women employees has a positive and significant relationship with ESG performance in non-financial firms. For both financial and non-financial firms, the percentage of women in management has a positive and significant relationship with ESG performance in the Nested models. Further, these relationships become insignificant in the full model, suggesting that other factors may overshadow the impact of women in management roles. In both financial and non-financial firms, the number of female executives has a positive and significant relationship with ESG performance across models. This underscores the importance of gender diversity in leadership roles for driving ESG initiatives. The results suggest that companies with a high-level of board boardrooms diversity strengthen innovation strategies intensity and leverage external resources for sustainability initiatives. The lack of a significant relationship between innovation strategies and ESG performance challenges the innovation-driven sustainability theory, which posits that innovation is a key driver of environmental and social sustainability. This suggests that traditional innovation strategies, R&D metrics, may not adequately capture sustainability-focused innovation, particularly in financial firms. The additional analysis resulted consistent results with the baseline findings, reinforcing the conclusion that the results of this study robust and minimise endogeneity concerns. Findings have theoretical and managerial implications.

Article
Medicine and Pharmacology
Epidemiology and Infectious Diseases

Sohel Modan

,

James Gunton

,

Kedar Madan

,

Teddy Teo

,

Michael Hii

,

Augustine Mugwagwa

,

Majo Joseph

Abstract: Background: Infective endocarditis (IE) is a rare yet serious condition affecting the heart valves and the endocardium. Notably, there has been a shift in the risk-factor profile from traditional factors such as rheumatic heart disease and poor dental health to more iatrogenic causes such as prosthetic heart valves, cardiac devices, foreign body implants, haemodialysis, and immunosuppression. Purpose: While IE has been extensively studied in the past, the evolving landscape prompts an epidemiological re-evaluation of vulnerable patient populations. Our primary objective is to examine the trends in microbiological and echocardiographic diagnostics over two decades and to compare them across native valve endocarditis (NVE), prosthetic valve endocarditis (PVE), and cardiac implantable electronic device-related IE (CIED-IE). Methods: We conducted a retrospective analysis of longitudinal data encompassing 912 patients admitted with either a possible or definite IE diagnosis between 2001 and 2023. Results: The incidence of IE increased over the study duration (p<0.01) with octogenarians most affected (21%). Iatrogenic risk factors were associated with nearly two-thirds (63%) of patients diagnosed with IE, while traditional risk factors were evident in almost one-eighth (13%). Blood culture-negative endocarditis increased over the study duration (19% versus 27%, p<0.01) and Staphylococcus aureus (29%, p<0.01) became the dominant pathogen over Viridans group streptococci (14%, p=0.001). Imaging with transthoracic (56% in 2004 versus 75% in 2023) and transoesophageal echocardiography (57% in 2004 versus 79% in 2023) had an increasing contribution in the diagnosis of IE over the two decades. The subgroup analysis suggested that PVE and CIED-IE were more likely to have negative blood cultures (OR=3.7, CI [1.2-6.8] & OR=4.9, CI [1.3-8]) compared to NVE (OR=0.04, CI [0.02-0.8]). PVE and CIED-IE were more likely to have inconclusive echocardiographic imaging (OR=3.7, CI [1.2-6.6] & OR=2.2, CI [0.07-7.8]) compared to NVE (OR=0.63, CI [0.3-3.2]). Conclusion: Our study underscores the evolving nature of IE, now predominantly a healthcare-related disease. Diagnostic challenges persist due to the heterogeneity of the disease, with the emergence of distinct entities such as PVE and CIED-IE.
Article
Computer Science and Mathematics
Computer Networks and Communications

Galia Novakova Nedeltcheva

,

Denis Chikurtev

,

Eugenia Kovatcheva

Abstract: While smart campuses continue to evolve alongside technological advancements, existing data models often fail to comprehensively integrate the diverse array of Internet of Things (IoT) devices and platforms. This study presents a unified data model tailored to the operational requirements of campus decision-makers, facilitating seamless interconnection across heterogeneous systems. By integrating IoT, cloud computing, big data analytics, and artificial intelligence, the proposed model seeks to advance campus operations, sustainability, and educational outcomes by fostering cross-system harmonization and interoperability. The analysis demonstrates that a layered architecture—comprising data acquisition, processing and storage, analytics and decision support, application presentation, and security and privacy—constitutes the foundation of robust smart campus data models. The system is structured to collect, refine, process, and archive raw data for future reference. Analytics and decision support mechanisms generate actionable insights; application presentation delivers results to end users, and security and privacy measures safeguard in-formation. The study further contends that artificial intelligence techniques, including predictive analytics (which forecasts outcomes using historical data), personalized learning (which customizes content to individual needs), and edge intelligence (which processes data at its source), are essential for advancing these models. These enhancements yield measurable benefits, including a 15% increase in student retention through personalized learning and a 20% reduction in energy consumption through predictive energy management [1]. Emerging technologies such as 5G networks, edge and fog computing, blockchain, and three-dimensional geographic information systems (3D GIS) are instrumental in enhancing campus intelligence. For example, the adoption of 5G has led to a 30% increase in data transmission speeds, thereby enabling real-time analytics and reliable connectivity (5G and IoT: How 5G is Transforming the Internet of Things, 2024). Building upon these technological advancements, innovative data models are shown to facilitate predictive energy management, resource optimization, and performance analytics within smart campuses. Nevertheless, ongoing challenges persist, including those related to system interoperability, scalability, and data governance. This study provides actionable design guidelines and offers a balanced evaluation of the achievements and challenges of smart campus implementations.

of 5,385

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated