Computer Science and Mathematics

Sort by

Review
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Satyadhar Joshi

Abstract: Financial risk management has increasingly adopted machine learning (ML) techniques, particularly Gradient Boosting Machines (GBMs), due to their high predictive accuracy. However, their "black-box" nature poses challenges for interpretability and regulatory compliance. This paper reviews the integration of Explainable AI (XAI) methods, such as SHAP and LIME, with GBMs to enhance transparency in financial risk assessment. We synthesize findings from recent studies, highlighting the trade-offs between accuracy and interpretability in applications like credit scoring, fraud detection, and crisis prediction. The review demonstrates how XAI techniques enable financial institutions to comply with regulations like the EU's AI Act while maintaining model performance. Key results from literature reviewed show GBMs achieving over 95\% accuracy when combined with SHAP explanations, with feature importance analysis revealing critical risk factors like credit utilization and macroeconomic indicators. The paper also addresses challenges including computational costs, dynamic market adaptations, and regulatory heterogeneity. By examining case studies and quantitative metrics, we propose that hybrid approaches combining GBMs with XAI offer a balanced solution for trustworthy AI in finance. The conclusions emphasize the need for standardized benchmarks and real-time interpretability methods to support wider adoption of transparent AI systems in financial risk management. The paper also addresses regulatory considerations and proposes future directions for scalable, trustworthy AI solutions in finance.By leveraging Gradient Boosting Machines (GBMs) and ensemble methods, this study demonstrates how XAI can bridge the gap between sophisticated machine learning models and regulatory compliance. Empirical results highlight improved transparency, stakeholder trust, and regulatory adherence in financial institutions. The increasing adoption of Artificial Intelligence (AI) and Machine Learning (ML) in financial risk management, particularly in credit scoring and default prediction, offers enhanced predictive power. This paper reviews the current trends in applying Explainable AI (XAI) techniques to credit risk management. We examine various XAI methods discussed in recent literature, focusing on their ability to provide insights into model decisions, enhance trust, and address fairness concerns.
Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Yifan Feng,

Hao Hu,

Xingliang Hou,

Shiquan Liu,

Shihui Ying,

Shaoyi Du,

Han Hu,

Yue Gao

Abstract: Large language models (LLMs) have transformed various sectors, including education, finance, and medicine, by enhancing content generation and decision-making processes. However, their integration into the medical field is cautious due to hallucinations, instances where generated content deviates from factual accuracy, potentially leading to adverse outcomes. To address this, we introduce Hyper-RAG, a hypergraph-driven Retrieval-Augmented Generation method that comprehensively captures both pairwise and beyond-pairwise correlations in domain-specific knowledge, thereby mitigating hallucinations. Experiments on the NeurologyCrop dataset with six prominent LLMs demonstrated that Hyper-RAG improves accuracy by an average of 12.3% over direct LLM use and outperforms Graph RAG and Light RAG by 6.3% and 6.0%, respectively. Additionally, Hyper-RAG maintained stable performance with increasing query complexity, unlike existing methods which declined. Further validation across nine diverse datasets showed a 35.5% performance improvement over Light RAG using a selection-based assessment. The lightweight variant, Hyper-RAG-Lite, achieved twice the retrieval speed and a 3.3% performance boost compared with Light RAG. These results confirm Hyper-RAG's effectiveness in enhancing LLM reliability and reducing hallucinations, making it a robust solution for high-stakes applications like medical diagnostics.
Article
Computer Science and Mathematics
Data Structures, Algorithms and Complexity

Guilherme Vilar Balduino,

Fredy João Valente

Abstract: This article explores the implementation of data fusion architectures in IoT systems for precipitation forecasting. IoT networks enable the collection of large volumes of real-time environmental data, which, when combined through data fusion techniques, provide a cohesive and comprehensive dataset for analysis. Using MongoDB as a storage and processing platform, a temporal fusion approach was implemented to analyze seasonal trends and recurring patterns in environmental data. The study highlights the challenges of integrating heterogeneous data, including the presence of outliers, and proposes solutions based on advanced data analysis and machine learning techniques. Data preprocessing techniques, such as outlier detection and normalization, were applied to enhance data quality before fusion. Results demonstrate that temporal fusion, combined with machine learning, significantly improves the accuracy and efficiency of precipitation forecasting systems. Key techniques such as Random Forest were employed, and performance was evaluated using metrics like MAE, MSE, R², and Cross-Validation MAE (CV\_MAE). The findings indicate that temporal fusion, especially when combined with exponential smoothing, surpasses other methods, providing a robust approach to precipitation forecasting.
Article
Computer Science and Mathematics
Software

Muhammad Haris Firdaus bin Zainol Mahariq,

Waleed Arnaout,

Muhammad Yamin Muiz,

Kao Cheng Yuan (Isaac),

Kao Cheng Buo (Caleb),

Tazreed Sabeel,

Noor Ul Amin

Abstract: The retail industry is undergoing a paradigm shift with the increasing adoption of self-checkout systems with the goal of improving customer experience and operational efficiency. The present study focuses on exploring the design, development, and testing of a multi-functional self-checkout system incorporating features such as barcode scanning, AI-based product identification, multiple payment options, and security features. With an Agile project management approach, the system was developed for user experience, security, and integration with current retail infrastructure. The study proposes various testing approaches, including black-box, white-box, integration, and security testing, to achieve reliability and usability of the system. A comprehensive risk management and maintenance plan is also proposed to achieve system longevity and flexibility. The findings show that well-designed self-checkout systems can potentially reduce checkout time significantly, improve customer satisfaction, and optimize retail operations without exposing the business to security and user error risks.
Article
Computer Science and Mathematics
Computer Networks and Communications

Il-Kyu Ha

Abstract: Drones are widely used in urban air pollution monitoring. Although studies have focused on single-drone applications, collaborative applications for air pollution detection are relatively underexplored. This paper presents a 3D cube-based adaptive cooperative search algorithm that allows two drones to collaborate to explore air pollution. The search space is divided into cubic regions, and each drone explores the upper or lower halves of the cubes and collects data from their vertices. The vertex with the highest measurement is selected by comparing the collected data, and an adjacent cube-shaped search area is generated for exploration. The search continues iteratively until any vertex measurement reaches a predefined threshold. An improved algorithm is also proposed to address the divergence and oscillation that occur during the search. In simulations, the proposed method outperformed linear search in terms of CPU time and search distance. Additionally, the cooperative search method using multiple drones was more efficient than was single-drone exploration in terms of the same parameters. In experiments in real-world scenarios, multiple drones equipped with the proposed algorithm successfully detected cubes containing air pollution above the threshold level. The findings serve as an important reference for research on drone-assisted target exploration, including air pollution detection.
Review
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Zhelin Liu

Abstract: The electrocardiogram (ECG) is a fundamental tool for diagnosing a wide range of cardiac conditions. The application of artificial intelligence (AI) to ECG analysis has shown significant potential in improving diagnostic accuracy and efficiency. This review provides a comprehensive overview of the current state of AI in ECG recognition, exploring the methodologies, applications, challenges, and future directions of this rapidly evolving field. We delve into the seminal research papers that have shaped the landscape of AI-enhanced ECG, discuss the various machine learning and deep learning techniques employed, and highlight the diverse applications of AI in diagnosing different cardiac diseases. Furthermore, we examine the performance evaluation metrics used, the current challenges and limitations, the crucial role of data preprocessing and feature engineering, and the perspectives of clinicians who utilize and evaluate AI in ECG recognition. This review aims to offer valuable insights for researchers, clinicians, and healthcare professionals interested in the intersection of AI and cardiology.
Short Note
Computer Science and Mathematics
Probability and Statistics

Yudong Tang

Abstract:

This article studies the terminal distribution of multi-variate Brownian motion where the correlations are not constant. In particular, with the assumption that the correlation function is driven by one factor, this article developed PDEs to quantify the moments of the conditional distribution of other factors. By using normal distribution and moment matching, we found a good approximation to the true Fokker Planck solution and the method provides a good analytic tractability and fast performance due to the low dimensions of PDEs to solve. This method can be applied to model correlation skew effect in quantitative finance, or other cases where a non-constant correlation is desired in modelling multi-variate distribution.

Article
Computer Science and Mathematics
Algebra and Number Theory

Triston Miller

Abstract: This paper proposes the Law of Symbolic Dynamics, a new theoretical framework within Symbolic Field Theory (SFT) that explains the emergence of irreducible structures, such as prime numbers, Fibonacci numbers, and square-free integers, through symbolic interference in a dynamic compression field. Unlike traditional methods of prime number identification, which rely on sieving or divisibility testing, this model interprets primes as emergent points in a symbolic field, driven by the dynamics of symbolic curvature, force, mass, momentum, and energy. Using computational models of symbolic curvature based on Euler’s totient function, we show that symbolic collapse zones align with prime numbers with over 98.6\% accuracy, offering a deterministic approach to prime prediction. This work introduces the \textit{Orbital Collapse Law}, which allows for the stepwise prediction of primes without external verification or sieving, marking a transition from probabilistic number theory to a generative model based on geometric principles. Finally, we discuss the broader applicability of this framework to other irreducible structures and propose future directions for extending Symbolic Field Theory beyond number theory to domains such as language, perception, and music.
Article
Computer Science and Mathematics
Probability and Statistics

Moritz Sohns

Abstract: The development of stochastic integration theory throughout the 20th century has led to several definitions and approaches of the stochastic integral, particularly for predictable integrands and semimartingale integrators. This survey provides an overview of the two prominent approaches in defining the stochastic integral: the classical approach attributed to Itô, Meyer and Jacod, and the more contemporary functional analytical approach mainly developed by Bichteler and Protter. It also delves into the historical milestones and achievements in this area and analyzes them from a modern perspective. Drawing inspiration from the similarities of existing approaches, this survey introduces a new topology-based approach to the general vector-valued stochastic integral for predictable integrands and semimartingale integrators. This new approach provides a faster, simpler way to define the general integral and offers a self-contained derivation of its key properties without depending on semimartingale decomposition theorems.
Article
Computer Science and Mathematics
Analysis

Anant Chebiam

Abstract: This paper presents a novel mathematical framework for addressing the Navier-Stokes existence and smoothness problem, one of the seven Millennium Prize Problems. We introduce new mathematical tools that extend beyond traditional partial differential equation theory to establish the global existence and uniqueness of smooth solutions to the three-dimensional Navier-Stokes equations. Our approach combines advanced functional analysis with innovative harmonic analysis techniques to overcome the longstanding difficulties associated with the nonlinear term and pressure. The theoretical results are validated through numerical simula- tions and demonstrate practical applications in turbulence prediction for aircraft design, weather forecasting, and blood flow modeling.
Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Lianghao Tan,

Zhuo Peng,

Xiaoyi Liu,

Weixi Wu,

Dong Liu,

Ruobing Zhao,

Huangqi Jiang

Abstract: This paper presents an efficient Grey Wolf Optimizer (EGWO) designed to address the limitations of the standard Grey Wolf Optimizer (GWO), focusing on reducing memory usage and accelerating convergence. The proposed method integrates Sinusoidal Mapping for enhanced population diversity and a Transverse- Longitudinal Crossover strategy to balance global explo- ration and local exploitation. These innovations improve search efficiency and optimization precision while main- taining a lightweight computational footprint. Experi- mental evaluations on 10 benchmark functions demon- strate EGWO’s superior performance in convergence speed, solution accuracy, and robustness. Its application to hyperparameter tuning of a Random Forest model for a housing price dataset confirms its practical utility, fur- ther supported by SHAP-based interpretability analysis.
Article
Computer Science and Mathematics
Algebra and Number Theory

Triston Miller

Abstract: This study introduces a comprehensive framework—Symbolic Field Theory (SFT)—for modeling the emergence and recurrence of irreducible mathematical structures, including prime numbers, square-free integers, Fibonacci and Lucas sequences, Mersenne primes, and other symbolic attractors. Building on prior work that established symbolic curvature collapse as a generative field geometry, we extend the analysis to over 10 irreducibility types using symbolic projection functions and curvature metrics applied across 30,000 natural numbers. Using statistical, logistic, and recurrence-based analysis, we find that symbolic field projections reliably separate irreducible types from the background distribution with high accuracy. Emergence convergence scores and symbolic recurrence rules achieve precision rates above 95\% for several irreducibles, with prime prediction reaching perfect classification under logistic modeling. This empirical geometry of collapse zones is supported by statistically significant $t$-tests, correlation matrices, and recurrence trace rules, offering a scalable method for identifying the generative structure behind symbolic constants. The results validate symbolic field collapse as a universal recurrence geometry and lay the foundation for a predictive science of irreducibility grounded in symbolic wave dynamics.
Review
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Shufen Lei,

Yin Hua,

Shufen Zhihao

Abstract: Foundation models have revolutionized artificial intelligence by achieving state-of-the-art performance across a wide range of tasks. However, fine-tuning these massive models for specific applications remains computationally expensive and memory-intensive. Parameter-Efficient Fine-Tuning (PEFT) techniques have emerged as an effective alternative, allowing adaptation with significantly fewer trainable parameters while maintaining competitive performance. This survey provides a comprehensive overview of PEFT, covering its theoretical foundations, major methodologies, empirical performance across various domains, and emerging trends. We begin by exploring the motivation behind PEFT, emphasizing the prohibitive cost of full fine-tuning and the necessity for more efficient adaptation strategies. We then categorize and discuss key PEFT techniques, including adapters, Low-Rank Adaptation (LoRA), prefix tuning, and prompt tuning. Each method is analyzed in terms of its architectural modifications, computational efficiency, and effectiveness across different tasks. Additionally, we present the theoretical underpinnings of PEFT, such as low-rank reparameterization and the role of sparsity in fine-tuning. Empirical evaluations are examined through large-scale benchmarking studies across natural language processing, vision, and speech tasks. We highlight trade-offs between efficiency and performance, demonstrating that PEFT methods can achieve near full fine-tuning accuracy with significantly reduced resource requirements. Furthermore, we discuss recent advancements in hybrid PEFT approaches, continual learning, hardware-aware optimization, and PEFT applications beyond traditional machine learning, including edge AI and scientific computing. Despite its advantages, several open challenges remain, including scalability to ultra-large models, robustness against adversarial attacks, and improved generalization across diverse tasks. We outline future research directions that aim to address these challenges and enhance the efficiency, adaptability, and security of PEFT methods. By summarizing key findings and identifying critical research gaps, this survey serves as a comprehensive resource for researchers and practitioners interested in optimizing the fine-tuning of foundation models. As PEFT continues to evolve, it holds the potential to make large-scale AI models more accessible, efficient, and widely deployable across real-world applications.
Article
Computer Science and Mathematics
Discrete Mathematics and Combinatorics

Anant Chebiam

Abstract: We introduce a new axiom, the Chebiam Continuum Axiom (CCA), which provides a novel perspective on the Continuum Hypothesis (CH). By integrating computational methods with classical set theory, we develop forcing techniques that construct models where the CCA holds and demonstrate its consistency with ZFC. Our approach leverages large cardinal properties to establish a hierarchical structure on the power set of א0, revealing an intricate stratification between א0 and 2א0. This stratification suggests that the classical formulation of CH as a binary question may be inadequate. We prove that the CCA is independent of ZFC but compatible with large cardinal axioms, offering a new framework that reconciles seemingly contradictory intuitions about the continuum. Our computational simulations provide empirical support for the theoretical results, suggesting that CCA captures essential properties of the continuum that extend beyond the traditional scope of CH.
Review
Computer Science and Mathematics
Other

Asset Durmagambetov

Abstract: Artificial Intelligence (AI) faces a range of mathematical challenges, such as optimization, generalization, model interpretability, and phase transitions. These issues significantly limit the application of AI in critical domains such as medicine, autonomous systems, and finance. This article examines the primary mathematical problems of AI and proposes solutions based on the universality of the Riemann zeta function. Furthermore, AI, as a major trend attracting hundreds of billions of dollars, is now tasked with addressing humanity’s most complex challenges, including nuclear fusion, turbulence, the functioning of consciousness, the creation of new materials and medicines, genetic issues, and catastrophes such as earthquakes, volcanoes, tsunamis, as well as climatic and social upheavals, ultimately aiming to elevate civilization to a galactic level. All these problems, both listed and unlisted, are interconnected by the issue of prediction and the problem of “black swans” within existing challenges. This work offers an analysis of AI’s problems and potential pathways to overcome them, which, in our view, will strengthen existing trends established by our great predecessors, which we believe will become foundational in mastering AI.
Article
Computer Science and Mathematics
Algebra and Number Theory

Triston Miller

Abstract: This paper proposes the Law of Emergence, a theoretical principle within Symbolic Field Theory that aims to explain the appearance of irreducible structures—such asprime numbers, Fibonacci terms, and square-free integers—through symbolic interference. Rather than assuming irreducibles arise from isolated axioms or randomness, we explore the hypothesis that these structures emerge from the interaction of multiple symbolic fields operating over a shared domain. Using computational models of symbolic curvature (via Miller’s Law) and collapse zone detection, we observe consistent patterns of enrichment, wherein irreducibles tend to cluster at points of constructive symbolic interaction. While not definitive, these findings provide empirical support for the Law of Emergence as a generative framework, suggesting new directions for modeling pattern formation across mathematical and cognitive domains. We conclude by outlining the theoretical and experimental limitations, emphasizing that this work represents an early-stage contribution toward a unifying model of symbolic emergence. In a further extension of this work, we derive and empirically validate a symbolic recurrence rule—termed the Orbital Collapse Law —which predicts next prime from the previous with over 98.6% accuracy using only symbolic curvature collapse. This recurrence operates without sieving or verification, offering the first curvature-based generative law for prime emergence and marking a critical transition from descriptive alignment to predictive irreducibility within the Symbolic Field framework.
Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Farouk Fayez Faky,

Maria Hilal Baswaid,

Nur Aliyah Zafirah Binti M Yusri,

Noor Ul Amin

Abstract: Fire alarm systems play a crucial role in preventing deaths and reducing property loss during emergency scenarios. However, traditional smoke detectors have limitations such as response delay and high probability of failure. This study talks about applying machine learning algorithms to enhance the efficiency and accuracy of fire detection. With a data set that includes various environmental inputs such as temperature, humidity, atmospheric pressure, gaseous emissions, and particle concentrations, we create predictive models to detect fire risks in real-time. Preprocessing of data, feature engineering, and implementation of a set of models for classification i.e. Decision Trees, Logistic Regression, and Random Forest are used in the study. The findings suggest that the Random Forest classifier performs better than other models in terms of accuracy and reliability and hence it is the best method for detecting fire. Furthermore, correlation analysis and exploratory data analysis (EDA) are used to establish important feature correlations influencing fire incidence. The outcomes show the capabilities of machine learning-based fire detection systems in terms of response times and false alarm reduction, hence leading to improved safety.
Article
Computer Science and Mathematics
Software

Mridul Bhattacharjee,

Mohamed Muzni Mohamed Ziham,

Rozin Khan,

Syed Athif Usman,

Mohamed Zeedhan,

Abdelrahman Mahmoud Mohamed Afifi Mohamed,

Noor Ul Amin

Abstract: High-Performance Computing (HPC) revolutionized the field of computational science by enabling it to process vast amounts of data and execute complex simulations at remarkable speed and efficiency. This paper describes various paradigms of computing, namely, client-server computing, distributed computing, cloud computing, and edge computing, and elaborates on their technology drivers. Particular focus is given to HPC, its architecture, programming models, performance metrics, scalability, and applications. The article highlights the significant impact of HPC to science in fields of medicine, biophysics, business, and engineering by facilitating paradigm-altering scientific discoveries, economic forecasting, and simulations of complex engineering. In addition, the article mentions challenges to deploying HPC, such as scalability, handling resources, and integration of multi-core processors. Through comparative analysis and benchmarking techniques, the research points to the necessity of continuous hardware and software innovations to make HPC systems efficient and sustainable. The study brings out the revolutionary potential of HPC in modern computing and its central role in solving the most complex computational problems in the modern age.
Article
Computer Science and Mathematics
Mathematical and Computational Biology

Lei Guo,

Xin Guo,

Feiya Lv

Abstract: As Virtual Reality technology is applied in medical domains deeply, surgical simulator is receiving widely attentions. As safe and efficient surgery training equip-ment, surgical simulator can provide surgeons with safe environment to practice sur-gical skills. The aim of surgical simulator is to calculate the responses of the soft tissue in surgery scenes caused by surgical tools. The basic task is to simulate deformation of soft tissue and cutting caused by scalpel. The non-linearity of the soft tissue and failure of pre-computed quantity caused by topology modification introduce great challenges for simulation of deformation and cutting within real-time. Dual-Mode Hybrid Dy-namic Finite Element Algorithm (DHD-FEA), the Finite Element Method (FEM) is adopted to simulate deformation of soft tissue. Nonlinear FEM model is applied to operational area for accuracy and linear FEM is applied to non-operational area for efficiency. A dynamic time integration scheme is applied to solve finite element equa-tions. Experiments show a good balance between accuracy and efficiency of defor-mation simulation.
Article
Computer Science and Mathematics
Discrete Mathematics and Combinatorics

Kunle Adegoke

Abstract: Using an elementary approach involving the Euler Beta function and the binomial theorem, we derive two polynomial identities; one of which is a generalization of a known polynomial identity. Two well-known combinatorial identities, namely Frisch's identity and Klamkin's identity, appear as immediate consequences of these polynomial identities. We subsequently establish several combinatorial identities, including a generalization of each of Frisch's identity and Klamkin's identity. Finally, we develop a scheme for deriving combinatorial identities associated with polynomial identities of a certain type.

of 463

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2025 MDPI (Basel, Switzerland) unless otherwise stated