Sort by

Article
Social Sciences
Education

Heeseong Ahn

,

Chungwan Lim

Abstract: In second language learning, research on multimodal input often assumes that when learners receive both audio and written text, comprehension becomes easier because cog-nitive load is reduced. This assumption, however, may not fully explain how multimodal input works in digital reading. The present study reexamines this assumption by investi-gating whether listening-reading input regulates learner engagement rather than simply lowering cognitive demand during digital EFL reading. Forty Korean university EFL learners were randomly assigned to either a text-only reading condition or a listen-ing-reading condition in which audio accompanied the written text. Vocabulary knowledge was measured using a 20-item test administered before and after the interven-tion, and delayed retention was examined descriptively. Heart rate data were continuous-ly recorded in order to examine physiological responses during task performance and the recovery period. The results showed that the listening–reading group made clear improvement in vocabu-lary scores (M = 67.00 → 80.25; t(19) = −11.395, p < .001, η² = .872, d = 2.47). In contrast, the text-only group did not show statistically significant improvement (p = .096). Contrary to the expectation that multimodal input would reduce physiological load, participants in the listening-reading condition exhibited higher task heart rate (d = .59) and greater eleva-tion relative to baseline (d = .63). These results indicate that listening-reading may facili-tate lexical acquisition through heightened engagement and attentional activation rather than simple cognitive load reduction. The findings provide additional insight for research on multimodal input in SLA and suggest that an appropriate level of cognitive engage-ment may play an important role in digital reading environments.

Article
Medicine and Pharmacology
Medicine and Pharmacology

Sabri El-Saied

,

Ruth Smadar-Shneyour

,

Junah Istaitih

,

Walid Shalata

,

Muhammad Abu Tailakh

,

Muhammad Abu-Arar

,

Orian Simhon

,

Naim Abu-Freha

Abstract: Background: Eosinophilic esophagitis (EoE) and gastroesophageal reflux disease (GERD) are distinct conditions, yet they often exhibit overlapping clinical, endoscopic, and histological features. Aim: This study aims to assess the prevalence of GERD among patients with EoE and to identify risk factors for both typical and atypical GERD symptoms. Methods: We conducted a population-based retrospective analysis of patients diagnosed with EoE and collected data on demographics, GERD diagnosis, symptoms, comorbidities, and laboratory results. GERD symptoms were classified as typical (e.g., heartburn and regurgitaion) or atypical (e.g., hoarseness and cough). Results: Of the 2,496 patients with EoE (73.2% male), 48.5% exhibited GERD symptoms—26.2% prior to and 30.5% following the diagnosis of EoE. Typical symptoms predominated before EoE diagnosis (26% vs. 4.3% after diagnosis), whereas atypical symptoms were more frequent after diagnosis (28% vs. 0.4% before diagnosis), with cough increasing from 0.4% to 27.4%. In the multivariate analysis, allergic rhinitis (odds ratio [OR 1.40, 95% confidence interval [CI]; 1.15–1.75, p < 0.001) and hiatal hernia (OR 2.47, 95% CI 1.75–3.47, p < 0.001) predicted typical GERD symptoms, whereas asthma (OR 1.50, 95% CI 1.28–1.85), food allergy (OR 1.36, 95% CI 1.08–1.70), and elevated eosinophil counts (OR 2.00, 95% CI 1.44–2.73) were associated with atypical GERD symptoms. Conclusion: Approximately half of patients with EoE are present with GERD symptoms, with a tendency toward more atypical symptoms after EoE diagnosis. Identifying key predictors and risk factors may improve diagnostic accuracy and facilitate the development of targeted treatment strategies.

Article
Biology and Life Sciences
Behavioral Sciences

Douglas Roy

Abstract: Traditional ideas about cognitive evolution often treat learning and instinct as zero-sum substitutes, where increased plasticity necessarily displaces hard-wired routines. We challenge this "substitution-only" view using economic concepts and by considering evolutionary implications of “behavioural hypercycles”: i.e., teams of modular component behaviours coordinated toward a functional task. This shows that, while intermediate brain sizes can indeed favour the substitution of instinct for flexible learning, larger nervous systems trigger a sort of "Income Effect" which changes the optimal allocation between how growing neural resources are dedicated between learning and instinct. Rather than displacing one another, sophisticated learning and extensive instinctive repertoires can evolve as adaptive complements under identifiable conditions. We further show that this trade-off is likely level-specific: evolution preferentially canalizes reusable, high-burden primitives (i.e., genetically assimilating behavioural components that are especially costly or difficult to learn) while leaving task-specific links plastic (e.g., subject to goal-directed control). Our analysis suggests that hypercyclic organization is a fundamental principle of complex agency, where instinct provides the reliable scaffold that makes sophisticated learning affordable. Our model is consistent with several lines of evidence (behavioural, genetical, neurological), likely applies broadly to any animals capable of complex behaviour, and points to a range of empirically testable predictions.

Review
Medicine and Pharmacology
Reproductive Medicine

Venkatkiran Kanchustambham

Abstract: The evaluation of peripheral pulmonary lesions (PPLs) represents a persistent clinical challenge, particularly in the context of rapidly expanding lung cancer screening programs generating unprecedented volumes of screen-detected nodules requiring tissue evaluation. Conventional bronchoscopic techniques and earlier navigation platforms, including electromagnetic navigation bronchoscopy (ENB), improved access to peripheral airways but demonstrated variable and generally suboptimal diagnostic performance—historically plateauing at 60–70%—attributable principally to CT-to-body divergence and the diagnostic drop-off phenomenon. Robotic-assisted bronchoscopy (RAB) has emerged as a significant advancement integrating robotic catheter control, shape-sensing or electromagnetic navigation, and adjunctive intraprocedural imaging—principally cone-beam computed tomography (CBCT)—to improve lesion localization and procedural stability. Available evidence suggests that RAB achieves improved diagnostic yields compared with conventional bronchoscopic techniques, particularly when combined with real-time CBCT-guided tool-in-lesion confirmation and multimodal biopsy strategies including transbronchial cryobiopsy. Reported diagnostic yields generally range from 70% to 87% across pooled meta-analytic datasets, with higher yields described in centers employing CBCT and cryobiopsy. A favorable safety profile—with pooled pneumothorax rates of approximately 2%, substantially below the 20–25% associated with CT-guided transthoracic needle biopsy—represents a clinically important differentiator, particularly in patients with emphysema, coagulopathy, or contralateral lung compromise [1,3]. RAB additionally enables procedural capabilities beyond peripheral lesion biopsy, including simultaneous mediastinal lymph node staging, bilateral same-session lesion sampling, fiducial marker placement for stereotactic body radiotherapy, and investigational therapeutic applications. However, current evidence is largely derived from observational studies and single-arm prospective cohorts, with limited randomized data directly comparing RAB with transthoracic biopsy approaches. Outcomes are meaningfully influenced by lesion characteristics, operator experience, institutional volume, and availability of advanced imaging infrastructure. This review summarizes current evidence on RAB platforms, procedural optimization, diagnostic performance, safety, molecular tissue adequacy, and emerging applications, while explicitly addressing limitations, controversies, and areas requiring further investigation. This review was conducted and reported in accordance with PRISMA 2020 guidelines adapted for narrative reviews (SANRA framework).

Article
Business, Economics and Management
Business and Management

Stanley Mukasa

,

Sixbert Sangwa

Abstract: Background: Why promising health technology ventures stall between product development and real-world deployment remains insufficiently explained in entrepreneurship research. Market-centric models emphasize experimentation, customer discovery, and product-market fit, but these mechanisms are less explanatory where access to users, clinical settings, and procurement channels is institutionally mediated. Methods: This study develops a process explanation of venture progression in regulated health innovation settings. Drawing on institutional theory, innovation systems research, and process views of entrepreneurship, it uses a longitudinal comparative case design to analyze early-stage health technology ventures in Rwanda over an 18-month observation window. The empirical material comprises venture milestone plans, periodic progress reports, observational notes, and documented interactions with regulatory, clinical, and institutional actors. Results: Venture progression was not explained primarily by technical capability, entrepreneurial effort, or early market interest. Rather, outcomes varied according to whether ventures achieved coordinated validation across interdependent regulatory, clinical, and institutional domains. Three recurrent pathway configurations were identified: sequential alignment, associated with forward progression; temporally constrained alignment, associated with delayed progression despite coherent sequencing; and fragmented progression, associated with stagnation. The findings show that legitimacy in regulated health innovation settings is multi-domain, threshold-based, and time-dependent. Conclusion: The study advances the concept of coordinated validation under temporal constraint to explain how ventures move, or fail to move, from development to deployment when market entry depends on synchronized institutional approval. It contributes to entrepreneurship and innovation theory by reframing venture progression in regulated environments as a coordination problem rather than a pure capability problem. Rwanda serves as a revealing case because its comparatively coordinated health system makes the underlying synchronization problem especially visible.

Article
Computer Science and Mathematics
Algebra and Number Theory

Jan Feliksiak

,

Monica U. Feliksiak

Abstract: For over a century, the distribution of prime numbers has been modeled as a stochastic process. This study presents results from a multi-year computational census that challenges this paradigm. Using a deterministic Sequential Reflection Filter implemented on a decentralized architecture, we analyzed a specific four-prime configuration, “The Southern Cross Constellation”, across the range 101 to 2.241014. The method targets twin-prime seeds and applies the symmetric reflection opera tor to generate the structure. We identified 6,175,562 unique prime quadruples exhibiting a consistent trailing-digit signature [9,1,9,1] with zero observed deviation. Additionally, we observe an “ironing effect,” characterized by a systematic reduction in relative variance η with increasing magnitude. At 1014, the relative variance η is reduced by a factor of 40 relative to 109, indicating a transition into a highly regular, symmetric topological structure. These findings indicate the existence of a scale-invariant, deterministic lattice, governing prime distribution. This challenges the assumption of high entropy randomness in prime-based lattices. The study identified the Golden Gamma constant as the foundational principle governing the Southern Cross Constellation.

Concept Paper
Business, Economics and Management
Business and Management

Abdulmohsen H. Alrohaimi

Abstract: Contemporary intelligent systems increasingly participate in shaping, rather than merely supporting, human decision-making processes. While these systems enhance efficiency and predictive performance, they also introduce a critical but underexamined challenge: decisions may remain technically valid while becoming difficult for human actors to interpret and internalize. This misalignment is conceptualized in this study as the meaning gap, defined as the divergence between system output and human interpretive understanding.This paper proposes a human-centered governance framework that positions cognitive sovereignty—the capacity of individuals and institutions to interpret, contextualize, and assume responsibility for decisions—as a necessary condition for sustainable sociotechnical systems. The framework is structured around ten interdependent principles that collectively redefine governance as a cognitive architecture embedded within system design.Drawing on interdisciplinary literature in human–AI interaction, interpretability, and decision science, the study introduces latency as a conceptual construct describing temporal misalignment between system output and human interpretive readiness. This construct provides an integrative lens for understanding phenomena such as delayed comprehension, reduced accountability, and unstable trust in algorithmically mediated environments.Rather than treating interpretability as an auxiliary feature, the proposed framework positions it as a core system function. The paper further outlines potential pathways for operationalizing the meaning gap through measurable indicators, including time-to-comprehension, decision override frequency, and confidence misalignment.While existing frameworks have called for interpretability, few have proposed measurable indicators of cognitive alignment. This paper contributes preliminary metrics—time-to-comprehension, decision override frequency, and confidence misalignment—that operationalize the meaning gap for empirical testing.

Brief Report
Engineering
Civil Engineering

F. Pacheco-Torgal

Abstract: The construction industry faces a dual imperative: continued growth to meet the demands of a rapidly expanding global population, and deep decarbonisation to align with planetary boundaries and climate commitments embedded in frameworks such as the European Green Deal and the EU Bioeconomy Strategy. This paper examines the potential of bio-based construction materials to bridge these competing demands, reviewing evidence across a broad spectrum of material categories — including fast-growing plant-based materials, bio-based admixtures and polymer composites for concrete, bio-based polyurethanes, nanocellulose and cellulose aerogels, plant-based biocomposites, and mycelium-based composites. The review demonstrates that bio-based materials offer compelling environmental advantages over conventional petrochemical-derived alternatives, including superior carbon sequestration potential, reduced embodied carbon, improved indoor environmental quality, and compatibility with circular economy principles. The strategic urgency of this transition has been rendered concrete by the 2026 Strait of Hormuz crisis, which triggered severe disruptions to global petrochemical supply chains and exposed the structural vulnerability of European construction to fossil-derived material inputs — reframing bio-based alternatives as a supply security imperative alongside an environmental one. However, the transition from demonstrator projects to mainstream specification practice remains constrained by persistent technical, economic, and regulatory barriers, including inconsistencies in life cycle assessment methodologies, the absence of harmonised performance standards, certification gaps, high initial costs, and fragmented supply chains. Crucially, the review identifies that resolving these barriers depends not only on continued material innovation but equally on governance configurations, policy stability, and actor coalitions, with the conditions under which green finance, circular procurement, and regulatory instruments successfully accelerate material adoption varying substantially depending on who orchestrates systemic coordination.

Communication
Chemistry and Materials Science
Organic Chemistry

Alan Aguilar-Aguilar

,

Ángel Palillero-Cisneros

,

Félix May-Moreno

,

Jorge R. Juarez-Posadas

,

Joel L. Terán

,

David M. Aparicio

Abstract: Herein, starting from (R)-(+)-α-methylbenzylamine, we report an efficient synthesis and full characterization of a new (R)-3-(1-hydroxyethylidene)-1-(1-phenylethyl)piperidine-2,4-dione, a new tetramic acid analogue. The key steps involved a non-classical Corey-Chaykovsky intramolecular cy-clization reaction to access the corresponding zwitterion, followed by a sequential desul-furization/reduction and condensation procedure. The key intermediate was obtained in 5 steps, and the desired product 7 with an overall 58% yield.

Article
Medicine and Pharmacology
Dentistry and Oral Surgery

Lucija Koturić Čabraja

,

Walter Dukić

,

Matea Lapas Barisic

Abstract: Background/Objectives: Children who experience high levels of dental anxiety often show poor cooperation during dental visits, which compromises treatment outcomes and which leads to a vicious cycle with poor outcomes for oral health and caries prevention. The purpose of this study is to analyse the prevalence of dental anxiety in a sample of clinical subjects in children, and to examine possible correlations with other factors as well as the longitudinal nature of anxiety. Methods: This cross-sectional and longitudinal part study was conducted on a sample of 150 children aged 12 to 18 years in the city of Zagreb and Zagreb County. The MDAS (Modified Dental Anxiety Scale) scale was used to collect data related to dental anxiety at three-time intervals; before the procedure (T1), after the procedure (T2) and after a period of 3 months (T3). Results. Significant differences between the MDAS before and after the procedure were found, in the sense that dental anxiety decreased in the majority of children (p< 0.001). The MDAS test result was 9.79 for the T1 period, and 8.03 for the T2 period, which belongs to the mild anxiety group. There are statistically significant differences between individual time points T1, T2, T3 (p< 0.001), and these differences are significant between T1 and T2 (p=0.045), and T1 and T3 (p=0.012), while between T2 and T3 the differences are not statistically significant (p=0.616). At T1, most children had mild dental anxiety (55.3%), moderate (41.3%) and severe dental anxiety (3.3%). At T2, most children had mild dental anxiety (77.2%) and moderate (22.8%), with no severe dental anxiety. Financial impact on dental service use was statistically associat-ed with dental anxiety at T1 (p=0.015), and at T1-T2 period (p=0.032). The long period since the last visit to the dentist also showed significance for T1 (p=0.003) and T2 (p=0.014). The "urgent pain" showed a statistical correlation with the period T2-T1 (p=0.023). The greatest decrease in the dental anxiety scale T2-T1 was in subjects who had a doctorate/master's degree in their family, high income in family, regular dental check-ups within 3 months, brushing their teeth several times a day for over 2 minutes with horizontal brushing tech-nique, use of dental floss and fluoridated toothpaste, no active caries lesions, no bad habits, and use of drinking water and rarely eating sweets. Conclusions: Most of the children in this study have mild to moderate anxiety, and it de-creases after the therapeutic procedure. Increased dental anxiety is associated with urgent dental procedures/urgent pain and irregular check-ups that are longer than 3 months. Bet-ter oral hygiene and oral status, higher socioeconomic status, and a low cariogenic diet in-fluence the level of dental anxiety among children.

Article
Arts and Humanities
Other

Yohanna Joseph Waliya

,

Margaret Mary Okon

Abstract: In the contemporary landscape, natural language processing (NLP) stands as a vital force, empowering computers to comprehend and engage with human languages, thereby enhancing the realm of human-computer interaction (HCI) through the utilisation of large language models (LLMs) and multilingual pre-trained language models (mPLMs). The widespread adoption of these LLMs on a global scale is obvious. However, a critical observation reveals a significant gap in their capacity to effectively recognize some low-resource African languages, a concern observed by numerous researchers. This paper endeavours to contribute to the discourse by conducting a comprehensive metadata analysis of existing African language models. Through this investigation, the aim is to outline the importance, strengths, and weaknesses inherent in these models. By shedding light on these aspects, the paper seeks to not only underscore the current limitations but also to provide valuable insights and recommendations for future research endeavours in the domain of language recognition, particularly focusing on African languages. In doing so, the paper aspires to catalyse advancements that promote inclusivity and a more nuanced understanding of linguistic diversity within the realm of natural language processing. Multilingual Testing shall be used on Cheetah to evaluate the model's proficiency strength in multiple languages, including those that are less widely spoken such as Margi and Ibibio as well as identify any language-specific weaknesses or limitations of the LLMs, especially in recognizing and understanding languages like Margi spoken in the North-East geo-political zone of Nigeria and Ibibio spoken in the South-South geo-political zone of Nigeria.

Review
Medicine and Pharmacology
Anesthesiology and Pain Medicine

Claire Yuan

,

Ashu K. Goyle

,

Maged Guirguis

,

Alan D. Kaye

,

Vahid Grami

,

Karan Dave

,

Ronald J. Kulich

,

Timothy E. Deer

,

David Rosenblum

,

Vwaire Orhurhu

+2 authors

Abstract: Micro-fragmented adipose tissue (mFAT) is a promising autologous biologic in regenerative medicine because it provides a mechanically processed adipose-derived product that preserves native extracellular matrix architecture and a cellular milieu rich in mesenchymal stem cells, pericytes, growth factors, cytokines, and extracellular vesicles. Mechanistically, mFAT is hypothesized to act largely through paracrine signaling that dampens inflammation, supports vascular stabilization, and promotes cartilage and soft-tissue repair; in vitro data suggest modulation of osteoarthritic synovial macrophage signaling, including reductions in chemokines such as CCL2 and CCL3. Preparation involves liposuction harvest followed by closed, sterile mechanical processing without enzymatic digestion or cell expansion, aligning with “minimal manipulation” concepts relevant to regulatory frameworks. Preclinical animal studies generally demonstrate favorable effects on synovial inflammation and cartilage matrix markers (e.g., glycosaminoglycan content) with limited adverse events. Clinically, the strongest body of evidence is in knee osteoarthritis, where multiple prospective and retrospective studies report improvements in pain and function from months to several years after single injections, though response rates vary and study designs are heterogeneous. Evolving data support potential benefit in hip osteoarthritis and select tendon conditions, but cohorts remain small. Overall, mFAT appears safe and potentially effective, yet larger, standardized, long-term randomized controlled trials and comparative studies versus platelet-rich plasma and bone marrow aspirate concentrates are needed to clarify indications, dosing, durability, and mechanisms in vivo.

Review
Medicine and Pharmacology
Epidemiology and Infectious Diseases

Rachana Mehta

,

Amogh Verma

,

Sonam Yadav

,

Sourav Sundan

,

Sanjit Sah

,

Ranjana Sah

,

Amrendra Prasad Kushwaha

,

Roshan Kumar Mahat

,

Shriyansh Srivastava

,

Aroop Mohanty

+4 authors

Abstract: Nipah Virus (NiV) is a highly virulent zoonotic henipavirus responsible for recurrent outbreaks across South and Southeast Asia and remains a priority global health threat. Fruit bats of the genus Pteropus serve as the natural reservoir, with transmission to humans occurring through contaminated food products, infected intermediate hosts, or person-to-person spread. Clinical illness ranges from nonspecific febrile disease to rapidly progressive encephalitis and severe respiratory failure, with reported case fatality rates often exceeding 40 percent. This narrative synthesis reviews the evolving epidemiology, phylogenetic diversity, transmission pathways, clinical spectrum, diagnostic approaches, and current management strategies of NiV infection. Persistent seasonal spillovers, limited therapeutic options, the absence of licensed human vaccines, and challenges in early detection continue to hinder containment efforts. Strengthening One Health surveillance, expanding laboratory capacity, and accelerating vaccine and antiviral development remain critical priorities to mitigate future outbreaks and reduce associated morbidity and mortality.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Mehdi Chrifi Alaoui

,

Nour-eddine Joudar

,

Mohamed Ettaouil

Abstract: The self-attention mechanism has revolutionized sequence modeling but suffers from quadratic computational complexity with respect to sequence length, limiting its applicability to long sequences. We propose Sparse Projection Attention (SPA), a novel attention variant that leverages learnable sparse projections to reduce the effective dimensionality of queries and keys while maintaining expressive power. Our method is grounded in the Johnson-Lindenstrauss lemma and provides theoretical guarantees on distance preservation. We introduce a comprehensive mathematical framework including error bounds, convergence analysis, and gradient dynamics. Experimental results on language modeling, machine translation, and long-range sequence classification demonstrate that SPA achieves up to 8 × computational speedup while maintaining competitive performance compared to standard attention and other sparse variants. The proposed approach offers an effective trade-off between computational efficiency and model expressivity for long-sequence tasks, making transformers more accessible for resource-constrained environments and real-time applications.

Article
Computer Science and Mathematics
Applied Mathematics

Guorui Chen

Abstract: We study the problem of constructing group-invariant embeddings that faithfully represent data modulo group symmetries, a task that arises naturally in signal processing, physics, and machine learning. A central challenge is to design embeddings that are simultaneously orbit-separating, stable, and computationally tractable. To address this, we develop a general lifting framework for constructing such embeddings. The key idea is to start from a group-invariant embedding defined on a low-dimensional reduced space, and lift it to the ambient space by composing it with a finite family of parameterized linear maps, followed by an aggregation step that produces a global embedding. This framework provides a unified perspective that connects classical problems such as phase retrieval and permutation-invariant embeddings. We demonstrate the effectiveness of this framework in the finite group setting. In this setting, we establish general sufficient conditions for orbit separation and prove that any orbit-separating lifting embedding is automatically bi-Lipschitz. We further extend the bi-Lipschitz result to sparse regimes, and show that, when applied to phase retrieval, it yields an equivalence between uniqueness and stability for real sparse phase retrieval.

Brief Report
Medicine and Pharmacology
Cardiac and Cardiovascular Systems

Ziyad Gunga

,

Augustin Rigollot

,

Elsa Hoti

,

Zied Eltaief

,

Gabriel Saiydoun

,

Anna Nowacka

,

Valentina Rancati

,

Florine Valliet

,

Matthias Kirsch

Abstract: Background Aorto-right ventricular fistula (ARVF) secondary to membranous septum rupture is an exceptionally rare complication after surgical aortic valve replacement (SAVR). While sutureless prostheses such as the Perceval valve have gained wide acceptance due to reduced cross-clamp times and procedural simplification, reported adverse events predominantly include conduction disturbances and paravalvular leaks. Structural septal disruption remains sparsely described. We report a case of early ARVF after Perceval implantation and review the pathophysiological and procedural mechanisms implicated in septal injury following sutureless and transcatheter aortic valve interventions. Case Description A 66-year-old woman with severe bicuspid aortic valve stenosis underwent SAVR via median sternotomy using a Perceval XL prosthesis after meticulous annular decalcification and sizing. Immediate intraoperative transesophageal echocardiography (TEE) confirmed optimal seating without paravalvular regurgitation. Within 24 hours, the patient developed complete atrioventricular block followed by cardiogenic shock. Repeat TEE revealed a large ARVF with significant left-to-right shunt. Emergent re-exploration identified a membranous septum tear. The Perceval prosthesis was explanted, the defect closed with reinforced patch repair, and a 27 mm Inspiris Resilia bioprosthesis was implanted. Peripheral veno-arterial ECMO support was required temporarily. The patient recovered and remained free of prosthetic dysfunction at two-year follow-up. Discussion Membranous septum rupture after AVR has an estimated incidence of 0.4-1.5 % in TAVR cohorts but is virtually unreported with Perceval valves. Mechanisms include chronic radial stress from oversized or malpositioned prostheses. Case reports with TAVR devices emphasize oversizing as a risk factor. Predictive factors for septal injury in sutureless AVR mirror those for conduction disturbances: valve oversizing, shallow infra-annular septal length, heavy calcification, and prior valve surgery. Preventive measures: strict sizing protocols, avoidance of balloon dilation, and optimized implantation depth, have reduced conduction complications and may mitigate septal trauma. Treatment choice percutaneous versus surgical closure, depends on hemodynamic stability, defect size and anatomy, and operative risk. Conclusion Early ARVF after Perceval implantation is exceedingly rare but potentially catastrophic. Strict adherence to sizing principles, awareness of septal anatomy, and prompt management, percutaneous in selected stable cases or surgical in acute large defects, are essential to optimize outcomes in sutureless AVR.

Article
Business, Economics and Management
Business and Management

Daniel Yi-Fong Lin

Abstract: Single-machine, single-product inventory models with generalized interarrival times have long lacked a rigorously justified and computationally tractable optimization framework, owing to a sign mis-specification in prior derivations that spawned ad hoc feasibility restrictions and expansive search domains. A corrected, succinct derivation establishes the strict convexity of the minimum-cost objective and proves the existence and uniqueness of an interior optimum without auxiliary conditions, unifying previously fragmented results into a general theorem. Leveraging these structural properties, the maximum-profit formulation is reduced to a single-variable program over natural, finite bounds that tightly bracket the optimizer, supplanting earlier paired bounds defined on an unbounded domain. Numerical evidence on a canonical benchmark shows that the admissible interval is markedly tighter yet attains the same optimum with fewer evaluations, thereby improving numerical efficiency and robustness of implementation. The analysis clarifies the correspondence between cost-minimization and profit-maximization formulations and provides an operationally simple, reproducible solution path for capacity-constrained single-machine systems under generalized interarrival times. Principal contributions are: (i) a corrected optimality theory establishing strict convexity and a unique interior optimum without auxiliary conditions; (ii) a dimensionality reduction of the profit model to a single-variable program with natural finite bounds; and (iii) demonstrably tighter admissible intervals that cut evaluations while preserving optimality.

Article
Computer Science and Mathematics
Security Systems

Waqas Aman

,

Ammar Hassan

,

Aqdas Malik

,

Waseem Iqbal

,

Firdous Kausar

Abstract: Cryptocurrencies are increasingly gaining traction in the digital realm, promising a decentralized future free from the grip of centralized authorities. This magnetic appeal has led to a surge in the integration of cryptocurrencies within various games and applications based on the robust security provided by blockchain technology. As the world embraces this digital revolution, everyday users are navigating a landscape filled with questions and concerns about the safety, privacy, and reliability of these innovative currencies. Some nations have chosen to ban or heavily regulate cryptocurrencies, further fanning the flames of uncertainty among regular users. While extensive research has been done on the technical dimensions of enhancing cryptocurrency security, there is a lack of appropriate work on critical aspects of parameters influencing users’ perspectives on security, privacy, and trust (SPT) offered by cryptocurrencies. This paper explores the existing gap by investigating the complex relationship between users’ perceptions of SPT in cryptocurrencies and the potential advantages presented by decentralized blockchains in current literature. PRISMA methodology has been followed to systematically review the existing literature targeting SPT parameters of cryptocurrencies with a detailed discussion of the methodologies followed by the researchers on the subject. After the careful selection of a search query, 64 papers have been reviewed in detail from a list of 350 papers obtained from Scopus, WoS, IEEE Explore, and ACM. Dominant use of surveys, the Technology Acceptance Model, and Structural Equation Modelling for analysis is observed in the reviewed literature which may not cover the complete domain of parameters affecting SPT concerns of users about cryptocurrencies. By exploring the existing literature, we highlighted the obstacles that may impede the widespread adoption of cryptocurrencies and the limitations that may be in research methodologies being adopted to measure these parameters.

Article
Medicine and Pharmacology
Cardiac and Cardiovascular Systems

Yingying Liao

,

Jie Xu

,

Yuheng Jiao

,

Xinxin Sun

,

Mingkui Gao

,

Yagang Ding

,

Dihui Cai

,

Yinyin Shen

,

Xiaohui Zhou

,

Wei Han

Abstract: A single paragraph of about 200 words maximum. For research articles, abstracts should give a pertinent overview of the work. We strongly encourage authors to use the following style of structured abstracts, but without headings: (1) Background: Place the question addressed in a broad context and highlight the purpose of the study; (2) Methods: briefly describe the main methods or treatments applied; (3) Results: summarize the article’s main findings; (4) Conclusions: indicate the main conclusions or interpretations. The abstract should be an objective representation of the article and it must not contain results that are not presented and substantiated in the main text and should not exaggerate the main conclusions.

Article
Physical Sciences
Particle and Field Physics

Jiazheng Liu

Abstract: We construct quantum Yang-Mills theory on four-dimensional Minkowski spacetime within the Epstein-Glaser causal perturbation theory framework, rigorously establishing the Wightman axioms and proving the existence of a positive mass gap \Delta >0 together with asymptotic freedom. The construction proceeds from two postulates—the massless wave equation \square \phi = 0 and Poincaré invariance—through the angular momentum decomposition of the retarded Green's function on the null cone. The equal-weight condition P_{\ell}(1) = 1, a consequence of the Peter-Weyl theorem for the unit element in every irreducible representation of \mathrm{SO}(3), ensures that all angular momentum modes contribute identically at the light-cone vertex. The spectral sum \Sigma^{(4)}(t) = \sum_{\ell = 0}^{\infty}(2\ell +1)e^{- (2\ell +1)t / 2} admits the closed form \cosh (t / 2) / [2\sinh^{2}(t / 2)], whose small-t expansion encodes the Riemann zeta function at negative odd integers via \zeta (- 1) = - 1 / 12, \zeta (- 3) = 1 / 120, etc. From the constant term 1 / 12 and the group-theoretic factor C_{2}(G), we derive the one-loop \beta-function coefficient b_{1} = 11C_{2}(G) / (12\pi) analytically without Feynman diagrams, establishing asymptotic freedom as a geometric consequence of null-cone causality. The mass gap is proven through two independent arguments: a distributional proof that non-abelian vertices extend propagator support to the timelike region, and a Carleman-Fredholm determinant argument excluding zero-mass poles. We verify all Wightman axioms via the reconstruction theorem. Furthermore, the framework reveals deep structural connections to random matrix theory (Dyson's threefold classification and Migdal's large-N reduction), and to number theory through the \mathrm{SL}(2,\mathbb{C}) holonomy R(2\pi) = - \mathbb{I} and the Selberg trace formula, providing a construction of the Hilbert-Pólya operator whose spectrum corresponds to the non-trivial zeros of the Riemann zeta function.

of 5,762

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated