Sort by

Review
Medicine and Pharmacology
Obstetrics and Gynaecology

George A. Vilos

,

Angelos G. Vilos

,

Meryl Hodge

,

Aym Oraif

,

Faisal Khalid Idris

,

Jacob McGee

Abstract: Post-endometrial ablation persistent uterine bleeding indicates that no method of endometrial ablation (EA) eliminates the entire endometrium and hysteroscopy shows distorted and scarred uterine cavity in the majority of women. These observations raise concerns regarding presentation, assessment and stage of potential post-ablation endometrial cancer (PAEC) developing in residual endometrium. To address these concerns, we conducted a systematic search for reports of endometrial cancer (EC) associated with or after EA using multiple data bases imputing keywords of EC after EA and possible combinations of first- and second-generation EA techniques associated with EC from its inception in the 1980s through 2025. After excluding irrelevant publications, we identified 86 ECs associated with EA described in 20 case reports (N=20), four case series (N=18), eleven cohort studies (N=21), one registry (N=27) and five reviews. Based on 12 relevant studies, at follow up of 1.9-25 years, 43 ECs were identified in 39,795 women with a history of EA; summary incidence of 0.11% (range 0.0 - 1.59%). Based on the remaining 43 evaluable cases of PAEC, the mode and time to presentation, investigation, diagnosis, and stage of PAEC were not altered by EA. We conclude that EA has a protective effect reducing the risk EC significantly, likely due to quantitative reduction in endometrium that can potentially become malignant and the EA process eliminating occult pre- or malignant endometrial tissues which are vulnerable to ablation techniques. The mode and time to presentation, the diagnostic work-up, including endometrial biopsy and hysteroscopy, and stage of PAEC are not altered by EA.

Review
Public Health and Healthcare
Health Policy and Services

Yangzihan Wang

,

Colin Millard

Abstract: The use of Traditional Chinese Medicine (TCM) is expanding worldwide. In the UK, TCM has developed rapidly since the 1990s, but limited scientific evidence supports its safety, quality, or efficacy. This creates challenges for regulatory governance and public health protection.Objective:To review the development of TCM regulations in the UK and examine how existing regulations address safety, quality, and efficacy through different regulatory instruments, objectives, targets, and enforcement mechanisms. A narrative literature review was conducted, which is supplemented by grey literature searches of government reports and legislative documents published between 1970 and 2020. Thematic and chronological analyses were applied to map regulatory transitions and classify instruments and objectives. Ten key regulations and policy documents were identified, forming a hierarchical and fragmented framework dominated by product-focused oversight. While the system ensures basic safety and quality standards, it lacks consistent mechanisms for enforcement, practitioner regulation, and efficacy assessment. UK-TCM regulation has evolved through a mix of EU and domestic legislation, but gaps in enforcement and practitioner oversight persist. Policymakers should develop proportionate efficacy evaluation methods, enhance enforcement, and establish clearer practitioner standards to ensure safe, evidence-informed practice in post-Brexit UK health policy.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Rao Mikkilineni

,

W. Patrick Kelly

Abstract: Contemporary enterprise IT operations are implemented largely atop Shannon–Turing computing: programs execute read–compute–write cycles over data structures, while governance (fault handling, configuration control, auditability, and continuity) is applied externally through infrastructure platforms, observability stacks, and human processes. This separation scales analytic throughput but accumulates coherence debt: locally expedient commitments whose provenance and revisability degrade until exposed by shocks (failures, security incidents, regulatory demands, or architectural transitions). We synthesize a model evolution that integrates computation with regulation at two distinct levels: (i) Distributed Intelligent Managed Elements (DIME), which modifies the Turing cycle to read–check-with-oracle–compute–write by infusing a signaling overlay and FCAPS (Fault, Configuration, Accountability, Performance, Security) supervision into computation in progress; and (ii) AMOS, which fully decouples the process executor from governance by treating any Turing-equivalent engine as a replaceable execution substrate while elevating knowledge structures—encoded as local and global Digital Genomes—to first-class operational state in a governed Knowledge Network. We further present implementation evidence via a microservice transaction testbed that operationalizes dynamic topology as data, a capability-oriented control plane, decoupled application-layer FCAPS from IaaS/PaaS FCAPS, and policy-selectable consistency/availability semantics. We argue that the principal benefit of AMOS is not “circumventing” impossibility results such as CAP, but governing their trade-offs as explicit commitments with auditable lineage and controlled convergence back to coherent state.

Article
Computer Science and Mathematics
Probability and Statistics

Gonçalo Melo de Magalhães

Abstract: Machine learning's dominant paradigm—whether model-centric or data-centric—treats intelligence as the extraction of statistical patterns from behavioral records. This approach has delivered remarkable engineering feats. Yet something foundational is missing. Data is not reality: it is a finite record of trajectories through reality. A photograph of a river is not the river's law. This paper argues that the data paradigm conflates measurement with mechanism, capturing where systems have been rather than why they go there. We propose an alternative grounded in the Architecture of Freedom Intelligence (AFI), which identifies navigability—the structural availability of paths—as the primary organizing principle of all complex systems. The Law of Freedom, F = P/D, states that navigational capacity equals differentiation capacity (Perception, P) divided by structural resistance (Distortion, D). Under this framework, intelligence is not pattern memorization but distortion navigation: all systems move according to dx/dt = −P(x)·∇D(x), following gradients of resistance scaled by perceptual capacity. We demonstrate that this gradient law is structurally identical to Fick's diffusion, Berg–Brown chemotaxis, Ohm's law, and gradient descent—revealing a deep structural unity that the data paradigm treats as coincidental analogy. Nature does not train on labeled datasets: ants, neurons, immune cells, and ecological populations navigate through calibrated heuristics on Perception and Distortion fields, not through backpropagation over historical trajectories. This observation motivates a fundamental reconceptualization of what training should accomplish. We propose Freedom Intelligence Training (FIT): a learning paradigm oriented toward learning P and D fields directly, rather than fitting statistical correlations over behavioral snapshots. FIT rests on five predictions: (i) models trained on P–D fields require exponentially less data than pattern-extraction models; (ii) generalization improves because P–D fields encode causal structure; (iii) out-of-distribution performance improves because navigability laws transfer across domains; (iv) interpretability is natural since every prediction decomposes into ΔP and ΔD contributions; (v) the exploration–exploitation transition is quantifiable as the coefficient of variation of the Freedom field crossing 1.0. We provide ten falsification criteria and position FIT within the emerging landscape of world models, physics-informed learning, and causal inference. This is a theoretical proposal; a complete experimental roadmap is provided.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Arda Yunianta

Abstract: Current implementation of pneumonia diagnosis remains challenging to achieve better performance and improve to get better result. Convolutional neural networks (CNNs) have demonstrated the successful automation of pneumonia diagnosis through the analysis of chest X-ray images, which can be combined with other methods to improve prediction and classification accuracy rates. The aim of this research is to propose an innovative framework for pediatric pneumonia diagnosis that unites three fine-tuned pre-trained CNN models through feature fusion at the EfficientNetB0, RestNet50, and MobileNetV2 to achieve better performance. The mixed-model architecture framework provides an ideal solution for time-sensitive clinical applications operating in resource-constrained environments. The proposed framework model demonstrates successful performance in maintaining excellent sensitivity and specificity measures because clinical use requires minimal false-negative and false-positive results. Furthermore, the proposed framework model outperformed individual models and compared favorably to previous studies related to pneumonia classification, achieving an accuracy level of 96.14%, a precision of 94.10%, a recall of 96.92%, and an F1-score of 94.97%.

Article
Environmental and Earth Sciences
Remote Sensing

Umberto Rizza

,

Simone Virgili

,

Alessandra Chiappini

,

Silvia Di Nisio

,

Giorgio Passerini

,

Martina Tommasi

Abstract: Forest fires in the Amazon rainforest pose a critical environmental challenge, with impacts on biodiversity, atmospheric composition, and climate regulation. Fire activity has intensified in recent decades due to climate variability and increasing anthropogenic pressure, raising concerns about a potential shift of the Amazon from a carbon sink to a carbon source. This study analyzes the spatial and temporal variability of fire activity across the Amazon basin, with a focus on the Brazilian region, over the period 2001–2022. The analysis is based on satellite-derived active fire data from NASA’s Fire Information for Resource Management System (FIRMS), obtained from the Moderate Resolution Imaging Spectroradiometer (MODIS) and the Visible Infrared Imaging Radiometer Suite (VIIRS). Fire Radiative Power (FRP) is used as a proxy for fire intensity and combustion processes. The observed increase in fire activity post-2012 is primarily attributed to the deployment of the VIIRS sensor, which offers superior sensitivity for detecting small-scale and low-intensity fires Pronounced peaks of fire activity are observed in 2004, 2005, 2007, and after 2019. Statistical analyses reveal strong interannual variability and cyclical behavior in FRP, linked to variations in drought conditions, precipitation, land-use change, and environmental policy. Overall, the study highlights the value of multi-sensor satellite observations for long-term fire monitoring on the Amazon.

Article
Business, Economics and Management
Human Resources and Organizations

Ying Zhao

,

Zhengyang Qin

,

Zhaoyu Wang

,

Wenbin Wu

Abstract: In volatile environments, work teams operate as complex adaptive systems that reconfigure internal processes in response to internal and external tensions. Team adaptability—a systemic outcome—is influenced by paradoxical leadership (PL), but the motivational pathways translating PL into adaptive behavior remain underexplored. Grounded in Conservation of Resources theory, this multi‑wave, supervisor–subordinate dyadic study of 114 high‑tech teams adopts a systems perspective and treats goal orientations as collective resource‑allocation rules. PL most strongly fosters systemic adaptability by cultivating a team performance‑approach orientation—an agentic, short‑term resource‑mobilization strategy that drives visible competence demonstration. Although team learning orientation predicts adaptability when tested alone, its mediating effect is suppressed once performance‑approach orientation is included, consistent with competitive resource‑allocation dynamics in specialist teams. PL also reduces performance‑avoidance orientation, but this reduction does not yield a significant indirect effect on adaptability, indicating that removing dysfunction is not equivalent to activating adaptive capacity. By comparing three competing motivational pathways, the study identifies a dominant leadership leverage point for configuring resource flows to produce emergent adaptation and offers implications for designing systemic interventions and models to enhance team resilience.

Article
Chemistry and Materials Science
Materials Science and Technology

Vera La Ferrara

,

Marco Martino

,

Antonio Marino

,

Giovanni Landi

,

Silvano Del Gobbo

,

Nicola Lisi

,

Rosanna Viscardi

,

Alberto Giaconia

,

Giulia Monteleone

Abstract: Mixed-halide perovskite solar cells with the composition Cs0.1(MA0.17FA0.83)0.9Pb(I0.83Br0.17)3 were fabricated obtaining solar cells as glass/ITO/SnO2/triple cation perovskite/HTL/Au, subsequently used as photoanodes for efficient solar-driven water splitting by applying commercial catalytic nickel foils onto the Au back-contact pads of devices. To enable operation under alkaline media the de-vices were encapsulated using commercial PET–EVA multilayer films, providing a ro-bust barrier while leaving the Ni foils exposed as the electrochemically active interface. Two operating configurations were investigated and compared: (i) an outside configu-ration, where the perovskite solar cell powered an external electrochemical cell, and (ii) an immersed configuration, in which the encapsulated device was directly integrated into the electrolyte. In particular, the oxygen evolution reaction onset shifted from ~1.32 V vs RHE, when the Ni electrode was not powered by the perovskite absorber, to ~0.34 V vs RHE when the perovskite device powered the nickel foil for both immersed and outside configurations. The IS device achieved a maximum Applied Bias Photon-to-Current Ef-ficiency of ~20% under AM 1.5G illumination (100 mW cm⁻²), among the highest reported for perovskite-based photoanodes.

Article
Public Health and Healthcare
Public Health and Health Services

Andrej Minich

,

Peter Sabaka

,

Vladimír Heger

,

Rudolf Kubička

,

Peter Mihalov

,

Ján Jurenka

,

Ľubomír Soják

,

Juliana Pašková

,

Ľubica Slimáková

,

Romana Kalianková Chovanová

+1 authors

Abstract: Background/Objectives: Staphylococcus aureus is one of the leading causes of bacterial infection–related mortality worldwide, with outcomes complicated by antimicrobial resistance; asymptomatic colonization (~30% of the population) increases the risk of subsequent infection, often with the colonizing strain. While high-income countries provide surveillance data, comprehensive data from LMICs are lacking, and the COVID-19 pandemic has significantly influenced the incidence and epidemiology of S. aureus infections, highlighting a critical data gap in Slovakia. Methods: We conducted data analysis using the KNIME Analytics Platform, an open-source, visual workflow environment that facilitates integration, preprocessing, and advanced analysis of complex biomedical datasets. This study analyzed 5 years of data from routine laboratory diagnostics extracted from the laboratory information system (LIS). Results: Our data reveals that the incidence of multidrug-resistant S. aureus, including MRSA, increased during 2020–2022—particularly in surgical departments—and remained elevated into the post-pandemic period, while MSSA incidence was consistently higher overall and predominantly driven by colonization rather than infection. Conclusions: This study provides essential insights into the use of big data analytics platforms. Identified missing gaps, such as information about the difference between colonization vs. infection, and their implementation in the future, together with whole genome sequencing, set a foundation for epidemiological research purposes in Slovakia.

Article
Physical Sciences
Theoretical Physics

Raoul Bianchetti

Abstract: The Dzhanibekov effect—also known as the tennis racket theorem in its classical formulation—remains one of the most visually striking and widely circulated demonstrations of rotational instability in torque-free motion. A rigid body spinning about its intermediate principal axis undergoes abrupt, repeated flips that appear paradoxical to non-specialists and counterintuitive even to many trained physicists when encountered as a real-world phenomenon rather than as a textbook theorem. Conventional mechanics accounts for this behavior through the instability of the intermediate axis in Euler’s equations; however, this explanation is typically framed as a binary statement (“stable” versus “unstable”) and rarely develops a deeper dynamical interpretation of three experimentally salient features: (i) the emergence of highly organized, quasi-periodic flips rather than unstructured chaos; (ii) the strong dependence of the observed flip dynamics on preparation, perturbations, and real-world imperfections; and (iii) the apparent “memory” of the system, which repeatedly returns to similar macroscopic configurations despite inevitable dissipation and microstructural coupling.In this paper, we propose a rigorous reinterpretation of the Dzhanibekov effect within the framework of Viscous Time Theory (VTT), viewing the flip not merely as a consequence of intermediate-axis instability, but as a coherence-regime transition in an anisotropic informational geometry. We introduce an informational manifold for rigid-body rotation, in which rotational states evolve along constrained trajectories shaped by anisotropic reconfiguration costs associated with the principal inertia structure and by an effective informational viscosity arising from internal mode coupling and finite-time redistribution. In this picture, the flip is not an instantaneous kinematic accident but a finite-time transition between metastable corridors of rotational coherence, with the observed quasi-periodicity emerging as a geometric consequence of navigation along preferred informational pathways.This formulation yields a set of quantitative, testable predictions absent from the standard narrative, including hysteresis under cyclic control of initial conditions, direction-dependent flip thresholds and transition times, metastable latency regimes, and scaling relations linking flip timing to an informational viscosity parameter. To assess these predictions, we perform a comprehensive numerical validation combining high-resolution integration of the Euler equations, ensemble statistics over large sets of perturbations, spectral and structural diagnostics, multi-precision convergence testing, and comparative model analysis.Quantitative analysis demonstrates that informational viscosity produces a bounded suppression of instability growth (approximately 2–15%) without altering phase-space topology, integrability, or scaling structure. No chaotic attractors, bifurcation cascades, or nonphysical divergences emerge. The Dzhanibekov instability remains fundamentally geometric, with VTT operating as a coherent rate-level regulator rather than a replacement mechanism. This bounded rate modification becomes logarithmically amplified in flip-time statistics, providing a clear and measurable experimental signature.By reframing one of the most iconic “video-paradox” phenomena of classical mechanics as an informational hysteresis and regime-transition process, this work provides a mathematically grounded bridge between visible macroscopic dynamics and a general theory of anisotropic, viscous informational evolution. The Dzhanibekov effect thus emerges not only as a pedagogical curiosity, but as a quantitative macroscopic laboratory for probing informational geometry, coherence regimes, and finite-time reconfiguration in real physical systems.

Brief Report
Social Sciences
Government

Satyadhar Joshi

Abstract: The rapid advancement of artificial intelligence (AI) presents unprecedented challenges for labor market forecasting, requiring fundamental methodological innovations that move beyond traditional extrapolation techniques. This policy paper proposes comprehensive enhancements to the U.S. Bureau of Labor Statistics (BLS) employment projection systems to better capture and forecast AI's impact on employment structures, job roles, and workforce skill requirements. Drawing on recent empirical research and the bureau's existing methodological frameworks, we present an integrated architectural framework that combines task-based exposure modeling, real-time data analytics, causal inference methods, and enhanced gross flows estimation. Our recommendations address critical gaps in current BLS methodologies identified through systematic literature review and analysis of emerging AI adoption patterns, including the distinction between automation and augmentation effects, the nonlinear dynamics of AI adoption, and differential impacts across worker demographics. We propose a dynamic Occupational AI Exposure Score (OAIES) framework that leverages large language models and occupational task data, alongside enhanced data collection strategies and modernized estimation techniques. The architectural framework, illustrated through five interconnected diagrams, demonstrates how these methodological innovations integrate into a coherent system for measuring labor market transformation. These enhancements would enable more accurate projections of job displacement, skill evolution, and employment transformation across industries and geographic regions, supporting evidence-based policymaking for workforce development in an AI-driven economy. The paper concludes with a phased implementation strategy and validation protocol to ensure methodological rigor and operational feasibility.

Article
Business, Economics and Management
Economics

Songyuan Liu

,

Shuaiqi Hu

,

Mei Wang

,

Yue Song

,

Yichuan Jin

,

Lingfeng Tan

Abstract: This study develops a hybrid analytical framework that bridges data-driven K2 structural learning with expert-informed Bayesian Networks to decrypt the intricate interdependencies among policy instruments, resource endowments, and socio-economic variables across China’s hydropower, wind, and solar power. The results demonstrate a fundamental paradigm shift from resource-bound growth to institutional-steered expansion, notably in the solar sector where the Renewable Portfolio Standard (RPS) has superseded natural radiation as the primary determinant for capacity scaling. Forward sensitivity analysis and backward diagnostic attribution reveal that achieving high-growth milestones requires a synergistic convergence of tech-cost reductions and mandatory consumption quotas, whereas the absence of RPS leads to a catastrophic 64% degradation in systemic causal connectivity. These findings underscore the necessity of transitioning from price-side stimuli to structural consumption-side mandates to ensure a resilient and certain energy transition under stringent carbon constraints.

Article
Physical Sciences
Astronomy and Astrophysics

Raheb Ali Mohammed Saleh Aoudh

Abstract: We report a direct empirical discovery from the analysis of 149 galaxies in the SPARC database. Through a purely data-driven computational approach, we find that galactic rotation curves naturally organize themselves into four statistically distinct dynamical families, with hierarchical substructure revealing seven finer-grained families. Two families exhibit exceptional regularity, with 100% success in basic kinematic modeling. This classification emerges objectively from the data structure itself, without any theoretical assumptions about dark matter or galaxy formation. Extensive validation including PCA analysis (shape parameters dominate over scale), cross-validation (85.2% agreement), bootstrap uncertainty (mean probability 0.654), and comparison with previous morphological classifications shows only 16.7% agreement, confirming that this is a fundamentally new classification scheme based purely on kinematics. Physical properties reveal systematic differences across families: Family 3 (Rising) has the highest mass (log M = 9.75), largest radius (14.5 kpc), and highest baryonic fraction (31.2), while Family 0 (Flat) has the lowest mass and smallest radius. We present these families as a new phenomenological framework for understanding galactic dynamics, independent of morphological considerations.

Article
Physical Sciences
Radiation and Radiography

Viviana Sîrbu

,

Eugenia Paulescu

Abstract: Balancing accuracy and accessibility in solar energy flux estimation models remains a key challenge in atmospheric radiative transfer research. Since spectral models require computationally intensive spectral calculations, a widely adopted simplification strategy is to parameterize atmospheric spectral transmittances using various wavelength-averaging formulations. This work introduces a broadband parametric model derived from a spectral model that accurately estimates the three components of solar irradiance, direct normal, diffuse, and global under clear-sky conditions. The procedure used to develop the model is structured in two stages. Initially, discrete broadband transmittances are obtained by applying an independent integration scheme to the spectral transmittances provided by the source spectral model. The second stage involves fitting these results to obtain continuous broadband atmospheric transmittances, expressed as analytical functions depending solely on atmospheric state parameters and remaining independent of wavelength. The model development procedure is relatively classical; however, the calculation of the diffuse component introduces a new approach for estimating the fraction of aerosol scattering directedtoward the ground. The model was tested against data collected fromeight radiometric stations distributed across six continents andbenchmarked against two well-established reference models. Overall, theresults indicate a high level of accuracy and demonstrate the practical applicability of the model.

Article
Engineering
Control and Systems Engineering

Yutian Gai

,

Haoyu Cen

Abstract: The rapid evolution of Embodied AI and Large Language Models presents significant opportunities for home robotics, yet challenges persist in enabling robots to execute long-term, high-level natural language instructions. Current LLM-driven embodied agents often suffer from sub-optimal task planning, limited memory systems struggling with multi-hop queries, and inflexible agent routing mechanisms. To address these limitations, we propose the Context-Rich Adaptive Embodied Agent (CRAEA) framework, designed to significantly enhance task planning and memory-augmented question answering in household robots. CRAEA integrates core components: Semantic-Enhanced Task Planning (SETP), which enriches LLM-driven planning with object relationship graphs, hierarchical strategies, and implicit physical constraints; Multi-Modal Contextual Memory (MMCM), which stores comprehensive contextual memory units in a relational graph for sophisticated multi-hop reasoning and employs an advanced retrieval mechanism with temporal decay; and Adaptive Agent Routing and Coordination (AARC), featuring intent recognition with confidence evaluation, proactive clarification, and a planning feedback loop. Evaluated in an artificial home environment across complex tidying scenarios, CRAEA consistently demonstrates superior performance. Empirical results show that CRAEA achieves notable improvements in Task Planning Accuracy, Knowledge Base Response Total Validity, and Agent Routing Success Rate compared to baseline methods. A human evaluation further confirms enhanced coherence, naturalness, and user satisfaction, while an ablation study validates the critical contribution of each proposed module. CRAEA represents a significant step towards more intelligent, robust, and user-adaptive home robots.

Review
Medicine and Pharmacology
Psychiatry and Mental Health

Danilo Pešić

,

Dušica Lečić Toševski

,

Bojana Pejušković

,

Ana Munjiza Jovanović

,

Olivera Vuković

Abstract: Recent revisions of personality disorders (PD) classifications have moved from categorical diagnoses toward dimensional models, raising renewed questions about the nosological status and clinical utility of borderline personality disorder (BPD). This narrative review traces the development of the borderline construct from early descriptions of patients positioned between neurosis and psychosis, through its theoretical consolidation within the concept of borderline personality organization, to the operationalization of BPD in DSM III and subsequent diagnostic revisions. A central section summarizes contemporary controversies regarding the validity and utility of BPD features. Arguments for abandoning the diagnosis emphasize the absence of a distinct borderline factor in factor analytic studies, the tendency of the construct to capture fluctuating symptoms and patterns of behaviour rather than stable maladaptive personality traits, the stigmatizing and non selective use of the label, and the lack of disorder specific treatment approaches. In contrast, converging evidence supports the view that core borderline symptoms frequently function as markers of general PD pathology and of the severity of impairments in self and interpersonal functioning. The paper integrates the regional tradition of the borderline level of personality functioning, conceptualizing borderline pathology as a dynamic dimension of dysfunction with potential transient regressions, and links this concept to the Level of Personality Functioning (LPF, Criterion A) within the DSM 5 Alternative Model for Personality Disorders (AMPD). Retaining borderline pathology as a dimension may support contemporary PD assessment by offering a clinically recognizable marker of overall dysfunction, a guide for rating severity, an indicator of personality structure and need for psychotherapy, without disrupting continuity with an extensive clinical and research tradition.

Article
Medicine and Pharmacology
Clinical Medicine

Jessica Archer

,

Sheridan O’Donnell

,

Melissa Buckman

,

Nicole Bain

,

Himanshu Goel

Abstract: Background: TNRC6B encodes a core effector of the RNA-induced silencing complex and is essential for miRNA-mediated gene silencing. Pathogenic variants in TNRC6B have re-cently been associated with a neurodevelopmental disorder characterised by develop-mental delay, intellectual disability, and behavioural difficulties. Methods: We report a three-generation family with a 22q13.1 deletion encompassing TNRC6B. Clinical data were collected from medical records and family interviews, and the findings were compared with those of published cohorts. Results: Affected individuals presented with developmental delay, speech and language impairment, autism spectrum disorder, ADHD, oppositional defiant disorder, cranio-synostosis, joint laxity, clinodactyly, and cardiac valve anomalies. The father and paternal grandmother had learning difficulties and neurobehavioral features, while the proband exhibited a more severe phenotype. Conclusion: This report expands the phenotypic spectrum of TNRC6B-related neurode-velopmental disorder, highlighting craniosynostosis, joint and connective tissue features, and cardiac involvement. Our findings also underscore variable expressivity across gen-erations and emphasise the relevance of both copy-number and sequence variants in TNRC6B in patients with neurodevelopmental disorders.

Hypothesis
Medicine and Pharmacology
Gastroenterology and Hepatology

Paul S. Mueller

Abstract:

An empirical pattern recurs across the dietary intervention literature: committed dietary patterns—sustained ketosis (<35 g carbohydrate/day with verified β-hydroxybutyrate ≥0.5 mM) and Mediterranean diet—each improve inflammatory markers and, under verified conditions, produce favorable or non-atherogenic lipid profiles. Intermediate carbohydrate restriction (50–150 g/day, or vacillating compliance without sustained ketosis) may not achieve either. Simultaneously, strict ketogenic diets produce dramatic gut microbiome restructuring, including near-elimination of Bifidobacterium adolescentis and expansion of Akkermansia muciniphila. This paper proposes that microbiome-mediated bile acid signaling is the mechanistic link connecting these observations. The microbiome generates the majority of bile acid chemical diversity through deconjugation, dehydroxylation, and epimerization of host-synthesized primary species, while the host simultaneously produces counter-regulatory bile acid conjugates. Dietary patterns that produce stable microbiome configurations therefore also produce stable bile acid signaling environments that coordinate, through multiple receptors including FXR, TGR5, S1PR2, VDR, and RORγt, both lipid metabolic and immune outcomes across organ compartments. This coordination is distributed across tissues and receptors with sometimes opposing outputs, not tightly coupled through a single molecular effector. The hypothesis must account for established findings that constrain it: FXR’s metabolic and anti-inflammatory programs use mutually exclusive post-translational modifications within single cells; the FXR agonist obeticholic acid improved hepatic inflammation while worsening atherogenic lipid profiles in Phase III trials; individual bile acid species exert cell-type-dependent effects on the same receptor; and the most potent bile acid immune tolerance pathways bypass FXR and TGR5 entirely. Moreover, bile acid–mediated immune tolerance may simultaneously suppress beneficial anti-tumor immunity in certain tissue contexts. Despite these constraints, the framework generates testable predictions and a staged, affordable experimental program is proposed. Take-home message: Bile acids are not passive fat-absorption facilitators but a multi-receptor signaling network through which committed dietary patterns may simultaneously coordinate lipid metabolism and immune tolerance—explaining why these outcomes co-vary under dietary intervention and why intermediate restriction fails at both.

Article
Medicine and Pharmacology
Oncology and Oncogenics

Ludvig Letica

,

Ivana Šutić Lubina

,

Zdrinko Brekalo

,

Đordano Bačić

,

Jelena Roganović

,

Ana Đorđević

,

Ingrid Šutić Udović

,

Ivona Letica

,

Ivana Kotri

,

Ines Mrakovčić Šutić

Abstract: Background and Objectives: Incidence of colorectal cancer (CRC) in developed Western countries is constantly growing. CRC represents the third most common cancer and the second leading cancer-related cause of death worldwide. Innate and adaptive im-munity plays a pivotal role in the tumor response, but many of these interactions are still not well understood. Granulysin (GNLY) is an effector, cytolytic molecule, present in human cytotoxic granules of different lymphocyte subpopulations, mainly in cyto-toxic T cells and NK cells. Pore-forming proteins GNLY, perforin and granzymes, play a key role in cell-mediated immune responses against tumors and infections. Materials and Methods: We aimed to analyze perforin and GNLY-mediated cytotoxicity in the peripheral blood of patients with CRC by flow cytometry. Simultaneously, the cells were labelled with monoclonal antibodies against perforin, GNLY and different sur-face antigens (CD3, CD4, CD8 and CD56). Phenotypes of lymphocyte subpopulation and expression of perforin and GNLY were analyzed using intracellular and surface immunofluorescence. Results: Total perforin and GNLY expressions in peripheral blood mononuclear cells (PBMC) were significantly lower than in the control group. Statisti-cally significant differences were observed in the distribution of perforin and GNLY expression in different stages of tumors classified according to Dukes, indicating that the percentage of total perforin and GNLY were significantly diminished in accord-ance with tumor progression. Perforin and GNLY expression was significantly reduced in NK and NKT cells, accompanied by reduced cytolytic potential in patients with CRC and a consequent reduction in their ability to eliminate tumor and infected cells. Con-clusions: The determination of cytotoxic potential may provide a valuable assessment of a patient’s immune status and represent a novel therapeutic target. Patients with CRC exhibit markedly impaired perforin- and GNLY-mediated cytotoxicity that cor-relates with disease progression. Assessment and restoration of cytolytic potential may therefore serve as indicators of immune competence and promising therapeutic strate-gies to improve perioperative and oncologic outcomes.

Article
Environmental and Earth Sciences
Remote Sensing

Rajesh Silwal

,

Guoquan Wang

,

Sabal KC

,

Rabin Rimal

,

Sagar Rawal

Abstract: Earthquake-induced landslides in active orogens such as the Nepal Himalaya pose major threats to life, infrastructure, and post-disaster recovery. Although coseismic landslide susceptibility mapping increasingly uses machine learning (ML) and deep learning (DL), explicit integration of spaceborne interferometric synthetic aperture radar (InSAR) products, particularly line-of-sight (LOS) displacement and coherence-based damage proxy maps (DPM), remains limited in event-based frameworks. This study develops and evaluates a multi-factor coseismic landslide probability model that incorporates InSAR-derived deformation metrics with key geomorphic and hydrologic predictors to improve rapid post-earthquake hazard assessment. Using the 25 April 2015 Mw 7.8 Gorkha earthquake as a case study, LOS displacement was derived from ALOS-2 PALSAR-2 ScanSAR interferometry, and the normalized channel steepness index (Kₛₙ) was computed from a digital elevation model. Additional predictors included slope, aspect, curvature, elevation, drainage density, distance to river, log-transformed stream power index (logSPI), peak ground acceleration (PGA), rainfall, and land use/land cover. Five models: Random Forest, Extreme Gradient Boosting (XGBoost), a lightweight convolutional neural network, U-Net, and DeepLabV3 were trained using fourteen conditioning factors and a landslide inventory, with class imbalance addressed through majority undersampling for ML and weighted loss with patch oversampling for DL. Incorporating LOS and DPM improved model discrimination and calibration: XGBoost and Random Forest achieved the highest AUC-ROC values (0.972 and 0.969) and lowest Brier scores, while DeepLabV3 produced the highest AUC-PR (0.768) and CSI (0.49). Feature importance analysis identified Kₛₙ as the dominant predictor, and ablation tests confirmed the added value of InSAR metrics. These findings demonstrate the effectiveness of integrating InSAR products for rapid coseismic landslide hazard assessment in the Nepal Himalaya.

of 5,643

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated