Preprint Article Version 1 This version is not peer-reviewed

Compressible Diagnosis of the Membrane Fouling Based on the Transfer Entropy

Version 1 : Received: 8 August 2024 / Approved: 9 August 2024 / Online: 9 August 2024 (12:19:35 CEST)

How to cite: Wu, X.; Hou, D.; Yang, H.; Han, H. Compressible Diagnosis of the Membrane Fouling Based on the Transfer Entropy. Preprints 2024, 2024080707. https://doi.org/10.20944/preprints202408.0707.v1 Wu, X.; Hou, D.; Yang, H.; Han, H. Compressible Diagnosis of the Membrane Fouling Based on the Transfer Entropy. Preprints 2024, 2024080707. https://doi.org/10.20944/preprints202408.0707.v1

Abstract

Membrane fouling caused by many direct and indirect triggering factors has become an obstacle to the application of membrane bioreactor (MBR). The nonlinear relationship between those factors is subject to complex causality or affiliation, which is difficult to clarify for the diagnosis of membrane fouling. To solve this problem, this paper proposes a compressible diagnosis model (CDM) based on transfer entropy to facilitate the fault diagnosis of the root cause for membrane fouling. Firstly, a framework of CDM between membrane fouling and causal variables is built based on a feature extraction algorithm and mechanism analysis. The framework can identify fault transfer scenarios following the changes in operating conditions. Secondly, the fault transfer topology of CDM based on transfer entropy is constructed to describe the causal relationship between variables dynamically. Thirdly, an information compressible strategy is designed to simplify the fault transfer topology. This strategy can eliminate the repetitious affiliation relationship, which contributes to the root causal variables speedily and accurately. Finally, the effectiveness of the proposed CDM is verified by the measured data from an actual MBR.

Keywords

Membrane fouling; diagnosis; causal relationship; root causal variables; transfer entropy

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.