Preprint Article Version 1 This version is not peer-reviewed

Enhancement of Underwater Images through Parallel Fusion of Transformer and CNN

Version 1 : Received: 18 July 2024 / Approved: 19 July 2024 / Online: 19 July 2024 (06:40:56 CEST)

How to cite: Liu, X.; Chen, Z.; Xu, Z.; Zheng, Z.; Ma, F.; Wang, Y. Enhancement of Underwater Images through Parallel Fusion of Transformer and CNN. Preprints 2024, 2024071575. https://doi.org/10.20944/preprints202407.1575.v1 Liu, X.; Chen, Z.; Xu, Z.; Zheng, Z.; Ma, F.; Wang, Y. Enhancement of Underwater Images through Parallel Fusion of Transformer and CNN. Preprints 2024, 2024071575. https://doi.org/10.20944/preprints202407.1575.v1

Abstract

Ocean exploration is crucial for utilizing its extensive resources. Images captured by underwater robots suffer from issues such as color distortion and reduced contrast. To address the issue, we propose an innovative enhancement algorithm that integrates Transformer and Convolutional Neural Network (CNN) in a parallel fusion manner. Firstly, a novel transformer model is intro-duced to capture local features, employing peak-signal-to-noise ratio (PSNR) attention and linear operations. Subsequently, to extract global features, both temporal and frequency domain features are incorporated to construct convolutional neural network. Finally, the Fourier’s high and low-frequency information of the original image are utilized to fuse different features. To demon-strate the algorithm's effectiveness, underwater images with various levels of color distortion are selected for both qualitative and quantitative analyses. The experimental results demonstrate that our approach surpasses other mainstream methods, achieving superior PSNR and structural sim-ilarity index measure (SSIM) metrics and leading to a detection performance improvement of over ten percent.

Keywords

image enhancement; local features; global features; parallel fusion

Subject

Computer Science and Mathematics, Robotics

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.