Preprint
Article

Generative Adversarial Unsupervised Image Restoration in Hybrid Degradation Scenes

Altmetrics

Downloads

741

Views

610

Comments

0

Submitted:

10 February 2022

Posted:

11 February 2022

You are already at the latest version

Alerts
Abstract
In this paper, we propose an unsupervised blind restoration model for images in hybrid degradation scenes. The proposed model encodes the content information and degradation information of images and then uses the attention module to disentangle the two kinds of information. It can improve the ability of disentangled presentation learning for a generative adversarial network (GAN) to restore the images in hybrid degradation scenes, enhance the detailed features of restored image and remove the artifact combining the adversarial loss, cycle-consistency loss, and perception loss. The experimental results on the DIV2K dataset and medical images show that the proposed method outperforms existing unsupervised image restoration algorithms in terms of peak signal-to-noise ratio (PSNR), structural similarity (SSIM), and subjective visual evaluation.
Keywords: 
Subject: Computer Science and Mathematics  -   Applied Mathematics
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2024 MDPI (Basel, Switzerland) unless otherwise stated