SubmergeStyleGAN: Synthetic Underwater Data Generation with Style Transfer for Domain Adaptation

Fathy, Mohamed E.; Mohamed, Samer A.; Awad, Mohammed I.; Hossam El DIn Hassan Abdelmunim;

Abstract


Underwater computer vision applications are challenged by limited access to annotated underwater datasets. Additionally, convolutional neural networks (CNNs) trained on in-air datasets do not perform well underwater due to the high domain variance caused by the degradation impact of the water column. This paper proposes an air-to-water dataset generator to create visually plausible underwater scenes out of existing in-air datasets. SubmergeStyleGAN, a generative adversarial network (GAN) designed to model attenuation, backscattering, and absorption, utilizes depth maps to apply range-dependent attenuation style transfer. In this work, the generated attenuated images and their corresponding original pairs are used to train an underwater image enhancement CNN. Real underwater datasets were used to validate the proposed approach by assessing various image quality metrics, including UCIQE, UIQM and CCF, as well as disparity estimation accuracy before and after enhancement. SubmergeStyleGAN exhibits a faster and more robust training procedure compared to existing methods in the literature.


Other data

Title SubmergeStyleGAN: Synthetic Underwater Data Generation with Style Transfer for Domain Adaptation
Authors Fathy, Mohamed E.; Mohamed, Samer A.; Awad, Mohammed I.; Hossam El DIn Hassan Abdelmunim 
Keywords Deep Learning;Generative Adversarial Network;Image Enhancement;Style Transfer;Underwater Perception
Issue Date 1-Jan-2023
Conference 2023 International Conference on Digital Image Computing Techniques and Applications Dicta 2023
ISBN [9798350382204]
DOI 10.1109/DICTA60407.2023.00081
Scopus ID 2-s2.0-85185226320

Recommend this item

Similar Items from Core Recommender Database

Google ScholarTM

Check



Items in Ain Shams Scholar are protected by copyright, with all rights reserved, unless otherwise indicated.