Chinese Space Science and Technology ›› 2025, Vol. 45 ›› Issue (5): 49-59.doi: 10.16708/j.cnki.1000-758X.2025.0075
Previous Articles Next Articles
LI Junyong,CHEN Keyan,LIU Liqin,ZOU Zhengxia,SHI Zhenwei*
Received:
Revision received:
Accepted:
Online:
Published:
Abstract: Cloud image generation is an important branch of remote sensing image generation. Nevertheless, prevailing approaches predominantly target the production of homogeneous cloud types, offering inadequate control over cloud coverage and opacity. Furthermore, the failure to disentangle cloud attributes and terrestrial features seriously affect the diversity and veracity of the generated cloud images, which cannot meet the simulation requirements.This research introduces DecoupleGAN, a bifurcated GAN framework for cloud image generation based on the decoupling of cloud and background. DecoupleGAN employes a pair of separate GANs to independently capture the characteristic representations of cloud formations and the underlying background. Leveraging a cloud-s with remote sensing backdrops, extracting features with heightened efficiency and no cross-interference, thereby culminating in superior quality cloud imageries. Complementarily, this study also introduces a dataset comprised of varying cloud coverage categories, broadening the generative scope of the model. The algorithm has been verified to exhibit superior performance in simulation, specifically with an FID value of 49.0012 and a KID value of 0.0253, representing performance improvements of 33.11% and 16.98% respectively compared with single-branch networks. Moreover, compared with existing cloud generation methods, this algorithm can generate more realistic and diverse types of clouds, and is capable of simultaneously generating multiple different types of land cover backgrounds, significantly expanding the scope of application and practicality. DecoupleGAN achieves more realistic and harmonious cloud image simulation effects by decoupling the clouds from the background and independently processing the two branches, effectively preventing interference during the feature learning process.
Key words: remote sensing image, cloud generation, generative models, generative adversarial networks, deep learning
LI Junyong, CHEN Keyan, LIU Liqin, ZOU Zhengxia, SHI Zhenwei. Dual-branch GAN for cloud image generation based on cloud and background decoupling[J]. Chinese Space Science and Technology, 2025, 45(5): 49-59.
Add to citation manager EndNote|Reference Manager|ProCite|BibTeX|RefWorks
URL: https://journal26.magtechjournal.com/kjkxjs/EN/10.16708/j.cnki.1000-758X.2025.0075
https://journal26.magtechjournal.com/kjkxjs/EN/Y2025/V45/I5/49