摘要
针对合成孔径雷达图像与光学遥感图像模式差异大、相互转换困难的问题,基于现有空间分离图像转换框架,提出了一种基于空间分离表征的循环一致性生成对抗网络(GAN)。以更深的网络层和跳跃连接完成图像风格和内容分离,通过学习内容映射关系,完成内容特征转换,而后组合目标风格特性实现图像转换。利用PatchGAN判别器,强化模型的图像细节信息生成能力,并新增目标误差损失和生成重建损失将转换任务限制为一对一映射,减少信息添加,约束生成网络。在SEN1-2、SARptical、WHU-SEN-City数据集上进行实验验证,相较于其他图像转换算法,所提方法能够有效实现两类遥感图像互转,生成图像清晰度高、细节特征完整、真实感强。
Resting on the translation framework of spatially separated images,we proposed a cycle-consistent generative adversarial network(GAN)based on spatial disentangled representation to address the large mode difference and difficult translation between synthetic aperture radar images and optical remote sensing images.The proposed model separates images into style and content features by a deeper network layer and jump connection.Furthermore,the content features are translated by content mapping learning and combined with target style features for image translation.In addition,PatchGAN,as the discriminator,enhances the image detail generation,and target error loss and generation&reconstruction loss are introduced to limit the translation task to one-to-one mapping,thus reducing the information added and constraining the GAN.The experimental results in SEN1-2,SARptical,and WHU-SEN-City datasets show that compared with other image translation algorithms,the proposed method can translate two types of remote sensing images and generate images of high resolution,complete detail features,and strong authenticity.
作者
韩子硕
王春平
付强
赵斌
Han Zishuo;Wang Chunping;Fu Qiang;Zhao Bin(Department of Electronic and Optical Engineering,Shijiazhuang Campus,Army Engineering University,Shijiazhuang,Hebei 050003,China)
出处
《光学学报》
EI
CAS
CSCD
北大核心
2021年第7期160-173,共14页
Acta Optica Sinica
基金
军内科研项目(LJ20191A040155)。
关键词
遥感
图像转换
合成孔径雷达
光学遥感图像
循环一致性生成对抗网络
remote sensing
image translation
synthetic aperture radar
optical remote sensing image
cycle-consistent adversarial networks
作者简介
王春平,E-mail:wang_c_p@163.com。