z-logo
open-access-imgOpen Access
Detection of GAN-Synthesized Image Based on Discrete Wavelet Transform
Author(s) -
Guihua Tang,
Lei Sun,
Xiuqing Mao,
Song Guo,
Zhang Hongmeng,
Xiaoqin Wang
Publication year - 2021
Publication title -
security and communication networks
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.446
H-Index - 43
eISSN - 1939-0114
pISSN - 1939-0122
DOI - 10.1155/2021/5511435
Subject(s) - computer science , artificial intelligence , discriminator , robustness (evolution) , rgb color model , pattern recognition (psychology) , computer vision , image (mathematics) , telecommunications , biochemistry , chemistry , detector , gene
Recently, generative adversarial networks (GANs) and its variants have shown impressive ability in image synthesis. The synthesized fake images spread widely on the Internet, and it is challenging for Internet users to identify the authenticity, which poses huge security risk to the society. However, compared with the powerful image synthesis technology, the detection of GAN-synthesized images is still in its infancy and face a variety of challenges. In this study, a method named fake images discriminator (FID) is proposed, which detects that GAN-synthesized fake images use the strong spectral correlation in the imaging process of natural color images. The proposed method first converts the color image into three color components of R, G, and B. Discrete wavelet transform (DWT) is then applied to RGB components separately. Finally, the correlation coefficient between the subband images is used as a feature vector for authenticity classification. Experimental results show that the proposed FID method achieves impressive effectiveness on the StyleGAN2-synthesized faces and multitype fake images synthesized with the state-of-the-art GANs. Also, the FID method exhibits good robustness against the four common perturbation attacks.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here