z-logo
open-access-imgOpen Access
Virtual microstructure design for steels using generative adversarial networks
Author(s) -
Lee JinWoong,
Goo Nam Hoon,
Park Woon Bae,
Pyo Myungho,
Sohn KeeSun
Publication year - 2021
Publication title -
engineering reports
Language(s) - English
Resource type - Journals
ISSN - 2577-8196
DOI - 10.1002/eng2.12274
Subject(s) - generative grammar , artificial intelligence , translation (biology) , microstructure , computer science , deep learning , image (mathematics) , macro , parameterized complexity , materials science , algorithm , metallurgy , chemistry , biochemistry , messenger rna , gene , programming language
The prediction of macro‐scale materials properties from microstructures, and vice versa, should be a key part in modeling quantitative microstructure‐physical property relationships. It would be helpful if the microstructural input and output were in the form of visual images rather than parameterized descriptors. However, only a typical supervised learning technique would be insufficient to build up a model with real‐image‐output. A generative adversarial network (GAN) is required to treat visual images as output for a promising PMPR model. Recently developed deep‐learning‐based GAN techniques such as a deep convolutional GAN (DCGAN), a cycle‐consistent GAN (Cycle GAN), and a conditional GAN‐based image to image translation (Pix2Pix) could be of great help via the creation of realistic microstructures. In this regard, we generated virtual micrographs for various types of steels using a DCGAN, a Cycle GAN, and a Pix2Pix and confirmed the generated micrographs are qualitatively indistinguishable from the ground truth.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here