z-logo
Premium
NoRM: No‐Reference Image Quality Metric for Realistic Image Synthesis
Author(s) -
Herzog Robert,
Čadík Martin,
Aydčin Tunç O.,
Kim Kwang In,
Myszkowski Karol,
Seidel HansP.
Publication year - 2012
Publication title -
computer graphics forum
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.578
H-Index - 120
eISSN - 1467-8659
pISSN - 0167-7055
DOI - 10.1111/j.1467-8659.2012.03055.x
Subject(s) - computer science , rendering (computer graphics) , artificial intelligence , inpainting , computer vision , image based modeling and rendering , ground truth , view synthesis , image quality , image (mathematics)
Synthetically generating images and video frames of complex 3D scenes using some photo‐realistic rendering software is often prone to artifacts and requires expert knowledge to tune the parameters. The manual work required for detecting and preventing artifacts can be automated through objective quality evaluation of synthetic images. Most practical objective quality assessment methods of natural images rely on a ground‐truth reference, which is often not available in rendering applications. While general purpose no‐reference image quality assessment is a difficult problem, we show in a subjective study that the performance of a dedicated no‐reference metric as presented in this paper can match the state‐of‐the‐art metrics that do require a reference. This level of predictive power is achieved exploiting information about the underlying synthetic scene (e.g., 3D surfaces, textures) instead of merely considering color, and training our learning framework with typical rendering artifacts. We show that our method successfully detects various non‐trivial types of artifacts such as noise and clamping bias due to insufficient virtual point light sources, and shadow map discretization artifacts. We also briefly discuss an inpainting method for automatic correction of detected artifacts.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here