z-logo
open-access-imgOpen Access
DEEP LEARNING-BASED NUMERICAL DISPERSION MITIGIATION IN SEISMIC MODELLING
Author(s) -
Kseniia Gadylshina,
Kirill Gadylshin,
Vadim Lisitsa,
D. Vishnevsky
Publication year - 2021
Publication title -
interèkspo geo-sibirʹ
Language(s) - English
Resource type - Journals
ISSN - 2618-981X
DOI - 10.33764/2618-981x-2021-2-2-17-25
Subject(s) - polygon mesh , dispersion (optics) , computer science , set (abstract data type) , computer simulation , algorithm , geology , deep learning , computational science , artificial intelligence , simulation , computer graphics (images) , optics , physics , programming language
Seismic modelling is the most computationally intense and time consuming part of seismic processing and imaging algorithms. Indeed, generation of a typical seismic data-set requires approximately 10 core-hours of a standard CPU-based clusters. Such a high demand in the resources is due to the use of fine spatial discretizations to achieve a low level of numerical dispersion (numerical error). This paper presents an original approach to seismic modelling where the wavefields for all sources (right-hand sides) are simulated inaccurately using coarse meshes. A small number of the wavefields are generated with computationally intense fine-meshes and then used as a training dataset for the Deep Learning algorithm - Numerical Dispersion Mitigation network (NDM-net). Being trained, the NDM-net is applied to suppress the numerical dispersion of the entire seismic dataset.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here