Research Library

open-access-imgOpen AccessFaultSeg Swin-UNETR: Transformer-Based Self-Supervised Pretraining Model for Fault Recognition
Author(s)
Zeren Zhang,
Ran Chen,
Jinwen Ma
Publication year2024
This paper introduces an approach to enhance seismic fault recognitionthrough self-supervised pretraining. Seismic fault interpretation holds greatsignificance in the fields of geophysics and geology. However, conventionalmethods for seismic fault recognition encounter various issues, includingdependence on data quality and quantity, as well as susceptibility tointerpreter subjectivity. Currently, automated fault recognition methodsproposed based on small synthetic datasets experience performance degradationwhen applied to actual seismic data. To address these challenges, we haveintroduced the concept of self-supervised learning, utilizing a substantialamount of relatively easily obtainable unlabeled seismic data for pretraining.Specifically, we have employed the Swin Transformer model as the core networkand employed the SimMIM pretraining task to capture unique features related todiscontinuities in seismic data. During the fine-tuning phase, inspired by edgedetection techniques, we have also refined the structure of the Swin-UNETRmodel, enabling multiscale decoding and fusion for more effective faultdetection. Experimental results demonstrate that our proposed method attainsstate-of-the-art performance on the Thebe dataset, as measured by the OIS andODS metrics.
Language(s)English

Seeing content that should not be on Zendy? Contact us.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here