Open Access
The Fourier transform of controlled‐source time‐domain electromagnetic data by smooth spectrum inversion
Author(s) -
Mitsuhata Yuji,
Uchida Toshihiro,
Murakami Yutaka,
Amano Hiroshi
Publication year - 2001
Publication title -
geophysical journal international
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.302
H-Index - 168
eISSN - 1365-246X
pISSN - 0956-540X
DOI - 10.1046/j.1365-246x.2001.00324.x
Subject(s) - algorithm , frequency domain , fourier transform , synthetic data , inversion (geology) , a priori and a posteriori , weighting , akaike information criterion , time domain , mathematics , computer science , mathematical analysis , acoustics , physics , statistics , geology , paleontology , philosophy , epistemology , structural basin , computer vision
SUMMARY In controlled‐source electromagnetic measurements in the near zone or at low frequencies, the real (in‐phase) frequency‐domain component is dominated by the primary field. However, it is the imaginary (quadrature) component that contains the signal related to a target deeper than the source–receiver separation. In practice, it is difficult to measure the imaginary component because of the dominance of the primary field. In contrast, data acquired in the time domain are more sensitive to the deeper target owing to the absence of the primary field. To estimate the frequency‐domain responses reliably from the time‐domain data, we have developed a Fourier transform algorithm using a least‐squares inversion with a smoothness constraint (smooth spectrum inversion). In implementing the smoothness constraint as a priori information, we estimate the frequency response by maximizing the a posteriori distribution based on Bayes' rule. The adjustment of the weighting between the data misfit and the smoothness constraint is accomplished by minimizing Akaike's Bayesian Information Criterion (ABIC). Tests of the algorithm on synthetic and field data for the long‐offset transient electromagnetic method provide reasonable results. The algorithm can handle time‐domain data with a wide range of delay times, and is effective for analysing noisy data.