Seismic coherence is a routine measure of seismic reflection similarity for interpreters seeking structural boundary and discontinuity features that may be not properly highlighted on original amplitude volumes. One mostly wishes to use the broadest band seismic data for interpretation. However, because of thickness tuning effects, spectral components of specific frequencies can highlight features of certain thicknesses with higher signal-to-noise ratio than others. Seismic stratigraphic features may be buried in the full-bandwidth data, but can be “lit up” at certain spectral (...) components. For the same reason, coherence attributes computed from spectral voice components also often provide sharper images, with the “best” component being a function of the tuning thickness and the reflector alignment across faults. Although one can corender three coherence images using red-green-blue blending, a display of the information contained in more than three volumes in a single image is difficult. We address this problem by combining covariance matrices for each spectral component, adding them together, resulting in a “multispectral” coherence algorithm. The multispectral coherence images provide better images of channel incisement, and they are less noisy than those computed from the full bandwidth data. In addition, multispectral coherence also provides a significant advantage over RGB blended volumes. The information content from unlimited spectral voices can be combined into one volume, which is useful for a posteriori/further processing, such as color corendering display with other related attributes, such as petrophysics parameters plotted against a polychromatic color bar. We develop the value of multispectral coherence by comparing it with the RGB blended volumes and coherence computed from spectrally balanced, full-bandwidth seismic amplitude volume from a megamerge survey acquired over the Red Fork Formation of the Anadarko Basin, Oklahoma. (shrink)
During the past decade, the size of 3D seismic data volumes and the number of seismic attributes have increased to the extent that it is difficult, if not impossible, for interpreters to examine every seismic line and time slice. To address this problem, several seismic facies classification algorithms including [Formula: see text]-means, self-organizing maps, generative topographic mapping, support vector machines, Gaussian mixture models, and artificial neural networks have been successfully used to extract features of geologic interest from multiple volumes. Although (...) well documented in the literature, the terminology and complexity of these algorithms may bewilder the average seismic interpreter, and few papers have applied these competing methods to the same data volume. We have reviewed six commonly used algorithms and applied them to a single 3D seismic data volume acquired over the Canterbury Basin, offshore New Zealand, where one of the main objectives was to differentiate the architectural elements of a turbidite system. Not surprisingly, the most important parameter in this analysis was the choice of the correct input attributes, which in turn depended on careful pattern recognition by the interpreter. We found that supervised learning methods provided accurate estimates of the desired seismic facies, whereas unsupervised learning methods also highlighted features that might otherwise be overlooked. (shrink)
Recent developments in seismic attributes and seismic facies classification techniques have greatly enhanced the capability of interpreters to delineate and characterize features that are not prominent in conventional 3D seismic amplitude volumes. The use of appropriate seismic attributes that quantify the characteristics of different geologic facies can accelerate and partially automate the interpretation process. Self-organizing maps are a popular seismic facies classification tool that extract similar patterns embedded with multiple seismic attribute volumes. By preserving the distance in the input data (...) space into the SOM latent space, the internal relation among data vectors on an SOM facies map is better presented, resulting in a more reliable classification. We have determined the effectiveness of the modified algorithm by applying it to a turbidite system in Canterbury Basin, offshore New Zealand. By incorporating seismic attributes and distance-preserving SOM classification, we were able to observe architectural elements that are overlooked when using a conventional seismic amplitude volume for interpretation. (shrink)
The main considerations for well planning and hydraulic fracturing in unconventional resources plays include the amount of total organic carbon and how much hydrocarbon can be extracted. Brittleness is the direct measurement of a formation about the ability to create avenues for hydrocarbons when applying hydraulic fracturing. Brittleness can be directly estimated from laboratory stress-strain measurements, rock-elastic properties, and mineral content analysis using petrophysical analysis on well logs. However, the estimated brittleness using these methods only provides “cylinder” estimates near the (...) borehole. We proposed a workflow to estimate brittleness of resource plays in 3D by integrating the petrophysics and seismic data analysis. The workflow began by brittleness evaluation using mineral well logs at the borehole location. Then, we used a proximal support vector machine algorithm to construct a classification pattern between rock-elastic properties and brittleness for the selected benchmark well. The pattern was validated using well-log data that were not used for constructing the classification. Next, we prestack inverted the fidelity preserved seismic gathers to generate a suite of rock-elastic properties volumes. Finally, we obtained a satisfactory brittleness index of target formations by applying the trained classification pattern to the inverted rock-elastic-property volumes. (shrink)
The Barnett Shale in the Fort Worth Basin is one of the most important resource plays in the USA. The total organic carbon and brittleness can help to characterize a resource play to assist in the search for sweet spots. Higher TOC or organic content are generally associated with hydrocarbon storage and with rocks that are ductile in nature. However, brittle rocks are more amenable to fracturing with the fractures faces more resistant to proppant embedment. Productive intervals within a resource (...) play should therefore contain a judicious mix of organics and mineralogy that lends to hydraulic fracturing. Identification of these intervals through core acquisition and laboratory-based petrophysical measurements can be accurate but expensive in comparison with wireline logging. We have estimated TOC from wireline logs using Passey’s method and attained a correlation of 60%. However, errors in the baseline interpretation can lead to inaccurate TOC. Using nonlinear regression with Passey’s TOC, normalized stratigraphic height, and acquired wireline logs, the correlation increased to 80%. This regression can be applied to uncored wells with logs to estimate TOC, and we used it as a ground truth in integrated analysis of seismic and well log data. The brittleness index is computed based on core Fourier transform infrared mineralogy using Wang and Gale’s formula. The correlation between core BI and estimated BI using elastic logs combined with wireline logs was 78%. However, this correlation decreases to 66% if the BI is estimated using only wireline logs. Therefore, the later serves as a less reliable proxy. We have correlated production to volumetric estimate of TOC and brittleness by computing distance-weighted averages in 120 horizontal wells. We have obtained a production correlation of 38% on blind wells, which was encouraging, suggesting that the geologic component in completions provides an important contribution to well success. (shrink)
Differentiating brittle and ductile rocks from surface seismic data is the key to efficient well location and completion. Brittleness average estimates based only on elastic parameters are easy to use but require empirical calibration. In contrast, brittleness index estimates are based on mineralogy laboratory measurements and, indeed, cannot be directly measured from surface seismic data. These two measures correlate reasonably well in the quartz-rich Barnett Shale, but they provide conflicting estimates of brittleness in the calcite-rich Viola, Forestburg, Upper Barnett, and (...) Marble Falls limestone formations. Specifically, the BI accurately predicts limestone formations that form fracture barriers to be ductile, whereas the brittleness average does not. We used elemental capture spectroscopy and elastic logs measured in the same cored well to design a 2D [Formula: see text] to brittleness template. We computed [Formula: see text] and [Formula: see text] volumes through prestack seismic inversion and calibrate the results with the [Formula: see text] template from well logs. We then used microseismic event locations from six wells to calibrate our prediction, showing that most of the microseismic events occur in the brittle regions of the shale, avoiding more ductile shale layers and the ductile limestone fracture barriers. Our [Formula: see text] to brittleness template is empirical and incorporates basin- and perhaps even survey-specific correlations of mineralogy and elastic parameters through sedimentation, oxygenation, and diagenesis. We do not expect this specific template to be universally applicable in other mudstone rock basins; rather, we recommend interpreters generate similar site-specific templates from logs representative of their area, following the proposed workflow. (shrink)
One of the key components of traditional seismic interpretation is to associate or “label” a specific seismic amplitude package of reflectors with an appropriate seismic or geologic facies. The object of seismic clustering algorithms is to use a computer to accelerate this process, allowing one to generate interpreted facies for large 3D volumes. Determining which attributes best quantify a specific amplitude or morphology component seen by the human interpreter is critical to successful clustering. Unfortunately, many patterns, such as coherence images (...) of salt domes, result in a salt-and-pepper classification. Application of 3D Kuwahara median filters smooths the interior attribute response and sharpens the contrast between neighboring facies, thereby preconditioning the attribute volumes for subsequent clustering. In our workflow, the interpreter manually painted [Formula: see text] target facies using traditional interpretation techniques, resulting in attribute training data for each facies. Candidate attributes were evaluated by crosscorrelating their histogram for each facies with low correlation implying good facies discrimination, and Kuwahara filtering significantly increased this discrimination. Multiattribute voxels for the [Formula: see text] interpreter-painted facies were projected against a generative topographical mapping manifold, resulting in [Formula: see text] probability density functions. The Bhattacharyya distance between the PDF of each unlabeled voxel to each of [Formula: see text] facies PDFs resulted in a probability volume of each user-defined facies. We have determined the effectiveness of this workflow to a large 3D seismic volume acquired offshore Louisiana, USA. (shrink)
Seismic attributes are routinely used to accelerate and quantify the interpretation of tectonic features in 3D seismic data. Coherence cubes delineate the edges of megablocks and faulted strata, curvature delineates folds and flexures, while spectral components delineate lateral changes in thickness and lithology. Seismic attributes are at their best in extracting subtle and easy to overlook features on high-quality seismic data. However, seismic attributes can also exacerbate otherwise subtle effects such as acquisition footprint and velocity pull-up/push-down, as well as small (...) processing and velocity errors in seismic imaging. As a result, the chance that an interpreter will suffer a pitfall is inversely proportional to his or her experience. Interpreters with a history of making conventional maps from vertical seismic sections will have previously encountered problems associated with acquisition, processing, and imaging. Because they know that attributes are a direct measure of the seismic amplitude data, they are not surprised that such attributes “accurately” represent these familiar errors. Less experienced interpreters may encounter these errors for the first time. Regardless of their level of experience, all interpreters are faced with increasingly larger seismic data volumes in which seismic attributes become valuable tools that aid in mapping and communicating geologic features of interest to their colleagues. In terms of attributes, structural pitfalls fall into two general categories: false structures due to seismic noise and processing errors including velocity pull-up/push-down due to lateral variations in the overburden and errors made in attribute computation by not accounting for structural dip. We evaluate these errors using 3D data volumes and find areas where present-day attributes do not provide the images we want. (shrink)
Using 3D seismic attributes and the support of a clay model that served as an analog, we mapped and analyzed a 32 km long, north–south-striking, right-lateral fault in the Woodford Shale, Anadarko Basin, Oklahoma, USA. Volumetric coherence, dip azimuth, and curvature delineated an approximately 1.5 km wide damage zone with multiple secondary faults, folds, and flexures. The clay analog enabled us to identify these features as belonging to a complex transpressional Riedel structure. We also suggest that the damage zone contains (...) dense subseismic fractures associated with multiscale faulting and secondary folding that may correspond to highly permeable features within the Woodford Shale. (shrink)
During the past two decades, the number of volumetric seismic attributes has increased to the point at which interpreters are overwhelmed and cannot analyze all of the information that is available. Principal component analysis is one of the best-known multivariate analysis techniques that decompose the input data into second-order statistics by maximizing the variance, thus obtaining mathematically uncorrelated components. Unfortunately, projecting the information in the multiple input data volumes onto an orthogonal basis often mixes rather than separates geologic features of (...) interest. To address this issue, we have implemented and evaluated a relatively new unsupervised multiattribute analysis technique called independent component analysis, which is based on higher order statistics. We evaluate our algorithm to study the internal architecture of turbiditic channel complexes present in the Moki A sands Formation, Taranaki Basin, New Zealand. We input 12 spectral magnitude components ranging from 25 to 80 Hz into the ICA algorithm and we plot 3 of the resulting independent components against a red-green-blue color scheme to generate a single volume in which the colored independent components correspond to different seismic facies. The results obtained using ICA proved to be superior to those obtained using PCA. Specifically, ICA provides improved resolution and separates geologic features from noise. Moreover, with ICA, we can geologically analyze the different seismic facies and relate them to sand- and mud-prone seismic facies associated with axial and off-axis deposition and cut-and-fill architectures. (shrink)
Automated seismic facies classification using machine-learning algorithms is becoming more common in the geophysics industry. Seismic attributes are frequently used as input because they may express geologic patterns or depositional environments better than the original seismic amplitude. Selecting appropriate attributes becomes a crucial part of the seismic facies classification analysis. For unsupervised learning, principal component analysis can reduce the dimensions of the data while maintaining the highest variance possible. For supervised learning, the best attribute subset can be built by selecting (...) input attributes that are relevant to the output class and avoiding using redundant attributes that are similar to each other. Multiple attributes are tested to classify salt diapirs, mass transport deposits, and the conformal reflector “background” for a 3D seismic marine survey acquired on the northern Gulf of Mexico shelf. We have analyzed attribute-to-attribute correlation and the correlation between the input attributes to the output classes to understand which attributes are relevant and which attributes are redundant. We found that amplitude and texture attribute families are able to differentiate salt, MTDs, and conformal reflectors. Our attribute selection workflow is also applied to the Barnett Shale play to differentiate limestone and shale facies. Multivariate analysis using filter, wrapper, and embedded algorithms was used to rank attributes by importance, so then the best attribute subset for classification is chosen. We find that attribute selection algorithms for supervised learning not only reduce computational cost but also enhance the performance of the classification. (shrink)
Seismic coherence is commonly used to delineate structural and stratigraphic discontinuities. We generally use full-bandwidth seismic data to calculate coherence. However, some seismic stratigraphic features may be buried in this full-bandwidth data but can be highlighted by certain spectral components. Due to thin-bed tuning phenomena, discontinuities in a thicker stratigraphic feature may be tuned and thus better delineated at a lower frequency, whereas discontinuities in the thinner units may be tuned and thus better delineated at a higher frequency. Additionally, whether (...) due to the seismic data quality or underlying geology, certain spectral components exhibit higher quality over other components, resulting in correspondingly higher quality coherence images. Multispectral coherence provides an effective tool to exploit these observations. We have developed the performance of multispectral coherence using different spectral decomposition methods: the continuous wavelet transform, maximum entropy, amplitude volume technique, and spectral probe. Applications to a 3D seismic data volume indicate that multispectral coherence images are superior to full-bandwidth coherence, providing better delineation of incised channels with less noise. From the CWT experiments, we find that providing exponentially spaced CWT components provides better coherence images than equally spaced components for the same computation cost. The multispectral coherence image computed using maximum entropy spectral voices further improves the resolution of the thinner channels and small-scale features. The coherence from AVT data set provides continuous images of thicker channel boundaries but poor images of the small-scale features inside the thicker channels. Additionally, multispectral coherence computed using the nonlinear spectral probes exhibits more balanced and reveals clear small-scale geologic features inside the thicker channel. However, because amplitudes are not preserved in the nonlinear spectral probe decomposition, noise in the noisier shorter period components has an equal weight when building the covariance matrix, resulting in increased noise in the generated multispectral coherence images. (shrink)
Very little research has been done on volcanic rocks by the oil industry due to the misconception that these rocks cannot be “good reservoirs.” However, in the past two decades, significant quantities of hydrocarbons have been produced from volcanic rocks in China, New Zealand, and Argentina. In frontier basins, volcanic piles are sometimes misinterpreted to be hydrocarbon anomalies and/or carbonate buildups. Unlike clastic and carbonate systems, the 3D seismic geomorphology of igneous systems is only partially documented. We have integrated 3D (...) seismic data, well logs, well reports, core data, and clustering techniques such as self-organizing maps to map two distinct facies, within a Miocene submarine volcano in the Taranaki Basin, New Zealand. Three wells; Kora-1–3 drilled the pyroclastic facies within the volcano encountering evidence of a petroleum system, whereas the Kora-4 well drilled the lava-flow facies, which was barren of hydrocarbons. By integrating results from geochemistry and basin modeling reports prepared for Crown Mineral, New Zealand, we concluded that the reason that Kora-4 was dry was due to a lack of source charge — not to the absence of reservoir quality. Moreover, the Kora-1 well drilled a thick sequence of pyroclastic flows in this submarine volcano by chance and found high peaks of gas in the mudlogs near the top 25 m of this sequence. A long-term test in this upper volcanic section resulted in 32 API oil flow of 668 barrels of oil per day for 254 h — a result that challenges the misconception that volcanic rocks cannot be good reservoirs. (shrink)
One of the key tasks of a seismic interpreter is to map lateral changes in surfaces, not only including faults, folds, and flexures, but also incisements, diapirism, and dissolution features. Volumetrically, coherence provides rapid visualization of faults and curvature provides rapid visualization of folds and flexures. Aberrancy measures the lateral change of curvature along a picked or inferred surface. Aberrancy complements curvature and coherence. In normally faulted terrains, the aberrancy anomaly will track the coherence anomaly and fall between the most (...) positive curvature anomaly defining the footwall and the most negative curvature anomaly defining the hanging wall. Aberrancy can delineate faults whose throw falls below the seismic resolution or is distributed across a suite of smaller conjugate faults that do not exhibit a coherence anomaly. Previously limited to horizon computations, we extend aberrancy to uninterpreted seismic data volumes. We apply our volumetric aberrancy calculation to a data volume acquired over the Barnett Shale gas reservoir of the Fort Worth Basin, Texas. In this area, the Barnett Shale is bound on the top by the Marble Falls Limestone and on the bottom by the Ellenburger Dolomite. Basement faulting controls karstification in the Ellenburger, resulting in the well-known “string of pearls” pattern seen on coherence images. Aberrancy delineates small karst features, which are, in many places, too smoothly varying to be detected by coherence. Equally important, aberrancy provides the azimuthal orientation of the fault and flexure anomalies. (shrink)
Much of seismic interpretation is based on pattern recognition, such that experienced interpreters are able to extract subtle geologic features that a new interpreter may easily overlook. Seismic pattern recognition is based on the identification of changes in amplitude, phase, frequency, dip, continuity, and reflector configuration. Seismic attributes, which providing quantitative measures that can be subsequently used in risk analysis and data mining, partially automate the pattern recognition problem by extracting key statistical, geometric, or kinematic components of the 3D seismic (...) volume. Early attribute analysis began with recognition of bright spots and quickly moved into the mapping of folds, faults, and channels. Although a novice interpreter may quickly recognize faults and channels on attribute time slices, karst terrains provide more complex patterns. We sought to instruct the attribute expression of a karst terrain in the western part of the Fort Worth Basin, Texas, United States of America. Karst provides a specific expression on almost every attribute. Specifically, karst in the Fort Worth Basin Ellenburger Group exhibits strong dip, negative curvature, low coherence, and a shift to lower frequencies. Geomorphologically, the inferred karst geometries seen in our study areas indicate strong structural control, whereby large-scale karst collapse is associated with faults and where karst lineaments are aligned perpendicularly to faults associated with reflector rotation anomalies. (shrink)
One of the key tasks of a seismic interpreter is to map lateral changes in surfaces, not only including faults, folds, and flexures, but also incisements, diapirism, and dissolution features. Volumetrically, coherence provides rapid visualization of faults and curvature provides rapid visualization of folds and flexures. Aberrancy measures the lateral change of curvature along a picked or inferred surface. Aberrancy complements curvature and coherence. In normally faulted terrains, the aberrancy anomaly will track the coherence anomaly and fall between the most (...) positive curvature anomaly defining the footwall and the most negative curvature anomaly defining the hanging wall. Aberrancy can delineate faults whose throw falls below the seismic resolution or is distributed across a suite of smaller conjugate faults that do not exhibit a coherence anomaly. Previously limited to horizon computations, we extend aberrancy to uninterpreted seismic data volumes. We apply our volumetric aberrancy calculation to a data volume acquired over the Barnett Shale gas reservoir of the Fort Worth Basin, Texas. In this area, the Barnett Shale is bound on the top by the Marble Falls Limestone and on the bottom by the Ellenburger Dolomite. Basement faulting controls karstification in the Ellenburger, resulting in the well-known “string of pearls” pattern seen on coherence images. Aberrancy delineates small karst features, which are, in many places, too smoothly varying to be detected by coherence. Equally important, aberrancy provides the azimuthal orientation of the fault and flexure anomalies. (shrink)
Artificial intelligence methods have a very wide range of applications. From speech recognition to self-driving cars, the development of modern deep-learning architectures is helping researchers to achieve new levels of accuracy in different fields. Although deep convolutional neural networks have reached or surpassed human-level performance in image recognition tasks, little has been done to transport this new image classification technology to geoscientific problems. We have developed what we believe to be the first use of CNNs to identify lithofacies in cores. (...) We use highly accurate models and transfer learning to classify images of cored carbonate rocks. We found that different modern CNN architectures can achieve high levels of lithologic image classification accuracy and can aid in the core description task. This core image classification technique has the potential to greatly standardize and accelerate the description process. We also provide the community with a new set of labeled data that can be used for further geologic/data science studies. (shrink)
Seismic attenuation, generally related to the presence of hydrocarbon accumulation, fluid-saturated fractures, and rugosity, is extremely useful for reservoir characterization. The classic constant attenuation estimation model, focusing on intrinsic attenuation, detects the seismic energy loss because of the presence of hydrocarbons, but it works poorly when spectral anomalies exist, due to rugosity, fractures, thin layers, and so on. Instead of trying to adjust the constant attenuation model to such phenomena, we have evaluated a suite of seismic spectral attenuation attributes to (...) quantify the apparent attenuation responses. We have applied these attributes to a conventional and an unconventional reservoir, and we found that those seismic attenuation attributes were effective and robust for seismic interpretation. Specifically, the spectral bandwidth attribute correlated with the production of a gas sand in the Anadarko Basin, whereas the spectral slope of high frequencies attribute correlated with the production in the Barnett Shale of the Fort Worth Basin. (shrink)
The coherence attribute computation is typically carried out as a poststack application on 3D prestack migrated seismic data volumes. However, since its inception, interpreters have applied coherence to band-pass-filtered data, azimuthally limited stacks, and offset-limited stacks to enhance discontinuities seen at specific frequencies, azimuths, and offsets. The limitation of this approach is the multiplicity of coherence volumes. Of the various coherence algorithms that have evolved over the past 25 years, the energy ratio coherence computation stands apart from the others, being (...) more sensitive to the seismic waveform changes rather than changes in their amplitude. The energy ratio algorithm is based on the crosscorrelation of five or more adjacent traces to form a symmetric covariance matrix that can then be decomposed into eigenvalues and eigenvectors. The first eigenvector represents a vertically variable, laterally consistent pattern that best represents the data in the analysis window. The first eigenvalue represents the energy of the data represented by this pattern. Coherence is then defined as the ratio of the energy represented by the first eigenvalue to the sum of the energy of the original data. An early generalization of this algorithm was to compute the sum of two covariance matrices, one from the original data and the other from the 90° phase rotated data, thereby eliminating artifacts about low-amplitude zero crossings. More recently, this concept has been further generalized by computing a sum of covariance matrices of traces represented by multiple spectral components, by their azimuthally limited stacks, and by their offset-limited stacks. These more recently developed algorithms capture many of the benefits of discontinuities seen at specific frequencies, azimuths, and offsets, but they present the interpreter with a single volume. We compare the results of multispectral, multiazimuth, and multioffset coherence volumes with the traditional coherence computation, and we find that these newer coherence computation procedures produce superior results. (shrink)
Seismic interpretation is based on the identification of reflector configuration and continuity, with coherent reflectors having a distinct amplitude, frequency, and phase. Skilled interpreters may classify reflector configurations as parallel, converging, truncated, or hummocky, and use their expertise to identify stratigraphic packages and unconformities. In principal, a given pattern can be explicitly defined as a combination of waveform and reflector configuration properties, although such “clustering” is often done subconsciously. Computer-assisted classification of seismic attribute volumes builds on the same concepts. Seismic (...) attributes not only quantify characteristics of the seismic reflection events, but also measure aspects of reflector configurations. The Mississippi Lime resource play of northern Oklahoma and southern Kansas provides a particularly challenging problem. Instead of defining the facies stratigraphically, we need to define them either diagenetically or structurally. Using a 3D seismic survey acquired in Osage County Oklahoma, we use Kohonen self-organizing maps to classify different diagenetically altered facies of the Mississippi Lime play. The 256 prototype vectors reduce to only three or four distinct “natural” clusters. We use ground truth of seismic facies seen on horizontal image logs to fix three average attribute data vectors near the well locations, resulting in three “known” facies, and do a minimum Euclidean distance supervised classification. The predicted clusters correlate well to the poststack impedance inversion result. (shrink)
Pattern recognition-based seismic facies analysis techniques are commonly used in modern quantitative seismic interpretation. However, interpreters often treat techniques such as artificial neural networks and self-organizing maps as a “black box” that somehow correlates a suite of attributes to a desired geomorphological or geomechanical facies. Even when the statistical correlations are good, the inability to explain such correlations through principles of geology or physics results in suspicion of the results. The most common multiattribute facies analysis begins by correlating a suite (...) of candidate attributes to a desired output, keeping those that correlate best for subsequent analysis. The analysis then takes place in attribute space rather than space, removing spatial trends often observed by interpreters. We add a stratigraphy layering component to a SOM model that attempts to preserve the intersample relation along the vertical axis. Specifically, we use a mode decomposition algorithm to capture the sedimentary cycle pattern as an “attribute.” If we correlate this attribute to the training data, it will favor SOM facies maps that follow stratigraphy. We apply this workflow to a Barnett Shale data set and find that the constrained SOM facies map shows layers that are easily overlooked on traditional unconstrained SOM facies map. (shrink)
All color monitors display images by mixing red, green, and blue components. These RGB components can be defined mathematically in terms of hue, lightness, and saturation components. A fourth alpha-blending component provides a means to corender multiple images. Most, but not all, modern commercial interpretation workstation software vendors provide multiattribute display tools using an opacity model. A smaller subset of vendors provide tools to interactively display two or three attributes using HLS, CMY, and RGB color models. I evaluated a technique (...) to simulate the HLS color model using monochromatic color bars and only opacity. This same trick only approximates true color blending of RGB or CMY components. There are three basic objectives in choosing which attributes to display together. The first objective is to understand the correlation of one attribute to another, and most commonly, of a given attribute to the original seismic amplitude data. The second objective is to visualize the confidence or relevance of a given attribute by modulating it with a second attribute. The third objective is to provide a more integrated image of the seismic data volume by choosing attributes that are mathematically independent but correlated through the underlying geology. I developed the interpretation value of the HLS display technique on a 3D data volume acquired over the Central Basin Platform of west Texas exhibiting faults, fractures, folds, channels, pinch outs, and karst features. To be a useful “technique,” I need to demonstrate these workflows within a specific package. Although I implemented the workflow in Petrel 2014, similar images can be generated using any software with flexible opacity capabilities. I also developed a short list of attribute combinations that are particularly amenable to corendering in HLS. (shrink)
Fault picking is a critical, but human-labor-intensive component of seismic interpretation. In a bid to improve fault imaging in seismic data, we have applied a directional Laplacian of a Gaussian operator to sharpen fault features within a coherence volume. We computed an [Formula: see text] matrix of the second moment tensor distance-weighted coherence values that fell within a 3D analysis window about each voxel. The eigenvectors of this matrix defined the orientation of planar discontinuities, whereas the corresponding eigenvalues determined whether (...) these discontinuities were significant. The eigenvectors, which quantified the fault dip magnitude and dip azimuth, defined a natural coordinate system for smoothing of the planar discontinuity. We rotated the data to the new coordinate system and applied the sharpening operator. By comparing the vector dip of the discontinuity to the vector dip of the reflectors, we could apply a filter to either suppress or enhance discontinuities associated with unconformities or low-signal-to-noise-ratio shale-on-shale reflectors. We have revealed the value and robustness of the technique by application to two 3D data volumes from offshore New Zealand, which exhibited polygonal faulting, shale dewatering, and mass transport complexes. (shrink)
Seismic facies estimation is a critical component in understanding the stratigraphy and lithology of hydrocarbon reservoirs. With the adoption of 3D technology and increasing survey size, manual techniques of facies classification have become increasingly time consuming. Besides, the numbers of seismic attributes have increased dramatically, providing increasingly accurate measurements of reflector morphology. However, these seismic attributes add multiple “dimensions” to the data greatly expanding the amount of data to be analyzed. Principal component analysis and self-organizing maps are popular techniques to (...) reduce such dimensionality by projecting the data onto a lower order space in which clusters can be more readily identified and interpreted. After dimensional reduction, popular classification algorithms such as neural net, K-means, and Kohonen SOMs are routinely done for general well log prediction or analysis and seismic facies modeling. Although these clustering methods have been successful in many hydrocarbon exploration projects, they have some inherent limitations. We explored one of the recent techniques known as generative topographic mapping, which takes care of the shortcomings of Kohonen SOMs and helps in data classification. We applied GTM to perform multiattribute seismic facies classification of a carbonate conglomerate oil field in the Veracruz Basin of southern Mexico. The presence of conglomerate carbonates makes the reservoir units laterally and vertically highly heterogeneous, which are observed at well logs, core slabs, and thin section scales. We applied unsupervised GTM classification to determine the “natural” clusters in the data set. Finally, we introduced supervision into GTM and calculated the probability of occurrence of seismic facies seen at the wells over the reservoir units. In this manner, we were able to assign a level of confidence to encountering facies that corresponded to good and poor production. (shrink)
Prestack seismic inversion techniques provide valuable information of rock properties, lithology, and fluid content for reservoir characterization. The confidence of inverted results increases with increasing incident angle of seismic gathers. The most accurate result of simultaneous prestack inversion of P-wave seismic data is P-impedance. S-impedance estimation becomes reliable with incident angles approaching 30°, whereas density evaluation becomes reliable with incident angles approaching 45°. As the offset increases, we often encounter “hockey sticks” and severe stretch at large offsets. Hockey sticks and (...) stretch not only lower the seismic resolution but also hinder long offset prestack seismic inversion analysis. The inverted results are also affected by the random noises present in the prestack gathers. We developed a three-step workflow to perform data conditioning prior to simultaneous prestack inversion. First, we mitigated the hockey sticks by using an automatic nonhyperbolic velocity analysis. Then, we minimized the stretch at the far offset by using an antistretch workflow. Last, we improved the signal-to-noise ratio by applying prestack structure-oriented filtering. We evaluated our workflow by applying it to a prestack seismic volume acquired over the Fort Worth Basin, Texas, USA. The results inverted from the conditioned prestack gathers have higher resolution and better correlation coefficients with well logs when compared to those inverted from conventional time-migrated gathers. (shrink)
Machine learning algorithms, such as principal component analysis, independent component analysis, self-organizing maps, and artificial neural networks, have been used by geoscientists to not only accelerate the interpretation of their data, but also to provide a more quantitative estimate of the likelihood that any voxel belongs to a given facies. Identifying the best combination of attributes needed to perform either supervised or unsupervised ML tasks continues to be the most-asked question by interpreters. In the past decades, stepwise regression and genetic (...) algorithms have been used together with supervised learning algorithms to select the best number and combination of attributes. For reasons of computational efficiency, these techniques do not test all of the seismic attribute combinations, potentially leading to a suboptimal classification. In this study, we have developed an exhaustive probabilistic neural network algorithm that exploits the PNN’s capacity in exploring nonlinear relationships to obtain the optimal attribute subset that best differentiates target seismic facies of interest. We determine the efficacy of our proposed workflow in differentiating salt from nonsalt seismic facies in a Eugene Island seismic survey, offshore Louisiana. We find that from seven input candidate attributes, the exhaustive PNN is capable of removing irrelevant attributes by selecting a smaller subset of four seismic attributes. The enhanced classification using fewer attributes also reduces the computational cost. We then use the resulting facies probability volumes to construct the 3D distribution of the salt diapir geobodies embedded in a stratigraphic matrix. (shrink)
Knowledge of induced fractures can help to evaluate the success of reservoir stimulation. Seismic P-waves through fracturing media can exhibit azimuthal variation in traveltime, amplitude, and thin-bed tuning, so amplitude variation with azimuth can be used to evaluate the hydraulic-fracturing-caused anisotropy. The Barnett Shale of the Fort Worth Basin was the first large-scale commercial shale gas play. We have analyzed two adjacent Barnett Shale seismic surveys: one acquired before hydraulic fracturing and the other acquired after hydraulic fracturing by more than (...) 400 wells. Although not a rigorous time-lapse experiment, comparison of AVAz anisotropy of these two surveys provided valuable insight into the possible effects of hydraulic fracturing. We found that in the survey acquired prior to hydraulic fracturing, AVAz anomalies were stronger and highly correlated with major structural lineaments measured by curvature. In contrast, AVAz anomalies in the survey acquired after hydraulic fracturing were weaker and compartmentalized by rather than correlated to the most-positive curvature lineaments. We found in five microseismic experiments within the survey that these ridge lineaments form fracture barriers. These findings suggested that future time-lapse experiments may be valuable in mapping the modified horizontal stress field to guide future drilling and in recognizing zones of bypassed pay. (shrink)
Seismic interpretation is dependent on the quality and resolution of seismic data. Unfortunately, seismic amplitude data are often insufficient for detailed sequence stratigraphy interpretation. We reviewed a method to derive high-resolution seismic attributes based upon complex continuous wavelet transform pseudodeconvolution and phase-residue techniques. The PD method is based upon an assumption of a blocky earth model that allowed us to increase the frequency content of seismic data that, for our data, better matched the well log control. The phase-residue technique allowed (...) us to extract information not only from thin layers but also from interference patterns such as unconformities from the seismic amplitude data. Using data from a West Texas carbonate environment, we found out how PD can be used not only to improve the seismic well ties but also to provide sharper sequence terminations. Using data from an Anadarko Basin clastic environment, we discovered how phase residues delineate incised valleys seen on the well logs that are difficult to see on vertical slices through the original seismic amplitude. (shrink)
Prestack seismic analysis provides information on rock properties, lithology, fluid content, and the orientation and intensity of anisotropy. However, such analysis demands high-quality seismic data. Unfortunately, noise is always present in seismic data even after careful processing. Noise in the prestack gathers may not only contaminate the seismic image, thereby lowering the quality of seismic interpretation, but it may also bias the seismic prestack inversion for rock properties, such as acoustic- and shear-impedance estimation. Common postmigration data conditioning includes running window (...) median and Radon filters that are applied to the flattened common reflection point gathers. We have combined filters across the offset and azimuth with edge-preserving filters along the structure to construct a true “5D” filter that preserves amplitude, thereby preconditioning the data for subsequent quantitative analysis. We have evaluated our workflow by applying it to a prestack seismic volume acquired over the Fort Worth Basin, TX. The inverted results from the noise-suppressed prestack gathers are more laterally continuous and have higher correlation with well logs when compared with those inverted from conventional time-migrated gathers. (shrink)
Brittleness in unconventional reservoirs is mainly controlled by mineralogy, and it increases with quartz and dolomite content, whereas an increase in the clay content represents an increase in ductility. To generate regional brittleness maps, we have correlated the mineralogy-based brittleness index to elastic parameters measured from well logs. This correlation can then be used to predict the brittleness from surface seismic elastic parameter estimates of [Formula: see text] and [Formula: see text]. We applied the workflow to a 3D seismic survey (...) acquired in an area where more than 400 wells were drilled and hydraulically fractured prior to seismic acquisition. Combining [Formula: see text] and [Formula: see text] into a single 3D volume allowed the combination of both attributes into a single 3D volume, which can be converted to brittleness using a template based on the well log and core data. Neither of these seismic estimates were direct measures of reservoir completion quality. We, therefore, used production logs and extracted surface seismic estimates at microseismic event locations to analyze the completion effectiveness along several horizontal wellbores in the reservoir. We defined four petrotypes in [Formula: see text] and [Formula: see text] space depending on their brittleness and gas saturation, and we found that most of the microseismic events fell into the zone described as brittle in the [Formula: see text]-[Formula: see text] crossplots. These observations supported the well-known idea that regardless of where the well was perforated, microseismic events appeared to preferentially grow toward the more brittle areas, suggesting the growth of hydraulic fractures into the brittle petrotype. (shrink)
Volcanic rocks with intermediate magma composition indicate distinctive patterns in seismic amplitude data. Depending on the processes by which they were extruded to the surface, these patterns may be chaotic, moderate-amplitude reflectors or continuous high-amplitude reflectors. We have identified appropriate seismic attributes that highlight the characteristics of such patterns and use them as input to self-organizing maps to isolate these volcanic facies from their clastic counterpart. Our analysis indicates that such clustering is possible when the patterns are approximately self-similar, such (...) that the appearance of objects does not change at different scales of observation. We adopt a workflow that can help interpreters to decide what methods and what attributes to use as an input for machine learning algorithms, depending on the nature of the target pattern of interest, and we apply it to the Kora 3D seismic survey acquired offshore in the Taranaki Basin, New Zealand. The resulting clusters are then interpreted using the limited well control and principles of seismic geomorphology. (shrink)
The term acquisition footprint is commonly used to define patterns in seismic time and horizon slices that are closely correlated to the acquisition geometry. Seismic attributes often exacerbate footprint artifacts and may pose pitfalls to the less experienced interpreter. Although removal of the acquisition footprint is the focus of considerable research, the sources of such artifact acquisition footprint are less commonly discussed or illustrated. Based on real data examples, we have hypothesized possible causes of footprint occurrence and created them through (...) synthetic prestack modeling. Then, we processed these models using the same workflows used for the real data. Computation of geometric attributes from the migrated synthetics found the same footprint artifacts as the real data. These models showed that acquisition footprint could be caused by residual ground roll, inaccurate velocities, and far-offset migration stretch. With this understanding, we have examined the real seismic data volume and found that the key cause of acquisition footprint was inaccurate velocity analysis. (shrink)
The Chicontepec Formation in east-central Mexico is comprised of complex unconventional reservoirs consisting of low-permeability disconnected turbidite reservoir facies. Hydraulic fracturing increases permeability and joins these otherwise tight reservoirs. We use a recently acquired 3D seismic survey and well control to divide the Chicontepec reservoir interval in the northern part of the basin into five stratigraphic units, equivalent to global third-order seismic sequences. By combining well-log and core information with principles of seismic geomorphology, we are able to map deepwater facies (...) within these stratigraphic units that resulted from the complex interaction of flows from different directions. Correlating these stratigraphic units to producing and nonproducing wells provides the link between rock properties and Chicontepec reservoirs that could be delineated from surface seismic data. The final product is a prestack inversion-driven map of stacked pay that correlates to currently producing wells and indicates potential untapped targets. (shrink)
Recent developments in attribute analysis and machine learning have significantly enhanced interpretation workflows of 3D seismic surveys. Nevertheless, even in 2018, many sedimentary basins are only covered by grids of 2D seismic lines. These 2D surveys are suitable for regional feature mapping and often identify targets in areas not covered by 3D surveys. With continuing pressure to cut costs in the hydrocarbon industry, it is crucial to extract as much information as possible from these 2D surveys. Unfortunately, much if not (...) most modern interpretation software packages are designed to work exclusively with 3D data. To determine if we can apply 3D volumetric interpretation workflows to grids of 2D seismic lines, we have applied data conditioning, attribute analysis, and a machine-learning technique called self-organizing maps to the 2D data acquired over the Exmouth Plateau, North Carnarvon Basin, Australia. We find that these workflows allow us to significantly improve image quality, interpret regional geologic features, identify local anomalies, and perform seismic facies analysis. However, these workflows are not without pitfalls. We need to be careful in choosing the order of filters in the data conditioning workflow and be aware of reflector misties at line intersections. Vector data, such as reflector convergence, need to be extracted and then mapped component-by-component before combining the results. We are also unable to perform attribute extraction along a surface or geobody extraction for 2D data in our commercial interpretation software package. To address this issue, we devise a point-by-point attribute extraction workaround to overcome the incompatibility between 3D interpretation workflow and 2D data. (shrink)
Patterns of recent seismogenic fault reactivation in the granitic basement of north-central Oklahoma necessitate an understanding of the structural characteristics of the inherited basement-rooted faults. Here, we focus on the Nemaha Uplift & Fault Zone and the surrounding areas, within which we analyze the top-basement and intrabasement structures in eight poststack time-migrated 3D seismic reflection data sets. Overall, our results reveal 115 fault traces at the top of the Precambrian basement with sub-vertical dips, and dominant trends of west-northwest–east-southeast, northeast–southwest, and (...) north–south. We observe that proximal to the NFZ, faults dominantly strike north–south, are fewer from the NFZ, faults exhibit predominantly northeast–southwest trends, fault areal density and intensity increases, and maximum vertical separation decreases steadily. Of the analyzed faults, approximately 49% are confined to the basement, ~28% terminate within the Arbuckle Group, and approximately 23% transect units above the Arbuckle Group. These observations suggest that proximal to the NFZ, deformation is dominantly accommodated along a few but longer fault segments, most of the mapped faults cut into the sedimentary rocks, and most of the through-going faults propagate farther up-section above the Arbuckle Group; and with distance away from the NFZ, deformation is diffuse and distributed across relatively shorter fault segments, and most basement faults do not extend into the sedimentary cover. The existence of through-going faults suggests the potential for spatially pervasive fluid movement along faults. Further, observations reveal pervasive, subhorizontal intrabasement reflectors that terminate at the basement-sediment interface. Results have direct implications for wastewater injection and seismicity in north-central Oklahoma and southern Kansas. Additionally, they provide insight into the characteristics of basement-rooted structures around the NFZ region and suggest a means by which to characterize basement structures where seismic data are available. (shrink)
The interpretation of faults on 3D seismic data is often aided by the use of geometric attributes such as coherence and curvature. Unfortunately, these same attributes also delineate stratigraphic boundaries and apparent discontinuities due to cross-cutting seismic noise. Effective fault mapping thus requires enhancing piecewise continuous faults and suppressing stratabound edges and unconformities as well as seismic noise. To achieve this objective, we apply two passes of edge-preserving structure-oriented filtering followed by a recently developed fault enhancement algorithm based on a (...) directional Laplacian of a Gaussian operator. We determine the effectiveness of this workflow on a 3D seismic volume from central British Columbia, Canada. (shrink)
The Fort Worth basin is one of the most fully developed shale gas fields in North America. Although there are hundreds of drilled wells in the basin, almost none of them reach the Precambrian basement. Imaged by perhaps 100 3D seismic surveys, the focus on the relatively shallow, flat-lying Barnett Shale objective has resulted in little published work on the basement structures underlying the Lower Paleozoic strata. Subtle folds and systems of large joints are present in almost all 3D seismic (...) surveys in the FWB. At the Cambro-Ordovician Ellenburger level, these joints are often diagenetically altered and exhibit collapse features at their intersections. We discovered how the basement structures relate to overlying Paleozoic reservoirs in the Barnett Shale and Ellenburger Group. In support of our investigation, the Marathon Oil Company provided a high-quality, wide-azimuth, 3D seismic data near the southeast fringe of the FWB. In addition to the seismic volume, we integrated the seismic results with gravity, magnetic, well log, and geospatial data to understand the basement and subbasement structures in the southeast FWB. Major tectonic features including the Ouachita frontal thrust belt, Lampasas arch, Llano uplift, and Bend arch surround the southeast FWB. Euler deconvolution and integrated forward gravity modeling helped us extend our interpretation beyond the 3D seismic survey into a regional context. (shrink)
Many tight sandstone, limestone, and shale reservoirs require hydraulic fracturing to provide pathways that allow hydrocarbons to reach the well bore. Most of these tight reservoirs are now produced using multiple stages of fracturing through horizontal wells drilled perpendicular to the present-day azimuth of maximum horizontal stress. In a homogeneous media, the induced fractures are thought to propagate perpendicularly to the well, parallel to the azimuth of maximum horizontal stress, thereby efficiently fracturing the rock and draining the reservoir. We evaluated (...) what may be the first anisotropic analysis of a Barnett shale-gas reservoir after extensive hydraulic fracturing and focus on mapping the orientation and intensity of induced fractures and any preexisting factures, with the objective being the identification of reservoir compartmentalization and bypassed pay. The Barnett Shale we studied has near-zero permeability and few if any open natural fractures. We therefore hypothesized that anisotropy is therefore due to the regional northeast–southwest maximum horizontal stress and subsequent hydraulic fracturing. We found the anisotropy to be highly compartmentalized, with the compartment edges being defined by ridges and domes delineated by the most positive principal curvature [Formula: see text]. Microseismic work by others in the same survey indicates that these ridges contain healed natural fractures that form fracture barriers. Mapping such heterogeneous anisotropy field could be critical in planning the location and direction of any future horizontal wells to restimulate the reservoir as production drops. (shrink)
Legacy seismic surveys cover much of the midcontinent USA and Texas, with almost all 3D surveys acquired in the 1990s considered today to be low fold. Fortunately, recent advances in 5D interpolation have not only enhanced the quality of structural and stratigraphic images, but they have also improved the data sufficiently to allow more quantitative interpretation, such as impedance inversion. Although normal-moveout-corrected, common-midpoint-based 5D interpolation does an excellent job of amplitude balancing and the suppression of acquisition footprint, it appears to (...) misinterpolate undercorrected diffractions, thus smearing fault and stratigraphic edges. We described a least-squares migration-driven 5D interpolation workflow, in which data were interpolated by demigrating the current subsurface image to the missing offsets and azimuths. Such demigration accurately interpolates fault edges and other diffractors, thereby preserving lateral discontinuities, while suppressing footprint and balancing the amplitudes. We have applied this workflow to a highly aliased low-fold survey acquired in the early 1990s now of use in mapping the newly reinvigorated Mississippi Lime play. This workflow improves reflector continuity, preserves faults delineated by coherence, balances the amplitude, and provides improved well ties. (shrink)
The organic-rich, silty Woodford Shale in west-central Oklahoma is a prolific resource play producing gas and liquid hydrocarbons. We calibrated seismic attributes and prestack inversion using well logs and core information within a seismic geomorphologic framework to define the overall basin architecture, major stratigraphic changes, and related variations in lithologies. Core measurements of elastic moduli and total organic content indicated that the Woodford Shale can be broken into three elastic petrotypes important to well completion and hydrocarbon enrichment. Upscaling these measurements (...) facilitates regional mapping of petrotypes from prestack seismic inversion of surface data. Seismic attributes highlight rugged topography of the basin floor of the Paleo Woodford Sea, which controls the lateral and vertical distribution of different lithofacies containing variable quantity of TOC as well as quartz, which controls brittleness. Depressions on the basin floor contain TOC-lean cherty lithofacies alternating with TOC-rich lithofacies, resulting in brittle-ductile rock couplets. In contrast, basin floor highs are characterized by overall TOC-rich ductile lithofacies. Seismic attributes illuminate complex post-Woodford tectonic deformation. The Woodford Shale is known to be naturally fractured on outcrop. Image log analysis in other shale plays showed a good correlation between such tectonic features and natural fractures. These features need to be correlated with well trajectories and production data to determine which hypothesized “fracture sets,” if any, improve well performance. (shrink)
Mississippian Meramec reservoirs of the Sooner Trend in the Anadarko in Canadian and Kingfisher Counties play are comprised of silty limestones, calcareous siltstones, argillaceous calcareous siltstones, argillaceous siltstones, and mudstones. We found that core-defined reservoir lithologies are related to petrophysics-based rock types derived from porosity-permeability relationships using a flow-zone indicator approach. We classified lithologies and rock types in noncored wells using an artificial neural network with overall accuracies of 93% and 70%, respectively. We observed that mudstone-rich rock type 1 exhibits (...) high clay and relatively low calcite, whereas calcareous-rich rock type 3 has high calcite and low clay content with rock type 2 falling in between as a balance between rock types 1 and 3. Results of the ANN were applied to a suite of well logs in noncored wells in which we generated lithology and rock-type logs for the Meramec. We identified that the Meramec consists of seven stratigraphic units characterized as strike-elongate, shoaling-upward parasequences; each parasequence is capped by a marine-flooding surface. The lower three parasequences form a retrogradational parasequence set that back steps to the northwest and is capped by a maximum flooding surface. The upper Meramec is characterized by parasequences that form an aggradational to progradational stacking pattern followed again by a retrogradational trend. We predict that the parasequence stacking, associated lithology distribution, and diagenetic cements appear to control the spatial distribution of petrophysical properties, pore volume, and hydrocarbon pore volume. Calcareous-rich lithologies exhibit lower porosity, permeability, HCPV, and higher water saturation. We deduced that argillaceous-rich lithologies that occur near the maximum flooding surface are the most favorable reservoir intervals because they exhibit relatively higher porosity, permeability, HCPV, and lower water saturation. Productivity could not be directly correlated to rock types as operational and completion factors as well as overpressure and oil phase play important roles on production. (shrink)
Seismic resolution significantly affects the quality of seismic interpretation. Processing parameters that effect resolution such as picking velocities in the presence of interbed multiples benefit from an understanding of the underlying geology. Three-dimensional migration is almost always performed by an external service company or internal specialty processing group, with the “final” product being migrated gathers and the final migration-stack section. In the Chicontepec Basin, Mexico, we have evaluated improvements in data quality made after 3D prestack time migration. By first mapping (...) shallow volcanics that generated strong interbed multiples, we performed a new velocity analysis to better image the weaker, underlying primaries of interest. We remove the local migration stretch through an inverse NMO correction, followed by a nonstretch NMO correction and prestack structure-oriented filtering. Such compensation for migration stretch improves the vertical resolution and preserves far-offset data valuable to subsequent prestack inversion that would otherwise need to be muted. Because S-impedance inversion depends heavily on the farther offsets, the resulting S-impedance images have resolution that is in general equivalent to and, in the target area of rapid S-impedance seen in the well logs, exceed that of the P-impedance images. Attributes such as coherence and curvature show improved fault resolution, whereas noisy areas look more chaotic because of the increased frequency content. (shrink)
Semblance and other coherence measures are routinely used in seismic processing, such as velocity spectra analysis, in seismic interpretation to estimate volumetric dip and to delineate geologic boundaries, and in poststack and prestack data conditioning such as edge-preserving structure-oriented filtering. Although interpreters readily understand the significance of outliers for such measures as seismic amplitude being described by a Gaussian distribution, and root-mean-square amplitude by a log-normal distribution, the measurement significance of a given coherence of poststack seismic data is much more (...) difficult to grasp. We have followed early work on the significance of events seen in semblance-based velocity spectra, and we used an [Formula: see text]-statistic to quantify the significance of coherence measures at each voxel. The accuracy and resolution of these measures depended on the bandwidth of the data, the signal-to-noise ratio, and the size of the spatial and temporal analysis windows used in their numerical estimation. In 3D interpretation, low coherence estimated not only the seismic noise but also the geologic signal, such as fault planes and channel edges. Therefore, we have estimated the S/N as the product of coherence and two alternative measures of randomness, the first being the disorder attribute and the second estimate based on eigenvalues of a window of coherence values. The disorder attribute is fast and easy to compute, whereas the eigenvalue calculation is computationally intensive and more accurate. We have demonstrated the value of this measure through application to two 3D surveys, in which we modulated coherence measures by our [Formula: see text]-statistic measure to show where discontinuities were significant and where they corresponded to more chaotic features. (shrink)
The iconic coherence attribute is very useful for imaging geologic features such as faults, deltas, submarine canyons, karst collapse, mass-transport complexes, and more. In addition to its preconditioning, the interpretation of discrete stratigraphic features on seismic data is also limited by its bandwidth, where in general the data with higher bandwidth yield crisper features than data with lower bandwidth. Some form of spectral balancing applied to the seismic amplitude data can help in achieving such an objective so that coherence run (...) on spectrally balanced seismic data yields a better definition of the geologic features of interest. The quality of the generated coherence attribute is also dependent in part on the algorithm used for its computation. In the eigenstructure decomposition procedure for coherence computation, spectral balancing equalizes each contribution to the covariance matrix, and thus it yields crisper features on coherence displays. There are other ways to modify the spectrum of the input data in addition to simple spectral balancing, including the amplitude-volume technique, taking the derivative of the input amplitude, spectral bluing, and thin-bed spectral inversion. We compare some of these techniques, and show their added value in seismic interpretation, which forms part of the more elaborate exercise that we had carried out. In other work, we discuss how different spectral components derived from the input seismic data allow interpretation of different scales of discontinuities, what additional information is provided by coherence computed from narrow band spectra, and the different ways to integrate them. (shrink)
With the advent of horizontal drilling and hydraulic fracturing in the Midcontinent, USA, fields once thought to be exhausted are now experiencing renewed exploitation. However, traditional Midcontinent seismic analysis techniques no longer provide satisfactory reservoir characterization for these unconventional plays; new seismic analysis methods are needed to properly characterize these radically innovative play concepts. Time processing and filtering is applied to a raw 3D seismic data set from Osage County, Oklahoma, paying careful attention to velocity analysis, residual statics, and coherent (...) noise filtering. The use of a robust prestack structure-oriented filter and spectral whitening greatly enhances the results. After prestack time migrating the data using a Kirchhoff algorithm, new velocities are picked. A final normal moveout correction is applied using the new velocities, followed by a final prestack structure-oriented filter and spectral whitening. Simultaneous prestack inversion uses the reprocessed and time-migrated seismic data as input, along with a well from within the bounds of the survey. With offsets out to 3048 m and a target depth of approximately 880 m, we can invert for density in addition to P- and S-impedance. Prestack inversion attributes are sensitive to lithology and porosity while surface seismic attributes such as coherence and curvature are sensitive to lateral changes in waveform and structure. We use these attributes in conjunction with interpreted horizontal image logs to identify zones of high porosity and high fracture density. (shrink)
Seismic inversion has become almost routine in quantitative 3D seismic interpretation. To ensure the quality of the seismic inversion, the input seismic data need to have a high signal-to-noise ratio. With the current low oil price environment, seismic reprocessing is often preferred over reacquisition to improve data quality. Common filter pairs include forward and inverse [Formula: see text]-[Formula: see text] and Radon transforms. Forward and inverse migrations are a more recently introduced transform pair that, when used together in an iterative (...) workflow, results in a least-squares migration algorithm. Least-squares migration compensates for surface variation in data density and, when combined with a filter applied to prestack migrated images, suppresses the operator and data aliasing. We apply a least-squares migration workflow to a fractured-basement data set from the Texas Panhandle to demonstrate the enhancement in signal-to-noise ratio, the reduction in acquisition footprint and migration artifacts, and the improvement in the P-impedance inversion result. (shrink)
Various approaches exist for quantitative or qualitative predictions of seismically thin beds and their physical properties. The evolving definition of thin beds, the use of seismic attributes indicative of thin beds, thin-bed imaging on geologic-time surfaces, and thin-bed thickness estimation represent some of the most active aspects of the research and application. We reviewed some theoretical and technological developments in thin-bed analysis over recent decades. We also reviewed the data processing steps that affect seismic resolution and thin-bed evaluation.
Although the structures associated with overthrust terrains form important targets in many basins, accurately imaging remains challenging. Steep dips and strong lateral velocity variations associated with these complex structures require prestack depth migration instead of simpler time migration. The associated rough topography, coupled with older, more indurated, and thus high-velocity rocks near or outcropping at the surface often lead to seismic data that suffer from severe statics problems, strong head waves, and backscattered energy from the shallow section, giving rise to (...) a low signal-to-noise ratio that increases the difficulties in building an accurate velocity model for subsequent depth migration. We applied a multidomain cascaded noise attenuation workflow to suppress much of the linear noise. Strong lateral velocity variations occur not only at depth but near the surface as well, distorting the reflections and degrading all deeper images. Conventional elevation corrections followed by refraction statics methods fail in these areas due to poor data quality and the absence of a continuous refracting surface. Although a seismically derived tomographic solution provides an improved image, constraining the solution to the near-surface depth-domain interval velocities measured along the surface outcrop data provides further improvement. Although a one-way wave-equation migration algorithm accounts for the strong lateral velocity variations and complicated structures at depth, modifying the algorithm to account for lateral variation in illumination caused by the irregular topography significantly improves the image, preserving the subsurface amplitude variations. We believe that our step-by-step workflow of addressing the data quality, velocity model building, and seismic imaging developed for the Tuha Basin of China can be applied to other overthrust plays in other parts of the world. (shrink)