We propose a new relevance sensitive model for representing and revising belief structures, which relies on a notion of partial language splitting and tolerates some amount of inconsistency while retaining classical logic. The model preserves an agent's ability to answer queries in a coherent way using Belnap's four-valued logic. Axioms analogous to the AGM axioms hold for this new model. The distinction between implicit and explicit beliefs is represented and psychologically plausible, computationally tractable procedures for query answering and belief..
The coherence attribute computation is typically carried out as a poststack application on 3D prestack migrated seismic data volumes. However, since its inception, interpreters have applied coherence to band-pass-filtered data, azimuthally limited stacks, and offset-limited stacks to enhance discontinuities seen at specific frequencies, azimuths, and offsets. The limitation of this approach is the multiplicity of coherence volumes. Of the various coherence algorithms that have evolved over the past 25 years, the energy ratio coherence computation stands apart from the others, being (...) more sensitive to the seismic waveform changes rather than changes in their amplitude. The energy ratio algorithm is based on the crosscorrelation of five or more adjacent traces to form a symmetric covariance matrix that can then be decomposed into eigenvalues and eigenvectors. The first eigenvector represents a vertically variable, laterally consistent pattern that best represents the data in the analysis window. The first eigenvalue represents the energy of the data represented by this pattern. Coherence is then defined as the ratio of the energy represented by the first eigenvalue to the sum of the energy of the original data. An early generalization of this algorithm was to compute the sum of two covariance matrices, one from the original data and the other from the 90° phase rotated data, thereby eliminating artifacts about low-amplitude zero crossings. More recently, this concept has been further generalized by computing a sum of covariance matrices of traces represented by multiple spectral components, by their azimuthally limited stacks, and by their offset-limited stacks. These more recently developed algorithms capture many of the benefits of discontinuities seen at specific frequencies, azimuths, and offsets, but they present the interpreter with a single volume. We compare the results of multispectral, multiazimuth, and multioffset coherence volumes with the traditional coherence computation, and we find that these newer coherence computation procedures produce superior results. (shrink)
The Utica Shale is one of the major source rocks in Ohio, and it extends across much of the eastern United States. Its organic richness, high content of calcite, and development of extensive organic porosity make it a perfect unconventional play, and it has gained the attention of the oil and gas industry. The primary target zone in the Utica Play includes the Utica Formation, Point Pleasant Formation, and Trenton Formation intervals. We attempt to identify the sweet spots within the (...) Point Pleasant interval using 3D seismic data, available well data, and other relevant data. This has been done by way of organic richness and brittleness estimation in the rock intervals. The organic richness is determined by weight % of total organic carbon content, which is derived by transforming the inverted density volume. Core-log petrophysical modeling provides the necessary relationship for doing so. The brittleness is derived using rock-physics parameters such as the Young’s modulus and Poisson’s ratio. Deterministic simultaneous inversion along with a neural network approach are followed to compute the rock-physics parameters and density using seismic data. The correlation of sweet spots identified based on the seismic data with the available production data emphasizes the significance of integration of seismic data with all other relevant data. (shrink)
The Utica Shale is one of the major source rocks in Ohio, and it extends across much of the eastern United States. Its organic richness, high content of calcite, and development of extensive organic porosity make it a perfect unconventional play, and it has gained the attention of the oil and gas industry. The primary target zone in the Utica Play includes the Utica Formation, Point Pleasant Formation, and Trenton Formation intervals. We attempt to identify the sweet spots within the (...) Point Pleasant interval using 3D seismic data, available well data, and other relevant data. This has been done by way of organic richness and brittleness estimation in the rock intervals. The organic richness is determined by weight % of total organic carbon content, which is derived by transforming the inverted density volume. Core-log petrophysical modeling provides the necessary relationship for doing so. The brittleness is derived using rock-physics parameters such as the Young’s modulus and Poisson’s ratio. Deterministic simultaneous inversion along with a neural network approach are followed to compute the rock-physics parameters and density using seismic data. The correlation of sweet spots identified based on the seismic data with the available production data emphasizes the significance of integration of seismic data with all other relevant data. (shrink)
The Utica Formation in eastern Ohio possesses all the prerequisites for being a successful unconventional play. Attempts at seismic reservoir characterization of the Utica Formation have been discussed in part 1, in which, after providing the geologic background of the area of study, the preconditioning of prestack seismic data, well-log correlation, and building of robust low-frequency models for prestack simultaneous impedance inversion were explained. All these efforts were aimed at identification of sweet spots in the Utica Formation in terms of (...) organic richness as well as brittleness. We elaborate on some aspects of that exercise, such as the challenges we faced in the determination of the total organic carbon volume and computation of brittleness indices based on mineralogical and geomechanical considerations. The prediction of TOC in the Utica play using a methodology, in which limited seismic as well as well-log data are available, is demonstrated first. Thereafter, knowing the nonexistence of the universally accepted indicator of brittleness, mechanical along with mineralogical attempts to extract the brittleness information for the Utica play are discussed. Although an attempt is made to determine brittleness from mechanical rock-physics parameters derived from seismic data, the available X-ray diffraction data and regional petrophysical modeling make it possible to determine the brittleness index based on mineralogical data and thereafter be derived from seismic data. (shrink)
The Devonian Duvernay Formation in northwest central Alberta, Canada, has become a hot play in the past few years due to its richness in liquid and gaseous hydrocarbon resources. The oil and gas generation in this shale formation made it the source rock for many oil and gas fields in its vicinity. We attempt to showcase the characterization of Duvernay Formation using 3D multicomponent seismic data and integrating it with the available well log and other relevant data. This has been (...) done by deriving rock-physics parameters through deterministic simultaneous and joint impedance inversion, with appropriate quantitative interpretation. In particular, we determine the brittleness of the Duvernay interval, which helps us determine the sweet spots therein. The scope of this characterization exercise was extended to explore the induced seismicity observed in the area that is perceived to be associated with hydraulic fracture stimulation of the Duvernay. This has been a cause of media coverage lately. We attempt to integrate our results with the induced seismicity data available in the public domain and elaborate on our learning experience gained so far. (shrink)
The interpretation of faults on 3D seismic data is often aided by the use of geometric attributes such as coherence and curvature. Unfortunately, these same attributes also delineate stratigraphic boundaries and apparent discontinuities due to cross-cutting seismic noise. Effective fault mapping thus requires enhancing piecewise continuous faults and suppressing stratabound edges and unconformities as well as seismic noise. To achieve this objective, we apply two passes of edge-preserving structure-oriented filtering followed by a recently developed fault enhancement algorithm based on a (...) directional Laplacian of a Gaussian operator. We determine the effectiveness of this workflow on a 3D seismic volume from central British Columbia, Canada. (shrink)
The axiom of recovery, while capturing a central intuition regarding belief change, has been the source of much controversy. We argue briefly against putative counterexamples to the axiom—while agreeing that some of their insight deserves to be preserved—and present additional recovery-like axioms in a framework that uses epistemic states, which encode preferences, as the object of revisions. This makes iterated revision possible and renders explicit the connection between iterated belief change and the axiom of recovery. We provide a representation theorem (...) that connects the semantic conditions we impose on iterated revision and our additional syntactical properties. We show interesting similarities between our framework and that of Darwiche–Pearl (Artificial Intelligence 89:1–29 1997). In particular, we show that intuitions underlying the controversial (C2) postulate are captured by the recovery axiom and our recovery-like postulates (the latter can be seen as weakenings of (C2)). We present postulates for contraction, in the same spirit as the Darwiche–Pearl postulates for revision, and provide a theorem that connects our syntactic postulates with a set of semantic conditions. Lastly, we show a connection between the contraction postulates and a generalisation of the recovery axiom. (shrink)
Traditional accounts of belief change have been criticized for placing undue emphasis on the new belief provided as input. A recent proposal to address such issues is a framework for non-prioritized belief change based on default theories (Ghose and Goebel, 1998). A novel feature of this approach is the introduction of disbeliefs alongside beliefs which allows for a view of belief contraction as independently useful, instead of just being seen as an intermediate step in the process of belief revision. This (...) approach is, however, restrictive in assuming a linear ordering of reliability on the received inputs. In this paper, we replace the linear ordering with a preference ranking on inputs from which a total preorder on inputs can be induced. This extension brings along with it the problem of dealing with inputs of equal rank. We provide a semantic solution to this problem which contains, as a special case, AGM belief change on closed theories. (shrink)
Shale resource plays are associated with low permeability; hence, hydraulic fracturing is required for their stimulation and production. Even though considerable nonuniqueness exists in identifying favorable zones for hydraulic fracturing, geophysicists seem to be avid followers of low-Poisson’s ratio and high-Young’s modulus brittleness criteria, proposed a decade ago. We highlight the misinterpretation that one may run into in following such a criterion for any shale play and develop a new attribute that makes use of strain energy density and fracture toughness. (...) Although the former controls fracture initiation, the propagation of fractures is governed by the latter. Because hydraulic fracturing comprises both these properties, it is firmly believed that the new proposed attribute could be used to highlight the favorable intervals for fracturing. Core data, well log curves, along with mud logs have been used to authenticate the proposed attribute. Finally, computation of the new attributes is implemented on the seismic data with encouraging results. (shrink)
We present a method for relevance sensitive non-monotonic inference from belief sequences which incorporates insights pertaining to prioritized inference and relevance sensitive, inconsistency tolerant belief revision. Our model uses a finite, logically open sequence of propositional formulas as a representation for beliefs and defines a notion of inference from maxiconsistent subsets of formulas guided by two orderings: a temporal sequencing and an ordering based on relevance relations between the putative conclusion and formulas in the sequence. The relevance relations are ternary (...) (using context as a parameter) as opposed to standard binary axiomatizations. The inference operation thus defined easily handles iterated revision by maintaining a revision history, blocks the derivation of inconsistent answers from a possibly inconsistent sequence and maintains the distinction between explicit and implicit beliefs. In doing so, it provides a finitely presented formalism and a plausible model of reasoning for automated agents. (shrink)
We provide a formal study of belief retraction operators that do not necessarily satisfy the postulate. Our intuition is that a rational description of belief change must do justice to cases in which dropping a belief can lead to the inclusion, or ‘liberation’, of others in an agent's corpus. We provide two models of liberation via retraction operators: ρ-liberation and linear liberation. We show that the class of ρ-liberation operators is included in the class of linear ones and provide axiomatic (...) characterisations for each class. We show how any retraction operator can be ‘converted’ into either a withdrawal operator ) or a revision operator via the Harper Identity and the Levi Identity respectively. (shrink)
The iconic coherence attribute is very useful for imaging geologic features such as faults, deltas, submarine canyons, karst collapse, mass-transport complexes, and more. In addition to its preconditioning, the interpretation of discrete stratigraphic features on seismic data is also limited by its bandwidth, where in general the data with higher bandwidth yield crisper features than data with lower bandwidth. Some form of spectral balancing applied to the seismic amplitude data can help in achieving such an objective so that coherence run (...) on spectrally balanced seismic data yields a better definition of the geologic features of interest. The quality of the generated coherence attribute is also dependent in part on the algorithm used for its computation. In the eigenstructure decomposition procedure for coherence computation, spectral balancing equalizes each contribution to the covariance matrix, and thus it yields crisper features on coherence displays. There are other ways to modify the spectrum of the input data in addition to simple spectral balancing, including the amplitude-volume technique, taking the derivative of the input amplitude, spectral bluing, and thin-bed spectral inversion. We compare some of these techniques, and show their added value in seismic interpretation, which forms part of the more elaborate exercise that we had carried out. In other work, we discuss how different spectral components derived from the input seismic data allow interpretation of different scales of discontinuities, what additional information is provided by coherence computed from narrow band spectra, and the different ways to integrate them. (shrink)
Thinking about how the law might decide whether to extend legal personhood to artificial agents provides a valuable testbed for philosophical theories of mind. Further, philosophical and legal theorising about personhood for artificial agents can be mutually informing. We investigate two case studies, drawing on legal discussions of the status of artificial agents. The first looks at the doctrinal difficulties presented by the contracts entered into by artificial agents. We conclude that it is not necessary or desirable to postulate artificial (...) agents as legal persons in order to account for such contracts. The second looks at the potential for according sophisticated artificial agents with legal personality with attendant constitutional protections similar to those accorded to humans. We investigate the validity of attributes that have been suggested as pointers of personhood, and conclude that they will take their place within a broader matrix of pragmatic, philosophical and extra-legal concepts. (shrink)
Possible-world semantics are provided for Parikh’s relevance-sensitive model for belief revision. Having Grove’s system-of-spheres construction as a base, we consider additional constraints on measuring distance between possible worlds, and we prove that, in the presence of the AGM postulates, these constraints characterize precisely Parikh’s axiom (P). These additional constraints essentially generalize a criterion of similarity that predates axiom (P) and was originally introduced in the context of Reasoning about Action. A by-product of our study is the identification of two possible (...) readings of Parikh’s axiom (P), which we call the strong and the weak versions of the axiom. An interesting feature of the strong version is that, unlike classical AGM belief revision, it makes associations between the revision policies of different theories. (shrink)
We have previously discussed some alternative means of modifying the frequency spectrum of the input seismic data to modify the resulting coherence image. The simplest method was to increase the high-frequency content by computing the first and second derivatives of the original seismic amplitudes. We also evaluated more sophisticated techniques, including the application of structure-oriented filtering to different spectral components before spectral balancing, thin-bed reflectivity inversion, bandwidth extension, and the amplitude volume technique. We further examine the value of coherence computed (...) from individual spectral voice components, and alternative means of combining three or more such coherence images, providing a single volume for interpretation. (shrink)
The standard theory for belief revision provides an elegant and powerful framework for reasoning about how a rational agent should change its beliefs when confronted with new information. However, the agents considered are extremely idealized. Some recent models attempt to tackle the problem of plausible belief revision by adding structure to the belief bases and using nonstandard inference operations. One of the key ideas is that not all of an agent's beliefs are relevant for an operation of belief change.In this (...) paper we incorporate the insights pertaining to local change and relevance sensitivity with the use of approximate inference relations. These approximate inference relations offer us partial solutions at any stage of the revision process. The quality of the approximations improves as we allow for more and more resources to be used. We are provided with upper and lower bounds to what would be obtained with the use of classical inference. (shrink)
The introduction of explicit notions of rejection, or disbelief, into logics for knowledge representation can be justified in a number of ways. Motivations range from the need for versions of negation weaker than classical negation, to the explicit recording of classic belief contraction operations in the area of belief change, and the additional levels of expressivity obtained from an extended version of belief change which includes disbelief contraction. In this paper we present four logics of disbelief which address some or (...) all of these intuitions. Soundness and completeness results are supplied and the logics are compared with respect to applicability and utility. (shrink)
Claims about the potential of free software to reform the production and distribution of software are routinely countered by skepticism that the free software community fails to engage the pragmatic and economic ‘realities’ of a software industry. We argue to the contrary that contemporary business and economic trends definitively demonstrate the financial viability of an economy based on free software. But the argument for free software derives its true normative weight from social justice considerations: the evaluation of the basis for (...) a software economy should be guided by consideration of the social and cultural states which are the ultimate goals of any economic arrangement. That is, the software economy should be evaluated in light of its ability to provide justice. We conclude with a discussion of possible avenues for reform. (shrink)
The “free” in “free software” refers to a cluster of four specific freedoms identified by the Free Software Definition. The first freedom, termed “Freedom Zero,” intends to protect the right of the user to deploy software in whatever fashion, towards whatever end, he or she sees fit. But software may be used to achieve ethically questionable ends. This highlights a tension in the provision of software freedoms: while the definition explicitly forbids direct restrictions on users’ freedoms, it does not address (...) other means by which software may indirectly restrict freedoms. In particular, ethically-inflected debate has featured prominently in the discussion of restrictions on digital rights management and privacy-violating code in version 3 of the GPL (GPLv3). The discussion of this proposed language revealed the spectrum of ethical positions and valuations held by members of the free software community. In our analysis, we will provide arguments for upholding Freedom Zero; we embed the problem of possible uses of software in the broader context of the uses of scientific knowledge, and go on to argue that the provision of Freedom Zero mitigates against too great a moral burden—of anticipating possible uses of software—being placed on the programmer and that, most importantly, it facilitates deliberative discourse in the free software community. (shrink)
This thesis proposes and presents two new models for belief representation and belief revision. The first model is the B-structures model which relies on a notion of partial language splitting and tolerates some amount of inconsistency while retaining classical logic. The model preserves an agent's ability to answer queries in a coherent way using Belnap's four-valued logic. Axioms analogous to the AGM axioms hold for this new model. The distinction between implicit and explicit beliefs is represented and psychologically plausible, computationally (...) tractable procedures for query answering and belief base revision are obtained. ;The second model presents a method for relevance sensitive non-monotonic inference from belief sequences which incorporates insights pertaining to prioritized inference and relevance sensitive, inconsistency tolerant belief revision. Our model uses a finite, logically open sequence of propositional formulas as a representation for beliefs and defines a notion of inference from maxiconsistent subsets of formulas guided by two orderings: a temporal sequencing and an ordering based on relevance relations between the conclusion and formulas in the sequence. The relevance relations are ternary as opposed to standard binary axiomatizations. The inference operation thus defined easily handles iterated revision by maintaining a revision history, blocks the derivation of inconsistent answers from a possibly inconsistent sequence and maintains the distinction between explicit and implicit beliefs. In doing so, it provides a finitely representable formalism and a plausible model of reasoning for automated agents. (shrink)
The complete characterization of a reservoir requires accurate determination of properties such as the porosity, gamma ray, and density, among others. A common workflow is to predict the spatial distribution of properties measured by well logs to those that can be computed from the seismic data. In general, a high degree of scatter of data points is seen on crossplots between P-impedance and porosity, or P-impedance and gamma ray, suggesting great uncertainty in the determined relationship. Although for many rocks there (...) is a well-established petrophysical model correlating the P-impedance to porosity, there is not a comparable model correlating the P-impedance to gamma ray. To address this issue, interpreters can use crossplots to graphically correlate two seismically derived variables to well measurements plotted in color. When there are more than two seismically derived variables, the interpreter can use multilinear regression or artificial neural network analysis that uses a percentage of the upscaled well data for training to establish an empirical relation with the input seismic data and then uses the remaining well data to validate the relationship. Once validated at the wells, this relationship can then be used to predict the desired reservoir property volumetrically. We have described the application of deep neural network analysis for the determination of porosity and gamma ray over the Volve field in the southern Norwegian North Sea. After using several quality-control steps in the DNN workflow and observing encouraging results, we validate the final prediction of the porosity and gamma-ray properties using blind well correlation. The application of this workflow promises significant improvement to the reservoir property determination for fields that have good well control and exhibit lateral variations in the sought properties. (shrink)
Seismic interpreters frequently use seismic geometric attributes, such as coherence, dip, curvature, and aberrancy for defining geologic features, including faults, channels, angular unconformities, etc. Some of the commonly used coherence attributes, such as cross correlation or energy-ratio similarity, are sensitive to only waveform shape changes, whereas the dip, curvature, and aberrancy attributes are based on changes in reflector dips. There is another category of seismic attributes, which includes attributes that are sensitive to amplitude values. Root-mean-square amplitude is one of the (...) better-known amplitude-based attributes, whereas coherent energy, Sobel-filter similarity, normalized amplitude gradients, and amplitude curvature are among lesser-known amplitude-based attributes. We have computed not-so-common amplitude-based attributes on the Penobscot seismic survey from the Nova Scotia continental shelf consisting of the east coast of Canada, to bring out their interpretive value. We analyze seismic attributes at the level of the top of the Wyandot Formation that exhibits different geologic features, including a synthetic transfer zone with two primary faults and several secondary faults, polygonal faults associated with differential compaction, as well as fixtures related to basement-related faults. The application of the amplitude-based seismic attributes defines such features accurately. We take these applications forward by describing a situation in which some geologic features do not display any bending of reflectors but only exhibit changes in amplitude. One such example is the Cretaceous Cree Sand channels present in the same 3D seismic survey used for the previous applications. We compute amplitude curvature attributes and identify the channels, whereas these channels are not visible on the structural curvature display. In both of the applications, we observe that appropriate corendering not-so-common amplitude-based seismic attributes lead to convincing displays, which can be of immense aid in seismic interpretation and help define the different subsurface features with more clarity. (shrink)
The shallow migrating hydrocarbon fluids in the western Barents Sea are usually found to be associated with high seismic amplitudes. We have attempted to characterize such shallow high-amplitude anomalies in the Hoop Fault Complex area of the western Barents Sea. The workflow is devised for discrimination of anomalies that are associated with the presence of hydrocarbons from those that are not, and quantifying them further includes the computation of a set of seismic attributes and their analyses. These attributes comprise coherence, (...) spectral decomposition, prestack simultaneous impedance inversion, and extended elastic impedance attributes, followed by their analysis in an appropriate crossplot space, as well as with the use of rock-physics templates. Finally, we briefly evaluate the futuristic efforts being devoted toward the integration of diverse data types such as P-cable seismic as well as controlled-source electromagnetic data so as to come up with an integrated assessment for the prospects and to mitigate risk. (shrink)
Dept of Business Administration Dept of Computer and Knowledge Systems Group University of Patras Information Science School of Computer Science 265 00 Patras, Greece Brooklyn College of the and Engineering [email protected] City University of New York University of New South Wales Brooklyn, NY 11210, USA Sydney, NSW 2052, Australia [email protected][email protected]
We present a framework that provides a logic for science by generalizing the notion of logical (Tarskian) consequence. This framework will introduce hierarchies of logical consequences, the first level of each of which is identified with deduction. We argue for identification of the second level of the hierarchies with inductive inference. The notion of induction presented here has some resonance with Popper's notion of scientific discovery by refutation. Our framework rests on the assumption of a restricted class of structures in (...) contrast to the permissibility of classical first-order logic. We make a distinction between deductive and inductive inference via the notions of compactness and weak compactness. Connections with the arithmetical hierarchy and formal learning theory are explored. For the latter, we argue against the identification of inductive inference with the notion of learnable in the limit. Several results highlighting desirable properties of these hierarchies of generalized logical consequence are also presented. (shrink)
To walk or not to walk: Should a batsman acknowledge his own dismissal by leaving the wicket without even waiting for the umpire's decision? David Coady and Samir Chopra examine this flashpoint ethical debate in cricket.
Software is much more than sequences of instructions for a computing machine: it can be an enabler (or disabler) of political imperatives and policies. Hence, it is subject to the same assessment in a normative dimension as other political and social phenomena. The core distinction between free software and its proprietary counterpart is that free software makes available to its user the knowledge and innovation contributed by the creator(s) of the software, in the form of the created source code. From (...) an ethical perspective, one of the most pressing questions raised by this form of collaboration is the question of the rights, and the restrictions on them, that are passed on to users and collaborators by the creators of programs. That is, what freedoms do software users deserve, and how can they best be protected? In this study we analyze free software licensing schemes in order to determine which most effectively protects such freedoms. We conclude that so-called copyleft licensing schemes are the morally superior alternative. (shrink)
Free and open source software is taking an increasingly significant role in our software infrastructure. Yet many questions still exist about whether a software economy based on FOSS would be viable. We argue that contemporary trends definitively demonstrate this viability. Claiming that an economy must be evaluated as much by the ends it brings about as by its size or vigor, we draw on widely accepted notions of redistributive justice to show the ethical superiority of a software economy based on (...) FOSS. (shrink)
Our freedoms in cyberspace are those granted by code and the protocols it implements. When man and machine interact, co-exist, and intermingle, cyberspace comes to interpenetrate the real world fully. In this cyborg world, software retains its regulatory role, becoming a language of interaction with our extended cyborg selves. The mediation of our extended selves by closed software threatens individual autonomy. We define a notion of freedom for software that does justice to our conception of it as language, sketching the (...) outlines of a social and political philosophy for a cyborg world. In a cyberspace underwritten by free software, political structures become contingent and flexible: the polity can choose to change the extent and character of its participation. The rejection of opaque power is an old anarchist ideal: free software, by making power transparent, carries the potential to place substantive restrictions on the regulatory power of cyborg government. (shrink)
The application of curvature attributes on seismic horizons or 3D seismic volumes has been discussed in the literature in several ways. Such discussion largely ignores the detail of parameter selection that must be made by the working interpreter or the expert processor. Parameter selection such as window size and filtering methods for seismic curvature estimates have not been extensively compared in the literature and have never been validated using quantitative ground truthing to log or drilling data. Of even greater relevance (...) to the interpreter is the lack of discussion of curvature parameters as they are relevant to interpretive and operational concerns. We focus on the seismic most-positive curvature attribute, its parameterization, and filtering for the overpressured tight sand target in the Falher F formation of the deep basin of Alberta, Canada. This sand has numerous natural fractures that constitute an occasional drilling hazard due to mud losses. Various parameterizations on horizon- and volume-based curvature extractions are made and examined in the context of the drilling results of four horizontal wells, one of which has image log fracture density along the lateral portion of the well. We compared different lateral window sizes in the initial curvature estimates, as well as different postcurvature filtering approaches including unfiltered, Gaussian-filtered, and Fourier-filtered products. The different curvature attribute estimates have been evaluated by way of map comparisons, cross-section seismic line comparisons, and correlations with the upscaled fracture density log data. We found that our horizon-based estimates of positive curvature suffered from mechanical artifacts related to the horizon picking process, and the volume-based methods were generally superior. Of the volume-based methods, we found that the Fourier-filtered curvature estimates were the most stable through smaller analysis windows. Gaussian-filtering methods on volumetric curvature gave results of varying quality. Unfiltered volumetric curvature estimates were only stable when very large time windows were used, which affected the time localization of the estimate. The comparisons give qualitative and quantitative perspective regarding the best parameters of curvature to predict the key properties of geologic target, which in this case are the potentially hazardous natural fractures within the overpressured Falher F sandstone. (shrink)
We present a model for first-order belief revision that is characterized by an underlying relevance-like relation and a background proof system. The model is extremely general in order to allow for a wide variety in these characterizing parameters. It allows some weakenings of beliefs which were initially implicit to become explicit and survive the revision process. The effects of revision are localized to the part of the theory that is influenced by the the new information. Iterated revision in this model (...) is handled trivially since the revision operator is constructive by definition. The usage of deductively limited proof systems permits an inconsistency tolerant model. The notion of a part of a theory capable of being influenced by new information (designed to accomodate the specific character of first-order languages) is shown to satisfy some intuitive and desirable properties. We show that for particular parametrizations, standard revision schemes can be embedded into our paradigm. (shrink)
The discrimination of fluid content and lithology in a reservoir is important because it has a bearing on reservoir development and its management. Among other things, rock-physics analysis is usually carried out to distinguish between the lithology and fluid components of a reservoir by way of estimating the volume of clay, water saturation, and porosity using seismic data. Although these rock-physics parameters are easy to compute for conventional plays, there are many uncertainties in their estimation for unconventional plays, especially where (...) multiple zones need to be characterized simultaneously. We have evaluated such uncertainties with reference to a data set from the Delaware Basin where the Bone Spring, Wolfcamp, Barnett, and Mississippian Formations are the prospective zones. Attempts at seismic reservoir characterization of these formations have been developed in Part 1 of this paper, where the geologic background of the area of study, the preconditioning of prestack seismic data, well-log correlation, accounting for the temporal and lateral variation in the seismic wavelets, and building of robust low-frequency model for prestack simultaneous impedance inversion were determined. We determine the challenges and the uncertainty in the characterization of the Bone Spring, Wolfcamp, Barnett, and Mississippian sections and explain how we overcame those. In the light of these uncertainties, we decide that any deterministic approach for characterization of the target formations of interest may not be appropriate and we build a case for adopting a robust statistical approach. Making use of neutron porosity and density porosity well-log data in the formations of interest, we determine how the type of shale, volume of shale, effective porosity, and lithoclassification can be carried out. Using the available log data, multimineral analysis was also carried out using a nonlinear optimization approach, which lent support to our facies classification. We then extend this exercise to derived seismic attributes for determination of the lithofacies volumes and their probabilities, together with their correlations with the facies information derived from mud log data. (shrink)
Multicomponent seismic data offer several advantages for characterizing reservoirs with the use of the vertical component and mode-converted data. Joint impedance inversion inverts both of these data sets simultaneously; hence, it is considered superior to simultaneous impedance inversion. However, the success of joint impedance inversion depends on how accurately the PS data are mapped on the PP time domain. Normally, this is attempted by performing well-to-seismic ties for PP and PS data sets and matching different horizons picked on PP and (...) PS data. Although it seems to be a straightforward approach, there are a few issues associated with it. One of them is the lower resolution of the PS data compared with the PP data that presents difficulties in the correlation of the equivalent reflection events on both the data sets. Even after a few consistent horizons get tracked, the horizon matching process introduces some artifacts on the PS data when mapped into PP time. We have evaluated such challenges using a data set from the Western Canadian Sedimentary Basin and then develop a novel workflow for addressing them. The importance of our workflow was determined by comparing data examples generated with and without its adoption. (shrink)
The Delaware and Midland Basins are multistacked plays with production being drawn from different zones. Of the various prospective zones in the Delaware Basin, the Bone Spring and Wolfcamp Formations are the most productive and thus are the most drilled zones. To understand the reservoirs of interest and identify the hydrocarbon sweet spots, a 3D seismic inversion project was undertaken in the northern part of the Delaware Basin in 2018. We have examined the reservoir characterization exercise for this dataset in (...) two parts. In addition to a brief description of the geology, we evaluate the challenges faced in performing seismic inversion for characterizing multistacked plays. The key elements that lend confidence in seismic inversion and the quantitative predictions made therefrom are well-to-seismic ties, proper data conditioning, robust initial models, and adequate parameterization of inversion analysis. We examine the limitations of a conventional approach associated with these individual steps and determine how to overcome them. Later work will first elaborate on the uncertainties associated with input parameters required for executing rock-physics analysis and then evaluate the proposed robust statistical approach for defining the different lithofacies. (shrink)