# Browsing by Subject "Model selection"

Now showing 1 - 7 of 7

- Results Per Page
1 5 10 20 40 60 80 100

- Sort Options
Ascending Descending

Item Assessing reservoir performance and modeling risk using real options(2012-05) Singh, Harpreet; Srinivasan, Sanjay; Lake, Larry W.Show more Reservoir economic performance is based upon future cash flows which can be generated from a reservoir. Future cash flows are a function of hydrocarbon volumetric flow rates which a reservoir can produce, and the market conditions. Both of these functions of future cash flows are associated with uncertainties. There is uncertainty associated in estimates of future hydrocarbon flow rates due to uncertainty in geological model, limited availability and type of data, and the complexities involved in the reservoir modeling process. The second source of uncertainty associated with future cash flows come from changing oil prices, rate of return etc., which are all functions of market dynamics. Robust integration of these two sources of uncertainty, i.e. future hydrocarbon flow rates and market dynamics, in a model to predict cash flows from a reservoir is an essential part of risk assessment, but a difficult task. Current practices to assess a reservoir’s economic performance by using Deterministic Cash Flow (DCF) methods have been unsuccessful in their predictions because of lack in parametric capability to robustly and completely incorporate these both types of uncertainties. This thesis presents a procedure which accounts for uncertainty in hydrocarbon production forecasts due to incomplete geologic information, and a novel real options methodology to assess the project economics for upstream petroleum industry. The modeling approach entails determining future hydrocarbon production rates due to incomplete geologic information with and without secondary information. The price of hydrocarbons is modeled separately, and the costs to produce them are determined based on market dynamics. A real options methodology is used to assess the effective cash flows from the reservoir, and hence, to determine the project economics. This methodology associates realistic probabilities, which are quantified using the method’s parameters, with benefits and costs. The results from this methodology are compared against the results from DCF methodology to examine if the real options methodology can identify some hidden potential of a reservoir’s performance which DCF might not be able to uncover. This methodology is then applied to various case studies and strategies for planning and decision making.Show more Item Assigning g in Zellner's g prior for Bayesian variable selection(2015-05) Wang, Mengjie; Walker, Stephen G., 1945-; Lin, LizhenShow more There are numerous frequentist statistics variable selection methods such as Stepwise regression, AIC and BIC etc. In particular, the latter two criteria include a penalty term which discourages overfitting. In terms of the framework of Bayesian variable selection, a popular approach is using Bayes Factor (Kass & Raftery 1995), which also has a natural built-in penalty term (Berger & Pericchi 2001). Zellner's g prior (Zellner 1986) is a common prior for coefficients in the linear regression model due to its computational speed of analytic solutions for posterior. However, the choice of g is a problem which has attracted a lot of attention. (Zellner 1986) pointed out that if g is unknown, a prior can be introduced and g can be integrated out. One of the prior choices is Hyper-g Priors proposed by (Liang et al. 2008). Instead of proposing a prior for g, we will assign a fixed value for g based on controlling the Type I error for the test based on the Bayes factor. Since we are using Bayes factor to do model selection, the test statistic is Bayes factor. Every test comes with a Type I error, so it is reasonable to restrict this error under a critical value, which we will take as benchmark values, such as 0.1 or 0.05. This approach will automatically involve setting a value of g. Based on this idea, a fixed g can be selected, hence avoiding the need to find a prior for g.Show more Item Fast assessment of uncertainty in buoyant fluid displacement using a connectivity-based proxy(2016-05) Jeong, Hoonyoung; Sepehrnoori, Kamy, 1951-; Srinivasan, Sanjay; Wheeler, Mary; Delshad, Mojdeh; Sen, MrinalShow more It is crucial to estimate the uncertainty in flow characteristics of injected fluid. However, because a large suite of geological models is probable given sparse static data, it is impractical to conduct full physics flow simulations on the entire suite of models in order to quantify the uncertainty in fluid displacements. Thus a fast alternative to a full physics simulator is necessary to quickly predict the fluid displacements. Most of the proxies proposed thus far are inappropriate to approximate the buoyant flow of injected fluid for 3D heterogeneous rock during the injection period. In this dissertation, a new proxy will be proposed to quickly predict the buoyant flow of injected fluid during CO2 sequestration. The geological models are ranked based on the extent of the approximated CO2 plumes. By selecting a representative group of models among the ranked models, the uncertainty in the spatial and temporal characteristics of the CO2 plume migrations can be quickly quantified. About 90% of the computational cost of quantifying the uncertainty in the extent of CO2 plumes was saved using the proposed connectivity based proxy. In a geological carbon storage project, the spatial and temporal characteristics of CO2 plume migrations can be monitored by 4D seismic surveys. The images of CO2 plumes obtained from 4D seismic surveys are used as observed data to find subsurface models honoring the spatial and temporal characteristics of the observed CO2 plumes. However, because manually comparing an observed CO2 plume and prior CO2 plumes in a large suite of subsurface models is inefficient, an automatic measure to calculate the dissimilarity between the CO2 plumes is necessary. The most intuitive way to calculate the dissimilarity is the Euclidean distance between vectors representing CO2 plumes. However, this is inappropriate to measure the dissimilarity between CO2 plumes because it does not consider spatial relation between the elements of the vectors. The shape dissimilarity between the CO2 plumes that reflects the spatial relation can be calculated using the Hausdorff distance. The computational cost of calculating the shape dissimilarity between CO2 plumes is significantly reduced by calculating the Hausdorff distance between the representations of the CO2 plumes such as perimeter, surface, and skeleton instead of the original CO2 plumes. An appropriate representation should be chosen according to the spatial characteristics of CO2 plumes.Show more Item Model selection for CO₂ sequestration using surface deflection and injection data(2015-08) Nwachukwu, Chiazor; Srinivasan, Sanjay; Sepehrnoori, KamyShow more In recent years, sequestration of CO₂ in the subsurface has been studied more extensively as an approach to curb carbon emissions into the atmosphere. Monitoring the fate and migration of the CO₂ plume in the aquifer is of utmost interest to regulators and operators. Current monitoring techniques like time-lapse seismic are expensive and have limited applicability. Moreover, these techniques have little predictive value unless embedded within a feedback-style control scheme. Provided that field data such as bottom-hole pressures, well rates, or even surface deformation is available, geologic models for the aquifer can be created and used, as an input to a flow simulator, to predict the migration of CO₂. A history matching approach has been developed, within a model selection framework, to select and refine geologic models within a selected set of models until they represent the spatial heterogeneity of the target aquifer, and produce forecast with relatively small uncertainty. An initial large suite of models can be created based on prior information of the aquifer. Predicting the response from these models however, presents a problem in terms of computational time and expense. A particle-tracking algorithm has been developed to estimate the flow response from geologic models, while significantly reducing computational costs. This algorithm serves as a fast approximation of finite-difference flow simulation models, and is meant to provide a rapid estimation of connectivity of the aquifer models. A finite element method (FEM) solver was also developed to approximate the geomechanical effects in the rock caused by the injection of CO₂. The approach used here utilizes a partial coupling scheme to sequentially solve the flow and geomechanical equilibrium equations. The validity of the proxies is tested on both 2D and 3D field cases, and the solutions are shown to correlate reasonably well with full-physics simulations. We also demonstrate the application of the model selection algorithm to a 3D reservoir with complex topography. The algorithm includes three main steps: (1) predicting the flow and geomechanical response of a large prior ensemble of models using the proxies; (2) grouping models with similar responses into clusters using multidimensional scaling together with a k-means clustering approach; and (3) selecting a model cluster that produces the minimum deviation from the observed field data. The model selection procedure can be repeated using the sub-group of models within a selected cluster in order to further refine the forecasts for future plume migration. This entire iterative model selection scheme is demonstrated using the injection data for the Krechba reservoir in Algeria, which is an active site for CO₂ sequestration.Show more Item Particle tracking proxies for prediction of CO₂ plume migration within a model selection framework(2014-05) Bhowmik, Sayantan; Srinivasan, Sanjay; Bryant, Steven L.Show more Geologic sequestration of CO₂ in deep saline aquifers has been studied extensively over the past two decades as a viable method of reducing anthropological carbon emissions. The monitoring and prediction of the movement of injected CO₂ is important for assessing containment of the gas within the storage volume, and taking corrective measures if required. Given the uncertainty in geologic architecture of the storage aquifers, it is reasonable to depict our prior knowledge of the project area using a vast suite of aquifer models. Simulating such a large number of models using traditional numerical flow simulators to evaluate uncertainty is computationally expensive. A novel stochastic workflow for characterizing the plume migration, based on a model selection algorithm developed by Mantilla in 2011, has been implemented. The approach includes four main steps: (1) assessing the connectivity/dynamic characteristics of a large prior ensemble of models using proxies; (2) model clustering using the principle component analysis or multidimensional scaling coupled with the k-mean clustering approach; (3) model selection using the Bayes' rule on the reduced model space, and (4) model expansion using an ensemble pattern-based matching scheme. In this dissertation, two proxies have been developed based on particle tracking in order to assess the flow connectivity of models in the initial set. The proxies serve as fast approximations of finite-difference flow simulation models, and are meant to provide rapid estimations of connectivity of the aquifer models. Modifications have also been implemented within the model selection workflow to accommodate the particular problem of application to a carbon sequestration project. The applicability of the proxies is tested both on synthetic models and real field case studies. It is demonstrated that the first proxy captures areal migration to a reasonable extent, while failing to adequately capture vertical buoyancy-driven flow of CO₂. This limitation of the proxy is addressed in the second proxy, and its applicability is demonstrated not only in capturing horizontal migration but also in buoyancy-driven flow. Both proxies are tested both as standalone approximations of numerical simulation and within the larger model selection framework.Show more Item Predicting the migration of CO₂ plume in saline aquifers using probabilistic history matching approaches(2012-05) Bhowmik, Sayantan; Srinivasan, Sanjay; Bryant, Steven L.Show more During the operation of a geological carbon storage project, verifying that the CO₂ plume remains within the permitted zone is of particular interest both to regulators and to operators. However, the cost of many monitoring technologies, such as time-lapse seismic, limits their application. For adequate predictions of plume migration, proper representation of heterogeneous permeability fields is imperative. Previous work has shown that injection data (pressures, rates) from wells might provide a means of characterizing complex permeability fields in saline aquifers. Thus, given that injection data are readily available and inexpensive, they might provide an inexpensive alternative for monitoring; combined with a flow model like the one developed in this work, these data could even be used for predicting plume migration. These predictions of plume migration pathways can then be compared to field observations like time-lapse seismic or satellite measurements of surface-deformation, to ensure the containment of the injected CO₂ within the storage area. In this work, two novel methods for creating heterogeneous permeability fields constrained by injection data are demonstrated. The first method is an implementation of a probabilistic history matching algorithm to create models of the aquifer for predicting the movement of the CO₂ plume. The geologic property of interest, for example hydraulic conductivity, is updated conditioned to geological information and injection pressures. The resultant aquifer model which is geologically consistent can be used to reliably predict the movement of the CO₂ plume in the subsurface. The second method is a model selection algorithm that refines an initial suite of subsurface models representing the prior uncertainty to create a posterior set of subsurface models that reflect injection performance consistent with that observed. Such posterior models can be used to represent uncertainty in the future migration of the CO₂ plume. The applicability of both methods is demonstrated using a field data set from central Algeria.Show more Item Selection, calibration, and validation of coarse-grained models of atomistic systems(2015-05) Farrell, Kathryn Anne; Oden, J. Tinsley (John Tinsley), 1936-; Prudhomme, Serge M.; Babuska, Ivo; Bui-Thanh, Tan; Demkowicz, Leszek; Elber, RonShow more This dissertation examines the development of coarse-grained models of atomistic systems for the purpose of predicting target quantities of interest in the presence of uncertainties. It addresses fundamental questions in computational science and engineering concerning model selection, calibration, and validation processes that are used to construct predictive reduced order models through a unified Bayesian framework. This framework, enhanced with the concepts of information theory, sensitivity analysis, and Occam's Razor, provides a systematic means of constructing coarse-grained models suitable for use in a prediction scenario. The novel application of a general framework of statistical calibration and validation to molecular systems is presented. Atomistic models, which themselves contain uncertainties, are treated as the ground truth and provide data for the Bayesian updating of model parameters. The open problem of the selection of appropriate coarse-grained models is addressed through the powerful notion of Bayesian model plausibility. A new, adaptive algorithm for model validation is presented. The Occam-Plausibility ALgorithm (OPAL), so named for its adherence to Occam's Razor and the use of Bayesian model plausibilities, identifies, among a large set of models, the simplest model that passes the Bayesian validation tests, and may therefore be used to predict chosen quantities of interest. By discarding or ignoring unnecessarily complex models, this algorithm contains the potential to reduce computational expense with the systematic process of considering subsets of models, as well as the implementation of the prediction scenario with the simplest valid model. An application to the construction of a coarse-grained system of polyethylene is given to demonstrate the implementation of molecular modeling techniques; the process of Bayesian selection, calibration, and validation of reduced-order models; and OPAL. The potential of the Bayesian framework for the process of coarse graining and of OPAL as a means of determining a computationally conservative valid model is illustrated on the polyethylene example.Show more