# Browsing by Author "Lake, Larry W."

Now showing 1 - 20 of 129

- Results Per Page
1 5 10 20 40 60 80 100

- Sort Options
Ascending Descending

Item A decision-based approach to establish non-informative prior sample space for decision-making in geotechnical applications(2022-12-21) Feng, Kai (Ph.D. in civil engineering); Gilbert, Robert B. (Robert Bruce), 1965-; Lake, Larry W.; Rathje, Ellen M; Nadim, Farrokh; Boyles, StephenShow more Bayes’ theorem is widely adopted for risk-informed decision-making in natural hazards (which often have limited data), but the prior sample space based on the existing methods may lead to inconsistent, irrational, and not defensible results. Therefore, Decision Entropy Theory (DET) is under development to improve the assessment of small probabilities when limited information is available for non-informative prior sample space in assisting Bayesian decision-making. The key idea to establish a non-informative prior sample space with DET is that the value of new information is as uncertain as possible, or the entropy of the new information is maximized. The mathematical formulation includes prior decision analysis by maximizing the relative entropy of the value of perfect information and pre-posterior decision analysis by maximizing the relative entropy of the value of imperfect information given each value of perfect information. The goal of this research includes (1) apply the theory to simple problems to demonstrate and study its rigorous implementation, evaluate possible approximations to reduce the computational effort required to implement it rigorously, and develop insight into the results; (2) propose and characterize the likelihood functions to represent subjective judgment for small-probability events in the decision analysis; and (3) demonstrate the application of the theory to real-world cases histories. From this research, the following conclusions are drawn: (1) results of illustrative decision analysis examples show that the non-informative prior probabilities obtained from DET are sensical and address concerns that have been raised about other approaches to establish non-informative prior probabilities that do not consider their impact on decision making; moreover, the DET-based non-informative prior is invariant to transformations of uncertain variables as it depends on the decisions rather than how the states of nature are defined; (2) an approximation to the rigorous DET reduces the computation effort considerably (many orders of magnitude), provides reasonable results for the prior decision and value of perfect information, but is less able to approximate the value of imperfect information; (3) likelihood functions proposed for fractional occurrence models with the Binomial distribution, Poisson distribution, and Multinomial distribution have a maximum at the estimated fraction of occurrences and a Fisher information quantity that is inversely proportional to the estimated fraction and proportional to the length of the record used to estimate the fraction; and (4) the non-informative prior probabilities obtained with DET for the dam case history provide useful insight into the potential impacts of not making assumptions beyond what is actually known. When uncertainty in frequencies of overtopping and the chance of dam failure given overtopping (fragility) are included, the decision to rehabilitate the dam is justified with a cost of dam breach that is more than 100 times smaller than when this uncertainty is neglected and more than 10 times smaller than when uncertainty in the hazard but not the fragility is neglected. In addition, the maximum value of obtaining additional information about frequencies of hazard and fragility is 35% of the cost of rehabilitation. The theory will be advanced in the future by developing more efficient algorithms that optimize the time complexity and space complexity for the numerical implementation of DET and applying it to more complicated and realistic problems.Show more Item Advances in the development and application of a capacitance-resistance model(2013-05) Laochamroonvorap, Rapheephan; Lake, Larry W.Show more Much effort of reservoir engineers is devoted to the time-consuming process of history matching in a simulator to understand the reservoir complexity. Its accuracy is debatable because only a few inputs are known. Several analytical tools have been developed to investigate reservoir heterogeneity. The reciprocal productivity index (RPI) is a tool to measure the pressure support observed at a producer. The log (water-oil ratio or WOR) plot can be used to indicate the presence of a channel. A capacitance-resistance model (CRM) is a simple tool to estimate the connectivity between a producer-injector pair from the production/injection and pressure data. Generally field operators implement an improved recovery plan such as water-alternating-gas (WAG) flood to improve displacement efficiency. However, the existence of heterogeneity compromises its performance. The first objective of this study is to improve the assessment of tertiary flood performance by integrating the CRM with other analytical tools. The integrated method was applied to a miscible flood field in West Texas. The results suggest strong interwell connectivity found more frequently in the NE-SW direction and the different preferential flow paths of injected CO2 and water. Overall, the results provide insights into the current flood status. The operating conditions of a producer dynamically change because of well/field constraints. These changes can induce significant interference in other wells, which cannot be captured by CRM. The second objective of this study is to develop a capacitance-resistance model with producer-producer interaction (CRMP-P). The CRMP-P, derived from the continuity and Darcy’s equations, accounts for producer-producer interactions. The CRMP-P was applied to data from three different reservoir models. The results suggest that the CRMP-P could fit the data with higher precision than CRM. Consequently, the CRMP-P estimates of reservoir properties are more accurate. Moreover, the estimated transmissibility between producers is in agreement with the reservoir models. The CRMP-P was also applied to Omani field data. The transmissibility results are consistent with previous study and the drilling information. The more accurate information on producer-producer interactions and reservoir properties can assist in history-matching, locating infill wells, and reservoir management planning.Show more Item Algorithm-aided decision-making in reservoir management(2019-05) Lee, Boum Hee; Lake, Larry W.; DiCarlo, David; Gilbert, Robert; Sepehrnoori, Kamy; Mohanty, KishoreShow more Sound reservoir management involves making decisions in the presence of uncertainty and complexity. Because projects handled in the oil and gas industry are often highly risky and uncertain, the decision-making methods the geoscientists employ must be self-consistent, systematic, and defensible. This dissertation addressed three example problems commonly encountered in reservoir management: water injection allocation optimization, horizontal well refrac scheduling, and infill drilling scheduling. Solutions to each problem employ different algorithms and data analytic techniques that allow a coherent integration of uncertainty and decisions. The specific algorithms and statistical tools used for each problem are provided below. The solution to water injection allocation draws from simple models as well as appropriate statistical methods. The capacitance-resistance model (CRM) is used to model interactions between injectors and producers to help predict the reservoir’s fluid production response. The CRM is paired with Koval’s K-Factor method to decouple oil and water from total fluid production. The models are fitted using a bootstrapped dataset to generate a diverse distribution of history matched solutions. Next, the best injection scheme corresponding to each history matched model is determined using ensemble optimization (EnOpt). Finally, a sampling algorithm called Thompson sampling is called upon to determine the optimal injection scheme while reducing the number of less promising simulations. This way, one can select the best injection scheme that is robust to uncertainties in history matching while simultaneously minimizing the number of simulation runs where it is unnecessary. Validation against a reservoir simulation model is provided at the end to confirm that the injection scheme selected is indeed optimal. The refrac scheduling problem examines a horizontal gas well that is a candidate to refracturing. The analysis employs a real options approach to find the current and future conditions in which refracing is the best decision, as well as to provide an accurate valuation that reflects the managerial flexibility of the project. An algorithm called least-squares Monte Carlo (LSM) will be used to achieve the two goals. In parallel, the Ornstein-Uhlenbeck model is calibrated using the ensemble Kalman filter (EnKF) to account for the gas price changes through time stochastically. The results of the valuations are compared against a myopic Monte Carlo/discounted cash flow (MC-DCF) method to demonstrate that the latter provides an underestimate of the true value. The underestimation results from that the MC-DCF approach neglects the alternatives available in managing the project. The difference between the two estimates of project value is calculated to determine the value of flexibility. Finally, the optimal policies determined is examined to confirm that the recommended response to the realization of uncertainties is intuitively consistent. Finally, a Monte Carlo tree search (MCTS) algorithm is paired with a reservoir simulator to optimize the infill drilling schedule in a reservoir undergoing waterflooding. Because of the permutative nature of sequence-dependent actions, the problem suffers from the curse of dimensionality. MCTS allows the user to find an approximate solution to the scheduling problem that is otherwise intractable. The final optimized schedule specifies 1) whether an infill well should be drilled at candidate locations, 2) whether an injector or producer should be drilled, and 3) when the well should be drilled. A provisional validation is provided at the end by comparing the cumulative oil production and the NPV of the MCTS-optimized schedule against those resulting from randomly generated schedules. Overall, the goal of this dissertation is to demonstrate that different algorithms can be tailored to optimize decisions or policies. The proposed solutions systematically integrate the relevant uncertainties in the analysis as they search for the most preferred action. Such rational approach where uncertainty plays an active role in decision-making provides the geoscientists with the confidence that the final optimized decision is the best action to take. Workflows designed and recommended in this dissertation are strongly preferred over the alternatives where uncertainty and sensitivity analyses are conducted after decisions have already been made using deterministic methodsShow more Item An investigation of the properties of geological simulation techniques based on orthogonal decompositions(2016-12) Raina, Arindam; Lake, Larry W.; Srinivasan, SanjayShow more Geological modeling is an important aspect of reservoir exploration and field development planning in which data obtained from the reservoir is interpolated to the locations in the field where the values of a given property are unknown. This is accomplished by statistically characterizing the pattern of variability exhibited by the data, and then using this characterization to estimate the values of a given location where the actual value is unknown. There have been many algorithms developed to model geological properties. The conventional geological simulation methods involve solving a system of equations for each point where an interpolated value is required. This method is somewhat inefficient, as simulation nodes are visited in a sequential fashion, and this results in increased computation time for processes such as history matching. Moreover, the sequential simulation approach is hampered by statistical constraints such as ergodicity that imply that the generated models reflect the target statistics only in an average sense and any one realization may deviate significantly from the target. In order to reduce the computational time that it takes generate these simulations, a new set of modeling methods based on orthogonal decompositions have been developed. In an orthogonal decomposition, an image is transformed from the original data space to a space defined by a set of orthogonal bases. These bases will have different levels of significance; the less significant bases can be ignored, allow the remaining bases to provide a reasonable representation of the data, while reducing the number of degrees of freedom. In this research, we investigate properties of decomposition based simulations. First we demonstrate a generalized method to condition models constructed using decomposition methods to known data points. Next, we show how these decomposition-based methods can be used to analyze the heterogeneity of a reservoir. Then, we combine the concept of dimensionality reduction with that of conditional simulation using an orthogonal bases and demonstrate the properties shown by the resulting models. There are two orthogonal decomposition based methods that are developed for this thesis. The first method is a novel method, based on a “rotational bases”, where a simulation is performed using somewhat arbitrary basis images generated using a rotational transformation. The second method is based on principal component analysis. These methods are analyzed to provide some insight into the characteristics of proper orthogonal decompositions.Show more Item Analysis of Areal Permeability Variations - San Andres Formation (Guadalupian): Algerita Escarpment, Otero County, New Mexico(1988-08) Kittridge, Mark Gerard; Lake, Larry W.Show more This paper presents the results of an integrated outcrop and subsurface characterization study conducted on the San Andres Formation of the Permian basin. More than 1600 permeability measurements were obtained from an outcrop section located along the Algerita Escarpment in southeastern New Mexico using an experimental mechanical field permeameter (MFP). Subsurface core data (permeability and porosity) were available from ten closely spaced wells in the Wasson field on the adjacent Northwest Shelf of the Permian basin. Standard population statistics, contour plots and vertical profiles, and geostatistical techniques were used in an attempt to characterize the extremely heterogeneous formation. The outcrop permeability data were found to be log-normally to power-normally distributed, with 12 of 16 data sets having a negative p value. Mean permeability and variance was lowest in the fusulinid dolowackestones, while the highest mean was found in the dolopackstones and dolograinstone intervals. Permeability contour maps of the outcrop grid data typically revealed isolated 'pods' of high permeability in a generally low permeability matrix. The vertical transect measured displayed rapidly varying permeability, with values changing over a very short interval. Geostatistical analysis with the variogram predicted three distinct correlation lengths: 40 feet, 3 to 5 feet, and approximtely 0.25 feet, depending on the spacing of the data used. Predicted correlation length decreased with a decrease in sample spacing. The correlation length was found to be invariant with respect to direction, indicating that the formation is isotropic. Subsurface permeability and porosity data were analyzed in a similar manner. The permeability data was found to be log-normally distributed while the porosity data was power normal. The associated variance on the core plug data was much larger than on the whole core data. Vertical permeability and porosity profiles were similar to that observed from the outcrop vertical transect: alternating high and low values occurring over a very short distance. Variograms indicated a correlation length of approximtely 10.0 feet (vertically) for both permeability and porosity.Show more Item An Analysis of Monte Carlo Simulation as an Estimator of Original Oil In Place and Original Gas In Place(2004-12) Williams, John David; Lake, Larry W.Show more The Monte Carlo Method has been increasingly used in the petroleum industry as a means of quantifying uncertainty. Most commonly, this technique is used to calculate a range of values for hydrocarbon volumes originally in place. The distributions of the input variables input into the Monte Carlo simulator are estimated by obtaining a sufficient number of measurements of reservoir and fluid properties. This report will analyze several depleted or very mature fields for which the ultimate hydrocarbon recovery is known. For each of these fields, distributions of porosity, water saturation, reservoir thickness, and reservoir size are obtained from cores, openhole logs, seismic, and other data. These distributions are input into the Crystal Ball computer program to obtain the cumulative distribution function (CDF) and probability density function (PDF) of oil or gas volume originally in place for each of the fields. By comparing the actual ultimate recovered volumes of hydrocarbons to the range of original in place volumes predicted by the Monte Carlo calculations, the accuracy of the Monte Carlo method will be analyzed.Show more Item Analytic Methods to Calculate an Effective Permeability Tensor and Effective Relative Permeabilities for Cross-Bedded Flow Units(1990-05) Kasap, Ekrem; Lake, Larry W.Show more Most naturally-occurring permeable media are heterogeneous on too small of a scale to include all the detailed heterogeneity into a numerical simulation. Instead, lumping the effects of those' heterogeneities in a form that can be easily inserted into simulators is an alternative. Many of the effects of those heterogeneities can be quantified analytically by calculating an effective permeability tensor, with non-zero off-diagonal terms, when the heterogeneity is non-uniform. If there exist some prototype regularities, in addition to the effective permeability tensor, effective relative permeabilities can be generated to account for an uneven displacement front in the direction normal to the main flow in viscously dominated flows. For non-uniform heterogeneities, an analytic method to calculate effective cell permeabilities as a tensor based on geometry, size of the numerical cell, tensorial local permeabilities and geology within the cell is proposed. The method is based on flow through parallel and serial cross-beds which is subsequently rotated to arrive at tensorial permeabilities having non-zero off-diagonal terms. The procedure is applied to a simulation of flow through an outcrop of the eolian Page Sandstone. The results of the fluid flow simulations show that the relative positions of the main geologic features and the ratio between the grainflow and windripple permeabilities are more important than bounding surfaces, cross-bedding and dispersion in determining flow behavior. For uniform heterogeneities, an analytical method to generate effective relative permeabilities which account for an uneven displacement front is proposed. The procedure considers only viscously-dominated flows and consists of discretizing the flow unit into subunits and homogenizing each subunit by calculating an effective permeability tensor which resolves cross-bedding and cross-bed orientation. Effective relative permeabilities are then generated analytically to account for differences in sweep between the subunits. · The method is applied to one-dimensional simulations of fluid flow in the C2 and B units of the Page Sandstone with less detail (36 elements, instead of 11520 elements of the detailed simulations). The resulting recovery predictions for different mobility ratios are compared with the ones from the detailed simulations. The comparisons of the recovery predictions indicate that the calculated effective relative permeabilities can capture the effect of heterogeneity on the sweep efficiency. Both methods have been validated using a finite element numerical simulator which models the permeability discontinuities explicitly. Comparison of analytical and numerical effective permeability and effective relative permeabilities indicate that the analytically calculated effective permeabilities and generated effective relative permeabilities are valid, easy to implement, and are practical alternatives to account for detailed heterogeneities in numerical simulationsShow more Item Analyzing databases using data analytics(2015-12) Lee, Boum Hee; Lake, Larry W.; Mohanty, Kishore KShow more There are many public and private databases of oil field properties the analysis of which could lead to insights in several areas. The recent trend of Big Data has given rise to novel analytic methods to effectively handle multidimensional data, and to visualize them to discover new patterns. The main objective of this research is to apply some of the methods used in data analytics to datasets with reservoir data. Abstract Abstract Using a commercial reservoir properties database, we created and tested three data analytic models to predict ultimate oil and gas recovery efficiencies, using the following methods borrowed from data analytics: linear regression, linear regression with feature selection, and Bayesian network. We also adopted similarity ranking with principal component analysis to create a reservoir analog recommender system, which recognizes and ranks reservoir analogs from the database. Abstract Among the models designed to estimate recovery factors, the linear regression models created with variables selected with sequential feature selection method performed the best, showing strong positive correlations between actual and predicted values of reservoir recovery efficiencies. Compared to this model, Bayesian network model, and simple linear regression model performed poorly. Abstract For the reservoir analog recommender system, an arbitrary reservoir is selected, and different distance metrics were used to rank analog reservoirs. Because no one distance metric (and hence the given reservoir analog list) is superior to the other, the reservoirs given in the recommended list are compared along with the characteristics of distance metrics.Show more Item Approximations, simulation, and accuracy of multivariate discrete probability distributions in decision analysis(2012-05) Montiel Cendejas, Luis Vicente; Bickel, J. Eric; Morton, David P.; Hasenbein, John J.; Dyer, James S.; Lake, Larry W.Show more Many important decisions must be made without full information. For example, a woman may need to make a treatment decision regarding breast cancer without full knowledge of important uncertainties, such as how well she might respond to treatment. In the financial domain, in the wake of the housing crisis, the government may need to monitor the credit market and decide whether to intervene. A key input in this case would be a model to describe the chance that one person (or company) will default given that others have defaulted. However, such a model requires addressing the lack of knowledge regarding the correlation between groups or individuals. How to model and make decisions in cases where only partial information is available is a significant challenge. In the past, researchers have made arbitrary assumptions regarding the missing information. In this research, we developed a modeling procedure that can be used to analyze many possible scenarios subject to strict conditions. Specifically, we developed a new Monte Carlo simulation procedure to create a collection of joint probability distributions, all of which match whatever information we have. Using this collection of distributions, we analyzed the accuracy of different approximations such as maximum entropy or copula-models. In addition, we proposed several new approximations that outperform previous methods. The objective of this research is four-fold. First, provide a new framework for approximation models. In particular, we presented four new models to approximate joint probability distributions based on geometric attributes and compared their performance to existing methods. Second, develop a new joint distribution simulation procedure (JDSIM) to sample joint distributions from the set of all possible distributions that match available information. This procedure can then be applied to different scenarios to analyze the sensitivity of a decision or to test the accuracy of an approximation method. Third, test the accuracy of seven approximation methods under a variety of circumstances. Specifically, we addressed the following questions within the context of multivariate discrete distributions: Are there new approximations that should be considered? Which approximation is the most accurate, according to different measures? How accurate are the approximations as the number of random variables increases? How accurate are they as we change the underlying dependence structure? How does accuracy improve as we add lower-order assessments? What are the implications of these findings for decision analysis practice and research? While the above questions are easy to pose, they are challenging to answer. For Decision Analysis, the answers open a new avenue to address partial information, which bing us to the last contribution. Fourth, propose a new approach to decision making with partial information. The exploration of old and new approximations and the capability of creating large collections of joint distributions that match expert assessments provide new tools that extend the field of decision analysis. In particular, we presented two sample cases that illustrate the scope of this work and its impact on uncertain decision making.Show more Item Aqueous solution of ketone solvent for enhanced oil recovery in tight reservoirs(2021-05-07) Wang, Mingyuan, 1991-; Okuno, Ryosuke, 1974-; Lake, Larry W.; DiCarlo, David; Espinoza, D. Nicolas; Leung, Juliana Y.Show more Horizontal drilling and multi-stage hydraulic fracturing have made it possible to recover oil from tight formations at economically feasible production rates. However, tight oil reservoirs often show a rapid decline in the production rate. Primary recovery factors in tight reservoirs are typically smaller than 10%. There is a critical need for enhanced oil recovery in tight reservoirs. Most tight oil reservoirs are originally intermediate- to oil-wet. Wettability alteration agents have been studied to facilitate water imbibition into tight rock matrices to enhance oil recovery. However, many factors affect the efficacy and efficiency of enhanced oil recovery by wettability alteration agents. The conventional wettability modifiers, such as surfactants, decrease the interfacial tension between the aqueous and oleic phases, which tends to limit the imbibition rate. The performance of wettability modifiers also depends on their mass transfer from the fracture to the matrix. However, the mass transfer of components between the fracture and the matrix has not been studied quantitatively in the literature. In addition, initial water saturation in the matrix, the concentration of wettability modifier in the injection fluid, and the injection/production pressures also affect the efficacy and efficiency of enhanced oil recovery by wettability alteration agents. This research aims to identify a practical solvent that can alter rock wettability without affecting interfacial tension and transfer efficiently from fracture to matrix. In addition, the effect of initial water saturation on enhanced water imbibition and the impact of the chemical concentration of the injected aqueous solution is investigated. In this research, we identified 3-pentanone, a symmetric dialkyl ketone, can act as a wettability alteration agent without affecting the interfacial tension between the aqueous and oleic phases. It is conceivable that the wettability change caused by 3-pentanone is related to the polar-polar interaction between 3-pentanone molecules and the calcite surface. This interaction may reduce the polar-polar interaction of the carboxylate group of naphthenic acids in oil with the calcite surface. Next, we compared 3-pentanone with a common wettability modifier, a surfactant. The dynamic imbibition experiments demonstrated that 3-pentanone was more efficient in transferring from a fracture to the surrounding matrices than the surfactant. Results indicated that an optimal process with a wettability modifier would have a large imbibed fraction to rapidly enhance the oil displacement by brine in the matrix. Then, we demonstrated that the 3-pentanone solution increased the oil recovery from the shale matrix in comparison to the injection brine through huff-n-puff experiments. Last, we developed a new method for reliable determination of saturation pressure from constant-mass expansion data even when the total compressibility of the fluid does not show a detectable change near the saturation pressure. The new method has been used successfully to design the live-oil experiments in this and other research projectsShow more Item Assess the risk of extreme floods : case study : Houston, Texas(2018-01-26) Gao, Yang, M.S. in Engineering; Gilbert, Robert B. (Robert Bruce), 1965-; Lake, Larry W.Show more The catastrophic flooding caused by the Hurricane Harvey, ranked as the third 500-year floods in three years in Houston Area, has inflicted nearly $200 billion in damage. The traditional Hydro-Economical Model of a specific stream station along Buffalo Bayou in Houston area is established following the standard of practice in the report. A different perspective of non-informative prior sample space from the DET theory is built as well based on the preference between two alternatives to evaluate the risks in decision analysis. The comparison between the recommendations from the DET theory and from the statistical extrapolation of standard of practice on selecting the first-floor elevation of one building is presented. The extrapolation method tends to choose a higher first-floor elevation no matter what the shape of the utility function is, while the preferred value from the DET is sensitive to the utility function. When the cost of implementation is extreme high for higher elevations, the extrapolation method recommends a lower first-floor elevation while the DET prefers a higher elevation. The DET framework takes the shape of utility function as well as the decision faced with extreme event into consideration, and turns out to be a rational way to analyze the flood risks.Show more Item Assessing reservoir performance and modeling risk using real options(2012-05) Singh, Harpreet; Srinivasan, Sanjay; Lake, Larry W.Show more Reservoir economic performance is based upon future cash flows which can be generated from a reservoir. Future cash flows are a function of hydrocarbon volumetric flow rates which a reservoir can produce, and the market conditions. Both of these functions of future cash flows are associated with uncertainties. There is uncertainty associated in estimates of future hydrocarbon flow rates due to uncertainty in geological model, limited availability and type of data, and the complexities involved in the reservoir modeling process. The second source of uncertainty associated with future cash flows come from changing oil prices, rate of return etc., which are all functions of market dynamics. Robust integration of these two sources of uncertainty, i.e. future hydrocarbon flow rates and market dynamics, in a model to predict cash flows from a reservoir is an essential part of risk assessment, but a difficult task. Current practices to assess a reservoir’s economic performance by using Deterministic Cash Flow (DCF) methods have been unsuccessful in their predictions because of lack in parametric capability to robustly and completely incorporate these both types of uncertainties. This thesis presents a procedure which accounts for uncertainty in hydrocarbon production forecasts due to incomplete geologic information, and a novel real options methodology to assess the project economics for upstream petroleum industry. The modeling approach entails determining future hydrocarbon production rates due to incomplete geologic information with and without secondary information. The price of hydrocarbons is modeled separately, and the costs to produce them are determined based on market dynamics. A real options methodology is used to assess the effective cash flows from the reservoir, and hence, to determine the project economics. This methodology associates realistic probabilities, which are quantified using the method’s parameters, with benefits and costs. The results from this methodology are compared against the results from DCF methodology to examine if the real options methodology can identify some hidden potential of a reservoir’s performance which DCF might not be able to uncover. This methodology is then applied to various case studies and strategies for planning and decision making.Show more Item Assessing the predictability and uncertainty of Capacitance Resistance Models(2022-05-05) Potla, Akhil; Foster, John T., Ph. D.; Lake, Larry W.Show more Capacitance Resistance Model (CRM) models are a specialized set of mathematical models to aid in predicting the total production rate of an oil well. These models uses an electrical analogy to model the reservoir capacity and the fraction of flow from injectors to producers. Because of the recent interest in machine learning, it is useful to compare the predictive ability of this physics-based model to machine learning (ML) models. The working hypothesis was that CRM would perform better than the ML models given limited data. It was thought improbable that a ML model would be able accurately model the flow of reservoir fluids during secondary oil recovery. We show that CRM predicts the production rate with greater accuracy, i.e. lower error, than the machine learning models tested. In some cases, the differences in prediction quality were substantial, and in other cases they were more modest. In addition to predicting the production rate better, CRM also produces more accurate and precise distributions than the machine learning models, which is important for decision makers who must take into account risk when making financial decisions. These distributions were constructed using a specific bootstrap technique, moving block bootstrap, and the quality of these distributions are quantified using a metric that assesses the "goodness" of distributions.Show more Item Capacitance resistance modeling for improved characterization in waterflooding and thermal recovery projects(2016-12) Duribe, Victor Chijioke; Edgar, Thomas F.; Lake, Larry W.; Sanchez, Isaac C; Baldea, Michael; Lasdon, Leon SShow more Rates are typically one of the most measured in an oil recovery project. The abundance of these types of data is explained partly by their relative ease of collection. Additionally, their collection and reporting is often required for logistical as well as financial purposes. Numerous researchers have shown the potency of using these data for characterization and management of oil reservoirs under primary or secondary recovery. Reduced-order models typically use these measurements as input to characterize reservoirs. The capacitance resistance model (CRM) is one such reduced order modeling method. This model uses well rates (and bottomhole pressure data, if available) to characterize a reservoir in a cheap and fast way. In characterizing an oil reservoir, the CRM and its linear counterpart (the Integrated Capacitance Resistance Model or ICRM) use historical data available at the wells to infer connectivity and flow paths between these wells through a set of model parameters. This use of readily available data, enabled by the speed of these models, creates a powerful tool that can be used as an alternative or as a complement to more expensive and time consuming traditional reservoir management tools. The CRM was initially developed for secondary recovery (i.e., water-flooding) but has been shown to work very well for primary recovery and many enhanced oil recovery (EOR) processes. The increasing industry acceptance of this modeling method is because of the work researchers who have contributed in expanding the capabilities of this modeling approach. However, key questions such as the impact of noise of CRM and ICRM performance remain. Additionally, a rigorous way of designing injection rates (a key input into the CRM model) such that parameter estimation is optimal has not been addressed. Finally, ideas about the applicability of the CRM modeling method to thermal EOR processes has not been explored. This research aims to address these questions. By addressing these questions, this work aims to contribute towards deepening current under-standing of the CRM modeling method and to opening new avenues for its application and research.Show more Item Capacitance resistance modeling for primary recovery, waterflood and water-CO₂ flood(2012-08) Nguyen, Anh Phuong; Edgar, Thomas F.; Lake, Larry W.; Lasdon, Leon S.; Bonnecaze, Roger T.; Sharma, Mukul M.Show more Reservoir characterization is very important in reservoir management to plan, monitor, predict and optimize oil production. Reservoir simulation is well-accepted in reservoir management but it requires many inputs, needs months to set up and complete a set of simulation runs, and contains large uncertainty in physical and geological properties. Therefore, simpler methods that provide quick results to complement or substitute reservoir simulation are important in decision making. Capacitance resistance model (CRM) is one of the methods. CRM is an input-output model derived from a continuity equation to quantify producer-injector connection strength during waterflood using solely production data. This work improves the CRM application method for waterflood and develops CRM theories and application methods for other recovery periods such as primary recovery and water-CO2 flood. A West Texas field test was carried out to validate CRM for a waterflood. The CRM fit was evaluated and used to optimize the oil production by changing injection rates. Through this first field experiment, a CRM application procedure was developed. With the CRM optimized injection schedule, the field gained 5372 bbls of additional oil production increase after one year. This research also quantitatively validates the CRM gain and time constant using synthetic fields and compares them to parameters of the streamline model, a complex model with similar purposes to the CRM. The CRM provides similar results as the streamline model with fewer inputs. The CRM was extended to primary recovery and water-CO2 flood. A new CRM equation – the integrated CRM (ICRM) - for primary recovery was developed and validated on many synthetic fields and an Oman field. The model can estimate dynamic pore volume, productivity index and average reservoir pressure that compare closely to simulated values and field knowledge. Additionally, the ability of CRM to quantify injector-producer connection strength and predict fluid production was examined on a synthetic water-CO2 flood field. A new oil production model to be used with CRM application in water-CO2 flood was developed and validated on synthetic data. The model predicts oil production from injection rate and relative permeability. CRM has successfully optimized waterflood on a West Texas field by reallocating the water from ineffective to effective injectors. New interpretations of the CRM parameters enable quantitative validation and integration of the CRM results with other methods. In primary recovery, the ICRM can estimate reservoir properties without requiring well testing which can cause loss of production. The CRM and the new oil production model can quickly characterize water-CO2 flood for short term production monitoring.Show more Item Chemical and Thermochemical Wave Behavior in Multiphase Fluid Flow Through Permeable Media: Wave-Wave Interactions(1988-05) Dria, Myra Ann; Lake, Larry W.; Schechter, Robert S.Show more The flow of reactive fluids through permeable media creates regions of constant composition for purely reactive flow. These regions are separated by waves, which mark the change in composition from one region to the next. We develop a theory which elucidates the interactions of these chemical waves with those formed from other flowing phenomena. We consider the following interactions: 1.) the intersection of precipitation/dissolution waves with other precipitation/dissolution waves formed from the sequential injection of fluid of a different composition, from a change in the direction of fluid flow, or from finite changes in the initial composition of the rock; 2.) the interactions of precipitation/dissolution waves with ion exchange waves; 3.) the interaction of thermal waves with chemical precipitation/dissolution waves, considering coupled and uncoupled thermal/chemical effects. Waves of a nature not previously found under constant (Riemann) boundary conditions are formed. Through the nondimensionalization of the chemical-energy balance, we define six dimensionless parameters, and investigate the relative effects of these parameters on temperature and composition. One dimensionless number can indicate when heat effects are important. The interaction of a thermal wave with precipitation/dissolution waves for nonadiabatic cases results in the formation of stationary waves and precipitation/dissolution waves with varying velocity. We assume a MarxLangenheim formulation sufficiently describes the movement of a thermal wave with thermal losses to the under- and overburden. A sequence may develop which contains mineral entities different from either the low or high temperature mineral sequence. We also show important effects of one additional dimensionless number obtained from restating the traditional Marx-Langenheim equation. The location of this thermal wave with respect to the precipitation/dissolution waves has a profound influence on the resulting fluid composition, mineral sequence, and the manner in which these compositions propagate within the permeable media. We further elucidate the nature of the Riemann problem for precipitation/dissolution reactions. Multiple discontinuities are regions of zero width in purely convective flow but appear with nonzero width in convective-dispersive flow. We show two cases which indicate the nonuniversality of the direction-dependent solution technique.Show more Item Choke management and production optimization in oil and gas fields(2019-09-16) Karantinos, Emmanouil; Sharma, Mukul M.; Lake, Larry W.; Bommer, Paul M.; Nguyen, Quoc P.; Baldea, MichaelShow more When a well is brought on production, the selection of the optimum choke management strategy should aim towards maximizing well productivity and minimizing the risk of completion or wellbore failures. Until recently, ramp-up practices were based on liquid rate recommendations or empirical guidelines on choke sizes for the early life of a well. The objective of this dissertation is to establish a systematic method for the design of choke management strategies and flowback operations under wellbore completion and reservoir constraints. In order to account for multi-well pressure interference through the surface facilities, an integration scheme is proposed for the effective coupling of the well models with the surface gathering network. Finally, an optimization framework is deployed to maximize the daily operating income by properly adjusting well and network controls. In the first part of the dissertation, we study choke management on an individual well basis. A general framework is introduced for comparing drawdown strategies for conventional and unconventional wells. Using analytical and numerical reservoir models we conclude that in conventional open-hole completions no more than 70% of the drawdown should be applied in less than 30% of the ramp-up period. In formations characterized by high diffusivity (e.g. high permeability gas formations), the bottom-hole-pressure should be reduced linearly with time. Using nodal analysis, a systematic method is proposed for translating a set of wellbore, completion and reservoir constraints into a choke management schedule. Illustrative examples are presented both for conventional and unconventional wells. For hydraulically fractured wells, we introduced a coupled rate-stress criterion for mitigating proppant flowback and fracture closure near the wellbore. Application of the method suggests drawdown rates which are in agreement with successfully implemented field practices (5-10 psi/hour). In order to capture well interference through the surface network, a multiphase (black-oil) pipeline network model has been developed. The network solver is formulated using fractional-flow theory, assuming steady state flow and concurrent flow of oil, water and gas phases. Using network topology, closed pipeline loops are unified into clusters where loop equations are solved using the Fletcher-Reeves conjugate gradient method. The network solver is validated using published network solutions, compared with field data and benchmarked against commercial network solvers. The well models are integrated with the surface gathering network using an explicit scheme that performs multi-point surface nodal analysis using fixed-point iteration. The integration scheme converges linearly and accurately captures well interference both for naturally flowing wells and wells on artificial lift. The integration scheme (forward model) is combined with various gradient based and derivative free optimization routines to optimize the well and network controls for a synthetic field. We observe that the use of integrated modeling can achieve significant improvements in terms of daily operating income (by up to 30%). Finally, we introduce a reduced variable range approach which can accelerate the performance of sampling and global search methods in complex production systems. This work introduces a systematic method for the design of choke management practices and presents new methods for integrating well models with the surface pipeline networkShow more Item Classification of Hydrocarbon Recovery Factor Based on Reservoir Databases(2008-08) Sharma, Aviral; Lake, Larry W.; Srinivasan, SanjayShow more In this thesis data analyses of the oil and gas reservoir data sets have been performed to come up with deterministic and probabilistic values of ultimate recovery factor for both oil and gas reservoirs. This could be of interest for exploration because the amount of knowledge regarding a newly discovered reservoir is limited and it would be helpful to know some proxy value of recovery factor that could be a guide during later flow simulations. This would also be helpful in projecting the possible revenues that could be generated from the reservoirs. The deterministic models are based on multivariate linear regression. The probability models include the calibration of the likelihood of recovery factor using naïve Bayesian classification. For the oil reservoirs, classification accuracies of the recovery factor were compared using the geological and engineering parameters. For the gas reservoirs, the Bayesian classifier model was implemented by fitting a multivariate Gaussian distribution to the predicting variables.The linear regression model performed well compared to the empirical correlations given by Arps et al. (1967) and Guthrie et al. (1995). In gas reservoirs, good prediction was achieved by using the recovery instead of recovery factor as a response function in the regression. The likelihood functions of the recovery factor for both gas and oil reservoirs are multimodal and non-Gaussian. For the oil reservoirs, both geological and engineering parameters played important role for the prediction of recovery factor, which eventually lead to the conclusion that the engineering and geological parameters are not independent.Show more Item Cluster Analysis in Reservoir Characterization(1994-08) Muneta, Yasuhiro; Lake, Larry W.Show more Any raw data sampled in an oil field has a certain amount of noise; the sample may be called an obscure image of the real thing. We may eliminate the noise by a process of "image enhancement" in "statistical pattern recognition." Image enhancement is one of the important steps in processing large data sets to make them more suitable for classification than were the original data. In this work, cluster analysis, which is a method of image enhancement, is applied to some reservoir characterization problems such as permeability distributions of core samples, sand/shale sequences observed in wells, and pressure distributions in heterogeneous porous media to classify the sample data and find the intrinsic patterns (averaged images) from the original data sets. Cluster analysis is a multivariate statistical method. It is very general and can be applied to a wide area of scientific investigations. It is often called a tool of discovery or an unsupervised approach which doesn't depend on a priori information. It searches unknown-significant categories (patterns) themselves. Once we obtain typical patterns, we may analogously approach the real thing based on them. We find that cluster analysis is applicable to finding appropriate parent populations of a permeability distribution, theoretical indicator variograms of sand/shale sequences, and trends of effective permeability distribution.Show more Item CO2-brine relative permeability and capillary pressure of Tuscaloosa sandstone: Effect of anisotropy(2020-01-01) Bakhshian, Sahar; Hosseini, Seyyed A.; Lake, Larry W.Show more