Hector Klie

Hector Klie

Houston, Texas, United States
11K followers 500+ connections

About

With over 25 years of expertise in computational and data science, I proudly lead…

Articles by Hector

Activity

Join now to see all activity

Experience & Education

  • DeepCast.ai

View Hector’s full experience

See their title, tenure and more.

or

By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.

Volunteer Experience

  • Springer-VDI-Verlag Graphic

    Associate Editor of the Computational Geosciences Journal

    Springer-VDI-Verlag

    - Present 6 years 2 months

    Science and Technology

    1) Advise the Editors-in-Chief on policy issues related to the journal, occasionally guiding the review of submitted manuscripts
    2) Advise the Editors-in-Chief on potential reviewers for papers and,
    3) Encourage prospective authors to submit suitable papers to Computational Geosciences.

  • Society of Petroleum Engineers International Graphic

    Data Science and Engineering Analytics Award

    Society of Petroleum Engineers International

    - Present 3 years 4 months

    Select annual recipients of the Data Science and Engineering Analytics Award of the Society of Petroleum Engineers to recognize outstanding achievement or contributions to the advancement of petroleum engineering in the area of data science and engineering analytics.

  • SEG Graphic

    Chair at the Unconventional Resources Technology Conference (URTeC)

    SEG

    - 4 months

    Chair session on Applications in Reserves Estimation and Production Forecasting

  • Member of Organizing Committee

    2nd Annual Workshop on Machine Learning for Unconventional Resources

    - 10 months

    Science and Technology

    FACT Inc. in collaboration with the University of Houston is organizing its second workshop on Machine Learning for Unconventional Resources (MLUR 2020) with a focus on Well Stimulation. The goal is to bring together participants from different disciplines to introduce new concepts and discuss solutions to existing challenges on well stimulation with a focus on the use of Machine learning, Artificial Intelligence and Data Analytics(ML-AI-DA). We are soliciting papers…

    FACT Inc. in collaboration with the University of Houston is organizing its second workshop on Machine Learning for Unconventional Resources (MLUR 2020) with a focus on Well Stimulation. The goal is to bring together participants from different disciplines to introduce new concepts and discuss solutions to existing challenges on well stimulation with a focus on the use of Machine learning, Artificial Intelligence and Data Analytics(ML-AI-DA). We are soliciting papers with both theoretical and applied nature and a variety of topics listed below. The workshop is also going to include a forum for brainstorming on current and future directions for research and development on digital transformation, for more efficient exploration and production, especially for unconventional resources with reduced cost and increased efficiency

  • Society of Petroleum Engineers Graphic

    Committee Member of the SPE Reservoir Simulation Conference

    Society of Petroleum Engineers

    - Present 12 years 11 months

    Science and Technology

    The SPE Reservoir Simulation Symposium offers engineers and scientists the chance to see and discuss leading-edge technologies and applications in reservoir simulation. It also provides an unparalleled opportunity to network with other technical professionals in the field. This 3-day event is a must for anyone working with simulation tools and techniques.

  • Rice University Graphic

    Board member of Professional Master's Program in Science and Engineering

    Rice University

    - Present 8 years 6 months

    Education

    The professional master's program is designed for those who seek to round out their engineering education with advanced analytical and technical expertise. It gives you a chance to add depth to your areas of interest and to round out your training, and it prepares you for a leadership role in engineering management. Employers value the knowledge and maturity it takes to complete the degree, which further signals your interest in the field.

  • MDPI Graphic

    Guest Editor

    MDPI

    - 1 year 3 months

    Science and Technology

    Provide editorial review on papers on the following topics:

    - Advances on shale reservoir characterization techniques and workflows;
    - Analysis of physical-chemical interactions of shale rocks with drilling, injected, or in-situ fluids;
    - Novel technologies to address the complex challenges on modeling and simulation of hydrocarbon production from shale formations;
    - Geomechanical aspects and impacts on shale reservoirs;
    - Novel methods for enhanced hydrocarbon recovery in…

    Provide editorial review on papers on the following topics:

    - Advances on shale reservoir characterization techniques and workflows;
    - Analysis of physical-chemical interactions of shale rocks with drilling, injected, or in-situ fluids;
    - Novel technologies to address the complex challenges on modeling and simulation of hydrocarbon production from shale formations;
    - Geomechanical aspects and impacts on shale reservoirs;
    - Novel methods for enhanced hydrocarbon recovery in shale reservoirs;
    - Machine Learning and Data Science applications for unlocking new insights in shale resources exploitation.

  • Society of Petroleum Engineers International Graphic

    Organizer of the SPE Workshop: Merging Data-Driven and Physics-Based Models

    Society of Petroleum Engineers International

    - 9 months

    Science and Technology

    ​​Reservoir performance predictions and optimization of field development have traditionally relied on computationally expensive physics-based models for flow and transport in porous media. More recently, there is an increasing trend to use purely data-driven models based on big data and machine learning techniques. The goal here is to exploit the multitude of data sources to extract intelligence, improve operational efficiency and optimize reservoir performance. In this workshop, we explore…

    ​​Reservoir performance predictions and optimization of field development have traditionally relied on computationally expensive physics-based models for flow and transport in porous media. More recently, there is an increasing trend to use purely data-driven models based on big data and machine learning techniques. The goal here is to exploit the multitude of data sources to extract intelligence, improve operational efficiency and optimize reservoir performance. In this workshop, we explore the opportunities presented by combining the data-driven models (data scientists) with physics-based models (domain experts), to provide a balanced and informed view of reservoir insights and create predictive and generalizable models while enforcing known physical constraints and addressing gaps in the data.

  • Society of Petroleum Engineers International Graphic

    Chair of Reservoir Simulation Conference

    Society of Petroleum Engineers International

    - 2 years 7 months

    Science and Technology

    Lead the organization effort for holding the SPE Reservoir Simulation Conference 2019. This conference provides to the large community of engineers and scientists the chance to learn leading-edge technologies and discuss applications in reservoir simulation. Additionally, the conference offers an unparalleled opportunity to network with other technical professionals working with simulation tools and techniques across the Oil & Gas industry and Academia.

  • Society of Petroleum Engineers Graphic

    Vice-Chair of the Reservoir Simulation Conference 2017

    Society of Petroleum Engineers

    - 2 years 2 months

    Science and Technology

    The SPE Reservoir Simulation Symposium offers engineers and scientists the chance to see and discuss leading-edge technologies and applications in reservoir simulation. It also provides an unparalleled opportunity to network with other technical professionals in the field.

  • Society for Industrial and Applied Mathematics (SIAM) Graphic

    Diversity Advisory Committee

    Society for Industrial and Applied Mathematics (SIAM)

    - 2 years 1 month

    Education

    The purpose of the SIAM Diversity Advisory Committee is to assist SIAM in addressing policy issues that arise in relationship to underrepresented groups. The committee will consist of a chair and up to twelve members. Designated within the committee are a working group to oversee the Workshop Celebrating Diversity at the SIAM Annual Meetings, a liaison from the AWM, and a liaison from the Joint Committee on Women. All appointments are subject to the approval of the SIAM President.

  • Society of Petroleum Engineers Graphic

    Committee Member of Mathematical Methods in Fluid Dynamics & Simulation of Giant Oil & Gas Reservoir

    Society of Petroleum Engineers

    - 8 months

    Science and Technology

    This conference will gather mathematicians and engineers to address challenges in mathematical modelling of compressible multiphase flow in porous media with reactions, fractured media, flow in pipes and pipe networks, coupled numericalsolution of porous and non-porous media, geomechanics, diffusion, dispersion problems, unstructured grid generation, linear and nonlinear solvers, multi-grid methods, new discretisation methods, parallel computing, hybrid computing involving multicore CPUsand…

    This conference will gather mathematicians and engineers to address challenges in mathematical modelling of compressible multiphase flow in porous media with reactions, fractured media, flow in pipes and pipe networks, coupled numericalsolution of porous and non-porous media, geomechanics, diffusion, dispersion problems, unstructured grid generation, linear and nonlinear solvers, multi-grid methods, new discretisation methods, parallel computing, hybrid computing involving multicore CPUsand GPUs, scientific visualisation of large data, real field studies for giant oil and gas reservoirs using simulators.

  • IEEE Graphic

    Member Technical Committee for Applications of Supercomputing 2016

    IEEE

    - 10 months

    Science and Technology

    Revise and approve submissions to the Conference in a variety of application themes relying on latest trends in HPC, Networking, Storage and Algorithms.

  • KAUST Graphic

    Board Member of SRI-Center for Uncertainty Quantification in Computational Science & Engineering

    KAUST

    - Present 12 years 7 months

    Education

    Provide feedback and guidance to diverse education, research and development initiatives at the SRI- Center

  • Society of Petroleum Engineers Graphic

    Committee Member of Large Scale Computing and Big Data Challenges in Reservoir Simulation

    Society of Petroleum Engineers

    - 9 months

    Science and Technology

    This conference will gather mathematicians and engineers to address challenges in mathematical modelling of compressible multiphase flow in porous media with reactions, fractured media, flow in pipes and pipe networks, coupled numerical solution of porous and non-porous media, geomechanics, diffusion, dispersion problems, unstructured grid generation, linear and nonlinear solvers, multi-grid methods, new discretisation methods, parallel computing, hybrid computing involving multicore CPUs and…

    This conference will gather mathematicians and engineers to address challenges in mathematical modelling of compressible multiphase flow in porous media with reactions, fractured media, flow in pipes and pipe networks, coupled numerical solution of porous and non-porous media, geomechanics, diffusion, dispersion problems, unstructured grid generation, linear and nonlinear solvers, multi-grid methods, new discretisation methods, parallel computing, hybrid computing involving multicore CPUs and GPUs, scientific visualisation of large data, real field studies for giant oil and gas reservoirs using simulators.

  • Society for Industrial and Applied Mathematics (SIAM) Graphic

    Secretary of SIAM Activity Group in Geosciences

    Society for Industrial and Applied Mathematics (SIAM)

    - 2 years 1 month

    Science and Technology

    SIAM Activity Groups (SIAGs) provide a more focused forum for SIAM members interested in exploring one of the areas of applied mathematics, computational science, or applications.

    SIAGs organize conferences, minisymposia, newsletters, electronic communications and Web sites, and they award prizes. SIAG members receive targeted communications from peers, access to electronic membership directories and additional discounts on SIAG-sponsored conferences.

  • Society for Industrial and Applied Mathematics (SIAM) Graphic

    SIAM Conference on Mathematical and Computational Issues in the Geosciences

    Society for Industrial and Applied Mathematics (SIAM)

    - 1 year 1 month

    Science and Technology

    Support and review process for paper submissions. Promote conference and interaction between the industrial and academic community.

  • Board Member

    Federacion Internacional de Scrabble en Español

    - Present 8 years 7 months

    Education

    Assisting to lead and coordinate competitive and promotional activities in Spanish scrabble worlwide.

  • Chair

    Federacion Internacional de Scrabble en Español

    - 6 years

    Lead and coordinate worldwide competitive and promotional activities in Spanish Scrabble.

  • Selection of top ranked high-school candidates for the UWC network

    FUNDACEA

    - 9 years 1 month

    Education

Publications

  • Dynamic Data-Driven Application Systems for Reservoir Simulation-Based Optimization: Lessons Learned and Future Trends

    Springer

    Home Handbook of Dynamic Data Driven Applications Systems Chapter
    Dynamic Data-Driven Application Systems for Reservoir Simulation-Based Optimization: Lessons Learned and Future Trends
    M. Parashar, Tahsin Kurc, H. Klie, M. F. Wheeler, Joel H. Saltz, M. Jammoul & R. Dong
    Chapter
    First Online: 06 September 2023
    386 Accesses

    Abstract
    Since its introduction in the early 2000s, the Dynamic Data-Driven Applications Systems (DDDAS) paradigm has served as a powerful concept…

    Home Handbook of Dynamic Data Driven Applications Systems Chapter
    Dynamic Data-Driven Application Systems for Reservoir Simulation-Based Optimization: Lessons Learned and Future Trends
    M. Parashar, Tahsin Kurc, H. Klie, M. F. Wheeler, Joel H. Saltz, M. Jammoul & R. Dong
    Chapter
    First Online: 06 September 2023
    386 Accesses

    Abstract
    Since its introduction in the early 2000s, the Dynamic Data-Driven Applications Systems (DDDAS) paradigm has served as a powerful concept for continuously improving the quality of both models and data embedded in complex dynamical systems. The DDDAS unifying concept enables capabilities to integrate multiple sources and scales of data, mathematical and statistical algorithms, advanced software infrastructures, and diverse applications into a dynamic feedback loop. DDDAS has not only motivated notable scientific and engineering advances on multiple fronts, but it has been also invigorated by the latest technological achievements in artificial intelligence, cloud computing, augmented reality, robotics, edge computing, Internet of Things (IoT), and Big Data. Capabilities to handle more data in a much faster and smarter fashion is paving the road for expanding automation capabilities. The purpose of this chapter is to review the fundamental components that have shaped reservoir-simulation-based optimization in the context of DDDAS. The foundations of each component will be systematically reviewed, followed by a discussion on current and future trends oriented to highlight the outstanding challenges and opportunities of reservoir management problems under the DDDAS paradigm. Moreover, this chapter should be viewed as providing pathways for establishing a synergy between renewable energy and oil and gas industry with the advent of the DDDAS method.

    Other authors
    See publication
  • Towards Automated Development and Evaluation of Optimal Corrosion Inhibitor Products

    AMPP

    In this work, a novel analysis approach using an artificial intelligence (AI) framework that automatically helps identify the drivers of a formulation from lab measurements was showcased. This AI framework builds and optimizes models in the form of physics-based equations from small amounts of measurement data. With this approach, the user can overcome the large data requirements of machine learning while building tailored models that outperform traditional analytical and statistical…

    In this work, a novel analysis approach using an artificial intelligence (AI) framework that automatically helps identify the drivers of a formulation from lab measurements was showcased. This AI framework builds and optimizes models in the form of physics-based equations from small amounts of measurement data. With this approach, the user can overcome the large data requirements of machine learning while building tailored models that outperform traditional analytical and statistical tools.

    The effectiveness of this modeling framework in helping scientists reduce uncertainty early in the experiment process was demonstrated. This solution allows a significant reduction in the number of experiments required to achieve an optimal formulation. This was accomplished by generating new, custom models from existing data and well-known equations in electrochemistry. Then, these models were used to predict or hypothesize the performance of unseen formulations by altering their control parameters. This study showed the accuracy of these predictions by calculating its error against unseen measurement data.

    Other authors
    See publication
  • Data Connectivity Inference and Physics-AI Models for Field Optimization

    SPE/AAPG/SEG Latin America Unconventional Resources

    The primary objective of the present work is to propose a new methodology that combines topological data analysis (TDA) and physics-informed artificial intelligence (PhysAI) models to enable automated reserve estimation and reliable field optimal recommendations as new data becomes available. Due to the resilience and efficient nature of the approach, we can deliver a massive number of forecasts on a regular basis. In this way, engineers and decision-makers can rely on a practical approach to…

    The primary objective of the present work is to propose a new methodology that combines topological data analysis (TDA) and physics-informed artificial intelligence (PhysAI) models to enable automated reserve estimation and reliable field optimal recommendations as new data becomes available. Due to the resilience and efficient nature of the approach, we can deliver a massive number of forecasts on a regular basis. In this way, engineers and decision-makers can rely on a practical approach to explore multiple unconventional field development strategies on a timely basis.

    Other authors
    See publication
  • Automated Lease Operating Statements for Cost Optimization and Reserve Evaluation Using Artificial Intelligence

    SPE Annual Technical Conference and Exhibition

    The objective of the present work is to streamline the analysis of Lease Operating Statements (LOS) with advanced learning paradigms from artificial intelligence (AI). The proposed approach aims at the: (a) consolidation of disparate expenses data; (b) timely expense assessment at field, pad or well level; (c) prevention and quick identification of negative cash flow trends (d) robust LOS predictions; and (e) optimal budget planning under uncertainty. To achieve this objective, a LOS…

    The objective of the present work is to streamline the analysis of Lease Operating Statements (LOS) with advanced learning paradigms from artificial intelligence (AI). The proposed approach aims at the: (a) consolidation of disparate expenses data; (b) timely expense assessment at field, pad or well level; (c) prevention and quick identification of negative cash flow trends (d) robust LOS predictions; and (e) optimal budget planning under uncertainty. To achieve this objective, a LOS acceleration platform was developed for the automatic integration of volume estimation with Lease Operating Expenses (LOE). Historical data from production volumes, revenues, price differentials, LOEs, marketing and transportation costs, taxes, CAPEX, P&A costs and others considered to strengthen the accuracy of individual expense categories and the overall LOS predictive model. The predictive model consists of a combined blend of analytical and machine learning models that allows to reliably forecast trends and proactively detect anomalies that may be negatively affecting the operational cash flow. Intuitive and portable visualizations allow a quick interpretation and communication of results among engineers and managers. The implemented platform fills a gap between traditional LOS analysis and preemptive expense planning involving many wells and expense categories that are hard to track daily. It is shown that the proposed approach can lead to savings of the order of 30% in incurred expenses.

    Other authors
    See publication
  • Data-Driven Prediction of Unconventional Shale-Reservoir Dynamics

    SPE Journal

    The present work introduces extended dynamic mode decomposition (EDMD) as a suitable data-driven framework for learning the reservoir dynamics entailed by flow/fracture interactions in unconventional shales. The proposed EDMD approach builds on the approximation of infinite-dimensional linear operators combined with the power of deep learning autoencoder networks to extract salient transient features from pressure/stress fields and bulks of production data. The data-driven model is demonstrated…

    The present work introduces extended dynamic mode decomposition (EDMD) as a suitable data-driven framework for learning the reservoir dynamics entailed by flow/fracture interactions in unconventional shales. The proposed EDMD approach builds on the approximation of infinite-dimensional linear operators combined with the power of deep learning autoencoder networks to extract salient transient features from pressure/stress fields and bulks of production data. The data-driven model is demonstrated on three illustrative examples involving single- and two-phase coupled flow/geomechanics simulations and a real production data set from the Vaca Muerta unconventional shale formation in Argentina. We demonstrated that we could attain a high level of predictability from unseen field-state variables and well-production data given relatively moderate input requirements. As the main conclusion of this work, EDMD stands as a promising data-driven choice for efficiently reconstructing flow/fracture dynamics that are either partially or entirely unknown, or that are too complex to formulate using known simulation tools on unconventional plays.

    Other authors
    See publication
  • Transfer Learning for Scalable Optimization of Unconventional Field Operations

    SPE/AAPG/SEG Unconventional Resources Technology Conference

    We present an integrated transfer learning approach that enables the reuse of modeling and optimization knowledge accumulated from previously learned development scenarios to accelerate the execution of forthcoming development scenarios. Transfer of this knowledge can be applied between different wells, pads, or fields. Therefore, the main objective of the proposed methodology is to create efficient and scalable field development workflows involving many well/pad configurations, decision, and…

    We present an integrated transfer learning approach that enables the reuse of modeling and optimization knowledge accumulated from previously learned development scenarios to accelerate the execution of forthcoming development scenarios. Transfer of this knowledge can be applied between different wells, pads, or fields. Therefore, the main objective of the proposed methodology is to create efficient and scalable field development workflows involving many well/pad configurations, decision, and uncertainty parameters.

    Other authors
    See publication
  • Shale Oil and Shale Gas Resources

    Energies, MDPI

    This multidisciplinary book covers a wide range of topics addressing critical challenges for advancing the understanding and management of shale oil and shale gas resources. Both fundamental and practical issues are considered. By covering a variety of technical topics, we aim to contribute to building a more integrated perspective to meet major challenges faced by shale resources. Combining complementary techniques and examining multiple sources of data serve to advance our current knowledge…

    This multidisciplinary book covers a wide range of topics addressing critical challenges for advancing the understanding and management of shale oil and shale gas resources. Both fundamental and practical issues are considered. By covering a variety of technical topics, we aim to contribute to building a more integrated perspective to meet major challenges faced by shale resources. Combining complementary techniques and examining multiple sources of data serve to advance our current knowledge about these unconventional reservoirs. The book is a result of interdisciplinary and collaborative work. The content includes contributions authored by active scientists with ample expertise in their fields. Each article was carefully peer-reviewed by researchers, and the editorial process was performed by an experienced team of Senior Editors, Guest Editors, Topic Editors, and Editorial Board Members. The first part is devoted to fundamental topics, mostly investigated on the laboratory scale. The second part elaborates on larger scales (at near-wellbore and field scales). Finally, two related technologies, which could be relevant for shale plays applications, are presented. With this Special Issue, we provide a channel for sharing information and lessons learned collected from different plays and from different disciplines.

    Other authors
    See publication
  • Middle East Steamflood Field Optimization Demonstration Project

    Abu Dhabi International Petroleum Exhibition & Conference

    Occidental Mukhaizna completed a steamflood field optimization demonstration project involving about 100 Mukhaizna wells from Mid-December 2018 to Mid-March 2019.

    The field demonstration involves a data analytics process that provides recommendations on the best steam injection allocation among wells in order to improve overall steamflood performance. The process uses a low fidelity physics-based proxy model and cloud-based parallel processing. A field optimization engineer history…

    Occidental Mukhaizna completed a steamflood field optimization demonstration project involving about 100 Mukhaizna wells from Mid-December 2018 to Mid-March 2019.

    The field demonstration involves a data analytics process that provides recommendations on the best steam injection allocation among wells in order to improve overall steamflood performance. The process uses a low fidelity physics-based proxy model and cloud-based parallel processing. A field optimization engineer history matches and anchors a proxy model to current well and field operating constraints. The engineer completes hundreds of forward runs as part of an optimization algorithm to identify scenarios most likely to help increase value (oil production per steam injected) over the short term in the field, while honoring all producing and injection well operating ranges. The reservoir management team vets the rate change ideas generated and provides their recommendations for changes so the likely best and most practical overall scenario is implemented. The process is refreshed monthly so field performance results are included immediately, and the optimization process is kept evergreen. The field results so far have been encouraging, yielding an increase in oil production that has exceeded expectations.

    This paper will describe the data analytics field optimization process and workflow, present the baseline performance versus field demonstration results, and share lessons learned.

    Other authors
    See publication
  • Improving Field Development Decisions in the Vaca Muerta Shale Formation by Efficient Integration of Data, AI, and Physics

    SPE/AAPG/SEG Unconventional Resources Technology

    We created a data platform that regularly extracts geological, drilling, completion and production data from multiple open data sources in Argentina. Data cleansing and consolidation are done via the integration of fast cross-platform database services and natural language processing algorithms. A set of AI algorithms adapted to best capture engineering judgment are employed for identifying multiple flow regimes and selecting the most suitable decline curve models to perform production…

    We created a data platform that regularly extracts geological, drilling, completion and production data from multiple open data sources in Argentina. Data cleansing and consolidation are done via the integration of fast cross-platform database services and natural language processing algorithms. A set of AI algorithms adapted to best capture engineering judgment are employed for identifying multiple flow regimes and selecting the most suitable decline curve models to perform production forecasting and EUR estimation. Based on conceptual models generated from minimum available data, a coupled flow-geomechanics simulator is used to forecast production in other field areas where no well information is available. New data is assimilated as it becomes available improving the reliability of the fast forecasting algorithm.

    In a matter of minutes, we are able to achieve high forecasting accuracy and reserves estimation in the Vaca Muerta formation for over eight hundred wells. This workflow can be executed on a regular basis or as soon as new data becomes available. A moderate number of high-fidelity simulations based on coupled flow and geomechanics allows for inferring production scenarios where there is an absence of data capturing space and time. With this approach, engineers and managers are able to quickly examine a feasible set of viable in-fill scenarios. The autonomous integration of data and proper combination of AI approaches with high-resolution physics-based models enable opportunities to reduce operational costs and improving production efficiencies. The integration of physics-based simulations with AI as a cost/effective workflow on a business-relevant shale formation such as Vaca Muerta seems to be lacking in current literature. With the proposed solution, engineers should be able to focus more on business strategy rather than on manually performing time-consuming data wrangling and modeling tasks.

    Other authors
    See publication
  • Leveraging US Unconventional Data Analytics Learnings in Vaca Muerta Shale Formation

    SPE Europec featured at 81st EAGE Conference and Exhibition

    Data Analytics is progressively gaining traction as a viable resource to improve forecasts and reserve estimations in most prospective US shale plays. Part of those learnings has been tested for the reserves and resources estimation of the next worldwide top-class shale play, Vaca Muerta formation in Argentina. In this work, we rely on advanced artificial intelligence methods to automate workflows for production forecasting and reserve estimation in the Vaca Muerta formation. To achieve this…

    Data Analytics is progressively gaining traction as a viable resource to improve forecasts and reserve estimations in most prospective US shale plays. Part of those learnings has been tested for the reserves and resources estimation of the next worldwide top-class shale play, Vaca Muerta formation in Argentina. In this work, we rely on advanced artificial intelligence methods to automate workflows for production forecasting and reserve estimation in the Vaca Muerta formation. To achieve this goal, we develop a computational platform capable of integrating several sequential operations into a single automated workflow: (1) data gathering; (2) data preparation; (3) model fitting and forecasting and, (4) EUR estimation. As new data becomes available, each of these steps is performed automatically. The proposed platform also integrates with advanced business intelligence tools that aid at facilitating graphical interpretation and communication among specialists and decision makers. Hence, the suggested workflow can deliver production forecasts several magnitudes faster than traditional workflows while maintaining accurate and engineering sound results. Having fast and reliable forecast turnarounds allow for timely tracking key differences and commonalities among multiple shale plays to facilitate informed decision strategies in unconventional field evaluation and development.

    Other authors
    See publication
  • Data-Driven Discovery of Unconventional Shale Reservoir Dynamics

    SPE Reservoir Simulation Conference

    The need to deliver well-informed decisions within stringent production cycles in unconventional plays is motivating the quest for practical models that can assimilate increasing volumes of data and satisfactorily account for observed production trends. The present work introduces the extended Dynamic Mode Decomposition (EDMD) as a suitable data-driven framework for learning the reservoir dynamics entailed by flow/fracture interactions in unconventional shales. The proposed EDMD approach builds…

    The need to deliver well-informed decisions within stringent production cycles in unconventional plays is motivating the quest for practical models that can assimilate increasing volumes of data and satisfactorily account for observed production trends. The present work introduces the extended Dynamic Mode Decomposition (EDMD) as a suitable data-driven framework for learning the reservoir dynamics entailed by flow/fracture interactions in unconventional shales. The proposed EDMD approach builds on the approximation of infinite dimensional linear operators combined with the power of deep learning autoencoder networks to extract salient transient features from pressure/stress fields and bulks of production data. The data-driven model is demonstrated on three illustrative examples involving single and two-phase coupled flow/geomechanics simulations and a real production dataset from Vaca Muerta unconventional shale formation in Argentina. Given relatively moderate data requirements, we show that it is possible to attain a high level of predictability from hidden field state variables and well production data. As the main conclusion of this work, EDMD stands as a promising data-driven choice for efficiently reconstructing flow/fracture dynamics that are either partially or entirely unknown, or that are too complex to formulate using known simulation tools on unconventional plays.

    Other authors
    See publication
  • Data-Driven Modeling Of Fractured Shale Reservoirs

    EAGE - ECMOR XVI, 16th European Conference on the Mathematics of Oil Recovery

    In this work, we strive to emulate first-order flow dynamics with data-driven models that have recently emerged in model reduction and machine learning. We rely on the assumption that complex flows on fractured systems can be decomposed into a simple representation based on coherent spatiotemporal structures. When field and simulation data are both integrated with the proposed approach, it is possible to extract additional patterns that enhance our capabilities for understanding predictions on…

    In this work, we strive to emulate first-order flow dynamics with data-driven models that have recently emerged in model reduction and machine learning. We rely on the assumption that complex flows on fractured systems can be decomposed into a simple representation based on coherent spatiotemporal structures. When field and simulation data are both integrated with the proposed approach, it is possible to extract additional patterns that enhance our capabilities for understanding predictions on different unconventional reservoir systems. We implement a single-phase flow model on structured curvilinear grids to capture first-order physics associated with unconventional shale production dynamics. Latin hypercube sampling is carried out to represent a different number of fractures (stages), fractures length and, geological uncertainty across distinct field scenarios. The data-driven model consists of the application of the recently proposed Dynamic Mode Decomposition (DMD) approach for modeling the evolution of pressure field and Long Short-Term Memory (LSTM) network, a powerful class of Recurrent Neural Network (RNN), to track gas production consistently and accurately. Our experiments show that our approach is accurate for a relatively small number of samples and reflects the relevant dynamics determining the production. Our model may not be as practical as empirical models employed in decline curve analysis, but it offers the potential to be more reliable as it can be based on complex simulations and field data. Numerical results support the accuracy of our approach with the possibility to impact forecast, reserves estimation and economics studies of unconventional assets in much shorter turnarounds.

    Other authors
    See publication
  • Optimal Learning of Field Operation and Well Placement in the Presence of Uncertainty

    International Petroleum Technology Conference

    We propose a novel learning optimization approach to find nearly optimal field solutions with a high probability and at a relatively low computational expense. Key attributes of the approach include capabilities to handle non-linear constraints in a dynamic fashion, automatic control of search directions; use of machine learning-based proxies; real time control of the optimization execution flow and, risk management decisions. The resulting algorithm retains the attractive property that the…

    We propose a novel learning optimization approach to find nearly optimal field solutions with a high probability and at a relatively low computational expense. Key attributes of the approach include capabilities to handle non-linear constraints in a dynamic fashion, automatic control of search directions; use of machine learning-based proxies; real time control of the optimization execution flow and, risk management decisions. The resulting algorithm retains the attractive property that the number of simulations is independent of the number of decision parameters as it progressively improves its rate of convergence as more information becomes available. We illustrate the new approach to determine multiple well locations and operation strategies to optimize field production on deep water and unconventional resource applications.

    Other authors
    See publication
  • Physics-Based and Data-Driven Surrogates for Production Forecasting

    SPE Reservoir Simulation Symposium 2015

    The present work proposes a novel combination of physics-sound analytical growth models with data driven techniques to generate surrogate models that can replace or mitigate the computational burden entailed by intensive numerical simulations and thus, unlock fast and reliable forecasts to decrease the turn-around time within the field development decision-making process. The analytical model relies on the association of cumulative production curves with growth functions and diffusion phenomena…

    The present work proposes a novel combination of physics-sound analytical growth models with data driven techniques to generate surrogate models that can replace or mitigate the computational burden entailed by intensive numerical simulations and thus, unlock fast and reliable forecasts to decrease the turn-around time within the field development decision-making process. The analytical model relies on the association of cumulative production curves with growth functions and diffusion phenomena that have been extensively used in the description of population and cell growth models in several branches of Life, Social and Economic Sciences. The use of growth function models enables description of production profiles from the complex interaction of main flow drivers across a connectivity network or continuous heterogeneous system. The data-driven component allows for discovering secondary physical flow or production trends that reside hidden in the data that in turn, may aid at complementing and extending the predictability of the whole surrogate model. The data-driven component relies on machine learning techniques to construct universal interpolators via radial basis function and/or artificial neural networks. Both analytical and data-driven approaches are conceived in a non-intrusive fashion. Hence, the proposed surrogate workflow can be realized from either field data or data generated from black-box commercial simulation software. The proposed class of non-intrusive physics-based surrogate models is promising for generation of moderate to long range forecasts for SAGD operations in oil sands as well as improving the reliability of forecasting and reserves estimation in unconventional resources. Demonstration of the proposed methodology is illustrated with a variety of forecasting scenarios and for predicting inflow performance relationships.

    See publication
  • Data-driven Model Inference and its Application to Optimal Control under Reservoir Uncertainty

    ECMOR XIV / EAGE

    The present work describes an efficient optimization methodology for black-box simulations consisting of inferring and calibrating a “twin model” representation of the black-box simulator. The twin model is a non-intrusive model that mirrors the behavior of the black-box simulation using data assimilation techniques. Once the inferred twin model is available, adjoint operators based on gradient-driven techniques can be easily computed to perform efficient optimization. Computational experiments…

    The present work describes an efficient optimization methodology for black-box simulations consisting of inferring and calibrating a “twin model” representation of the black-box simulator. The twin model is a non-intrusive model that mirrors the behavior of the black-box simulation using data assimilation techniques. Once the inferred twin model is available, adjoint operators based on gradient-driven techniques can be easily computed to perform efficient optimization. Computational experiments illustrate the significant reduction of simulations to estimate the optimal water injection strategy under various geological uncertainty scenarios when compared with traditional approaches using gradient-free methods.

    Other authors
    See publication
  • Reduced-Order Modelling for Thermal Recovery Process

    Computational Geosciences - Springer

    Thermal recovery can entail considerably higher costs than conventional oil recovery, so the use of computational optimization techniques in designing and operating these processes may be beneficial. Optimization, however, requires many simulations, which results in substantial computational cost. Here, we implement a model-order reduction technique that aims at large reductions in computational requirements. The technique considered, trajectory piecewise linearization (TPWL), entails the…

    Thermal recovery can entail considerably higher costs than conventional oil recovery, so the use of computational optimization techniques in designing and operating these processes may be beneficial. Optimization, however, requires many simulations, which results in substantial computational cost. Here, we implement a model-order reduction technique that aims at large reductions in computational requirements. The technique considered, trajectory piecewise linearization (TPWL), entails the representation of new solutions in terms of linearizations around previously simulated (and saved) training solutions. The linearized representation is projected into a low-dimensional space, with the projection matrix constructed through proper orthogonal decomposition of solution “snapshots” generated in the training step. Two idealized problems are considered here: primary production of oil driven by downhole heaters and a simplified model for steam-assisted gravity drainage, where water and steam are treated as a single “effective” phase. The strong temperature dependence of oil viscosity is included in both cases. TPWL results for these systems demonstrate that the method can provide accurate predictions relative to full-order reference solutions. Observed runtime speedups are very substantial, over 2 orders of magnitude for the cases considered. The overhead associated with TPWL model construction is equivalent to the computation time for several full-order simulations (the precise overhead depends on the number of training runs), so the method is only applicable if many simulations are to be performed.

    Other authors
    See publication
  • A Black-Box Interpolation Method to Accelerate Reservoir Simulation Solutions.

    SPE Reservoir Simulation Symposium

    The present work aims at predicting space-time pressure solutions via a novel non-intrusive reduced-order simulation model. The construction of low-dimensional spaces entails the combination of the Discrete Empirical Interpolation (DEIM) method with a suitable regressor such as an artificial neural network to accurately approximate the pressure solutions arising in an IMPES formulation. Two basic assumptions are key in the present work: (a) physics invariance and, (b) stencil locality. The…

    The present work aims at predicting space-time pressure solutions via a novel non-intrusive reduced-order simulation model. The construction of low-dimensional spaces entails the combination of the Discrete Empirical Interpolation (DEIM) method with a suitable regressor such as an artificial neural network to accurately approximate the pressure solutions arising in an IMPES formulation. Two basic assumptions are key in the present work: (a) physics invariance and, (b) stencil locality. The first one allows for coping with the curse of dimensionality associated with the training of a lower-dimensional surrogate model. The second assumption enables to significantly reduce the input parameter space and therefore, infer the global solution from local mass conservation principles. These assumptions are inspired in the discretization of PDEs governing the flow in porous media and serve as a powerful vehicle to generate physics-based surrogate models at a low computational cost. Hence, without explicit or little knowledge of simulation equations and numerical schemes, a sequence of pressure solutions or initial guesses can be obtained from inexpensive solutions of reduced order models (ROMs). Numerical examples are provided to illustrate the potentials of the present approach.

    Other authors
    See publication
  • Unlocking Fast Reservoir Predictions via Nonintrusive Reduced-Order Models

    SPE Reservoir Simulation Symposium 2013

    The present paper proposes a non-intrusive model reduction approach based on proper orthogonal decomposition (POD) and radial basis function (RBF) networks to efficiently predict production of oil and gas reservoirs. Provided a representative set of training reservoir scenarios, the POD method allows for effectively projecting input parameters (e.g., permeability, porosity), states (e.g., pressure, saturations) and outputs (e.g., well production curves) into a much lower dimension that retains…

    The present paper proposes a non-intrusive model reduction approach based on proper orthogonal decomposition (POD) and radial basis function (RBF) networks to efficiently predict production of oil and gas reservoirs. Provided a representative set of training reservoir scenarios, the POD method allows for effectively projecting input parameters (e.g., permeability, porosity), states (e.g., pressure, saturations) and outputs (e.g., well production curves) into a much lower dimension that retains the main features contained in the simulation system. In this work, these projections are applied across multiple levels to be able to collapse a large number of spatio-temporal correlations. It is observed that these projections can be effectively performed at a large extent regardless of the underlying geological complexity and operational constraints associated with the reservoir model. The RBF network provides a powerful means for developing learning functions from input-output relationships described by the reservoir dynamics entailed by multiple combinations of inputs and controls. In order to achieve a high degree of predictability from the resulting reduced model, the RBF network exploits locality by a means of Gaussian basis functions that are maximal at the sampled point and decrease monotonically with distance. Compared to multilayer perceptron networks (i.e., traditional artificial neural networks) RBF networks require less training and are less sensitive to the presence of noise in the data. In this regard, POD acts as a data filter that additionally aids at designing a more compact RBF network representation suitable for targeting fast reservoir predictions. Numerical results show computational accelerations of several orders of magnitude with respect to running the original simulation model on a wide range of real field scenarios.

    See publication
  • Enabling Optimal Production Strategies Under Uncertainties With The Aid Of Non-Intrusive Model Reduction Methods

    ECMOR XIII/ EAGE

    Real-time reservoir management, optimization and uncertainty assessment requires an effective assimilation of all available data into multiple reservoir models to be able to generate accurate forecasts and in turn, fast and reliable decisions. This process entails thousands of complex simulations that generally require several days and weeks to be completed even with the most powerful supercomputer facilities available today. Consequently, there has been increasing attempts to develop…

    Real-time reservoir management, optimization and uncertainty assessment requires an effective assimilation of all available data into multiple reservoir models to be able to generate accurate forecasts and in turn, fast and reliable decisions. This process entails thousands of complex simulations that generally require several days and weeks to be completed even with the most powerful supercomputer facilities available today. Consequently, there has been increasing attempts to develop surrogate models that could eventually replace the job of full-fledge simulations in an accurate and consistent way within optimization and uncertainty analysis workflows. The present work proposes an alternative approach to generate nonlinear reduced order models for optimization and control under uncertainty without explicit knowledge of the equations governing the physics of the simulation. Hence, the proposed method is amenable for legacy simulation codes. In order to cope with the lack of physical information in conjunction with the inherent curse of dimensionality associated with the number of parameter coefficients, control and state variables of the problem, we combined the projection operators obtained from the Proper Orthogonal Decomposition (POD) with neural net interpolation. In this way, the proposed Black-Box Stencil Interpolation Method (BSIM) is capable of exploiting both spatial and temporal variable locality. The method can be seen as a competitive but non-intrusive alternative to the TPWL (Trajectory Piece-Wise Linear) method and the DEIM (Discrete Empirical Interpolation Method) both recently proposed in the literature. We illustrate the capabilities of BSIM on a suite of different black-oil and compositional field models subject to multiple wells controls under geological uncertainty. We show that the results are comparable in accuracy to DEIM despite the non-intrusive character of BSIM.

    Other authors
  • A Parallel Stochastic Framework for Reservoir Characterization and History Matching

    Journal of Applied Mathematics

    The spatial distribution of parameters that characterize the subsurface is never known to any reasonable level of accuracy required to solve the governing PDEs of multiphase flow or species transport through porous media. This paper presents a numerically cheap, yet efficient, accurate and parallel framework to estimate reservoir parameters, for example, medium permeability, using sensor information from measurements of the solution variables such as phase pressures, phase concentrations…

    The spatial distribution of parameters that characterize the subsurface is never known to any reasonable level of accuracy required to solve the governing PDEs of multiphase flow or species transport through porous media. This paper presents a numerically cheap, yet efficient, accurate and parallel framework to estimate reservoir parameters, for example, medium permeability, using sensor information from measurements of the solution variables such as phase pressures, phase concentrations, fluxes, and seismic and well log data. Numerical results are presented to demonstrate the method.

    Other authors
    See publication
  • Exploiting Capabilities of Many Core Platforms in Reservoir Simulation

    SPE Reservoir Simulation Symposium/ SPE

    The forthcoming generation of many-core architectures suggests a strong paradigm shift in the way algorithms have been designed to achieve maximum performance in reservoir simulations. In this work, we propose a novel poly-algorithmic solver approach to develop hybrid CPU multicore and GPU computations for solving large sparse linear systems arising in realistic black oil and compositional flow scenarios. The GPU implementation exploits data parallelism through the simultaneous deployment of…

    The forthcoming generation of many-core architectures suggests a strong paradigm shift in the way algorithms have been designed to achieve maximum performance in reservoir simulations. In this work, we propose a novel poly-algorithmic solver approach to develop hybrid CPU multicore and GPU computations for solving large sparse linear systems arising in realistic black oil and compositional flow scenarios. The GPU implementation exploits data parallelism through the simultaneous deployment of thousands of threads while reducing memory overhead per floating point operations involved in most BLAS kernels and in a suite of preconditioner options such as BILU(k), BILUT and multicoloring SSOR. On the other hand, multicore CPU computations are used to exploit functional parallelism to perform system partitioning and reordering, algebraic multigrid preconditioning, sparsification and model reduction operations in order to accelerate and reduce the number of GCR iterations. The efficient orchestration of these operations relies on carefully designing the heuristics depending on the timestep evolution, degree of nonlinearity and current system properties. Hence, we also propose several criteria to automatically decide the type of solver configuration to be employed at every time step of the simulation. To illustrate the potentials of the proposed solver approach, we perform numerical computations on state-of-the-art multicore CPU and GPU platforms. Computational experiments on a wide range of highly complex reservoir cases reveal that the solver approach yields significant speedups with respect to conventional CPU multicore solver implementations. The solver performance gain is of the order of 3x which impacts in about 2x the overall compositional simulation turnaround time. These results demonstrate the potential that many core solvers have to offer in improving the performance of near future reservoir simulations.

    Other authors
    See publication
  • Parallel Sparsified Solvers for Reservoir Simulation

    ECMOR XII/ EAGE

    We propose two parallel algorithms to sparsify a given linear system: (a) random sampling sparsification (RSS), and (b) percolation-based sparsification (PBS). The former one relies on the idea that coefficients are included into the sparsified system with a probability proportional to its effective transmissibility. The latter relies on capturing highly connected flow paths described by whole set of transmissibility coefficients. Depending on the case, the RSS and PS algorithms have the…

    We propose two parallel algorithms to sparsify a given linear system: (a) random sampling sparsification (RSS), and (b) percolation-based sparsification (PBS). The former one relies on the idea that coefficients are included into the sparsified system with a probability proportional to its effective transmissibility. The latter relies on capturing highly connected flow paths described by whole set of transmissibility coefficients. Depending on the case, the RSS and PS algorithms have the potential to reduce in orders of magnitude the number of floating point operations associated with the preconditioner. Results confirming the benefits of sparsified solvers are illustrated on a wide set of field cases arising in black-oil and compositional simulations.

    See publication
  • Solving Sparse Linear Systems on NVIDIA Tesla GPUs

    Lecture Notes in Computer Science - Springer

    Current many-core GPUs have enormous processing power, and unlocking this power for general-purpose computing is very attractive due to their low cost and efficient power utilization. However, the fine-grained parallelism and the stream-programming model supported by these GPUs require a paradigm shift, especially for algorithm designers. In this paper we present the design of a GPU-based sparse linear solver using the Generalized Minimum RESidual (GMRES) algorithm in the CUDA programming…

    Current many-core GPUs have enormous processing power, and unlocking this power for general-purpose computing is very attractive due to their low cost and efficient power utilization. However, the fine-grained parallelism and the stream-programming model supported by these GPUs require a paradigm shift, especially for algorithm designers. In this paper we present the design of a GPU-based sparse linear solver using the Generalized Minimum RESidual (GMRES) algorithm in the CUDA programming environment. Our implementation achieved a speedup of over 20x on the Tesla T10P based GTX280 GPU card for benchmarks with from a few thousands to a few millions unknowns.

    Other authors
    See publication
  • A Novel Percolative Aggregation Approach for Solving Highly-Ill Conditioned Systems.

    XI European Conference on Mathematics of Oil Recovery (ECMOR)

    A key aspect in any algebraic multilevel procedure is to be able to reliably capture the physical behavior behind system coefficients. However, the modeling of complex reservoir scenarios generally involves the computation of extremely discontinuous and nonlinear coefficients that, in turn, compromise the robustness and efficiency of smoothing and coarsening strategies. In addition, the need of dealing with large discretized domains leads to highly ill-conditioned systems that represent a true…

    A key aspect in any algebraic multilevel procedure is to be able to reliably capture the physical behavior behind system coefficients. However, the modeling of complex reservoir scenarios generally involves the computation of extremely discontinuous and nonlinear coefficients that, in turn, compromise the robustness and efficiency of smoothing and coarsening strategies. In addition, the need of dealing with large discretized domains leads to highly ill-conditioned systems that represent a true challenge to any linear solver technology known today. In this work, we exploit the fact that flow trend information can be used as a basis for developing a robust percolative aggregation (PA) two-stage preconditioning method. To this end, we identify coefficient aggregates by a means of an efficient version of the Hoshen-Kopelman percolation algorithm suitable for any flow network structure. By partitioning and reordering unknowns according to these aggregates, we obtain a set of high-conductive blocks followed by a low-conductive block. Diagonal scaling allows for weakening the interaction between these high- and low- conductive blocks plus improving the overall conditioning of the system. The solution of the high-conductive blocks approximates well the solution of the original problem. Remaining sources of errors from this approximation are mainly due to small eigenvalues that are properly eliminated with a deflation step. The combination of the algebraic solution of the high-conductive blocks with deflation is realized as a two-stage preconditioner strategy. The deflation stage is carried out by further isolating the aggregate blocks with a matrix compensation procedure. We validate the performance of the PA approach against ILU preconditioning. Preliminary numerical results indicate that the PA two-stage preconditioning can be used as a promising alternative to employ existing algebraic multilevel methods in a more effective way.

    See publication
  • Stochastic collocation and mixed finite elements for flow in porous media

    Computer Methods in Applied Mechanics and Engineering

    The aim of this paper is to quantify uncertainty of flow in porous media through stochastic modeling and computation of statistical moments. The governing equations are based on Darcy’s law with stochastic permeability. Starting from a specified covariance relationship, the log permeability is decomposed using a truncated Karhunen–Loève expansion. Mixed finite element approximations are used in the spatial domain and collocation at the zeros of tensor product Hermite polynomials is used in the…

    The aim of this paper is to quantify uncertainty of flow in porous media through stochastic modeling and computation of statistical moments. The governing equations are based on Darcy’s law with stochastic permeability. Starting from a specified covariance relationship, the log permeability is decomposed using a truncated Karhunen–Loève expansion. Mixed finite element approximations are used in the spatial domain and collocation at the zeros of tensor product Hermite polynomials is used in the stochastic dimensions. Error analysis is performed and experimentally verified with numerical simulations. Computational results include incompressible and slightly compressible single and two-phase flow.

    Other authors
    • Tim Wildey
    • Ivan Yotov
    • Mary F Wheeler
    • Dong Zhang
    • Ben Ganis
    See publication
  • Towards a rigorously justified algebraic preconditioner for high-contrast diffusion problems

    Computing and Visualization in Science/Springer-Verlag

    In this paper we present a new preconditioner suitable for solving linear systems arising from finite element approximations of elliptic PDEs with high-contrast coefficients. The construction of the preconditioner consists of two phases. The first phase is an algebraic one which partitions the degrees of freedom into “high” and “low” permeability regions which may be of arbitrary geometry. This partition yields a corresponding blocking of the stiffness matrix and hence a formula for the action…

    In this paper we present a new preconditioner suitable for solving linear systems arising from finite element approximations of elliptic PDEs with high-contrast coefficients. The construction of the preconditioner consists of two phases. The first phase is an algebraic one which partitions the degrees of freedom into “high” and “low” permeability regions which may be of arbitrary geometry. This partition yields a corresponding blocking of the stiffness matrix and hence a formula for the action of its inverse involving the inverses of both the high permeability block and its Schur complement in the original matrix. The structure of the required sub-block inverses in the high contrast case is revealed by a singular perturbation analysis (with the contrast playing the role of a large parameter). This shows that for high enough contrast each of the sub-block inverses can be approximated well by solving only systems with constant coefficients. The second phase of the algorithm involves the approximation of these constant coefficient systems using multigrid methods. The result is a general method of algebraic character which (under suitable hypotheses) can be proved to be robust with respect to both the contrast and the mesh size. While a similar performance is also achieved in practice by algebraic multigrid (AMG) methods, this performance is still without theoretical justification. Since the first phase of our method is comparable to the process of identifying weak and strong connections in conventional AMG algorithms, our theory provides to some extent a theoretical justification for these successful algebraic procedures. We demonstrate the advantageous properties of our preconditioner using experiments on model problems. Our numerical experiments show that for sufficiently high contrast the performance of our new preconditioner is almost identical to that of the Ruge and Stüben AMG preconditioner, both in terms of iteration count and CPU-time.

    Other authors
    See publication
  • Integrated Time-lapse Seismic Inversion for Reservoir Petrophysics and Fluid Flow Imaging

    77th SEG International Exposition & Annual Meeting

    Seismic history matching has been used to reduce uncertainty and increase the accuracy in reservoir characterization. In this paper we propose a joint inversion scheme for quantitative reservoir petrophysics characterization and inside fluid flow imaging, which integrates as many data sources as available, such as time‐lapse seismic data, production data and sensor information. We demonstrate that this integrated framework leads to accurate seismic history matching and reservoir parameter…

    Seismic history matching has been used to reduce uncertainty and increase the accuracy in reservoir characterization. In this paper we propose a joint inversion scheme for quantitative reservoir petrophysics characterization and inside fluid flow imaging, which integrates as many data sources as available, such as time‐lapse seismic data, production data and sensor information. We demonstrate that this integrated framework leads to accurate seismic history matching and reservoir parameter estimation and that the incorporation of time‐lapse seismic data does help speed up the convergence process. In addition, the imaging of the inside fluid flow is also obtained. Considering the unique feature of Bayesian inference in data integration and uncertainty analysis, we also propose a formulation to simultaneously integrate all data sources in a Bayesian framework and solve the problem by stochastically constructing the posterior probability distribution (PPD) using Markov Chain Monte Carlo (MCMC) methods. The coefficients of rock fluid physics models are also incorporated. This means that the coefficients are determined stochastically based on data rather from lab measurements or empirical relationships. Based on MCMC samples, uncertainty can be correctly quantified.

    Other authors
    See publication
  • Algebraic Multigrid Methods (AMG) for the Efficient Solution of Fully Implicit Formulations in Reservoir Simulation

    SPE Reservoir Simulation Symposium

    Two-stage preconditioners are based on the idea that coupled system solutions are mainly determined by the solution of their elliptic components (i.e., pressure). Thus, the procedure consists of extracting and accurately solving pressure subsystems. Residuals associated with this solution are corrected with an additional preconditioning step that recovers part of the global information contained in the original system.

    Optimized and highly complex hierarchical methods such as algebraic…

    Two-stage preconditioners are based on the idea that coupled system solutions are mainly determined by the solution of their elliptic components (i.e., pressure). Thus, the procedure consists of extracting and accurately solving pressure subsystems. Residuals associated with this solution are corrected with an additional preconditioning step that recovers part of the global information contained in the original system.

    Optimized and highly complex hierarchical methods such as algebraic multigrid (AMG) offer an efficient alternative for solving linear systems that show a "discretely elliptic" nature. When applicable, the major advantage of AMG is its numerical scalability; that is, the numerical work required to solve a given type of matrix problem grows only linearly with the number of variables. Consequently, interest in incorporating AMG methods as basic linear solvers in industrial oil reservoir simulation codes has been steadily increasing for the solution of pressure blocks.

    Generally, however, the preconditioner influences the properties of the pressure block to some extent by performing certain algebraic manipulations. Often, the modified pressure blocks are "less favorable?? for an efficient treatment by AMG. In this work, we discuss strategies for solving the fully implicit systems that preserve (or generate) the desired ellipticity property required by AMG methods. Additionally, we introduce an iterative coupling scheme as an alternative to fully implicit formulations that is faster and also amenable for
    AMG implementations. Hence, we demonstrate that our AMG implementation can be applied to efficiently deal with the mixed elliptic-hyperbolic character of these problems. Numerical experiments reveal that the proposed methodology is promising for solving large-scale, complex reservoir problems.

    Other authors
    See publication
  • Assessing Multiple Resolution Scales in History Matching With Metamodels

    SPE Reservoir Simulation Symposium

    In this paper we present a new multiscale approach to history matching assisted by a neural network metamodel. The method starts with the construction of a fine scale a priori model that includes geological and geostatistical information. We then apply a singular value decomposition (SVD) in order to obtain a parametric representation of the permeability field, in a way that a fixed set of eigenimages
    are determined with the parameters to be inverted as weights in the expansion. Through this…

    In this paper we present a new multiscale approach to history matching assisted by a neural network metamodel. The method starts with the construction of a fine scale a priori model that includes geological and geostatistical information. We then apply a singular value decomposition (SVD) in order to obtain a parametric representation of the permeability field, in a way that a fixed set of eigenimages
    are determined with the parameters to be inverted as weights in the expansion. Through this procedure not only is the number of parameters significantly reduced, but also the weights in the SVD expansion define a hierarchy that naturally separates the different resolution scales in the system. We show that the multiscale procedure alone helps to significantly reduce the CPU time required to accomplish the parameter estimation. Furthermore, the reduced parameter space facilitated the training of the neural network engine.

    Other authors
    See publication
  • Deflation AMG Solvers for Highly Ill Conditioned Reservoir Simulation Problems

    SPE Reservoir Simulation Symposium

    In recent years, deflation methods have received increasingly particular attention as a means of improving the convergence of linear iterative solvers. This is due to the fact that deflation operators provide a way to remove the negative effect that extreme (usually small) eigenvalues have on the convergence of Krylov iterative methods for solving general symmetric and non-symmetric systems.

    In this work, we use deflation methods to extend the capabilities of algebraic multigrid (AMG)…

    In recent years, deflation methods have received increasingly particular attention as a means of improving the convergence of linear iterative solvers. This is due to the fact that deflation operators provide a way to remove the negative effect that extreme (usually small) eigenvalues have on the convergence of Krylov iterative methods for solving general symmetric and non-symmetric systems.

    In this work, we use deflation methods to extend the capabilities of algebraic multigrid (AMG) for handling highly non-symmetric and indefinite problems, such as those arising in fully implicit formulations of multiphase flow in porous media. The idea is to ensure that components of the solution that remain unresolved by AMG (due to the coupling of roughness and indefiniteness introduced by different block coefficients) are removed from the problem. This translates to a constraint to the AMG iteration matrix spectrum within the unit circle to achieve convergence. This approach interweaves AMG (V, W or V-W) cycles with deflation steps that are computable either from the underlying Krylov basis produced by the GMRES accelerator (Krylov-based deflation) or from the reservoir decomposition given by high property contrasts (domain-based deflation). This work represents an efficient extension to the Generalized Global Basis (GGB) method that was recently proposed for the solution of the elastic wave equation with geometric multigrid and an out-of-core computation of eigenvalues.

    Hence, the present approach offers the possibility of applying AMG to more general large-scale reservoir settings without further modifications to the AMG implementation or algebraic manipulation of the linear system (as suggested by two-stage preconditioning methods). Promising results are supported by a suite of numerical experiments with extreme permeability contrasts.

    Other authors
    See publication
  • A neural stochastic multiscale optimization framework for sensor-based parameter estimation

    Integrated Computer-Aided Engineering/IOS Press

    This work presents a novel neural stochastic optimization framework for reservoir parameter estimation that combines two independent sources of spatial and temporal data: oil production data and dynamic sensor data of flow pressures and concentrations. A parameter estimation procedure is realized by minimizing a multi-objective mismatch function between observed and predicted data. In order to be able to efficiently perform large-scale parameter estimations, the parameter space is decomposed in…

    This work presents a novel neural stochastic optimization framework for reservoir parameter estimation that combines two independent sources of spatial and temporal data: oil production data and dynamic sensor data of flow pressures and concentrations. A parameter estimation procedure is realized by minimizing a multi-objective mismatch function between observed and predicted data. In order to be able to efficiently perform large-scale parameter estimations, the parameter space is decomposed in different resolution levels by means of the singular value decomposition (SVD) and a wavelet upscaling process. The estimation is carried out incrementally from low to higher resolution levels by means of a neural stochastic multilevel optimization approach. At a given resolution level, the parameter space is globally explored and sampled by the simultaneous perturbation stochastic approximation (SPSA) algorithm. The sampling yielded by SPSA serves as training points for an artificial neural network that allows for evaluating the sensitivity of different multi-objective function components with respect to the model parameters. The proposed approach may be suitable for different engineering and scientific applications wherever the parameter space results from discretizing a set of partial differential equations on a given spatial domain.

    Other authors
    See publication
  • Dynamic Data-Driven Systems Approach for Simulation Based Optimizations

    Computational Science – ICCS 2007/Springer-Verlag

    This paper reviews recent developments in our project that are focused on dynamic data-driven methods for efficient and reliable simulation based optimization, which may be suitable for a wide range of different application problems. The emphasis in this paper is on the coupling of parallel multiblock predictive models with optimization, the development of autonomic execution engines for distributing the associated computations, and deployment of systems capable of handling large datasets. The…

    This paper reviews recent developments in our project that are focused on dynamic data-driven methods for efficient and reliable simulation based optimization, which may be suitable for a wide range of different application problems. The emphasis in this paper is on the coupling of parallel multiblock predictive models with optimization, the development of autonomic execution engines for distributing the associated computations, and deployment of systems capable of handling large datasets. The integration of these components results in a powerful framework for developing large-scale and complex decision-making systems for dynamic data-driven applications.

    Other authors
    See publication
  • Development of Low-Order Controllers for High-Order Reservoir Models and Smart Wells

    Society of Petroleum Engineers

    Recently, the oil industry has started instrumenting and deploying various controls for the enhancement of hydrocarbon extraction. However, due to system complexity, data acquisition and, in turn, decision making, are still issues to be resolved in large-scale reservoir management. A very challenging large-scale control problem that has emerged recently is the real-time control of smart wells. A fundamental reason for using feedback control in this setting is to achieve desired performance in…

    Recently, the oil industry has started instrumenting and deploying various controls for the enhancement of hydrocarbon extraction. However, due to system complexity, data acquisition and, in turn, decision making, are still issues to be resolved in large-scale reservoir management. A very challenging large-scale control problem that has emerged recently is the real-time control of smart wells. A fundamental reason for using feedback control in this setting is to achieve desired performance in the presence of external disturbances and model uncertainties.

    This work introduces iterative Krylov subspace projection (KSP) methods to generate low-order feedback controllers for smart wells employing high-order reservoir models. The main motivation for using these methods is to enable the efficient computation of low-order reservoir models derived from the highly sparse structure of fluxes and pressure coefficients after discretization. Compared to other model reduction methods, such as modal decomposition, balanced realization, subspace identification and the proper orthogonal decomposition (POD), the proposed approach is very efficient since it is fundamentally based on sparse matrix-vector products. For a problem size of n simulation gridblocks, the KSP method presents a complexity of O(n2k) (or even O(nk), by exploiting sparsity), where k is the number of iterations required to generate the low-order model. This is highly desirable for the design of timely low-controller mechanisms in smart wells.

    We illustrate the potential of the method by linearizing an oil-water model and showing how to establish stability and error bounds for production responses under certain perturbations. We also discuss how the method can be extended to cases of non-stationary and nonlinear fluid coefficients.

    Other authors
    See publication
  • A Neural Stochastic Optimization Framework for Oil Parameter Estimation

    International Conference on Intelligent Data Engineering and Automated Learning

    The main objective of the present work is to propose and evaluate a neural stochastic optimization framework for reservoir parameter estimation, for which a history matching procedure is implemented by combining three independent sources of spatial and temporal information: production data, time-lapse seismic and sensor information. In order to efficiently perform large-scale parameter estimation, a coupled multilevel, stochastic and learning search methodology is proposed. At a given…

    The main objective of the present work is to propose and evaluate a neural stochastic optimization framework for reservoir parameter estimation, for which a history matching procedure is implemented by combining three independent sources of spatial and temporal information: production data, time-lapse seismic and sensor information. In order to efficiently perform large-scale parameter estimation, a coupled multilevel, stochastic and learning search methodology is proposed. At a given resolution level, the parameter space is globally explored and sampled by the simultaneous perturbation stochastic approximation (SPSA) algorithm. The estimation and sampling performed by SPSA is further enhanced by a neural learning engine that evaluates the objective function sensitiveness with respect to parameter estimates in the vicinity of the most promising optimal solutions.

    Other authors
    See publication
  • Towards Dynamic Data-Driven Management of the Ruby Gulch Waste Repository

    ICCS

    Previous work in the Instrumented Oil-Field DDDAS project has enabled a new generation of data-driven, interactive and dynamically adaptive strategies for subsurface characterization and oil reservoir management. This work has led to the implementation of advanced multi-physics, multi-scale, and multi-block numerical models and an autonomic software stack for DDDAS applications. The stack implements a Grid-based adaptive execution engine, distributed data management services for real-time data…

    Previous work in the Instrumented Oil-Field DDDAS project has enabled a new generation of data-driven, interactive and dynamically adaptive strategies for subsurface characterization and oil reservoir management. This work has led to the implementation of advanced multi-physics, multi-scale, and multi-block numerical models and an autonomic software stack for DDDAS applications. The stack implements a Grid-based adaptive execution engine, distributed data management services for real-time data access, exploration, and coupling, and self-managing middleware services for seamless discovery and composition of components, services, and data on the Grid. This paper investigates how these solutions can be leveraged and applied to address another DDDAS application of strategic importance – the data-driven management of Ruby Gulch Waste Repository.

    Other authors
    See publication
  • A Neural Stochastic Optimization Framework for Oil Parameter Estimation

    Intelligent Data Engineering and Automated Learning/ Springer-Verlag

    The main objective of the present work is to propose and evaluate a neural stochastic optimization framework for reservoir parameter estimation, for which a history matching procedure is implemented by combining three independent sources of spatial and temporal information: production data, time-lapse seismic and sensor information. In order to efficiently perform large-scale parameter estimation, a coupled multilevel, stochastic and learning search methodology is proposed. At a given…

    The main objective of the present work is to propose and evaluate a neural stochastic optimization framework for reservoir parameter estimation, for which a history matching procedure is implemented by combining three independent sources of spatial and temporal information: production data, time-lapse seismic and sensor information. In order to efficiently perform large-scale parameter estimation, a coupled multilevel, stochastic and learning search methodology is proposed. At a given resolution level, the parameter space is globally explored and sampled by the simultaneous perturbation stochastic approximation (SPSA) algorithm. The estimation and sampling performed by SPSA is further enhanced by a neural learning engine that evaluates the objective function sensitiveness with respect to parameter estimates in the vicinity of the most promising optimal solutions.

    Other authors
    See publication
  • Assessing the Value of Sensor Information in 4-D Seismic History Matching.

    76th SEG Int. Exposition & Annual Meeting. New Orleans, Oct. 1-6, 2006

    The main objective of the present work is to numerically determine how sensor information may aid in reducing the ill-posedness associated with permeability estimation via 4-D seismic history matching. These sensors are assumed to provide timely information of pressures, concentrations and fluid velocities at given locations in a reliable fashion. This information is incorporated into an objective function that additionally includes production and seismic components that are mismatched between…

    The main objective of the present work is to numerically determine how sensor information may aid in reducing the ill-posedness associated with permeability estimation via 4-D seismic history matching. These sensors are assumed to provide timely information of pressures, concentrations and fluid velocities at given locations in a reliable fashion. This information is incorporated into an objective function that additionally includes production and seismic components that are mismatched between observed and predicted data. In order to efficiently perform large-scale permeability estimation, a coupled multilevel, stochastic and learning search methodology is proposed. At a given resolution level, the parameter space is globally explored and sampled by the simultaneous perturbation stochastic approximation (SPSA) algorithm. The estimation and sampling performed by SPSA is further enhanced by a neural learning engine that estimates sensitivities in the vicinity of the most promising optimal solutions. Preliminary results shed light on future research avenues for optimizing the frequency and localization of 4-D seismic surveys when sensor data is available.

    Other authors
    See publication
  • On optimization algorithms for the reservoir oil well placement problem

    Computational Geosciences/Springer Verlag

    Determining optimal locations and operation parameters for wells in oil and gas reservoirs has a potentially high economic impact. Finding these optima depends on a complex combination of geological, petrophysical, flow regimen, and economical parameters that are hard to grasp intuitively. On the other hand, automatic approaches have in the past been hampered by the overwhelming computational cost of running thousands of potential cases using reservoir simulators, given that each of these runs…

    Determining optimal locations and operation parameters for wells in oil and gas reservoirs has a potentially high economic impact. Finding these optima depends on a complex combination of geological, petrophysical, flow regimen, and economical parameters that are hard to grasp intuitively. On the other hand, automatic approaches have in the past been hampered by the overwhelming computational cost of running thousands of potential cases using reservoir simulators, given that each of these runs can take on the order of hours. Therefore, the key issue to such automatic optimization is the development of algorithms that find good solutions with a minimum number of function evaluations. In this work, we compare and analyze the efficiency, effectiveness, and reliability of several optimization algorithms for the well placement problem. In particular, we consider the simultaneous perturbation stochastic approximation (SPSA), finite difference gradient (FDG), and very fast simulated annealing (VFSA) algorithms. None of these algorithms guarantees to find the optimal solution, but we show that both SPSA and VFSA are very efficient in finding nearly optimal solutions with a high probability. We illustrate this with a set of numerical experiments based on real data for single and multiple well placement problems.

    Other authors
    See publication
  • Upscaling Hydraulic Properties of Fractured Porous Media: Full Permeability Tensor and Continuum Scale Simulation

    SPE/DOE Symposium on Improved Oil Recovery (SPE 100057), 22-26 April, Tulsa, Oklahoma, 2006.

  • A Time-Stepping Scheme for Coupled Reservoir Flow and Geomechanics

    Society of Petroleum Enginners

    This paper presents numerical techniques for coupled simulations with different time scales and space discretizations for reservoir flow and geomechanics. We use an explicitly coupled approach together with an iterative coupling to increase stability and reduce time discretization error. An error indicator is proposed to determine when displacement must be updated and whether the explicit or iterative coupling technique is required. Under this setting, one geomechanics calculation is performed…

    This paper presents numerical techniques for coupled simulations with different time scales and space discretizations for reservoir flow and geomechanics. We use an explicitly coupled approach together with an iterative coupling to increase stability and reduce time discretization error. An error indicator is proposed to determine when displacement must be updated and whether the explicit or iterative coupling technique is required. Under this setting, one geomechanics calculation is performed for several reservoir flow steps. For time steps without geomechanics updates linear extrapolated pore volume is used for porous flow calculations. The resulting algorithm is computationally more efficient than the iterative coupling, and it is more stable and accurate than the loosely coupled technique.

    In the event that different meshes are used for the reservoir flow and geomechanics models, special treatments are required for the integration of the coupling terms over each element. To avoid complex 3D grid intersection calculations we propose to divide an element into a number of subelements and apply the midpoint integration rule over each subelement. Numerical results are presented to demonstrate the efficiency and accuracy of the proposed method for coupled simulations with different time and space discretizations.

    Other authors
    • Mary Wheeler
    • Xiuli Gai
    • Shuyu Sun
    See publication
  • Estimation of elastic constants from ellipsoidal velocities in orthorhombic media

    SEG Expanded Abstracts

    This paper in troduces an ellipsoidal approximation of phase and group velocities of the P-, S1- and S2-wave propagation modes in an orthorhombic media and, show how to estimate elastic constants from these velocities. The procedure is basically two-fold. First, we estimate seven ellipsoidal velocities near the vertical symmetry axis which represent both the direct and the NMO velocities (in the symmetry planes XZ and YZ) for
    each w avepropagation mode. Secondly, the…

    This paper in troduces an ellipsoidal approximation of phase and group velocities of the P-, S1- and S2-wave propagation modes in an orthorhombic media and, show how to estimate elastic constants from these velocities. The procedure is basically two-fold. First, we estimate seven ellipsoidal velocities near the vertical symmetry axis which represent both the direct and the NMO velocities (in the symmetry planes XZ and YZ) for
    each w avepropagation mode. Secondly, the ellipsoidal
    velocities are used to build a linear square system whose simple analytical solution returns an estimation of the elastic constants. The accuracy of this inversion process depends on how accurately NMO velocities are estimated. The whole procedure is valid for homogeneous media, although extensions to heterogeneous media can be performed through tomographic techniques.

    Other authors
    • R. Michelena
    See publication
  • A parallel, implicit, cell‐centered method for two‐phase flow with a preconditioned Newton–Krylov solver

    Cmputational Geosciences/ Springer-Verlag

    A new parallel solution technique is developed for the fully implicit threedashdimensional twodashphase flow model. An expandedcelldashcentered finite difference scheme which allows for a full permeability tensor is employed for the spatial discretization, and backwardEuler is used for the time discretization. The discrete systems are solved using a novel inexact Newton method that reuses the Krylov information generated by the GMRES linear iterative solver. Fast nonlinear convergence can be…

    A new parallel solution technique is developed for the fully implicit threedashdimensional twodashphase flow model. An expandedcelldashcentered finite difference scheme which allows for a full permeability tensor is employed for the spatial discretization, and backwardEuler is used for the time discretization. The discrete systems are solved using a novel inexact Newton method that reuses the Krylov information generated by the GMRES linear iterative solver. Fast nonlinear convergence can be achieved by composing inexact Newton steps with quasidashNewton steps restricted to the underlying Krylov subspace. Furthermore, robustness and efficiency are achieved with a linedashsearch backtracking globalization strategy for the nonlinear systems and a preconditioner for each coupled linear system to be solved. This inexact Newton method also makes use of forcing terms suggested by Eisenstat and Walker which prevent oversolving of the Jacobian systems. The preconditioner is a new twodashstage method which involves a decoupling strategy plus the separate solutions of both nonwettingdashphase pressure and saturation equations. Numerical results show that these nonlinear and linear solvers are very effective.

    Other authors
    See publication
  • A Family of Physics-Based Preconditioners for Solving Elliptic Equations on Highly Heterogeneous Media

    Applied Numerical Mathematics

    Eigenvalues of smallest magnitude become a major bottleneck for iterative solvers especially when the underlying physical properties have severe contrasts. These contrasts are commonly found in many applications such as composite materials, geological rock properties and thermal and electrical conductivity. The main objective of this work is to construct a method as algebraic as possible. However, the underlying physics is utilized to distinguish between high and low degrees of freedom which is…

    Eigenvalues of smallest magnitude become a major bottleneck for iterative solvers especially when the underlying physical properties have severe contrasts. These contrasts are commonly found in many applications such as composite materials, geological rock properties and thermal and electrical conductivity. The main objective of this work is to construct a method as algebraic as possible. However, the underlying physics is utilized to distinguish between high and low degrees of freedom which is central to the construction of the proposed preconditioner. Namely, we propose an algebraic way of separating binary-like systems according to a given threshold into high- and low-conductivity regimes of coefficient size and , respectively where . So, the proposed preconditioner is essentially physics-based because without the utilization of underlying physics such an algebraic distinction, hence, the construction of the preconditioner would not be possible. The condition number of the linear system depends both on the mesh size Δx and the coefficient size m. For our purposes, we address only the m dependence since the condition number of the linear system is mainly governed by the high-conductivity sub-block. Thus, the proposed strategy is inspired by capturing the relevant physics governing the problem. Based on the algebraic construction, a two-stage preconditioning strategy is developed as follows: (1) a first stage that comprises approximation to the components of the solution associated to small eigenvalues and, (2) a second stage that deals with the remaining solution components with a deflation strategy (if ever needed). Due to its algebraic nature, the proposed approach can support a wide range of realistic geometries (e.g., layered and channelized media). Numerical examples show that the proposed class of physics-based preconditioners are more effective and robust compared to a class of Krylov-based deflation methods on highly heterogeneous media.

    See publication
  • Dynamic Decision and Data-Driven Strategies for the Optimal Management of Subsurface Geo-Systems

    -

    Effective geo-system management involves an understanding of the interplay
    between surface entities (e.g., locations of injection and production wells
    in an oil reservoir) and appropriately affecting subsurface characteristics.
    This, in turn, requires efficient integration of complex numerical models of
    the environment, optimization procedures, and decision making
    processes. The dynamic, data-driven application systems (DDDAS)
    paradigm offers a promising framework to address…

    Effective geo-system management involves an understanding of the interplay
    between surface entities (e.g., locations of injection and production wells
    in an oil reservoir) and appropriately affecting subsurface characteristics.
    This, in turn, requires efficient integration of complex numerical models of
    the environment, optimization procedures, and decision making
    processes. The dynamic, data-driven application systems (DDDAS)
    paradigm offers a promising framework to address this requirement. To
    achieve this goal, we have developed advanced multi-physics, multiscale, and multi-block numerical models and autonomic systems software
    for dynamic, data-driven applications systems. This work has enabled a
    new generation of data-driven, interactive and dynamically adaptive
    strategies for subsurface characterization and management. These strategies have been applied to different aspects of subsurface
    management in strategically important application areas, including
    simulation-based optimization for the optimal oil well placement and the
    data-driven management of the Ruby Gulch Waste Repository. This paper
    summarizes the key outcomes and achievements of our work, as well as
    reports ongoing and future activities focused on uncertainty estimation
    and characterization.

    Other authors
    See publication

Patents

  • GROWTH FUNCTIONS FOR MODELING OIL PRODUCTION

    Issued US 20170177992

    The present disclosure describes the use of growth models and data driven models that are combined for quickly and efficiently modeling SAGD reservoir oil production. Growth function surrogate models are used for efficient and reliable reservoir modeling and production forecasting as opposed to CPU intensive simulations based on finite difference models. A data-driven technique can then compare the growth function surrogate model with real field data to find discrepancies and inconsistencies…

    The present disclosure describes the use of growth models and data driven models that are combined for quickly and efficiently modeling SAGD reservoir oil production. Growth function surrogate models are used for efficient and reliable reservoir modeling and production forecasting as opposed to CPU intensive simulations based on finite difference models. A data-driven technique can then compare the growth function surrogate model with real field data to find discrepancies and inconsistencies between the two, allowing for an updates and improvements of the growth function model.

    See patent
  • SYSTEM AND METHOD FOR EVENT DETECTION USING STREAMING SIGNALS

    Issued US US 20160370260

    Systems and methods compute dysfunctions via amplitude envelopes that deviate from a mean (normal) state behavior. The envelope function is constructed from a recursive application of the maximum signal value within a given window size. The aforementioned operations are causal and computationally affordable as relative short moving windows are required to trail the current point. Therefore, the proposed envelope and dysfunction calculations are amenable for any source of data streams measured…

    Systems and methods compute dysfunctions via amplitude envelopes that deviate from a mean (normal) state behavior. The envelope function is constructed from a recursive application of the maximum signal value within a given window size. The aforementioned operations are causal and computationally affordable as relative short moving windows are required to trail the current point. Therefore, the proposed envelope and dysfunction calculations are amenable for any source of data streams measured at high sample rates. The effectiveness of the computing is validated as representing multiple physics in real time field drilling operations

    See patent
  • POWER LOSS DYSFUNCTION CHARACTERIZATION

    Issued US 20160333672

    The invention relates to a method, system and apparatus for determining real-time drilling operations dysfunctions by measuring the power-loss of signal propagation associated with a drill string in a wellbore. The invention comprises acquiring a first time series from a mid-string drilling sub sensor associated with a drill string in a wellbore and acquiring a second time series from a sensor associated with the drill string wherein the sensor is on or near a drill rig on the surface of the…

    The invention relates to a method, system and apparatus for determining real-time drilling operations dysfunctions by measuring the power-loss of signal propagation associated with a drill string in a wellbore. The invention comprises acquiring a first time series from a mid-string drilling sub sensor associated with a drill string in a wellbore and acquiring a second time series from a sensor associated with the drill string wherein the sensor is on or near a drill rig on the surface of the earth. The process further comprises determining the geometry of the wellbore and determining model parameters alpha and beta for characterizing a wellbore using the first time series, the second time series and the geometry of the wellbore by deriving a power loss of signal propagation. The model parameters may then be used for drilling a subsequent well using surface sensor acquired data to detect drilling dysfunctions.

    Other inventors
    See patent
  • ADVANCED PARALLEL "MANY-CORE" FRAMEWORK FOR RESERVOIR SIMULATION

    Issued US 20150205005

    The disclosure relates generally to computerized numerical simulations of hydrocarbon reservoirs, and particularly to a parallel computing framework with an integrated set of software components designed to efficiently perform massive parallel computations for reservoir simulation. Heuristic rules will be used to determine the appropriate software and hardware configuration for efficient simulation.

    Other inventors
    • Hari Sudan
    See patent
  • MULTILEVEL PERCOLATION AGGREGATION SOLVER FOR PETROLEUM RESERVOIR SIMULATIONS

    Issued US EP 2531951 A2

    An efficient percolation aggregation solver methodology captures media connectivity and continuity to reliably incorporate relevant flow solution trends in subterranean formation models. The approach allows introduction of meaningful physical information that is generally overlooked by state-of-the-art algebraic algorithms in the solution process. Percolation aggregation extends the efficiency and robustness of solution methods used to solve scientific and engineering problems.

    See patent

Projects

  • Optimal Learning of Field Operations and Well Placement in the Presence of Uncertainty

    -

    Other creators

Languages

  • English

    -

  • Spanish

    -

Organizations

  • SPE, EAGE, SIAM, AGU, IEEE

    -

Recommendations received

More activity by Hector

View Hector’s full profile

  • See who you know in common
  • Get introduced
  • Contact Hector directly
Join to view full profile

Other similar profiles

Explore collaborative articles

We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

Explore More

Others named Hector Klie

1 other named Hector Klie is on LinkedIn

See others named Hector Klie

Add new skills with these courses