An Efficient Transformer-based Surrogate Model with End-to-end Training Strategies for Automatic History Matching
Date: 2024-06-12  Cicking Rate: 22

Recently, Zhang Kai's team has made significant progress in the field of automatic history matching in reservoirs. Their research results have been published in the journal Geoenergy Science and Engineering, in a paper titled "An Efficient Transformer-based Surrogate Model with End-to-end Training Strategies for Automatic History Matching."

InnovationWe propose a novel Hybrid Hierarchical Vision Transformer (HHVT) method for automatic history matching. This method achieves efficient prediction of reservoir production data through a unified encoder-decoder architecture and end-to-end strategies. HHVT significantly improves the training speed and prediction accuracy of the surrogate model by supporting multi-time-step parallel computing, outperforming traditional convolutional and recurrent neural network models. Additionally, this method combines multimodal optimization algorithms for automatic history matching of reservoirs. Its application to complex 3D reservoir models demonstrates its advantages in handling large-scale datasets and high-dimensional features, with the training speed of the history matching surrogate model being more than twice as fast.

AbstractIn recent years, surrogate models have gained popularity as a tool to tackle the challenges presented by time-consuming numerical simulations in automatic history matching (AHM). Although there are many different surrogate models designed to alleviate this issue, it is still challenging for most of them to handle the high dimensionality and strong non-linearity in reservoir models and dynamic production data. Inspired by the rapid development of Transformer, we propose a novel hybrid hierarchical Vision Transformer (HHVT) approach for history matching, which utilizes a unified architecture to predict production data for specific physical fields with an end-to-end strategy. For predicting the production data of wells, the Transformer supports parallelism computation among multiple timesteps, which shows more superiority than traditional recurrent neural networks. Specifically, our approach constructs a novel encoder-decoder Transformer architecture to learn the implicit features of high-level spatial parameters to match the features of time-series production data. With this architecture, HHVT achieves fast training and inference, which is suitable for large-scale datasets and high-dimensional features. The proposed HHVT model is integrated with a multimodal optimization algorithm to find history-matching solutions. We first validated the effect of hyperparameters of HHVT on a simple 2D reservoir. Moreover, the proposed method was verified on the complex 3D Brugge model. The results demonstrate that the training speed of the Transformer-based model is approximately twice as fast as the surrogates based on convolutional and recurrent neural networks. The proposed HHVT also shows better prediction accuracy in two cases, compared with other surrogate models, which enhances the applicability of surrogate-based history-matching methods in large-scale complex reservoir scenarios.

Geoenergy Science and Engineering covers a broad range of topics in earth energy and sustainable hydrocarbon production, aiming to publish articles with a particular focus on energy transition and achieving net-zero emission targets. Formerly known as the Journal of Petroleum Science and Engineering, its 2022 impact factor is 4.4, CiteScore is 8.8, and it is categorized as Q1 in the 2022 JCR rankings and in the second tier of the Chinese Academy of Sciences' engineering and technology category.

Paper link:

https://doi.org/10.1016/j.geoen.2024.212994


Citation:

Zhang Jinding, Kang Jinzheng, Zhang Kai, Zhang Liming, Liu Piyang, Liu Xingyu, Sun Weijia, Wang Guangyao. An Efficient Transformer-based Surrogate Model with End-to-end Training Strategies for Automatic History Matching [J]. Geoenergy Science and Engineering: 212994, 2024.


Copyright:@ The Zhang Group
You are the
th Visitor