High-Performance Computing Breaks Oil and Gas Simulations Record

May 4, 2017
Reservoir simulation calculations from IBM, Stone Ridge Technology and Nvidia show that large-scale modeling projects can be accessible to small and mid-size oil and gas companies as well as large.

Much of what can be achieved today with sensing, data analytics, connectivity and more is possible because of the awesome computing power available to us today. Combine that with the relatively new-found ability to get more fossil fuels out of the earth, and you get a very powerful tool to manage what is arguably any oil producer’s greatest asset—the reservoir.

Earlier this year, ExxonMobil announced a major breakthrough in complex oil and gas reservoir simulation models—using 716,800 processors operating in parallel to help ExxonMobil’s geoscientists and engineers to optimize predictions in reservoir performance. It was not only the largest number of processors used in the oil and gas industry, it was one of the largest simulations reported by industry in general.

Now imagine achieving the same success with a whole heck of a lot fewer processors. IBM, Stone Ridge Technology and Nvidia recently showed off what can be done when you use graphics processing units (GPUs) to accelerate the capabilities of standard processors—central processing units (CPUs)—and apply them to engineering applications. Together, the companies shattered previous reservoir simulation capabilities using only a 10th of the power and a 100th of the space. The news demonstrates the ability of Nvidia GPUs to simulate 1 billion cell models in a fraction of the published time, while delivering 10 times better performance and efficiency than legacy CPU codes.

The breakthrough achievement used 60 Power processors and 120 GPU accelerators, aiming to transform the price and performance for business-critical high-performance computing (HPC) applications for simulation and exploration.

Energy companies use reservoir modeling to predict the flow of oil, water and natural gas in the subsurface of the earth before they drill to figure out how to more efficiently extract the most oil. A billion-cell simulation is extremely challenging due to the level of detail it seeks to provide. Stone Ridge Technology, which develops the Echelon petroleum reservoir simulation software, completed the billion-cell reservoir simulation in 92 minutes using 30 IBM Power Systems S822LC for HPC servers equipped with 60 Power processors and 120 Nvidia Tesla P100 GPU accelerators.

“This calculation is a very salient demonstration of the computational capability and density of solution that GPUs offer. That speed lets reservoir engineers run more models and what-if scenarios than previously so they can have insights to produce oil more efficiently, open up fewer new fields and make responsible use of limited resources,” said Vincent Natoli, president of Stone Ridge Technology. “By increasing compute performance and efficiency by more than an order of magnitude, we're democratizing HPC for the reservoir simulation community.”

The democratization of HPC is also demonstrated through a significant difference in the cost structure—more in the range of $1 million to $2 million rather than hundreds of millions of dollars for the ExxonMobil system.

IBM pointed to the benefits of its Power architecture for data-intensive and cognitive workloads. “By running Echelon on IBM Power Systems, users can achieve faster run times using a fraction of the hardware,” said Sumit Gupta, IBM’s vice president for HPC, artificial intelligence and analytics. “The previous record used more than 700,000 processors in a supercomputer installation that occupies nearly half a football field. Stone Ridge did this calculation on two racks of IBM Power Systems machines that could fit in the space of half a ping-pong table.”

A common misconception about GPUs is that they are better suited to simple, more naturally parallel applications such as seismic imaging. This project aimed to show that they are efficient on complex application codes like reservoir simulators as well. The project also shows that even small and medium-size oil and energy companies can take advantage of computer-based reservoir modeling and optimize production from their asset portfolio.

Though there are only a few places in the world where the resolution and detail offered by billion-cell simulations would be useful, the calculation highlights the performance differences between new fully GPU-based codes like the Echelon reservoir simulator and equivalent legacy CPU codes. Echelon scales from the cluster to the workstation. While it can simulate a billion cells on 30 servers, it can also run smaller models on a single server or even on a single Nvidia P100 board in a desktop workstation—the latter two use cases being more in the sweet spot for the energy industry.

About the Author

Aaron Hand | Editor-in-Chief, ProFood World

Aaron Hand has three decades of experience in B-to-B publishing with a particular focus on technology. He has been with PMMI Media Group since 2013, much of that time as Executive Editor for Automation World, where he focused on continuous process industries. Prior to joining ProFood World full time in late 2020, Aaron worked as Editor at Large for PMMI Media Group, reporting for all publications on a wide variety of industry developments, including advancements in packaging for consumer products and pharmaceuticals, food and beverage processing, and industrial automation. He took over as Editor-in-Chief of ProFood World in 2021. Aaron holds a B.A. in Journalism from Indiana University and an M.S. in Journalism from the University of Illinois.

Sponsored Recommendations

Put the Plant Floor in Your Pocket with Ignition Perspective

Build mobile-responsive HTML applications that run natively on any screen.

Ignition: Industrial-Strength System Security and Stability

Ignition is built on a solid, unified architecture and proven, industrial-grade security technology, which is why industrial organizations all over the world have been trusting...

Iron Foundry Gains Competitive Edge & Increases Efficiency with Innovative Technology

With help from Artek, Ferroloy implemented Ignition to digitally transform their disconnected foundry through efficient data collection and analysis while integrating the new ...

Empowering Data Center Growth: Leveraging Ignition for Scalability and Efficiency

Data center growth has exploded over the past decade. Initially driven by organizations moving their computer assets to the cloud, this trend has only accelerated. With the rise...