Sign In Start a Trial

AI And The Digital Oilfield Series, Part 2 – Present And Potential Applications


Despite being relatively early days, artificial intelligence (AI) is already being used in a wide range of applications for the digital oilfield, while having the potential to revolutionize the Canadian oil and gas industry in the longer term.

Here, in the second part of a five-part series, AI’s present applications will be discussed, and its potential for providing autonomous systems to the Canadian oilpatch in the future explored.

Present applications

“Hundreds of applications, or use cases, have been successfully implemented on a large scale for oil and gas companies,” according to Nick Robbins, executive advisor, energy at AltaML, a leading AI-powered solutions company based in Edmonton, with these applications spanning the industry’s value chain.

“In the upstream segment, opportunities include integrating data driven and machine learning techniques to geological and reservoir modeling — to better understand resources and reserves, improving simulation and forecasting,” Bruce Duong, senior manager of recovery technologies at provincial Crown corporation Alberta Innovates, told DOB Energy.  

Existing modeling and simulation are computationally intensive, and a combination of statistical models with physics-based models can significantly accelerate time from resource delineation to production, he added.

“Data driven techniques can be used to optimize drilling and completion operations, real time drilling feedback, designing well placement, placement of hydraulic fractures etc.”

In oilsands operations, machine learning techniques can be applied to automate artificial lift, optimize steam distribution and production, maintain uptime of steam generation, and monitor water quality and treatment.

“In the midstream segment, safety and measurement are major concerns, and real time data from distributed sensors can be used to detect and predict failures and anomalies,” Duong said. “Improved tracking of commodities from receipt to delivery also enables opportunities to capture value, and trace emissions and environmental impacts associated with a particular commodity.”

“In the downstream segment, upgraders and refineries can benefit from leveraging their already heavily instrumented facilities and automate portions of operations, monitor product quality in real time, reduce planned maintenance costs, and allow human operators to analyze complex streams of data to improve throughput.”

According to Russ Erickson, VP of investment and partnerships at Edmonton-based Amii Research [Alberta Machine Intelligence Institute], “predictive maintenance has been the easiest and lowest hanging AI fruit for the oil and gas industry to pick, especially by the mid and downstream segments of the industry.”

AI-powered predictive maintenance allows oil and gas companies to perform work as needed by analyzing vast amounts of sensor data, historical maintenance records, and real time operational data to predict equipment failures, whereas maintenance has traditionally been on fixed schedules leading to unnecessary maintenance and/or unexpected breakdowns.

A ‘blue sky’ scenario

“In the longer term, we should think about integrating individual AI/ML projects on specific workflows or assets to fully automate operations across an organization, and focus on leveraging human insights on developing complex models with subject matter and technical expertise,” Duong said.

“There are opportunities to fundamentally change practices that haven’t been considered — beyond supporting existing workflows — i.e. discovery-based AI/ML that isn’t constrained by human biases or historical influence on operations,” he added.

“In a ‘blue sky’ scenario — we would talk about full automation, or robotic automation — things like task replacement (a facility that runs by itself, with no human intervention).”

And one cornerstone of this possible scenario is generative AI, like OpenAI’s ChatGPT, that excels at creating text responses based on large language models (LLMs) where the AI is trained on a massive number of data points, and its derivative retrieval-augmented generation (RAG) technology.

Generative AI uses machine learning techniques to create new content, not just analyze data as is the case for ML alone, while RAG provides a convenient way to optimize the output of an LLM by providing more up-to-date and industry-specific information than the underlying LLM model.

“Generative AI is already being explored as a viable solution for energy companies,” according to Robbins. “For example, several exploration and production … companies are currently advancing their RAG infrastructure within their enterprise resource planning systems. Essentially, for an energy company, this translates to unparalleled abilities to navigate vast internal databases and varied document management systems efficiently.”

“One area where we see not only increasing interest, but tangible results from implementing these solutions, is generative AI being used for shutdown, turnaround and outage management,” he added. “There are organizations that are able to reduce parts of their turnaround planning cycles by orders of magnitude (from multiple days spent hunting for an answer to seconds) by using RAG. These are scalable, enterprise-wide use cases that make for a great foundation to build from.”

A second cornerstone of this ‘blue sky’ scenario, computer vision technology, is also already in use. “Computer vision technology is being used for a broad range of solutions, from ensuring asset integrity to managing building information,” Robbins said.

“Globally, hundreds of companies are focusing on safety compliance and the measurement and quantification of emissions. In the Western Canadian Sedimentary Basin (WCSB) alone, upstream companies are currently deploying computer vision technology alongside drone surveillance for various applications.”

“One particular area under our close observation, for which we are actively developing solutions, involves the concept of an agentic workflow,” Robbins added.

Agentic workflows, yet another cornerstone of the ‘blue sky’ scenario, refers to a more iterative and multi-step approach to using LLMs and so-called AI Agents — a system or program that is capable of autonomously performing tasks on behalf of a user or another system by designing its workflow and utilizing available tools — to perform tasks, as opposed to the traditional "non-agent" approach of providing a prompt and receiving a single, direct response.

Agentic workflows introduce a number of possibilities, including enhanced automation capabilities, auditability, and opportunities for team collaboration, Robbins said.

“Fundamentally, agentic workflows are the next step for generative AI, offering the chance to automate a broader spectrum of decisions on an individual’s behalf.”

These workflows are not only adaptive but also capable of collaborating, learning from past experiences, and autonomously making decisions on their own while “keeping a rigorous ‘audit trail’ of the discrete decisions made, including the explainability and methodology behind those decisions.”

According to Erickson, “for autonomous systems to be trusted they must be based on explainable AI. Black boxes are dangerous, as we’ve seen recently with generative AI making stuff up.”

Sep 04, 2024 - Article 1 of 13

We use cookies to help you navigate our website content and to help us understand how we can improve the user experience. Note that DOB Energy will not function if your browser does not accept cookies. By continuing to browse you agree to our use of cookies. Please see our Privacy & Cookie Usage Policy to learn more.

If you have ideas for how we can improve our services, we’d love to hear from you. Click here to email us.