As energy processes and industrial assets become digitized, they climb on an exponential growth curve instead of a linear growth trajectory. This digital transition is ripe with many possibilities, whether it is in artificial intelligence (AI), remote diagnostics using digital twins or next-generation usage-based operating models powered by sensor data. Oil and gas companies need to prepare for five trends.
Most industrial AI applications are geared toward providing operational efficiency impacting the cost side of the balance sheet such as increased uptime and well yields and reduced HSE risks. For example, Flutura is powering a “digital prognostics as a service” model for a major upstream company where instead of reacting to asset downtimes, the company can proactively complete remote diagnostics and in-person interventions based on fault mode predictions from an AI model that is watching real-time equipment sensor streams.
Innovative business models will transform the market landscape for drilling service providers, equipment manufacturers and owner operators. Winners and losers will be decided by the ability of these traditional industrial sectors to deeply embed AI into core equipment and processes. This requires that many entrenched players reimagine their business operating models.
AI platforms in 2017 were generic and untuned to the nuances of oil and gas. There has been a great deal of momentum in upstream areas. For example, Flutura’s Cerebra industrial AI application center has preconfigured solvers for ultraspecific upstream problems such as deepwater asset diagnostics, hydraulic fracturing, LNG and more. Expect to see more AI apps this year that will impact measurable outcomes using algorithms highly specialized to solve high-impact problems.
“Vanilla” data science will not suffice to solve mission critical problems in the oil and gas industry. As deep-learning algorithms become democratized, the importance of novel AI applications that solve a specific and complicated problem will increase. These applications will become more important than a horizontal AI platform, which requires immense tuning for the industry context.
A primary challenge in the practical execution of AI projects are blind spots in vital signals. For example, an upstream company realized through its work with Flutura that while its rotary assets had sufficient instrumentation (e.g., lube oil pressure and temperature, rpm, torque, etc.), there were critical blind spots when it came to vibration sensors and shock sensors that were a crucial signal for the deep-learning algorithm to spot anomalies leading to failure. Some specific blind spots where significant sensor innovation will be seen this year include the detection of fluid and gas quality using optics based on differential interferometry, tampering of oil containers, emissions and noise anomalies in close proximity to rotating assets.
Making assets and process context aware requires increasing the asset sensitivity to events both within and around them. Model quality is directly correlated to the quality of sensor streams. The better the sensors get, the better the AI models become.
There are two types of intelligence: informational and actionable. For example, if a leased asset in an assetas- a-service offering is repeatedly being misused by a worker, edge intelligence will notify the supervisor to intervene. This decision-making loop cannot afford the time needed to ship massive sensor event data over the network and then wait for the AI layer at the center to respond. Localized sense and respond layers are needed to be operationally effective.
Edge intelligence is ideal for “fail operational” behaviors where an asset or process can complete its core operation even when a part of it fails. Edge intelligence also is ideal when reliability and latency are important. Large oil and gas projects have thousands of sensor events streaming across myriad wells with some decisions needing to be reliably made within milliseconds.
Today’s data networks are insufficient to keep up with the high rates of data transmission required by rising sensor density on upstream processes and assets combined with increased frequency of transmission. Companies like Sigfox and Ingenu are focused on building dedicated nextgeneration sensor data transmission infrastructures for moving sensor data. It will be like getting a dedicated lane on national highways where sensor data streams can move data that support machine-critical upstream processes and equipment.
Join our mailing list to stay up to date on the latest developments in the IIoT space.