logo


Jul-2019

The great data transformation: how the industry is capitalising on digitalisation (Part 1)

Real-life examples from TransCanada, Petronas, and ADNOC Onshore. As the oil and gas industry becomes increasingly digitised, torrents of data are being unleashed that can easily overwhelm companies if not properly managed.

Russell Herbert
OSIsoft

Viewed : 3196


Article Summary

The deluge of new data raises important questions. How should it be organised? Can it be presented in a meaningful way? Who should have access to it? These questions need to be answered to make the most of the data and turn it into value-adding information.

At any modern-day oil and gas operation, three big changes are taking place that one should consider when looking for the best approach to the digital transformation.

The first is the accelerating growth of data generated by assets. The second is the increasing importance of emerging technologies, such as Internet of Things (IoT), cloud and data analytics, which are having a big impact.

Finally, the third is that within every company there is a growing appetite for collecting and leveraging data from an increasing number of groups and specialisations.

Together, these drivers are making oil and gas operators focus on designing a centralised strategy for managing their operational data as they pursue “intelligent information” – rather than merely collecting volumes of data that will languish unused in data swamps.

The opportunities offered by converting raw data into value-adding intelligence are immense, which is why industry leaders including Saudi Aramco, Shell, BP, Chevron and Total have been long-time customers of OSIsoft, the company behind the PI System – the world’s leading software platform for managing real-time operational data.

The PI System allows its users to collect and centrally manage all operational information from traditional process control networks, IoT devices, proprietary data historians or cloud-based SaaS offerings. With the PI System, real- time data can be visualised, analysed and sent to executives for decision making and reporting. It can also be shared with third parties to enhance operations or streamed into any of the ‘Big Data’ analytics platforms to deliver predictive insights.

As the industry adjusts to the new “less for longer” oil price era, OSIsoft’s customers are becoming increasingly open about sharing the success of their digital journeys and collaborating more closely on data-driven projects, an unusual trend for a sector that is usually tight-lipped about its technologies.

At the 2017 OSIsoft User Conference in London, customers like Shell, Petronas, Veolia, Transocean, ADNOC Onshore, and many others discussed their real- time data strategies and shared the latest developments in the world of the PI System. Here’s the first of a two- part series of articles based on the success of OSIsoft’s customers who have converted potential data swamps into game-changing intelligence, improved their operations and saved millions of dollars in costs.

Keeping problems contained: how TransCanada minimises disruption
Before TransCanada embarked on a company-wide program of real-time data analytics a few years ago, the group was vulnerable to unexpected equipment failures and network outages. With high power turbines running 24 hours a day and 57,100 miles of pipelines1 connected to high-demand markets, such vulnerabilities were hardly surprising.

According to Keary Rogers, the Systems Reliability Manager at TransCanada, an unnoticed broken valve plate or a cracked bearing on a tier-one piece of equipment could have easily triggered a crisis.

TransCanada first installed the PI System  in  1998, but only used it for network monitoring and after-event investigations to determine the cause of a failure: a reactive rather than a proactive approach. However, after a particularly big incident in 2010 and a narrow avoidance of a second incident, the group signed a full-scale Enterprise Agreement (EA) with OSIsoft to accelerate their digital transformation and forestall further problems.

For TransCanada the first step of its analytics journey in 2011 was to adopt OSIsoft’s Asset Framework (AF) to structure its data into a coherent asset hierarchy. Once AF was in place, operational data could now be directly analysed and streamed to a variety of advanced statistical tools.
Initially, TransCanada started with just one installation of their new anomaly detection system that relies on the PI System and saved $94,500 in failure avoidance costs. By 2014, there were 32 installations in place, and the annual savings had risen more than twentyfold to $1.93m. Two years later TransCanada had 84 installations, and the savings fell just short of $8m. By 2017, 129 installations produced $10.65m in cost reductions by the end of the third quarter alone.

Because TransCanada’s engineers have steadily built new applications on the PI System, they can now identify anomalies and perform repairs well before they disrupt one of the largest natural gas transmission companies in the U.S. Detection rates have more than doubled in the last two years, leading to a significant improvement in operational safety and the bottom line.

Today, TransCanada relies on over 16,000 streams of real-time data. The PI System is analysing each stream in near real-time to detect if any equipment is trending away from acceptable operating windows. If a potential problem is spotted, the PI System will automatically notify engineers, delivering the operational intelligence they need for a rapid response. “It’s all about fixing little things before they become big things,” explains Brendan Bell, Reliability Technical Consultant at TransCanada.

As the group’s business expands, the importance of real- time data analytics continues to grow. By 2019, with its PI System-based “Enterprise Analytics” solution installed company-wide, the TransCanada plans to monitor twice as much horsepower as it does today.

1. www.transcanada.com/globalassets/pdfs/investors/reports-and-filings/annual-and-quarterly- reports/2017/transcanada-2017-annual-report.pdf
 
Petronas goes it alone  in the pursuit of data quality   
In mid-2015 Petronas’ maintenance department had a data deficiency problem.
The maintenance and engineering team oversaw 130 pieces of aging, hard-working gas turbine-driven equipment, but had no accurate means for measuring its performance and reliability. The fact that the equipment had been supplied by several manufacturers made the job more challenging.

Crews used hand-held data loggers to collect data about vibration, lubrication and other key asset benchmarks and manually compiled the data into monthly reports.


Add your rating:

Current Rating: 3


Your rate: