Connected data improves refinery operations

Connecting data with a common knowledge management platform improves visibility, response, and analysis to benefit refinery operations

GE Water & Process Technologies

Viewed : 8776

Article Summary

The transition from board-mounted, pneumatic refinery controls to electronic distributed control system (DCS)-centred control and data historian systems that started to become popular in the 1970s has helped refinery operations improve production, reliability, and product yield optimisation. However, still today much of the data originating from vendor service companies remains locked away in proprietary systems that are shared typically only as periodic snapshots of performance – collectively referred to as ‘dark data’. Dark data in a refinery often includes information about the performance of specialty chemical programs, equipment vibration monitoring, corrosion measurements, inspection readings, and many other ‘non-core’ areas of refinery operations. In the last several years, new sensors, controllers, and data handling infrastructures have been under development to connect this once dark data with the traditional refinery operating data as part of the movement by dozens of companies to create an ‘Industrial Internet of Things’. Connecting these data sources allows more rapid, more transparent, and remote evaluation of system performance, automated application of proprietary analytic algorithms, and other system analyses that were impossible to do previously. The increase in analysis and performance measurements provides significant opportunities to better optimise reliability and profitability for these once isolated systems.

There are several factors that are leading to this leap in technological capability. First, the cost to produce sensors continues to drop and the variety of operational and physical parameters that can be measured with sensors continues to increase. This allows more sensors to be deployed in more places in a refinery, making visible what was previously unmeasurable. In addition to sensor technology, wireless transmission has become cheaper and more reliable, allowing reduced cost of installing sensors and maintaining connection without the need for running miles of wires and cables. The increase in sensors is already happening today – the average refinery has seen a five-fold increase in the number of sensors deployed over just the last five years.1 The trend to increase measurements via the use of sensors and other on-line equipment is expected to continue for some time as the digitisation of industrial plants provides another leap in productivity and efficiency.

Robust data systems and large data storage capacity are needed to handle the additional data streams generated as sensors and measurements increase. Luckily, the cost of data storage has also dropped considerably to economically accommodate the increased data load. Additionally, cloud based storage solutions are becoming more popular daily, it seems, opening up even greater storage and processing capacity. While cyber security is still a concern for refiners and industry,2 as security measures are improved there is a trend in the industry to allow data to be transmitted ‘beyond the fence’ and stored remotely, thus making data available to people and systems not physically located at the refinery. This movement contributes to the tight integration of the physical and digital worlds, which many are calling the Industrial Internet of Things, and places us on the verge of a new productivity revolution to transform industry on the scale of the industrial and digital revolutions of the past.3 According to a 2014 study by Accenture and GE, ‘Big Data Analytics’ is within the top three corporate priorities for 87% of oil and gas companies surveyed and ‘not a priority’ for only 3% (see Figure 14).

The growing abundance of data provides great opportunities, but also additional challenges, in efficiently sifting through large amounts of data and extracting valuable information to improve refinery performance. The emergence of more advanced analytics is a growing trend in the industrial world, including refineries, and may indicate a shift in hiring practices. Forward-looking companies have already started investing in data scientists, programmers, and data analysts to provide the value that big data analysis can offer. More sophisticated analysis due to better data visibility and advances in analytics helps to more easily anticipate impacts of operational or feedstock changes on system performance and reliability. This field is growing rapidly in many areas of industry, so much so that ‘data scientist’ was listed as the number one job in America for 2016 by Glassdoor.5 It is quite possible that in the near future data scientists will join the regular ranks of refinery professionals like engineers, instrument and controls teams, and reliability staff. In a recent interview with SAP, Peter Reynolds from the ARC Advisory Group argues that adopting the technologies provided by the Industrial Internet of Things could be a key to survival for many oil and gas companies by taking advantage of optimising infrastructure to lower operating costs. “Why monitor 50 pumps when you can monitor 50000 pumps [from a centralised location].”6

The bottom line: connecting dark data enables greater insight to the system conditions, performance, and pitfalls to better turn data to information and information to action.

Common connected data types
• Real-time (streaming) data connection: making a constant, always-on, connection between the local sensors/controllers and a cloud based control system. This would provide the most dynamic data stream for analysis and action and theoretically enable remote closed-loop control of a system outside the refinery fence. However, this method is the most difficult to keep secure from a cyber security perspective and requires an absolutely stable signal connection to avoid dropped signals compromising system control. Additionally, utilising some of the other methods described below minimises the need to have an ‘always on’ data connection for the scope discussed in this article.
• Near-real-time data communication blends local closed-loop control and monitoring with cloud based analytics and performance monitoring. The approach uses local, smart devices to record sensor data and perform closed-loop control on systems. The system can periodically establish a connection to the cloud based platform and bilaterally transmit data between the local device and the platform. This increases data security since the information is only transmitted in ‘bursts’, and maintains closed-loop control locally, which is more stable. This method still enables the platform to monitor system performance rapidly and is usually sufficient to provide feedback to out-of-control conditions to enable timely corrective actions.
• Edge computing: a variant on the above near-real-time communication and a growing trend, it increases the intelligence of the local smart device to enable more complex analytic execution locally and autonomously. This approach takes advantage of the greater processing power and lower power requirements of today’s computer processors to unload communications and server bandwidth while enabling controls beyond traditional PID-type control schemes. Data transparency and system performance are preserved by communicating critical data and analytic outputs to the platform, while pushing more ‘data crunching’ down to the local level.
• Manual, or on-demand, data upload: intended mostly for local data obtained on a spot basis, such as laboratory results for field readings not captured by on-line sensors or instruments. Data of this type is updated either on some set frequency, or as it is generated by a person. There are several mechanisms to incorporate this data, such as manual template upload, live forms, or file transfer/email. Historically, this has been some of the most difficult data to incorporate into the data stream for refinery operation and is therefore a huge, mostly untapped resource of information.

Advantages of connected data
In a 2012 AFPM Technology Forum presentation, Sam Lordo outlined the need to define key performance indicators (KPI), key control parameters (KCP) and key stress indicators (KSI) as essential elements of a well-defined performance management system.7 For a system to be valuable to refinery operations, a performance management and communication system needs to be able to not only identify when a system is out of compliance but also provide a simple mechanism to ensure action is taken to bring a system back into control. A common platform utilising connected data can provide a single data view that is transparent, can automate specialised analytic analysis to allow more advanced performance assessment, and can allow for defined control limits for KPIs, KCPs, and KSIs. This type of comprehensive system provides a clear mechanism to turn data to information, and information to corrective actions.

Add your rating:

Current Rating: 4

Your rate: