logo


Aug-2021

Improve profit with augmented digital twins

Artificial intelligence and machine learning deliver greater, more sustainable value from simulation models.

MARK KRAWEC, MIKE AYLOTT and SIMON CALVERLEY
KBC (A Yokogawa Company)

Viewed : 1285


Article Summary

Oil refiners are struggling to command their share of reducing profit pools. Profit margins are tight, and the energy transition is exacerbating already tough operating conditions as a result of economic loss of significant parts of the barrel through reduced air and road travel. In many regions, growing availability of lower cost energy sources is increasingly undermining demand for transportation fuel product. Even if alternative energy sources are not cheaper, ambitious net zero carbon policies are driving the switch of fuel away from conventional, fossil based sources.

Oil refiners must adapt or die. The more progressive refiners are taking deliberate steps to future-proof their relevance:
1. Scale: boldly increase scale, thereby reducing unit cost of output, the assumption being that markets will be found for increased product volumes.
2. Scope: others are not betting on availability of export markets to simply expand their scale, and are, instead, altering the scope of their operations:
a. Re-tool conventional refineries to produce ‘greener’ products such as biodiesel from alternative feedstocks such as waste agricultural products and animal fats. In doing so, make a more meaningful contribution to society through the processing of materials such as cooking oil, fats, greases, and soybean oils.
b. Downstream integration to produce a wider petrochemical product slate, including  ethylene, propylene, polyethylene, ethylene glycol, paraxylene, and/or styrene.

Historically, improving refinery profit has focused on opening up the crude oil basket and changing associated process operating conditions to most efficiently drive a desired change in yield slate. Whilst these options remain relevant today, the upstream and downstream optimisation envelope has expanded greatly.

Feedstock optimisation must now include crude oils, as well as alternative biofeedstocks. Similarly, downstream integration must consider conventional transportation fuels, petrochemicals, as well as emerging inter-connected economies such as hydrogen.

Enhanced complexity around these expanded inputs, outputs, and processes requires new technologies to aid evaluation and realisation of available opportunities for improving profits.

Same goal, different pressures, higher expectations
Operational excellence must deliver a measurable change in profitability. Unfortunately, this can be overlooked in the rush for adoption of the latest digital technology and enthusiasm for proof of concepts and articulation of blue-sky visions. Understanding where value can be sourced, how to unlock it, and then putting in place the right people, processes, and technology to sustain it has been the staple of KBC’s Profit Improvement Program (PIP) over the past 40 years. Billions of dollars in benefits have been realised to date by KBC by exploiting the full range of refinery optimisation levers, but in particular improving margin through optimisation of yields and energy.

Subject matter experts, specialist reactor simulation models, and use of full Petro-SIM flowsheets have enabled significant benefits from complex inter-unit improvements. All of these have contributed to realignment of multiple refineries to maximise profit against their limitations and constraints.

However, in a world where agile approaches are critical and delivery expectations have changed, value must be realised faster despite the increasing complexity of problems arising and the range of options. This comes both from the ability to implement faster but also to monitor and adapt continuously to market signals, disturbances, and changes in operating conditions.

Continuous, holistic optimisation on a real-time basis has become an imperative. A shift away from ad hoc and highly manual approaches is needed. To achieve this shift requires a strong data foundation, digital-enabled technology, and agile delivery with progressive deployment of integrated tools which avoid large scale retooling of IT systems and workflow changes. Certain sites may also require reskilling of staff to adopt the available benefits more rapidly and sustainably.

Whereas a number of expectations have changed, those that have not changed are the need for industry expertise, sound work processes, and accurate technology. Core to integrating these fundamental expectations with new ones is digital twin technology.

In with the new — the digital twin that works
A digital twin is a virtual or digital copy of a human, device, system, or process that accurately mimics actual performance in real-time, and that is executable and can be manipulated, allowing better performance to be developed.

Digital twins of process units that are built from rigorous simulation models made operational with real-time data are currently used for monitoring and optimising unit performance (see Figure 1). They enable a real asset to accommodate real constraints and physics by providing the ability to forecast, predict, and optimise molecular behaviour under different pressures, temperatures, volumes/flows, and catalyst service. In doing so, the rigorous models handle associated non-linear relationships and infer information that cannot be directly measured. But we cannot yet explain everything by physics alone, or at least provide answers at the speed and cost which business can more readily afford.

This is why artificial intelligence (AI) and machine learning (ML) techniques have become important for providing ways of extending the reach of process simulation in critical areas like fouling and corrosion prediction, as well as increasing the speed of optimisation.

One of the challenges of expanded applications of digital twin models has been the associated increase in model upkeep and maintenance due to disturbances from market and process/equipment conditions. However, rapid enhancements in technology have allowed AI and ML techniques to shoulder much of the burden of model maintenance. Previously, both structured, ad hoc and manual optimisation of process units required expertise in in-depth techno-economic subject matter, but can now be undertaken in a continuous and automated manner through AI and ML that:
-  Takes account of long-term operating history to ensure all interconnectivities are understood
-  Reviews inferred parameters and qualities to cover those which cannot be measured on-line
-  Provides understandable outputs to both panel operator and advanced process control (APC) to deliver meaningful change
-  Deploys/operates in a modular approach to ensure safe and controlled operation
-  Monitors equipment health and continuously identifies opportunities for improvement to ensure direction of the unit is aligned to strategy
-  Adapts models to changes in unit performance bounded by first principles and constraints to ensure safe operation


Add your rating:

Current Rating: 4


Your rate: