logo


Mar-2015

Three steps to successful processing of shale and other opportunity feedstocks

Careful blending of incoming feedstocks makes it possible to create an input 
stream to the refinery that is a better match than any crude source on its own

Patrick Truesdale Emerson Process Management
Didier Lambert Topnir Systems s.a.s.

Viewed : 3461


Article Summary

The nature of crude is changing and the refining industry must change with it. At one time, oil wells could be adequately characterised by a crude assay that would change only slowly with time. Thus, refineries could count on fairly stable feedstock.

That is no longer the case, with the arrival of shale oil and other opportunity crudes driving wider feedstock variability. Dealing with this situation requires three components:
• An ability to do accurate, on-the-fly, rapid analysis
• The capability to adjust feedstock blending as needed
• Smart process control to achieve the best possible processing.

Changing assay parameters
Consider the previous situation confronting refineries. Feedstocks largely came from traditional oil wells typically characterised by crude assays determining important parameters such as density, sulphur, salt content and volumetric yield. The crude assay would usually change little over time, often measured in years and only happening slowly as the well moved toward the end of its productive life. What is more, one well’s crude assay could frequently serve as a proxy for all other wells within the field. Consequently, refineries dealt with stable feedstocks. In many ways, this aided process optimisation. Refineries consist of many process units operating on a fully integrated basis, which, as a whole, pose challenges in quickly adapting to changing feedstock. Of course, yesterday’s refineries still had to deal with variation, but the cycle of change often ran over years.

Now look at the world confronting refineries today, as illustrated by the accompanying chart of properties (see Figure 1) of various US and Canadian crude sources. Examine the last entry, that of Eagle Ford. In 2009, this source accounted for only 50 000 bpd. A year ago, thanks to hydraulic fracturing and horizontal drilling, that figure climbed to 1.1 million bpd, a more than 20-fold increase. Understandably, this feedstock source has become more prominent. Thanks in part to this surging production, the US has vaulted into the position of being the world’s leading fossil fuel provider. Yet, crude produced from the Eagle Ford has the most variation of any of the 22 sources listed. The percentage of light crude to condensate lies between a low of about 20 and a high of 65%. Thus, it can require very different processing, depending upon where along the spectrum a particular batch of crude lies. However, other sources also exhibit large variations in important parameters.

Two more pieces of information complete the picture. The first is that one Eagle Ford cargo may differ radically in characteristics from another, even though both bear the same name. So, the type of crude assay that has been valuable in the past is no longer as useful because assays that were once good for many years, if not decades, may no longer hold from one well to another, or from one well to itself a few months hence. Second, for those who want to dismiss this as mainly an Eagle Ford or US problem, realise that other areas around the world are also considered strong hydraulic fracturing candidates. For example, Argentina’s Vaca Muerta basin is thought to be similar to the richest drilling areas in the US, and there are also plays in Australia, China and Poland, as shown in the US Energy Information Administration (EIA) developed Figure 2.

In general, such opportunity crudes offer discounted pricing and are increasingly abundant as compared to traditional sources. However, they vary in properties and so may not match a refinery’s design and configuration. Dealing with the situation involves three steps: developing the right analysis capabilities and then using the data thus generated to adjust the blend and the process to create the optimum solution.

Analyse
Having a sufficiently advanced analysis capability is critical. Ideally, this analysis will be capable of rapidly discriminating between different types of crude, thereby providing information about distinguishing characteristics. Such data can then serve as the source for any plan to deal with incoming feedstock. More importantly, the analysis must also capture data about crude properties in general. Often, the engineering and operations team at a refinery may not know exactly what will cause problems with heat exchangers, distillation columns and other components of a crude processing unit. Casting as wide an analysis net as practical can provide some protection against such unknowns.

The analysis data can be used to produce a history of properties for past crude shipments that may be essential to diagnosing problems when they occur. There are many different ways to do such an analysis. Of all the possible methods, techniques based upon light can perhaps come closest to the ideal of a fast, cost-effective and highly discriminating analysis method. All matter interacts with electromagnetic radiation, but the nature of that interaction depends upon the material and the wavelength of the radiation. So, developing a light-based method involves finding and applying the correct spectral band to ‘interrogate’ the incoming feedstock stream, and then devising ways to interpret the data.

Researchers at large global oil companies have devoted years to investigating the feasibility of using light in the near infrared, a spectral region that runs from the edge of the visible at 700 nanometres (nm) out to about 2500 nm. This is well below the thermal IR, which is 5000 nm and on up, which is where objects can be detected by the heat they generate.

The reason why the near-IR has attracted interest as an analysis tool is that chemical bonds absorb such light. This is particularly true for the varying hydrocarbon bonds found in crude, with the region running from 2000 to 2500 nm being among the best possible choices because it is far enough from the visible to be free of any absorption effects related to the light that can be seen with the naked eye. The result of absorption in this near-IR band is a spectral signature. Such a signature can identify specific hydrocarbons as clearly as fingerprints can identify people.

Correctly classifying hydrocarbons is a multi-step process. First, the instrument takes spectral data from a sample of unknown composition. This is then compared to the results from a library of samples with a known make-up. The product is something like a scatter plot, with one axis being the wavelength and the other being a relative reading. Visually, this would group various grades of crude into different clusters.

In the ideal world, there would be plenty of data points to easily place an unknown into the proper classification. In the real world, refinery managers cannot wait too long for an analysis to be complete. At the same time, they cannot afford to have an analysis be wrong, since that may actually be worse than not knowing anything at all.

One way out of this bind that is being pursued by instrument makers is through a process called densification. This is based on interpolation between a sparse set of data points to create a much denser data cloud. Consequently, the accuracy of analysis can be significantly improved without having to sacrifice speed. Doing densification properly demands deep expertise in the how and why of spectral blending.


Add your rating:

Current Rating: 3


Your rate: