An explosion of data and ever more complex work flows requires new ways of working in order to realize the digital oilfield goal. Ray Millward addresses the issue.
As sensor deployments proliferate, and digital oilfield technologies continue to mature, operators are increasingly looking at how to make best use of their data asset, from the raw seismic data through to generated data from reservoir simulations.
The goal for the digital oilfield is to create a more powerful fusion of historic data, present knowledge, and predictive modelling, allowing future operating decisions to be as near to optimal as makes economic sense. In an industry with so many expert disciplines, the biggest difficulty is to ensure that specialist engineers are equipped with the right analytics tools and able to access the right data, at the right time. Only then can they quickly gain the right insight through which to make better decisions.
In the past, subject matter experts (SMEs), such as geophysicists, reservoir engineers, and well test engineers, had been able to learn enough software engineering to analyze the data directly available to them and, unaided, make improvements to their analytical tools and platforms. Typically, SME’s can write functional programs that work with files on their local drive, for example processing rate and pressure history data to find trends in order to formulate a model, or processing surfaces to find features and model time dependent behavior.
These localized innovations could then be shared with other specialist colleagues, or passed onto IT departments for commercialization and large scale release. Today, however, there is an explosion in data quantities from sources such as fiber equipped wells, continuous seismic monitoring, and increasingly large simulation grids, and there’s a corresponding increase in the complexity of upstream architecture and workflows.
This makes it harder for SMEs to maintain the necessary expertise in both their specialist area, as well as in professional software engineering, which itself continues to evolve apace. Even those SME’s with sharp software engineering expertise struggle to make writing new code a priority amongst the other pressing demands on their time.
To alleviate this, there has been an emerging rise in the number of small teams and departments responsible for technical computing. Technical computing specialists have a more robust mathematical background, and are able to introduce new computational techniques to solve subsurface problems.
It is now feasible to ask new kinds of questions of huge data sets, and to use high performance computing to get answers in minutes instead of days. The difficulty, however, is in matching up the right kinds of algorithms to solve subsurface problems in the most efficient way.
Problems can arise when there is insufficient overlap between the domain knowledge of the SME and of the technical computing specialists. This runs the risk of delivering a project with a set of tools that ask the wrong questions and delivers insight which does not meet the original brief.
An overlap in domain knowledge is necessary to ensure that when the SMEs and technical computing specialists collaborate, they are able to understand each other sufficiently to create a suitably broad problem space that fully defines and interprets the problem at hand, so that it is possible to explore a broad variety of relevant solution strategies. In this way, the SME is able to learn about how new computing techniques can be applied to existing problems, and, in turn, the technical computing specialist becomes more aware of the SME’s domain.
As an example of how technical computing specialists with domain expertise start to think about existing problems, consider the use of forward modelling as an alternative approach to seismic inversion. This would involve starting with a basic gridded model of planar geological features, and using this basic model to generate synthetic seismic data, which can then be compared with the original seismic survey data.
By employing a carefully designed iterative process, which uses genetic algorithms, pattern matching techniques, and optimization, the process would continue until the gridded model evolves to generate synthetic seismic data to appropriately match the observed data. Once techniques such as this are perfected and automated, geophysicists and reservoir engineers can spend less time fighting to create a simulation grid. More time becomes available for identifying good starting points for the optimization as well as incorporating complex features of the reservoir.
Other emerging techniques that technical computing specialists can offer involve the use of large scale complex platforms for processing 4D seismic data in a massively parallel fashion, either on a high performance cluster or using map reduction platforms. The result fragments can then be combined back into a dynamic and evolving reservoir model.
When real-time data from intelligent fields is integrated with current well data, specialist engineers will be able to use analytics tools to spot trends and improve history matching for the field. This supports a more informed and timely choice of the appropriate extraction method.
In the long term oil companies can then begin to capture the decision making processes that expert reservoir engineers have built up over decades of experience, and as pattern matching techniques improve, to then make this data accessible when similar geological features are explored in future.
While it is possible to begin to articulate what the future of the digital oilfield might look like, these kinds of problems will be extremely challenging to solve, and will require software engineering and data analysis talent that can clearly understand how these complex systems work best together.
Operators will be increasingly reliant on broad vendor ecosystems to ensure that their SME’s can be matched with the right kinds of technical computing specialists to provide a bridge between them and the managed software development cycle. Finding the right partner that understands the domain can significantly shorten the software release cycle.
This allows the wider business units to benefit from the many innovations becoming available, but within a known workflow that they are comfortable with and trust. We must remember that human needs are just as important as those of the data.
Ray Millward joined Tessella Ltd two years ago, working within the upstream oil and gas industry. Before Tessella, Millward earned his PhD at the University of Bath, studying a new adaptive multiscale finite element method, with applications to high contrast interface problems.
The adoption of internet-based, or “open,” industrial control systems (ICS), has left the global energy sector vulnerable to cyber attacks and hacking, according to a new Marsh Risk Management research paper.
The paper, entitled “Advanced cyber attacks on global energy facilities,” says the adoption of internet-based ICS and supervisory control data acquisition (SCADA) systems has grown as companies seek greater business insight, remote access, and interoperability between systems.
“For the last quarter of a century, the global energy sector has relied on the protection offered by standalone and closed ICS as the primary barrier to the cyber security threat. Today, however, with energy facilities worldwide generally aging, upgrades and expansion projects are ushering in a wave of new ICS and SCADA systems, built on openness and interoperability.
“Unlike past industrial control systems, which were closed and predominantly exclusive to respective operating companies, these new systems have integrated control systems with other information technology networks, providing malicious persons with the opportunity to gain access to a facility’s IT software, without needing to be onsite,” the paper says.
“Once inside the system, an infiltrator could, in theory, open an emergency shutdown valve, or adjust alarm system settings at a gas or petrochemical plant.”
According to the US Department of Homeland Security, 53% of the 200 incidents responded to by its Industrial Control Systems Cyber Emergency Response Team, between October 2012 and May 2013, were directed toward the energy sector.
To date, cyber attacks on the energy sector have mostly been untargeted and data-driven. But this is starting to change, Marsh says, with hackers now seeking to control ICSs in order to inflict damage to property and operations.
“The energy sector’s resiliency to date is certainly not due to a lack of effort on the part of the hackers,” the report says.
In August 2012, Saudi Aramco, was the victim of a malicious attack intended to halt the company’s crude oil and gas supplies.
Although the virus—given the nickname “Shamoon” by investigators—failed in its primary objective, it nevertheless destroyed the hard drives of more than 30,000 desktop computers and 2000 servers, forcing IT systems to be disconnected from the internet for two weeks.
“Computer viruses such as Shamoon and the US-developed Stuxnet virus, the latter of which successfully disrupted uranium enrichment at the Iranian Natanz nuclear facility in 2010, have drawn the energy sector’s attention to the potential disruption that could be caused by a malicious piece of software.”
A 2013 report by Zpryme Research & Consulting found 63% of energy companies polled were “very concerned” about the prospects of cyber or network attacks.
Governments are taking note and acting. In the US, the Obama administration’s new Cybersecurity Framework has sought to define a common set of security standards for a list of 16 defined critical infrastructure sectors, including standards and approaches for ICS.
In Europe, the EU is close to finalizing its own cybersecurity directive to reduce the cyber threat posed to critical infrastructure, communications, and public services.
“The next and much more difficult challenge will be to identify common vulnerabilities before assessing the potential impacts of cyber risk to the energy sector – particularly from an economic perspective...It is imperative that energy companies consider the risk of cyber attack as an inevitable one, and focus on preparing scenarios to identify, respond, and contain any attacks accordingly.” -OE