2017 became the year that the oil and gas industry embraced robotics. Elaine Maslin reports on initiatives in the North Sea (first published in the January 2018 OE, access the full issue here).
While the industry has long been using autonomous underwater systems and remotely operated tooling, and more recently subsea processing equipment and aerial drones, this year the broader potential for robotics took some tentative steps towards realizing the first topsides robots.
French oil major Total completed its Argos (Autonomous Robot for Gas and Oil Sites) Challenge, in Lacq, France (OE: May 2017), which saw five teams pitch their robotic creations against a string of tasks on a mock-platform site. Total’s aim – part of a broader project towards reducing manning offshore – was to create a robot that is able to detect and control leaks, while weighing less than 100kg, that can move between floors, and on different types of flooring, from grating and corrugated iron to cement and wet slippery surfaces, under its own power.
Chevron has also been testing so-called snake-arm robots for inside vessel inspection in the North Sea. This followed work it was involved in as part of the Petrobot robotic challenge (OE: April 2016), which involved vessel and tank inspection robot development, and led to the formation of the Sprint Robotics Collaborative, based in the Netherlands.
Now, another initiative has been launched with nearly US$19 million (£14.3 million) in funding from the UK Industrial Strategy Challenge Fund (ISCF). The initiative is a new research center focused on offshore robotics, led by the University of Edinburgh.
The Offshore Robotics for Certification of Assets (ORCA) Hub will develop robotics and artificial intelligence (AI) technologies for use in extreme and unpredictable environments. The hub will work to develop robot-assisted asset inspection and maintenance technologies, which can make autonomous and semi-autonomous decisions and interventions across aerial, topside and marine domains.
“The application for robotics in the [oil and gas] industry is almost limitless, and we have only just scratched the surface,” says Rebecca Allison, asset integrity solution center manager at the publicly-funded Oil & Gas Technology Centre in Aberdeen. Allison is keen on robotics and its potential to improve inspections and reduce manual involvement in asset integrity management activities, such as pressure vessel and insulted pipework inspection (which require production shutdown if human-entry is involved).
But, she also highlights use of emerging underwater survey vehicles, which can carry out simple manipulation tasks and “the next generation of UAV (unmanned aerial vehicle) technology.”
“It’s about improving safety, removing personnel, and being more connected and competitive,” she says. “In 10 years or so, robotics will become common place, just as self-driving cars will not be viewed as robots, just like any piece of technology we use to manage assets. I would like to see future generations working alongside robots, even learning from them.”
Robotics is coming to the fore now because of a drive to increase productivity, but also to reduce offshore manning, says David Lane, autonomous systems engineering professor and director of the Edinburgh Centre of Robotics. It’s also being enabled thanks to the likes of the iPhone, miniaturization of electronics, the number of transistors that can now be put on a chip and developments in machine learning.
Oskar von Stryk, professor and head of simulation, systems optimization and robotics group at the Technical University of Darmstadt, Germany, was on the winning team in Total’s Argos Challenge – Argonauts – alongside Austrian robotics firm Taurob, and is now working with Total to further develop offshore robotic systems.
He says that robotic systems can take very different forms but that there are three key elements that make up a robot:
- Sensors, to understand its own status in the environment
- Algorithms, to integrate the sensors and plan what to do
- An ability to act, and interact with humans and the environment
“The key difference between robot and human is that humans see and understand. Robots just collect data – i.e. numbers – and just compute to try get some information about these numbers.” For example, the information the Argonauts gathered during the Argos Challenge was just point data.
Three classic tasks for robots are those that are dirty, dangerous, and dull. Robots can be more capable, efficient and fast, Stryk says. To date, robotic systems have focused on working in planar environments, i.e. self-driving vehicles (passenger, farming, mining, etc.), logistics systems (moving shelves to pickers), auto delivery robots, and search and rescue robots. For these, navigation, self-localization and mapping systems are needed. Going into more 3D environments, such as offshore platforms, is harder, he says.
Existing robotic systems have also mostly been working in isolation, i.e. zones without human presence. But, there’s much work ongoing to enable robots to work alongside humans – so-called “cobots,” or collaboration robots. German firm Franka Emica has developed a robotic arm that responds to touch and moves in such a way that it can avoid collision with an also moving human.
These are some of the challenges an offshore robot might face, with the addition of having to be ATEX certified and able to handle harsh environments. Offshore robots will have to be able to navigate complex structures, including going up stairs, and might be tasked with scheduled operations, occasional operations, and emergency response, such as inspections like acoustic measurement, as well as monitoring, gas detection, thermal imagining, and even maintenance tasks, cleaning or evening valve operation. The robot could also find and fight fires.
The Argonauts’ robot was ATEX certified and able to climb upstairs. “We have demonstrated it’s possible,” Stryk says. “Autonomous robots can be used on oil and gas sites. The vision, towards 2021, is mainly robots on offshore and onshore sites with industrial strength.”
Lane says that subsea systems, or ocean robotics, have been operating autonomously for over 40 years because of the limitations of underwater communication (the boundaries of which today are being chipped away). “Autonomy has long been part of what marine robotics have been about,” he says, highlighting the likes of the Remus AUV, introduced in 1990.
However, systems with more functionality are now emerging, some because of years of development work, such as Subsea 7’s AIV (autonomous inspection vehicle), which has drawn on research at Heriot-Watt and spun out unmanned vehicle software specialists SeeByte. The AIV has been performing subsea inspections for Shell in the North Sea, untethered, Lane says. It’s able to locate itself and can go to 3000m water depth, venture on 40km excursions and has 24-hour dive time, depending on the mission.
“Marine robotics can currently do mapping and tracking very well, inspecting pipelines over quite long distances, but it’s scarier when you try to dock something,” he says. Having vehicles that can dock would make tasks easier, as they’d be stationary, but it also enables recharging and data download/upload for subsea resident vehicles.
There have been projects, such as, the EU ALIVE, led by Cybernetics, which worked on docking station technology.
However, there’s also been work done on whether a vehicle could dock, and then perform pre-programmed tasks such as turning a valve or adapt to the situation, using machine learning.
The Pandora EU AUV project has also investigated how to teach a robot to turn a valve. There has also been research done to stabilize the end of a manipulator arm, instead of trying to stabilize the vehicle, Lane says.
The University of Girona, in Spain, (also involved in Pandora) focused its research on auto-learning, turning of a valve, and reaction to an accident in its Girona 500 project.
“We can do these things, the next phase is to make it robust so we can take it offshore. The hard part is cognition,” i.e. vehicles being able to recognize what they’re looking at, map and navigate unmapped areas on the fly, says Lane.
The same innovations are happening in unmanned aerial vehicles, in terms of increasing autonomy and sensor payload. Initially, drones have been manually flown to gather images as part of inspection work. Now the focus is on automated flight and being able to extend the inspection capabilities.
William Jackson, a researcher at Strathclyde University, UK, says drones can both build or use an existing CAD model to detect changes in a structure over time, for example an offshore wind turbine blade. The drone would be able to autonomously navigate around the blade, using the blade as its reference. For larger areas, drones could work in an array, with the images stacked into a high-resolution image.
But, sensor payloads are going further. Drones have already been flown inside vessel tanks. Earlier this year, Texo Drone Survey and Inspection (UKCS), a division of Texo DSI, said it had deployed the world’s first UT (ultrasonic thickness testing) from a UAV. This was just part of a pilot at a test site, but would extend drone inspection capability.
Texo says that it can be used on flat or curved surfaces and has been used in the offshore and onshore wind turbine structures, as well as on telecoms and maritime assets, the firm says. The inspection data is combined with a photogrammetric visual overlay of the completed survey, helping to pinpoint exact measurement locations on a structure/surface to an accuracy of sub-10mm, Texo says. Use of pulsed eddy current from a drone is also being tested, Jackson says. This is a non-connect electromagnetic technique for metal thickness testing.
The next step would be getting different robotics systems to work together, Lane says. This is something happening in the subsea industry, with autonomous surface vehicles communicating and working with autonomous underwater vehicles.
With the potential seeming limitless, OGTC and Orca Hub are focused on what would be useful to the industry. At an OGTC robotics workshop in Aberdeen in early November, the results were 15 potential use cases for robotics and a gap analysis to identify areas that would benefit from future ‘Call for Ideas’ or new joint industry projects.
The use cases included fully autonomous aerial drones, that can plan and navigate their own flight path (something drone operators are already working on); small, highly agile robots that can autonomously, climb, navigate and perform inspections, with little or no human intervention and support (such as Total’s Argos Challenge robots); and a type of pipeline inspection gauge (PIG) that is autonomous, adaptable, reliable, multifunctional and capable of working in harsh environments.
Further detail was hashed out for each area of robotics that could apply offshore:
Fully autonomous drones that could plan and navigate their own flight path would remove the need for manned inspections by using remote solutions, would enable repeatable activities, improving data quality (data collected from the same place at known intervals), with easily fitted and replicable sensors, and capability for multiple sensor monitoring and data collection.
Challenges to such a scenario include regulation around the machine learning systems used for automated flight, as well as payload limitations, drone and data security.
The OGTC workshop suggested areas that could be worked on to enable this technology, including launch and recovery systems, obstacle awareness and avoidance systems, alternative power options, drone diagnostics, communications systems integration into simultaneous operations and permit to work systems, and regulatory acceptance. There were also questions around flying in areas without GPS coverage, around live plant, and adapting to weather conditions.
The land workshop set out a vision for small, highly agile robots that can autonomously navigate complex, 3D oil and gas installations, while localizing itself, finding points of interest and performing non-destructive testing and needing little human intervention and support.
Such a system would help reduce offshore manning, enable an increase in frequency and quality of inspections, aid fault prediction and maintenance scheduling, and deeper analysis, improve safety by reducing risk exposure to staff, and also better capture and make available working knowledge of facilities (instead of it being lost as older crews leave and younger people are less interested in jobs offshore).
No challenges were suggested to this scenario but possible areas where work needed to be done to make it happen included “locomotion strategy,” i.e. how platform robots move about, navigation, localization and intelligence, power management, maintenance, payload integration, and communications infrastructure (OE: December 2017).
It was also suggested that multiple coordinated robots could be used, with a mother unit providing communications and navigation. How robots would adhere to various surfaces was also discussed, as well as ATEX compliance, and power/charging stations and payload, and integration with permits to work systems.
Subsea robotics would aid inspection and surveillance, including pipeline wall thickness, augmented using cloud-based analytics and machine learning, to identify corrosion, putting, scabs, etc. Such a system would be deployable anywhere and could adapt to new environments, leverage past inspection data, flag faults, reduce survey time, improve data reliability and make end user interpretation easier.
Challenges for this vision were around data – both security, data sharing, format and handling/management. Areas for more work included creating open cloud databases of previous inspection data and automated communications related to machine learning, i.e. an application to upload data into a cloud based system, with machine learning, and data analytics.
There were also questions around how to deal with limited availability of ground truth data and power and data management for long missions, also working in variable environments, reactions to break downs. There was also a lot of concern about commercial models and collaboration, and common data models.
Images from the OGTC, by Rory Raitt, taken at the Robotics Week opening event in Aberdeen late 2017.