Maintaining and monitoring millions of miles of pipelines no longer needs to be such an invasive and extensive process thanks to new data-analysis systems, but to what extent can the industry put its faith in big data at the price of a costly system upgrade? World Expro speaks to GE Oil & Gas’s Brad Smith about how the company’s Intelligent Pipeline Solution can give a new, efficient perspective on gathering linear asset data.

main

When nearly two million miles of transmission pipeline network worldwide are outputting data at 17TB per 30,000 miles, the prospect of effectively managing and using the sheer wealth of information certainly makes the phrase ‘big data’ ring true. However, with the ‘industrial internet of things’ hitting its stride, the rise of data integration solutions is steadily optimising asset management. This means that coping with pipeline maintenance challenges such as risk potential and ageing infrastructure can now be less labour-intensive for providers.

"We’ve seen this happen in other verticals, other industries," explains Brad Smith, business leader of intelligent pipelines at GE Oil & Gas. "Technology is allowing more data to be crunched and more tools to be applied to that data, enabling us to make decisions in the pipeline space that really haven’t been there in the past."

The Intelligent Pipeline Solution (IPS) is a software package, designed by GE and Accenture, that provides a standard way to run data analytics on a grand scale by combining multiple data streams into one dashboard. This allows operators to enhance their services, mitigate risks and prepare for potential scenarios more efficiently.

"I think it’s very important for operators to invest in acquiring data that’s maybe not currently available to them," says Smith. "So when they get access to this data, they can make the most informed decisions possible. This will make pipelines more efficient and, most importantly, more safe."

Systems integration

IPS came about through GE’s focus on taking large subsets of data from different ‘seemingly unrelated regressions’ (SUR) systems and routing them into a single location before applying deep domain expertise to generate a new piece of information altogether.

"For example, operators can use geographic information systems [GIS] to map all their asset data," explains Smith. "They can have work management systems that contain all the work orders and information; SAP systems; and Stata operational systems that they use in the control rooms to monitor pressures, flows and temperatures."

Smith has not, for the most part, come across operators that use all these methods to channel static and dynamic data into one place; IPS was therefore intended to work in concert with providers’ existing systems to give them a workable overlay of information analytics.

Choosing to upgrade pipeline technology obviously requires extensive financial consideration, but the consequent savings that IPS can generate are highly likely to outweigh the initial expenditure. As not every company will require the same upgrades or want to use the same kind of data, the extent of their investment often depends on the existing systems they have implemented.

"If operators have already invested in bringing their data into one place in a consistent model where they have easy access to it, we’re that much closer to putting those numbers into another tool to run analytics from," Smith explains. "So we really focus on understanding what problems operators have. That tells us what data is needed, and the financial impact of that varies. We get to understand the quality and completeness of their data, and there’s a lot of value in letting that drive what is likely to be millions of dollars’ worth of investment."

Collaborative development

Understanding linear asset data in order to put such a large investment to optimum use is somewhat different to that of fixed assets. This is due to the necessarily different ways in which the data is collected and analysed.

In the case of fixed assets, collecting data is somewhat simpler. Commonly, offshore platforms and ONG assets will have a device with numerous sensors to gather data that is then constantly being relayed back in real time. The numbers are then crunched and analysed to determine maintenance activities and minimise downtime.

We really focus on understanding what problems operators have. That tells us what data is needed.

"One operator I spoke with recently was experiencing uptime in the mid-to-upper 80th percentile, and with some maintenance and asset performance management solutions, that increased to 91%," explains Smith.

Gathering data from linear assets is a very different challenge. For a 500-mile stretch of pipeline, the data is much more static – there aren’t constant real-time streams and there is also the geospatial challenge. But IPS carries out pipeline data optimisation that lets providers tread new ground.

"Even just the data aggregation and communication is changing the way operators run their business processes, the way an integrity team collaborates with an operations team – how do they send their reports, how fast do they respond and send their work through? I think the pipeline industry is really just now starting to see this data in ways that it can actually do something with."

Smith confirms that training will be necessary for personnel required to operate the IPS software. By accumulating all the company’s requirements and then building the software fully prior to release, GE has taken an approach that invests in user-centred design, bringing in expertise from consumer firms such as Google, as well as firms specialising in designing user-oriented software applications.

"Rather than just mapping the program requirements in a lab, this method yields a much deeper understanding of what the user’s day-to-day life looks like – who they interact with and the applications they use – and then uses that insight to drive how to develop the software," says Smith.

That approach is predicated on the assumption that if the software is built in an intuitive fashion – very easy to understand and adopt – employees can learn quickly and then find new ways to do their job.

"As we develop and release the software, we’re constantly engaging with providers to generate progress," explains Smith. "We showcase the product to demonstrate where we’re at so we can get immediate feedback.

"That allows much tighter collaboration – we are developing with the operator in mind, continually iterating and reacting. More important than the training itself is developing the software with the operator and recognising what role the user plays."

Room to expand

With a wealth of new data being aggregated, there are measures that pipeline operators must put into place to accommodate the extra required storage. IT departments will play a critical role in this, as well as in the way source systems are managed and maintained – these programs now use and depend on data in very different ways.

"Say you have six sources of information that each has its own set of users and objectives," says Smith. "Now combine that into one batch and start to make new business processes around it – that is a big change to how frequently data is updated and systems are refreshed. So I think our IT department will also play a critical role in the strategy and maintenance of these systems."

With a background in hardware and software, Smith knows there are always competitors. "In terms of hardware, you tend to have a well-structured market – the product is being sold, and you have people that are competing against it," he says , "In the software space, it’s much more of a grey area; you have software that can scale and do a lot of different things, so it’s key to a lot of partnership opportunities."

In order to be successful, IPS has to work with existing systems, requiring Predix, the software powering IPS, to be able to interact with systems currently in place. It also needs to provide tools that allow third parties to develop new applications. Without this innate compatibility, operators would end up relying heavily on one tool, rather than using it as a jumping-off point for their own data strategies.

"It’s an oversimplification," explains Smith, "but this is a good way to think about it: we aren’t necessarily trying to sell a platform as much as we are what the platform enables you to do to make improved business outcomes."

Whether pipeline operators will be willing to venture such a large financial investment in solutions like IPS is a big question, but the climate of linear asset management potentially seeing a revolution in efficiency, safety and uptime that has not previously been possible is a prospect as enticing as it now is feasible.