Hackers are finding ever more sophisticated ways to breach corporate IT defences and the results can be devastating. BP’s former security capability lead Tim Harwood describes the threats faced by modern businesses and why maintaining cybersecurity is now the responsibility of every employee.

main

On 15 August, 2012, Saudi Aramco found out what kind of damage a security breach can cause. A virus – later named Shamoon – erased data on three quarters of the company’s PCs, leaving behind an image of a burning American flag.

To stop the bug from spreading, Aramco was forced to shut down its entire corporate network, cutting off employees’ email and internet access for several days. The company was still dealing with the aftermath months later, with remote access to its internal network prohibited.

Havex is the latest piece of malware to attack the oil and gas sector. Unlike Shamoon, Havex’s Russian authors have specifically targeted industrial control systems (ICSs) in the energy industry and elsewhere over the last two years.

Though every refinery and production well relies on ICSs like SCADA to manage critical processes, ICs manufacturers and operators have traditionally not prioritised cybersecurity in the same way as mainstream IT.

People still tend to think that security is done by the security team because they still see it as a technical issue, rather than being every user’s responsibility.

Who do you trust?
The 2010 Stuxnet attack on Iran was the big wake-up call for ICS security, sparking widespread concern over control system vulnerability. Thankfully, widespread ICS malware attacks have never materialised, but the energy industry still needs to be on its guard.

"Companies need to realise that their control systems are at risk of attack," warns security specialist Tim Harwood, formerly of BP and the founder and managing director of security specialist HS and T Consultancy.

"People never used to go after ICSs because they were largely invisible proprietary networks: the ‘security by obscurity’ advantage."

That has changed. The web now makes it simple to find detailed information on common SCADA communication protocols such as Modbus, OPC (open-platform communications) or Profibus. A search quickly reveals which ports they use in the firewall – which commands control devices directly – and also where connections to remote networks might provide entry.

"Traders in London are asking to be connected to the proprietary ICs at the refinery so we can work with this information in real time," explains Harwood. "Vendors will want remote system access to perform updates. Do you really want to create a tunnel through your firewall? You have to carefully gauge the trustworthiness of third parties."

This growing uncontrolled access weakens control system security, with hackers often targeting smaller companies with weaker defences in order to gain access to their main targets. For example, the criminals that stole 40 million credit card numbers from Target earlier this year initially entered via a heating and ventilation supplier’s system.

The Havex malware also takes this route. The authors of this remote access Trojan (RAT) hacked into the websites of ICS manufacturers and infected their software update installers. Once downloaded, it spread quickly across entire systems.

Links to the corporate network are another ICS weak point. State data often has to pass through firewalls to corporate networks, for example, to add records to Historians (historical records of SCADA systems that can be queried or used to display trends).

Sharing information on security threats, vulnerabilities and solutions in a trusted environment is one of the best ways to ensure that the same attacks don’t succeed elsewhere.

Attack plan
Harwood advises "thinking like an attacker" to plan defences, citing Lockheed Martin’s Cyber Kill Chain framework. This lays out seven different stages in breaching security, from initial reconnaissance, delivery and installation of malware on a system to establishing full remote control and taking damaging action.

"Use your own network as an example," he says. "If I was going to attack my own network, I would do this, so what do I need to put in place to defend it? For a start, think about who you are connected to."

The classic solution is to have an ‘air gap’ buffer zone between control systems and the corporate network with a firewall on each side, often termed a ‘DMZ’ (demilitarised zone) in security parlance. Every passing packet of data is isolated and carefully inspected before being allowed through.

"A DMZ should be able to talk both ways and give you full control of exactly what goes in and out of the control system environment," explains Harwood. "You should be pulling in any data required by the ICS, rather than having it pushed on to you by the corporate networks. You also need to understand that firewalls have limits."

Plugging those other gaps isn’t easy. For example, Stuxnet’s designers gave their worm eight different spreading mechanisms. It propagates via secondary pathways rarely considered in most security designs, like infected PLC project files, USB keys and maintenance laptops. When it finds a firewall in its way, it piggybacks on protocols that are typically allowed through a control system to avoid setting off any alarms.

Havex also employs other attack channels such as spam emails and web-based exploits. Once inside a system, it scans any linked networks for devices that respond to OPC requests, gathers information on them and transmits that back to its base server for use in future attacks.

That means that, as well as external barriers, internal ICS devices and architectures must combine to resist malware that does get through.

As well as standard ‘hardening’ approaches, such as locking out USB ports, other options include dividing plants into security zones in order to contain infections, or using satellite-based communication for remote ICSs to avoid vulnerable public telephone systems and to permit the use of built-in encryption.

"If an attacker knows you use a certain phone number, they will go 50 digits each side of it, get a power dialler and just try every single number to see if there’s a modem, a copier/scanner or another piece of vulnerable equipment on the end of it," warns Harwood.

Security/risk
Organisational and cultural differences add to the challenge of securing ICSs. For example, a laptop OS can simply download security patches and install them before it shuts down, but that’s not an option for a PC controlling a vital part of a refinery. With something as time-sensitive as anti-virus protection, the security implications are obvious.

"You might not be able to patch it for 90 days, or even six months, until it’s offline," says Harwood.

"It’s the same in any production environment. Something like a nuclear power station is the extreme example: you can’t just switch it off because there’s some patching to do."

In many companies, operational teams handle the ICSs while the IT department runs the corporate networks. Sometimes based in different countries, they may not even speak the same language and certainly have different priorities. According to Harwood, an ‘IT/OT’ culture clash is common in any business with extensive control systems.

"One big risk for IT might be loss of data," he says, "but for ICSs, the biggest risk is loss of life. For them, the ultimate goal is ensuring the safe and reliable operation of the plant, and that affects everything from throughput to availability to data integrity."

With IT and operational teams on different sides of the same firewall, they need to get together and ensure they can communicate. That might mean talking about cybersecurity in the same way as the need to wear steel toe-capped boots onsite.

"Take out the word ‘security’ and substitute ‘risk’," advises Harwood. "That gets the message across far better, because they understand the terminology. They perform risk assessments day in, day out, so that’s familiar to them."

The manual side of ICS security extends from making sure employees have sufficient skills to operate what can often be outdated software to acting as human firewalls and sensors to spot abnormal operations potentially caused by a security breach.

To help employees decide what’s reportable and what’s just noise, management first need to set out normal parameters and set points.

"People in our industry still tend to think that security is done by the security team because they still see it as a technical issue, rather than being every user’s responsibility," says Harwood. "The people watching the HMIs [human machine interfaces] become the first layer of defence."

Staff can be a critical cybersecurity weakness far beyond ICS. Employees often fail to follow security policies and procedures, whether it’s remembering not to plug USB sticks picked up at conferences into company laptops, or refraining from logging in remotely to the corporate network from a country known to be risky.

"People aren’t stupid, they are just really busy," Harwood says. "Think hard about how, for example, you increase recognition of fraudulent calls and emails, and how you roll that out. Do you do it using posters, or face to face?"

Built in, not bolted on
It can be difficult, however, to gain funding approval for this kind of training and awareness in an industry focused more than ever on reducing operating costs. Harwood notes, "It’s tough to go to the board and say you’d like to spend $300,000 dollars on a series of awareness campaigns when the best result will be that nothing happens."

Industry collaboration is a bright spot in the ICS security story. Initiatives such as Cyber Security Capability Maturity Model (ONG-C2M2) and LOGIIC (linking the oil and gas industry to improve cybersecurity) are established examples of this kind of collective approach. The latter gave birth to Achilles certification, a set of tests that all system suppliers comply with. This builds security in from the start, rather than having energy companies bolt it on afterwards.
Understandably, businesses that have been hacked don’t want to publicise the fact, but sharing information on security threats, vulnerabilities and solutions in a trusted environment is one of the best ways to ensure that the same attacks don’t succeed elsewhere. Painful as it must have been, that’s what Saudi Aramco did after the Shamoon attack, and the industry should thank it.