Pages

3 August 2015

Hacking Critical Infrastructure: A How-To Guide

BY PATRICK TUCKER
Source Link


Cyber-aided physical attacks on power plants and the like are a growing concern. A pair of experts is set to reveal how to pull them off — and how to defend against them.

How easy would it be to pull off a catastrophic cyber attack on, say, a nuclear power plant? At next week’s Black Hat and Def Con cybersecurity conferences, two security consultants will describe how bits might be used to disrupt physical infrastructure.

Patrick Tucker is technology editor for Defense One. He’s also the author of The Naked Future: What Happens in a World That Anticipates Your Every Move? (Current, 2014). Previously, Tucker was deputy editor for The Futurist for nine years. Tucker has written about emerging technology in Slate, The ...Full Bio
U.S. Cyber Command officials say this is the threat that most deeply concerns them, according to a recent Government Accountability Office report. “This is because a cyber-physical incident could result in a loss of utility service or the catastrophic destruction of utility infrastructure, such as an explosion,” the report said. They’ve happened before. The most famous such attack is the 2010 Stuxnet worm, which damaged centrifuges at Iran’s Natanz nuclear enrichment plant. (It’s never been positively attributed to anyone, but common suspicion holds that it was the United States, possibly with Israel.)

Scheduled to speak at the Las Vegas conferences are Jason Larsen, a principal security consultant with the firm IOActive, and Marina Krotofil, a security consultant at the European Network for Cyber Security. Larsen and Krotofil didn’t necessarily hack power plants to prove the exploits work; instead Krotofil has developed a model that can be used to simulate power plant attacks. It’s so credible that NIST uses it to find weakness in systems.

The idea is to help cybersecurity professionals understand what to look for and design intrusion detection software to prevent attacks from taking place. You can’t guard an asset until you know what weak spots your enemy will use to grab your prize. And when it comes to online attack, the weak spots in U.S. infrastructure are many. But Larsen hopes he doesn’t get “crucified” for his presentationWhen asked if there was a single error or issue that was common across the various installations accounted for in the model, perhaps a single unlocked back door that made power plants, chemical plants, and other pieces of infrastructure vulnerable, Larsen replied, “The answer to that is, which one?”

A hacker bent on destruction might try various methods. There are “water hammers,” a method of destroying piping structures by closing valves too fast. There are three-phase attacks that cause gears to spin too quickly, too slowly, or out of sync with other vital pieces of equipment. (The so-called Auroravulnerability is one of these.) And there are collapse attacks, where the hacker fills a round tube or container with hot liquid, rapidly closes the lid and waits for the liquid to cool to create a vacuum. “A lot of the round stuff we build doesn’t hold up to vacuums very well. Whole valves that you can drive trucks through can collapse like a beer can,” Larsen told Defense One.

Still, it remains far easier to get online access to a computer or network than it is to cause physical damage to infrastructure. Such attacks a very specific understanding of a physical event playing out — creating a vacuum, turning a valve, rotating a piston, etc. — and specific knowledge of a particular plant or facility.

“For instance, the attacker probably needs the point database because he needs to know that Point 16 operates the oil pump and Point 17 is the light in the bathroom…Without it, it’s hard to launch an effective attack,” said Larsen.

The attack on the Natanz enrichment plant is illustrative. “When Stuxnet came out, the very first version had a payload. It went over there and the effective process broke a whole bunch of stuff. But the actual creation of the payload…a lot of people had to work hard behind the scenes trying to figure out, ‘Oh, there’s a spinning apparatus. I can go damage the spinning apparatus. What information do I need to know to do that?’” asked Larsen. In general, he said, “We don’t have the roadmap for an attacker once he gets in, where he gets to the final payload” that does the damage.

Still, it’s time to start beefing up cyber defense, he said. Defenders need a comprehensive overview of plant cyber security, better sensors inside the facility, better control processes, and much better sensitivity to small abnormalities. This is what Krotofil calls process security — protecting the overall plant. Traditional IT security is insufficient, she said.

“People say, [supervisory control and data acquisition] systems are vulnerable because there is not enough traditional ITsecurity put in place,” she said. “Well, that’s rubbish, because we just presented two attack possibilities where you can control the process at will even if it’s password-protected and encrypted. We also show that you can exercise the process at will despite all the IT security you put in place. And we can spoof process states, to make the operator believe that everything is fine.”

This sort of research can reveal the most likely vulnerabilities in a target — but turning keystrokes into physical damage requires more, says Larsen. “If you’re hitting a nuclear reactor, you really have to know what the estimates they’re using for flux and fluids are. That might not be really obvious. One of the ways to do that is tweak the process a little bit and see how it responds. If you can figure out how people would normally go about doing these little tweaks and responses to tune their cyber weapon, we can actually go look for those and develop signatures for them. We can say, ‘Oh, someone might be tweaking a process’ before someone launches a full-blown attack.”

For policymakers, Larsen offers this advice: create a place for engineers to share data, and then butt out so they can do it. “There’s been a lot of information-sharing things that have sprung up,” including the Cybersecurity Framework the White House put out last year, he said. “What we need is information sharing between engineers at various facilities in order to improve. But sharing information is dangerous because eventually you are going to share the information for how to attack somebody else. So the programs for information sharing have started off with lofty ideas and ended up with a very conservative, to the point of not being useful, implementation because no one wants to be the guy who leaked the information that somebody used to go attack something,” he said. “On the policy decision, I would say that the government’s role should be to mandate and facilitate the information sharing, but not be a member of the information sharing.”

Of course, there also some vulnerabilities that are easy to fix. “Putting in a pressure relief valve in place is actually way cheaper than all the cyber work you have to do,” he said.

No comments:

Post a Comment