Cyber Attacks on U.S. Critical Infrastructure will Intensify

Did Iran really detect a planned "massive cyber attack" against its nuclear facilities, as reported by Reuters last week? And, have they really “taken [the] necessary measures” to contain it? Or has their posturing been affected by the revelations in “Confront and Conceal: Obama’s Secret Wars and the Surprising Use of American Power” (Confront and Conceal), the new book by the New York Times writer David E. Sanger. Furthermore, what does this have to do with ICS and SCADA Security?

In a recent blog (Stuxnet Warfare – The Gloves are Off) we discussed Mr. Sanger's book.  At that time, we noted that Mr. Sanger’s statements that the U.S. and Israel were behind Stuxnet “made it difficult for the U.S. Administration to deny it was behind the Stuxnet attacks”. Indeed the Reuters article seems to treat the attribution of Stuxnet to the U.S. and Israel as fact, indicating the impact of Sanger’s disclosures on how the world is now interpreting sophisticated advanced persistent threats such as Stuxnet and Flame.

Iran's president, Mahmoud Ahmadinejad, visiting the Natanz Uranium Enrichment Facility Photo: AP

In our earlier blog article we also stated that:

“This means that the gloves are off. Cyber warfare has moved from “you don’t ask and we don’t tell” to open aggression between countries.”

Open aggression is how Reuters characterizes Iran’s statement that its “arch enemies the United States and Israel, along with Britain, had planned the [massive cyber] attack”.

Scary Simulations Highlight How Hard it is to Protect ICS and SCADA Systems

Since our first blog article on “Confront and Conceal” I have read the entire book and been startled by a lot of what it says. I am going to outline here some of the remarkable points it makes that pertain to cyber warfare, and more importantly to industrial cyber security.

The book describes two cyber attack simulations that the U.S. government has conducted. One was at the cyber-emergency center in Idaho Falls1. On the edge of town a simulated chemical company was created and its equipment was connected to controllers built by Honeywell, Siemens, and other major manufacturers. Two teams were set up; an attack “red team” and a team of defenders.

“It wasn’t a fair fight, or a lengthy one. In cyberattacks, all the advantages lie with the attacker – the element of surprise, the ability to hit multiple weak spots at once, the mystery of where the attack is coming from. A team of ‘defenders’ trying to protect the mock chemical company was quickly overwhelmed; when you walked downstairs, a small automated chemical factory appeared to be in chaos, with liquid spills occurring all the time, mixing machines shaking, black smoke pouring out for effect. The operators were unable to shut any of it off because the attackers had taken control of the electrical system too.”

In a second government simulation, a hacker turned off the lights of New York City.  Photo Courtesy Matt Apps

Another simulation, done in March 20122  was “a vivid demonstration of what it might look like if a dedicated hacker – or enemy state - decided to turn off the lights in New York City.” The attack started when a power utility worker clicked on a link in an email that appeared to be from a trusted friend. It was a “spear phishing” attack and it duped the authorized user into letting cyber invaders into the computer systems that run New York’s electric grid. Since the simulation included a heat wave, it took a while for operators to realize it was not an ordinary blackout. Then, no one could figure out where the trouble originated, which is the first step to restoring power.

The goal of this second demonstration was to press Congress into passing a bill to require that critical infrastructure companies “bend to national standards, and national supervision, to secure their networks.”

Why Libya was not Suitable for Cyber Attack

When the U.S. was considering what role it should play in supporting the uprising in Libya, a cyber attack of Qaddafi’s air defenses was considered. However, this approach was not taken because there was not enough cyber intelligence available about the Libyan air defense systems.

Remember that in the case of Stuxnet, the malware was actively listening and learning for years before an attack was made. “Beacon” code3  was inserted into Natanz that “phoned home” to describe “the structure and rhythms’ of the enrichment plant… to understand how the centrifuges were connected to what are called [PLCs]”.

Earlier this year the Flame virus was discovered and it has been called “the most powerful espionage tool ever to target countries”, by the International Telecommunications Union, the United Nations agency responsible for information and communication technologies.  It issued a formal warning telling member nations that Flame “could potentially be used to attack critical infrastructure”.

Kaspersky Lab’s figures show that Flame’s infection sites were spread across the Middle East with 189 attacks in Iran, 98 incidents in the West Bank, 32 in Sudan, 30 in Syria plus attacks in Lebanon, Saudi Arabia and Egypt. Other reports indicate that specific targets of Flames’ data collection activities are AutoCAD drawings.

It seems that the U.S. could be using Flame to collect intelligence for future cyber attacks on industrial systems.

What are the Lessons for ICS and SCADA Security?

What the cyber offensive moves of the U.S. tell us, is that information about control systems matters. The old thinking of “security by obscurity” is deader than a doornail now.

Second, it is clear that cyber attacks against control systems take time. The U.S. was able to attack Natanz because it had the time. It was able to quietly infect the control network with its “beacons” and run reconnaissance for years.  Conversely, it could not use cyber attacks against Libya, because it did not have enough time. So don’t expect attacks on SCADA to suddenly show up tomorrow – they could take years. And the longer they take the more devastating they could be.

Bottom line - if you think your facility has not been infiltrated; you might want to look harder. If you notice any unusual behaviour on either the IT side or on the automation side of things, you should do a thorough analysis of it. Be sure your evaluation considers the possibility of cyber intelligence “beacons” or a staging of minor disruptions that could lead to larger ones.

How will your team and the executives in your company react if a cyber-reconnaissance effort is detected? What if it had progressed from beaconing to attacking? Perhaps doing some simulations yourself is not be a bad idea.

Finally, Sanger’s book says that Obama and his administration have been very quiet about their cyber warfare initiatives because they did not want to spur attacks on the U.S. Based on last week’s Reuters article, the quiet period is over and attacks, whether from nation states, hackers or criminals will be increasing. Particularly if you are located in the U.S., now is the time to renew and possibly redouble your cyber security efforts.

What do you think of the U.S. move into cyber offense?  How does it affect your thinking for protecting your plant?

1Confront and Conceal, Kindle location 3335, Chapter 8
2Confront and Conceal Kindle location 4190, Chapter 10
3Confront and Conceal Kindle location 3115, Chapter 8

Related Content to Download

White Paper: "Using ANSI/ISA-99 Standards to Improve Control System Security"

Download this White Paper and learn about:

  • The ANSI/ISA-99 Zone and Security Model
  • A Real World Oil Refinery Example
  • Implementing Zones and Conduits with Industrial Security Appliances
  • Testing and Managing the Security Solution

Related Links

[Update Sept 27, 2012]

 

RSS Feed Subscribe to the "Practical SCADA Security" news feed

Comments

14

I have participated in the RED-BLUE exercise at INL. It is not a fair fight, not because the red (attack) capability is so dominant, but because the blue team's network starts out unprotected. This is not a realistic scenario. Even with the clear red advantage it is a challenge for them to gain access.

RE: Lights in NYC, who runs email on their control network? That's an old ruse. I'm not saying that nobody does this now, but I think you'll find it very rare, almost certainly not by ConEd. Even carrying it across the IT to control system networks on mobile media would be extremely difficult based upon the dozens on installations I have seen. Impossible, no, but highly highly unlikely.

I agree that the simulations, meant to impress reporters or Senators, were simplistic. But, the point is that malware attacks like Stuxnet and Flame are not simplistic - indeed they are very sophisticated, and hard to detect. Now that the U.S. and Iran are openly talking about cyber warfare, it is the advanced persistent threats (APT) that industry needs to worry about.

Also, the lesson that the attacker has all of the advantages – the element of surprise, the ability to hit multiple weak spots at once, the mystery of where the attack is coming from, is also important. How many organizations today can respond effectively to an APT with those characteristics?

I find it hard to agree with your unqualified statement: "attacker has all of the advantages". Yes, the advantages that you point to are very useful in war-time scenarios. But I suspect that it would be less advantageous during peace time. In peace time, the element of surprise of a cyber attack incident may act in a negative way and lead the victim of the attack to believe that the incident is an accident instead of a deliberate attack (at least for some time) since an accident is as likely as an attack. If this happens the utility of such an attack would be lost since it would not help the attacker in achieving his political objectives. The only way an attacker would be able to achieve his political objectives then is to come out from the dark and claim that he/his nation waged the attack.

Investing significant time, resources and money behind delivering a cyber attack and then noticing that it may not be useful in achieving a political goal is not necessarily an advantage to the attacker, I think. However, during war time an accident is more likely to be seen as an attack. Thus, a cyber attack during war time could be very useful in achieving political objectives of the attacker even if doesn't wish to come out of the dark and claim that he waged the attack.

Bottom line, please qualify your statement about cyber attacks favouring the aggressor. Attacker may not have all the advantages all the time.

We are looking into integrating the ICS security management program which was developed by the Chemical Sector Coordinating Council with the Department of Homeland Security, what are your thoughts of this program?

Is this the program you mean: http://www.dhs.gov/files/programs/gc_1276534935062.shtm?

While neither I nor Eric Byres know this program directly, it looks good in principle. We like the fact that is has several components, such as raising awareness, a risk-level assessment tool, and fostering cooperation amongst emergency response teams.

However, we have heard that the program covers very little on cyber security and is mainly focused on physical security.

Have any of our readers done this program? What are your thoughts on it?

Also, a resource for chemical facility security information that you might want to check out is PJ Coyle’s blog: Chemical Facility Security News http://chemical-facility-security-news.blogspot.ca/ It covers legislation in the area and also reports on vulnerabilities and other issues.

Heather:

Thanks for the mention of my blog.

I think that what the reader was asking about was not the CFATS program, which has the ICS flaws that you mentioned, but rather the The Roadmap to Secure Control Systems in the Chemical Sector. It is described at http://www.dhs.gov/files/programs/gc_1276534935062.shtm, but you can only get a copy from DHS by requesting it at ChemicalSector@dhs.gov

Thanks for the clarification on the program Patrick, and the links so people can request information on it.

Does anyone have any comments on the usefulness of "The Roadmap to Secure Control Systems in the Chemical Sector?"

I participated in the RED-BLUE exercise at INL in 2011.
I was on the BLUE team. We won!
It can be done.

Good to know that the blue (defending) team can win! I imagine that it took significant effort to successfully defend the system.

Now the real life “blue” teams need to start preparing to win.

Not all of Sanger's claims can be taken at face value because some of them are technically faulty, suggesting that he accepted his informants' word without a technological reality check. His scenario for how the Stuxnet worm escaped into the wild in particular could not have happened that way. A PLC infected with the payload cannot infect a laptop, and Stuxnet was never capable of spread over the Internet, only by removable media and over LAN connections. If Sanger got that wrong, what else of his story is misinformation or deliberate disinformation?

--Prof. Larry Constantine (novelist, Lior Samson)

I agree with Larry Constantine, although I see it this way. Without an attempt at falsifiability, David Sanger's book cannot be taken as true information.
As Larry Constantine points out, the fact that Sanger got some facts wrong taints the rest of the book. And the fact that the book was published with tainted facts brings into question the social motives behind the book. He proved some statements by Sanger to be false through contradiction with true statements.
I would take this a step further by applying falsifiability to a claim or hypothesis made in a book. If a claim has no way of being proven false (falsifiability) then it has no claim to credible evidence. Any reader may assess whether the claim is falsifiable or not if they simply conduct a study of the claim. Is this claim an assertion of truth? If so, how can the assertion or statement of truth be tested?
But this is where most journalism fails. Since some of the claims which Sanger made have not been proven falsifiable in the book, further derivative articles must not repeat these claims as fact and must assert a conditional acceptance of the claim... IF this is true, then... Otherwise the lack of falsifiability of the source is masked by the social acceptance of the derivative articles and the potentially false claims of Sanger are virally communicated throughout society, giving them credibility.
The flip side of this is that when society finally does recognize that false claims have been made, not only is the source tainted, but so are the propagators of it. This means that the people who have been dedicated to protecting against cyber attacks may also be discredited. This is why it is important that reviewers of books like Sanger's be careful to apply conditional acceptance of a claim.
I've found the most devastating information attacks to be social. What better way to infiltrate the industries of a country than to discredit the agencies involved in protecting them?
Let's examine the claim that the United States and Israel engineered Stuxnet. Stuxnet was released into the wild. No attempts were made to limit this possibility. The code was not obfuscated, but in fact the opposite. To a subversive programmer Stuxnet was a toolkit to be further developed and tested, leaving any country in the world open to cyber attacks on ICS. The US and Israel are mandated to protect the security of their countries. Releasing Stuxnet to the wild contradicts these mandates. If the authors of Stuxnet were from agencies of the US or Israel, these agencies would be guilty of treason. Therefore can these agencies be reasonably expected to act in the way suggested by this claim? The only way any reasonable person can accept the assertion that the US and Israel released Stuxnet is to believe in the ability of these agencies to circumvent their own mandates without justice.
Here is the social engineering side... by accepting that Stuxnet was released into the wild by US and Israeli agencies through assertion of virally communicated information as fact, a reader has to conclude that these agencies are not subject to justice. And since these are government agencies, then so is the government believed not to be subject to justice.
Is this the outcome that we, as a society, are looking for?
-- ray shpeley, just a U of A student

Thanks for your comments, Prof. Larry Constantine and Ray Shpeley. You raise interesting points and I now know what “falsifiability” is. I don’t agree, however, that if there are technical errors in the book, it implies overall misinformation.

Sanger is a reputable political and foreign policy journalist, whose main revelations are in his area of expertise. While I agree with Mr. Shpeley’s points that not all of Sanger’s information can be treated as fact, and I could have called them out as being only conditionally valid, I do think there is a good probability that a lot of what he says about how Stuxnet was developed and deployed is true.

Whether or not you believe Sanger, to me the main point is that superpowers and dangerous states like Iran are now openly talking about cyber warfare. That means attacks on critical infrastructure are likely to increase. Hence, operators should be more attentive to that possibility and take measures to protect their information and their facility.

ConEd and other power companies may not run email on their ICS network and there may be barriers and policies to inhibit portable media being carried across the air gap, but any modern ICS that is actively maintained as up-to-date by software that is itself maintained as up-to-date on platforms that are maintained as up-to-date has indirect, intermittent, or hidden pipelines to the Internet. Ultimately, engineering work stations must connect, directly or indirectly, to ICSs and ultimately to the Internet.

Most of the industrial security community has acknowledged that the air gap is a myth, and a dangerous one at that. The demonstrable routes to hack into Con Ed and other suppliers are too numerous to name.

--Prof. Larry Constantine (novelist, Lior Samson)

Good comments Prof. Constantine. Stuxnet showed that air gaps are not effective and that there are many pathways to the control system floor.

Note: Eric Byres has just updated his article: #1ICS and SCADA Myth - Protection by Air Gaps" which directly addresses this topic. If you are interested in the topic of Air Gaps, you might want to check it out.

Add new comment