To prevail against the cat, the mouse must know the cat … and read some Sun Tzu

Last year I listened to a lecture on cybersecurity which included a prediction that we may be entering an age of “unhackable” equipment. The idea was that it will be possible to apply new encryption algorithms and improve the security aspects of hardware enough to make them immune to cyber-attacks (https://www.youtube.com/watch?v=kBXGRkan7rY ,see at 4 min. 32 sec.) . This was quite surprising to hear in the context of the many reports that year of successful hacking and system intrusions from cyberspace.

Recently I heard another optimistic lecture where it was asserted that defenders of control systems can reduce the risk of damage from an attack while insuring acceptable recovery times. ( https://www.youtube.com/watch?v=t5F93NIDePQ ). The impression given was that the efforts needed to insure that defense stays one step ahead could be successful if some easily employable common sense measures were applied. For example removing a cyber-attack vulnerability by adding an additional safety valve to a critical system or sending a man out to a site to make a manual measurement (rather than rely on data from a remote device) to double check that things are ok.

I found the evaluations of examples of real world cyber-attacks on critical systems used to support the thesis most perplexing. Of all the examples that could have been used to support the thesis for some reason Stuxnet and the cyber-attack on Ukraine’s power grid were chosen. Excellent examples for illustrating the kind of advanced threats we face but was quite surprised at how they were underestimated. This was surprising because Stuxnet has been well studied and reported on (1) while Ukraine has caught the attention of the US Government DHS (2) and various security companies (3) which produced good analysis. They in fact were credible demonstrations of the capabilities states have to develop and direct malicious cyber activities at the critical infrastructure of other states. The way the examples were interpreted however gives cause for concern.

Discussing (and not going into what really happened to cause the blackout) the Ukraine blackout in terms of 6 hour recovery times that are “not unusual for utilities” leaves out how unusual the incident actually was. Field technicians had to be sent out to manually run the substations not because of a blackout caused by a thunderstorm or accidental technical malfunction but because the firmware on communication devices needed for the control systems to remotely control the substations from the control center were overwritten and made inoperable. The implications for the presenters’ thesis (defense can win against offense) of using an example where the remote actions of a hostile actor succeeded in disconnecting substations, erasing firmware on critical devices, and erasing data on workstations were lost. There is more there to think about for defenders than just whether management will find a 1-6 hour outage acceptable.

The use of Stuxnet as another supporting example for the thesis is perhaps the most baffling. Yes it is probably correct that Stuxnet, as was stated in the presentation, only resulted in perhaps delaying the victims program by 1-2 years. Yes only a limited number of targeted devices seem to have been affected. However the extensive analysis and study of the Stuxnet event tell a darker story of a new domain being used by states and other malicious actors for preparing and engaging in cyber-attacks that can disrupt the operations of critical processes and systems that support modern society. Defenders face a huge and complex challenge to come up with a defense against an offense that is supported by the intelligence, financial, and technical resources of a state.

Both examples fail to consider an important point. While the examples in terms of outcomes were perceived to be limited that does not mean that the attackers reached the limit of what they could do. They had access, system control and apparently stopped the attack when the objective was achieved. Just as in the first Gulf War the Allies stopped military operations once the assigned objective was achieved. They could have done more, but chose not to. Those responsible for Stuxnet could have destroyed more equipment and in a more visible way but these actions apparently were not in the plan .(4) Any defensive measures taken by the defender are just additional obstacles to be overcome which is not beyond the capability of a state resourced attacker.

It is my opinion that technical solutions for reducing the risk to critical systems from cyber-attacks and developing strategies for acceptable damage and recovery time objectives while important are not enough to fully meet the new threats emanating from cyberspace. In addition to the mitigations at the technical level steps must also be taken at the international security policy level to restrain the malicious cyber activities of states and those actors they may wish to sponsor or quietly ignore. The steps taken by industry, government, and operators to defend critical infrastructure from cyber incidents will have to be measured against the financial, intelligence and technical resources states can bring to counter them. The efforts of one sector or one company will not be enough to address the danger. We should pay better attention to the very few examples of this activity that have surfaced above the “iceberg” to study them and connect the dots. Care should be given to avoiding hasty comparisons (recovery time from a power outage caused by weather and caused by cyber) that can mislead the direction of mitigation strategies. Those in defense still need to realize that some of the attackers out there are working on assignment as part of an APT. These foes seek to know, using a wide range of tools and resources, more about the targeted systems and vulnerabilities better than the defenders do .(5) Until they come to this realization defense will likely come in second in this competition. As much as one would wish to be optimistic it is too early to think in terms of defense as “winners” just yet. A lot of work still needs to be done in order to even out the odds. Coming up with instruments to make the leadership behind APT’s think twice and exercise restraint before engaging in this dangerous activity should be done in parallel to the development of technical solutions. This will more likely lead to a winning policy in terms of defending critical systems from cyber-attacks than in just pursuing technical solutions to problems we can only fully understand after an attack has taken place.

Views expressed in this blog entry belong to the author and do not
represent the official view of any institution he is affiliated with.

1. http://www.langner.com/en/wp-content/uploads/2013/11/To-kill-a-centrifuge.pdf
2. https://ics-cert.us-cert.gov/alerts/IR-ALERT-H-16-056-01
3. https://ics.sans.org/media/E-ISAC_SANS_Ukraine_DUC_5.pdf
4. http://foreignpolicy.com/2013/11/19/stuxnets-secret-twin/
5. For a look at the attacker perspective look at US NSA TSO presentation „Disrupting Nation-State Hackers“ . https://www.usenix.org/conference/enigma2016/conference-program/presentation/joyce (see at : 1 min. 40 – 46 sec, 3 min. :24-52 sec.)

https://enseccoe.org/en

NOTE: The views expressed within this blog entry are the authors’ and do not represent the official view of any institution or organization affiliated thereof. Vytautas Butrimas has been working in information technology and security policy for over 30 years. Mr. Butrimas has participated in several NATO cybersecurity exercises, contributed to various international reports and trade journals, published numerous articles and has been a speaker at conferences and trainings on industrial cybersecurity and policy issues. Has also conducted cyber risk studies of the control systems used in industrial operations. He also collaborates with the International Society of Automation (ISA) and is member of ISA 99 Workgroup 13 that is developing Micro Learning Modules on the ISA 62443 Industrial Automation and Control System Security Standard and Workgroup 14 on security profiles for substations.