Knowing about the tip of the iceberg is not good enough

A high official of the International Atomic Energy Agency (IAEA) is reported recently to have announced that a disrupting cyber incident took place at a nuclear power plant around two-three years ago. (1) He was quoted further saying “This issue of cyber-attacks on nuclear-related facilities or activities should be taken very seriously. We never know if we know everything or if it’s the tip of the iceberg.” These words bring to mind the importance of answering the 3 questions for developing a strategy that will address today’s threats to critical infrastructure emanating from cyberspace: what to protect, from what threats and how to protect chosen assets in the most cost effective way? The comprehensiveness and effectiveness of a policy to address cyber risks at a nuclear or any other power plant depends on how well those questions are answered. Poorly understood and poorly answered questions will lead to flawed policies that may lead to unforeseen and tragic results.

The IAEA’s concern about cyber-attacks on nuclear power plants is of course good to hear but how well do they understand those key questions? Earlier this year the IAEA issued a publication to its 168 member states on protecting nuclear facilities from cyber threats, – “Computer Security Incident Response Planning at Nuclear Facilities”. The guide’s stated purpose “is to assist Member States in developing *comprehensive* contingency plans for computer security incidents with the potential to impact nuclear security and/or nuclear safety” (2). When I read documents of this type I am keen to look for how those 3 questions are addressed.


The IAEA’s answer to the first question of “what to protect” is presented as “computation, communication, instrumentation and control devices that make up functional elements of the nuclear facility” (3). The guide also makes clear that industrial control systems (SCADA, DCS, PLC’s) belong to the control devices category. The answer to the first question seems right on target. However the answer to the second question, on cyber threats, is surprising when one reads a highlighted reference to the Council of Europe Cybercrime Convention (4). The authors addressing the cybersecurity of control systems are using the Convention’s Information Technology (IT) security model which is based on the priorities of Confidentiality, Integrity and Availability of computer data and systems (CIA). Why isn’t the control system security priority list of (Safety), Availability, Integrity and Confidentiality (S)AIC being used here? The use of this borrowed CIA biased cybersecurity policy model from a cybercrime convention raises some doubts as to the applicability of this guide for addressing control systems. Is it a good idea to emphasize near the beginning of the IAEA document on page 4 that the main cyber threat to nuclear facilities comes from cybercrime! Criminally motivated cyber-attacks are important to note but in terms of developing a “comprehensive” computer security incident response plan for a nuclear facility more threat elements need to be considered and added accordingly.

Council of Europe "Budapest" Convention on Cybercrime
Council of Europe “Budapest” Convention on Cybercrime

For example one of the elements that the IAEA guide fails to address in terms of addressing cyber threats to control systems come from the increased connectivity and complexity of today’s control systems. A complexity that has contributed to an increasing cyber fragility of these systems in terms of being vulnerable to not only intentional but unintentional cyber incidents. The lessons for instance of the cyber caused emergency reactor shutdowns at the Hatch and Brown’s Ferry nuclear power plants seem lost. One wonders whether the authors were even aware of these well documented events that occurred in their own industry. The cause of the Hatch incident was unintentional after someone at the plant implemented an IT security policy on a single computer at the plant. The Brown’s Ferry shutdown was caused by some malfunction in the Ethernet based control network that caused a denial of service condition (5). No cybercrimes were committed in either incident. In the Hatch case it was rather an example of an IT specialist applying an IT security policy on a complex control system he/she did not fully comprehend. The same flawed mind set also seems present in this IAEA guide.

One more indication that the authors needed to spend more time on addressing the 3 questions is in their recommendation that each nuclear facility establish a Computer Security Incident Response Team (CSIRT) for dealing with cyber related incidents. The scope of the operation of a CISRT however is vaguely defined. Is it limited to the administration side (office IT computers and servers) of the facility or does it extend out to the process control systems monitoring and regulating the nuclear reactor part of the plant? The skill sets required may be similar but dealing with a cyber incident on the production floor requires very specialized knowledge. From the CSIRT description the skills required in the CISRT are defined as being able “to respond to, analyse, and mitigate events impacting the confidentiality, integrity and availability of computer systems”. (6) If this is correct than it would be safe to assume that the CSIRT staff is not expected to have any training beyond knowledge of dealing with IT (Windows, Linux, Cisco and Intel) based cybersecurity issues. To be fair ICS specific language is found later in the document but the meaning seems lost amidst the IT biased language that came before. For example in Annex I on page 46 (of a 65 page document) a reference to NIST’s Guide to Industrial Control System Security Nr. 800-82 is made but it comes too late for a firm tie into the IT based statements covered in the beginning of the text. The misleading impression made by reference to the cybercrime convention has already led the reader a long way down a different road.

I am reminded of the 1958 film about the tragic sinking of the Titanic, “A Night to Remember” (7). The Captain for some reason did not fully take in the implications behind frequent reports of icebergs in his sailing area. He seemed to think that icebergs were not a threat to his ship that was designed to be unsinkable and proceeded to steam on ahead. It is possible that a “Titanic captain” mentality is prevailing in this guide on cybersecurity incident response planning in terms of doing something about the safety and security of control systems. The warnings about “icebergs” in cyberspace have been reported yet similarly to the Titanic’s captain behaviour the cyber warnings have not had an impact in terms of corrective actions in this guide. Certainly the unintentional cyber caused shutdowns at Hatch and Brown’s Ferry indicate that addressing the threat is not just about dealing with cybercrime and the need to create a CSIRT. The publicized and analyzed cyber-attack directed at the control systems of a nuclear enrichment facility in Iran, the German Government report of a cyber-attack on the control systems of a German steel mill, and last year’s cyber-attack on the control and communication’s system of a power distribution grid in Ukraine surely indicate that something more is needed to address the implications of these events than using the IT based security model borrowed from a Cybercrime Convention. One may ask how could such a list of examples fail to make an impression on the authors of this IAEA document. One answer may be due to a lack of expertise and imagination that is required to answer the 3 questions. As one author of an article on threats to nuclear facilities said, “those responsible for protecting nuclear facilities from cyber- attack are less prepared than their potential aggressors” (8).

We still have a long way to go in understanding what the threats are in this new environment of cyberspace where all things technical and vital to society’s well-being dwell. Interest is high now and users of this technology are looking for help in addressing these threats. Those that propose solutions need to take the time to do their homework. We need to go beyond a state of ignorance and avoid seeking a feeling of security in reaching for established but inappropriate models for insuring security in today’s complex control systems. We need to go beyond saying what the IAEA director said this week that “We never know if we know everything or if it’s the tip of the iceberg” (9). A multi-disciplinary approach is needed that integrates what the IT and Control System security and engineering people know. The “left” and the “right” brain of cybersecurity need to be equally engaged. Only then will appropriate course changes take place in response to the warnings of “cyber icebergs” threatening critical infrastructure in today’s cyberspace.


3. Ibid. Page 4.

4. Ibid. Page 4.

5. Page 21.

6. Page 11


8. Page 24.


NOTE: The views expressed within this blog entry are the authors’ and do not represent the official view of any institution or organization affiliated thereof. Vytautas Butrimas has been working in information technology and security policy for over 30 years. Mr. Butrimas has participated in several NATO cybersecurity exercises, contributed to various international reports and trade journals, published numerous articles and has been a speaker at conferences and trainings on industrial cybersecurity and policy issues. Has also conducted cyber risk studies of the control systems used in industrial operations. He also collaborates with the International Society of Automation (ISA) and is member of ISA 99 Workgroup 13 that is developing Micro Learning Modules on the ISA 62443 Industrial Automation and Control System Security Standard and Workgroup 14 on security profiles for substations.