Thoughts on possible misconceptions over the cybersecurity of the energy sector

“The pump don’t work, cause the vandals took the handles” – Bob Dylan

The use of high technology (information technology and telecommunications) has entered almost every aspect of our lives. You name a sector and it is there: finance, trade, energy, communications, transportation, even education and healthcare. High tech is what modern society is built upon and any disruption could have a serious effect on any nation’s security, economy, and well-being of society. High tech in reality forms the foundation which supports our lives. However, we take it for granted that it will always be there. The lights will turn on at night, trains will arrive safely, water will run from the tap, and we feel confident that we can reach our loved ones by mobile phone or internet. However when there is a failure in high tech the effect is immediate, causing you to stop what you are doing. Only then does one realize how dependent and vulnerable one is to the safe and reliable presence of these technologies and the services that they support. This sudden realization has also occurred in many countries such as in Turkey (1), Holland (2) and in the United States (3). Power blackouts that extend beyond regional to national electric grids can be very disruptive. The metro trains (4) , trams and airports(5) can come to a halt, traffic lights stop working and people get trapped in elevators(6) , while some factories lacking backup power stop production(7). Mobile and fixed telephone networks are vulnerable to disruptions as well.(8) As most often happens when there is a massive failure in critical infrastructure felt by millions of people, they look to those who can provide an explanation (9). What caused this? Was it an accident or was there a malicious intention behind the event? (10) One cannot always trust the first reports about an accident since it takes time to take in all the facts and make a judgement. Later it is usually announced, with perhaps a sense of relief, that the cause was not cyber-related but due to a technical-management issues (11). These blackouts however do raise important issues concerning the cybersecurity dimension of critical energy infrastructure.

Should we be relieved to hear that a blackout was not caused by a cyber-related incident? Does it mean that there is no need for further actions and that we can go back to doing our usual daily business? Is it safe to believe that technical failures occur all the time, someone fixes them and life goes on? In terms of the reliability and safety of our regional and national electric grids and national security, it may not be a good idea to be so complacent.

One common myth that seems to surface when the cybersecurity of critical energy infrastructure is raised is that these “systems are not connected to the internet”. To those who find this belief credible I would refer you to Project Shine (12). From April of 2012 to January of 2014 two industrial control system (ICS) professionals conducted a test to verify the truth of this belief – i.e. is it true that ICS is not connected to the Internet? They applied control system key search words to the SHODAN (13) Internet search engine to see if there are any control system devices belonging to critical infrastructure visible on the Internet. The results were quite surprising. Over 2 million devices from over 200 countries appeared in the SHODAN searches made during Project Shine. In January of 2014 Project Shine ended not because the discovery of devices ended, the numbers in fact continued to grow, but because the researchers proved their point – a number of industrial control systems thought to be isolated from the Internet are in fact visible to potential attackers from cyberspace (14).

What does this mean when a device has become visible through the help of a search engine like SHODAN? It means that the possibility to intrude into a critical control system and take control exists. To use an example from your home, this is like leaving an open window or unlocked door available to a thief. Industrial control systems (ICS) may be tough on the outside like a crab in terms of its ability to defend against an attack from a predator. It is hard on the outside in terms of the difficulties in penetrating the system but once the hard shell is broken the inside that is exposed then becomes a very soft target for stealing information or for causing damage. Hackers have become interested in ICS during the past few years. Black Hat conferences now offer courses on “attacking SCADA systems” (15) and amateurs have been trying out SHODAN on their own to find exposed devices on the Internet and publishing their findings on social media (16). One can only wonder what mischief can be created if a state or a supported proxy with resources far greater than those of a hacker or cyber-criminal were applied to attacking ICS.

Another myth that has its believers is that ICS are built to be safe and cannot be intentionally harmed from cyberspace. There are too many safety mechanisms and automatic reaction controls in place that will respond to a malfunction before any damage can be caused. This myth was tested by Idaho National Labs in 2007 in the Aurora (17) experiment. To state briefly the experiment attempted to prove if an electric power generator with all the safety mechanisms in place could be attacked from cyberspace. The results of the experiment revealed what today is known and as the Aurora Vulnerability (18). An AURORA attack results when a circuit breaker or breakers are opened and closed, resulting in an out-of-phase condition that can damage alternating current (AC) equipment connected to the grid. This vulnerability has wide implications for the electrical and manufacturing sectors. It affects nearly every electricity system worldwide and potentially any rotating equipment—whether it generates power or is essential to an industrial or commercial facility (19). Mitigation for this vulnerability is available but unfortunately has not been widely implemented. One reason for this is perhaps cost. The mitigation is relatively inexpensive but must be applied to every electric substation in order to be effective in securing a power grid from this vulnerability. The cost of fully implementing the mitigation can add up but is far less in cost when weighed against replacing damaged generators, motors, or transformers. To this must be added the cost that comes from the loss of production time when having to replace and install new equipment (20). To get an idea of how high the costs can reach suggest reading the Lloyds of London and Cambridge University study on the insurance costs resulting from a cyber-attack (applying a scenario of malware that takes advantage of the Aurora vulnerability) on 50 substations (21).

One other cybersecurity of critical infrastructure myth is called “security through obscurity” (22) or the belief that one’s systems are so difficult for a hacker to understand and so isolated that no attacker will notice them or want to make the extra effort to look into your system. Alas, potential perpetrators of cyber- attacks on critical energy infrastructure are on the prowl to the peril of those stuck in the security through obscurity myth. Just one example of the error in believing this myth is the Sandworm/Black Energy malware. Sandworm/BE has been reported on in depth as a malware, suspected to be the work of state, seeking the locations of control systems, especially the human machine interface, of control systems manufactured by Siemens and General Electric (23). A massive reconnaissance of control systems is under way. Perhaps this is part of a student’s harmless data survey or part of a preparation for a cyber-attack (as perhaps later realised in the cyber-attack against part of Ukraine’s power grid late last year) (24) ? This example together with the reference to the interest in control systems expressed in recent hacker conferences should dispel the myth of obscurity. These systems are no longer “hidden”. They have aroused a lot of interest and are considered to be potential targets during a conflict. There is no longer any good reason to relax in the belief that critical infrastructure is safe from the malicious cyber activities of threat actors using advanced and highly resourced methods to direct cyber-attacks at critical systems.

Another false belief or myth is the idea that today’s critical infrastructure is resilient to cyber incidents and cyber-attacks. It is thought that critical systems designed to be reliable and safe will not be brought completely down because of all the automated safety systems and controls put in place. The increasing use of information and communications technologies in control system environments has made it possible to integrate systems into larger and larger complexes of systems allowing for cost reductions and increased efficiencies. However, the increased level of integration and resulting increased complexity of systems found in national or regional electric distribution grids can create new vulnerabilities and cause unexpected failures. Small failures in system components may be of little significance in themselves but the effect on other connected components can lead to cascading failures in the system resulting in a major power blackout. A famous example is the unintentional misapplication of an IT security policy to a single computer at a nuclear power station which led to an emergency reactor shut down (25). The bottom line is that the increasing capabilities and complexity of modern control systems found in today’s energy sector are not immune to failure caused by unintended accidents or exploitable vulnerabilities.

In looking back at the question raised at the beginning of this blog and the cyber myths just discussed it does not seem prudent to go back to business as usual after any major national power outage. Regardless of whether the outage was caused by a cyber-attack, technical failure or management error the event should be taken as an opportunity to take a long look at the security of our increasingly vulnerable critical infrastructures. An integrated broad based approach to security that takes into consideration not only the requirements for reliability and safety but also the potential for unintended or intentional cyber failure needs to be developed. This means that the contribution of information and communication technology specialists who deal with cybersecurity needs to be part of the system risk evaluation and design process performed by ICS engineers. A wide range of risk needs to be evaluated and accounted for if we are to avoid and defend against future failures in the critical infrastructures our societies depend upon so much for their well-being.



3) Combination of Errors Led to Power Loss in San Diego,

4) As happened in Australia



7) As happened once to factory in Quatar.

8) As happened in region of U.S. 9)

10) N.E. blackout in US in 2003 raised similar questions. Cyber cause initially suspected in this case to.

11) Speculation about a cyber cause for a one blackout was denied by Brazilian authorities.

12) Read the full report and find out how many control system devices were found in your country at






18) Watch what an Aurora attack looks like yourself at


20) Ibid.

21) Business Blackout: The insurance implications of a cyber-attack on the US power grid.



24) Cyber-Attack Against Ukrainian Critical Infrastructure US DHS ICS-CERT Alert


NOTE: The views expressed within this blog entry are the authors’ and do not represent the official view of any institution or organization affiliated thereof. Vytautas Butrimas has been working in information technology and security policy for over 30 years. Mr. Butrimas has participated in several NATO cybersecurity exercises, contributed to various international reports and trade journals, published numerous articles and has been a speaker at conferences and trainings on industrial cybersecurity and policy issues. Has also conducted cyber risk studies of the control systems used in industrial operations. He also collaborates with the International Society of Automation (ISA) and is member of ISA 99 Workgroup 13 that is developing Micro Learning Modules on the ISA 62443 Industrial Automation and Control System Security Standard and Workgroup 14 on security profiles for substations.