We are not learning valuable lessons for protecting critical infrastructure.

“Being aware of what is happening in cyberspace and communicating it to policy makers is not an easy task”

On February 5th an engineer working for a small water utility in Florida noticed the mouse pointer moving on his SCADA control screen (where have we seen this before?).  He watched in surprise as unauthorized changes were made to the programmed levels of sodium hydroxide (increase of over 100 times) used to treat the drinking water for this small community of 15,000 people.  Luckily, he reacted quickly and brought the NaOH back to normal pre-set levels.

Later investigation revealed that this was the result of a cyber-intruder who gained unauthorized access from the Internet to the water treatment and control systems of the utility. To view the system from the Internet and take control the intruder used one of the utility’s SCADA applications called Team Viewer .  Further investigation revealed enough security flaws to alarm government agencies to issue industry alerts.

Here is a list of reported security flaws and practices of this utility that threatened the safety and availability of clean drinking water for this community.

The security policies of the contractor that designed the SCADA system for the utility also contributed to this incident. The contractor that designed, delivered and installed the SCADA system to the water utility provided many “user friendly” features but very little security. One of these included a mouse clickable start and stop switch button on the control screen. The other is that it posted this and other useful gems of open source intelligence on its website.

It is fortunate that no people were hurt or property damaged.  However, this incident perhaps would not have happened had the lessons from past cyber incidents on industrial control systems been applied.

Here are some lessons missed by the system designer and affected utility and perhaps by many others in the industrial/manufacturing sectors:

STUXNET:  Since 2010 the engineering and cybersecurity community were given notice that advanced persistent threat (APT) actors were targeting engineering systems that support critical infrastructure.  While the perpetrators did not show evidence of being highly skilled and knowledgeable about the engineering part of the operations, the system designer and utility should have considered possibility of the operations becoming a cyber-target.

Project SHINE: Demonstrated in 2014 that industrial control systems and associated devices thought to be isolated from the Internet were not.  The designer should have foreseen the possibility of penetration from the outside and placed industrial strength firewalls on Internet facing systems.  Strong password policies would also have mitigated the possibility of performing this kind of attack from inside (i.e. from a disgruntled employee or other knowledgeable insider).

Ukraine regional blackout: First time a control system operator in 2015 saw someone come in from the Internet, take control of the mouse and click open breakers at 30 substations putting a ¼ million people in blackout just before Christmas .  We should not think that because something happened “over there” that it could not possibly happen elsewhere, even to your operations.  Let us look, be aware and learn from what happened to others.

Triton/Trisis: In 2017 the safety instrumented systems of a Saudi petrochemical facility were compromised as a result of a prolonged cyber-attack taking place over several months. APT actors were targeting the systems designed to protect human life, property and the environment.  This included attempts to compromise the automated response of safety equipment. It was only after the second unplanned plant shutdown that a cyber-forensic investigation was conducted and determined that the cause was a cyber-attack.  An in-house capability to monitor for anomalous behavior on the control networks and equipment could have helped in early detection and reduced the potential for damage from compromise.

It is not evident from this case that the system designer nor the utility were aware of these demonstrated cyber threats to industrial control systems taking place for over 10 years.  In the reports that have been published it is not clear if the designer relied on or was aware of any industry standards or best practices  that could/should have been applied when designing this system for the customer.  For example, the International Society for Automation has developed the ISA 62443 Industrial Automation and Control System standard that covers the lifecycle of a control system from design to decommissioning. What happened at this small water utility is not an isolated event.  The target of the attack was a SCADA system. Similar to what is used in many industrial operations including those that are found in fuel pipelines, power generation, power distribution, and other sectors of critical infrastructure.  The targeted application “TeamViewer” is found in many other locations as someone using the SHODAN search engine has demonstrated.

In designing and operating critical systems that so many depend on for their economic activity, security and well-being special care is required to be aware of what is going on in cyberspace “out there”.  In addition to awareness, a willingness to act in order to mitigate risk to people, property and the environment is required.  Reducing costs by not adding security in order to win a contract is not in the best interests of the one providing the solution nor is it in the best interest of the customer. It is also the responsibility of the customer to be as informed about security as the vendor should be.  All should avoid being blinded by Cybergs in developing and implementing appropriate and effective polices to protect critical infrastructure.  A team effort that is characterized by an understanding of the physical processes involved, employs tools for their protection and seeks to improve safety, reliability, performance and resilience of critical processes is required if we are to avoid more tragic repetitions of these incidents in the future


NOTE: The views expressed within this blog entry are the authors’ and do not represent the official view of any institution or organization affiliated thereof. Vytautas Butrimas has been working in information technology and security policy for over 30 years. Mr. Butrimas has participated in several NATO cybersecurity exercises, contributed to various international reports and trade journals, published numerous articles and has been a speaker at conferences and trainings on industrial cybersecurity and policy issues. Has also conducted cyber risk studies of the control systems used in industrial operations.

Related posts