Securing valuable IIoT data is a business imperative, demanding reevaluation of traditional architectures and adoption of new best practices.
Connectivity & Cybersecurity Part 2 of 3: Single Plant Cybersecurity System
For the first article in this three-part series, we discussed the reasons driving manufacturers to converge their operational technology (OT) systems with their information technology (IT) resources. Hardware, software, and networking technology have made it easier than ever to achieve IT-OT convergence, allowing end users to effectively use the large volume of available and valuable industrial internet of things (IIoT) data.
But this improved level of connectivity must be carefully balanced with cybersecurity. IT systems generally have good provisions protecting them from OT assets, but the reverse is not nearly as true. In this article, we will talk about why a single plant security system is crucial for protecting all assets and information. The third article in these series will be a deeper dive into security aspects of interconnecting so many different and distributed IIoT devices.
As a collaborative effort to bring insights from different viewpoints, this article is written from the perspective of two experienced experts: one with over 26 years in IT cybersecurity and the other with over 26 years in all aspects of OT and industrial automation, including cybersecurity.
Accessing valuable data
The global market for industrial internet of things (IIoT) is estimated to reach $751.3 B by 2023, according to Market Report1. The introduction of interconnected sensors, instruments, edge devices, and more promise increased efficiency in the form of analytics to drive optimization. However, the widespread distribution of this essential data throughout operating plants requires each company to evaluate current cybersecurity models.
Smart equipment stores a company’s intellectual property in the form of precision directions, recipes, and measurements. Intelligent devices are interconnected via networks spanning the plant. Remote access to plants has become the norm, accelerated by the pandemic.
Mobile devices with wireless connectivity are gaining traction on the plant floor because they are affordable, portable, and widely available. Wireless connectivity is also being used on devices such as autonomous mobile robots, industrial robots, asset tracking devices, and more.
With all these changing industry trends, the goal remains the same: to safely move and use operational data to drive productivity, provide contextualized data to the enterprise, minimize disruption and downtime of operations, and do all this in a secure manner. As access improves to this valuable source data, so does vulnerability. Traditional models for handling data from plant floor assets have favored availability and integrity over confidentiality. Can these models be adapted or is greater change needed?
Evolution of a model
The Purdue Enterprise Reference Model Architecture was established in the 1990s by Theodore J. Williams in cooperation with the Purdue University Consortium for computer integrated manufacturing and adopted as ISA-99 standard. The Purdue Model initially was a reference model for defining the critical infrastructure used in production lines, and it is helpful for understanding how data flows through the various layers. With time, the model has evolved to take on some aspects of a cybersecurity model as well.
The model segments devices and equipment into hierarchical functions, with each level representing a subset of systems within the manufacturing operation (Figure 1).
Figure 1: The traditional Purdue Enterprise Reference model defined production infrastructure, data flows, and eventually cybersecurity.
Each of these levels represent areas requiring access control because each level may only communicate with other specific levels. A common practice in the past was to restrict access such that information technology (IT) Levels 4 and 5 could only communicate down to Level 3. However, Levels 2 or 3 could communicate up to Levels 4 and 5. Levels 0 and 1 were isolated to only the operational technology (OT) network, which some described as the ‘OT zone’.
New layers have been added between the levels for cybersecurity purposes over the past decade. For instance, the demilitarized zone (DMZ) now exists between Levels 3 and 4, and it uses firewalls and proxies to secure communications. The DMZ layer function as a subnetwork to protect a company’s exposed, outward-facing services.
The IIoT challenge
As manufacturing evolves, and IIoT implementations spread, greater convergence of data and communication between the OT and IT levels is necessary. These industry trends are placing requirements for OT layers to send more and more data to IT Layers 4 and 5, in some cases going directly from Layer 0 to 5.
Level 3 represents plant site operations, and it is increasingly becoming the area where IT and OT convergence occurs. It is also where security individuals attempt to close any air gaps, a physical isolation security measure, existing between the IT and OT network.
This security is achieved by isolating a network which should be secure, such as the OT network, from the unsecured network, for example, the IT network or the internet. Closing the air gap improves connectivity but sacrifices cybersecurity.
Some operating companies have decided to aggregate OT data before sending it to the IT layers. However, careful selection and curation of transmitted data is needed, otherwise a massive amount of operational data could be missed. The right data must be available to gain insights for optimizing, and predicting failure of, equipment and processes.
Other companies are creating an IIoT zone where smart sensors and devices are connected to a single DMZ and firewall, bypassing the Purdue Model and going directly from the device to the cloud or Internet. This allows traditional data operations to continue, while providing an alternate streamlined path.
Certain smart equipment manufacturers are attempting to drive efficiencies by adding supplementary sensors to their devices, and then bypassing traditional automation by using the cellular network to communicate directly to the vendor’s monitoring software in the cloud. This smart equipment avoids the end user’s cybersecurity provisions altogether, forcing reliance on the equipment manufacturer to secure these cellular connections to the cloud.
Cellular networks are also used to provide access to critical devices when the normal OT network or the internet experience an outage. This sort of backup connection or alternative access path, in one form or another, is widely used to provide support when normal pathways fail.
Protecting alternate paths
Protecting these alternate access paths, particularly the inbound path, must be a high priority for any security considerations. Any access into the manufacturing and safety zones should rely on more than just network location (coming from the Enterprise zone) or knowing an access phone number. Methods used to protect access into the manufacturing and safety zones can be as simple as a user ID/password combination for human-driven access, to more complex approaches like public/private key pairs and a network access device when devices need to communicate across the IT/OT network boundaries.
The Purdue Model reflects the hierarchical nature of traditional OT networks. With industry trends leading to data distribution throughout the company’s enterprise, cybersecurity requirements must be reconsidered at every level, and even within each level, to ensure appropriate cybersecurity, given breach risk within each level (Figure 2).
Figure 2: Industry technologies and trends are impacting the traditional hierarchical Purdue Model, so cybersecurity must be reconsidered at every layer. Modern visualization, data connectivity, and analytical software like the portfolio of ADISRA products must be designed with built-in cybersecurity provisions, even when deployed at the lowest model layers.
IT/OT convergence is requiring IT security models to be implemented further down the Purdue Model layers, while preserving OT requirements for production efficiency and uptime. In response, the National Institute of Standard and Technology (NIST) is proposing a Zero Trust Architecture.
Authenticating and authorizing
The Zero Trust Architecture was initially proposed in September 2019 and then updated based on feedback in February 2020. Using this architecture, the view of securing network-based perimeters shifts to securing users, assets, and resources. Cybersecurity provisions are no longer determined by physical or network location, but by authentication and authorization of the users, assets, and resources at the time each joins a session.
Authentication is a security process for verifying the user, asset, or resource is who they declare themselves to be.
Authorization determines to what degree the user, asset, or resource is permitted to access resources such as files, services, computer programs, and data.
A way to visualize the security problem leading to the Zero Trust Architecture is to imagine a medieval castle defense system, which included building strong walls with controlled ground access points, and a moat and a drawbridge for additional safety. Defenders of the castle believed these measures completed their efforts. However, consider an advanced enemy who gains access to airplanes to bomb the castle or paratroopers able to attack. The old defenses are ineffective.
This scenario reflects the increased cybersecurity assault possibilities for IT and OT networks as companies adopted bring your own device (BYOD) policies, and also as worker remote access became even more widespread during the COVID pandemic.
Trust only when verified
Zero Trust Architecture advances the cybersecurity discussion by mandating each person or device should not be trusted just because they happen to appear on a network in the Enterprise space. Instead, their authentication and authorization must be verified.
The need for verification has led to other developments, such as the widespread adoption of multi-factor authentication for sensitive applications within the Enterprise, with Microsoft Authenticator and Okta Verify two widely used solutions. When a user logs into a sensitive application, they receive a hail on their smart phone and answering the hail completes the login; failure to answer within a set time causes the login to fail. Organizations adopting multi-factor authentication for their remote users are also moving to the use of multi-factor authentication for all logins to sensitive applications.
While multi-factor authentication is a good solution for access by humans, the Zero Trust Architecture solution for machine-to-machine communication is not so simple. Traditionally, network access solution relied on the location of the accessor and allowed specific systems on the Enterprise network to gain access into the OT environment. This reliance on network address is specifically called out as a problem in the Zero Trust Architecture.
Newer solutions using VPN technology, where the system requiring access into the OT space logs into a VPN device on the boundary of the OT network, can be used to provide the required access. Such access can be automated using VPN solutions from major network device manufacturers, or it can be built using free open-source solutions, such as OpenVPN.
Another standard for industrial automation and control systems is ISA/IEC 62443 and IEC 62443. This standard is growing in adoption and popularity as it is endorsed by the ISA Global Cybersecurity Alliance, a 40-member group of companies such as Eaton, Johnson Controls, and Honeywell. The standard addresses cybersecurity for device-to-device communication, sensitive data protection, authentication, authorization, and guidelines for device vendors. ISA/IEC 62443 and IEC 62443 provide a list of cybersecurity requirements for digital transformation.
Cybersecurity to address the whole plant
Digital transformation involves companies working to gain value and competitive advantages from the IIoT, smart equipment, mobile devices, remote access to critical assets, and more. The COVID pandemic has also been a catalyst for change, forcing companies to take a more holistic approach to cybersecurity.
Suitable cybersecurity requires more than threat intelligence, best-practice backups, and disaster recovery. By incorporating cybersecurity best practices, which we will discuss in the upcoming third and final article, companies can control and monitor critical OT networks, while still providing the required level of access for IT and OT users, third parties, and applications.
As we trace the data flow of our manufacturing plant, cybersecurity must be considered at every level, and within the level. Cybersecurity cannot be a lessor priority than efficiency and uptime, but instead should be a business imperative at the forefront of digital transformation efforts. As part of the process of making plants resilient, proper cybersecurity requires updating architectural models, along with technology investments.
All figures courtesy of ADISRA
About The Author
Chip Coy is the senior managing consultant for NTT Data. Chip is a cybersecurity professional who has implemented, tested, or assessed information security for organizations ranging from network providers to the Olympic Games. Prior to joining NTT Data, Chip worked in cybersecurity consulting practices including IBM and Coalfire Systems. Chip was one of IBM’s original ethical hackers. Chip consulted on network security designs with organizations ranging from state governments and grid operators to mail order pharmacies. Chip holds the Certified Information Systems Security Professional (CISSP) certification from ISC2 and the Certified Information Privacy Technologist certification from IAPP.
Marcia Gadbois is the president and general manager of ADISRA. Marcia is an entrepreneur who has grown a start-up from inception to a successful liquidity event. Prior to joining ADISRA, Marcia was the President of InduSoft, which was acquired by Invensys. In the past, Marcia held a few high-tech senior management positions in the area of business development, strategy, competitive intelligence, marketing, and sales. Marcia has more than 34 years’ experience in the software industry in diverse technology areas such as artificial intelligence, operating systems, rapid application development environment, output management, databases, directory services, data recovery, middleware, and industrial automation. Marcia is a contributing author in the book, “Client/Server Programming with RPC and DCE” as well as author of many articles and whitepapers. Marcia holds a BS in Management Information System and Computer Science from Bowling Green State University and an Executive MBA from University of New Hampshire.
Did you enjoy this great article?
Check out our free e-newsletters to read more great articles..