NIST Defines Elements Needed to Trust IoT and Cyber Physical Systems

Ensuring the trustworthiness of the Internet of Things (IoT) and Cyber Physical Systems (CPS) consists of a variety of factors, not all of them absolutes, according to panelists at a National Institute of Standards and Technology (NIST) workshop last week.

“The sense of trust is not absolute. Trust does allow for failure,” said Greg Shannon, assistant director for cybersecurity strategy at The White House Office of Science and Technology Policy. “What bolsters people’s trustworthiness is that there’s a sense of accountability.”

“Everything should not have to be ultra trustworthy, because making things highly trustworthy is going to come at a cost of processes,” agreed Cynthia Irvine, distinguished professor of computer science at the Naval Postgraduate School.

The NIST CPS Framework says that trust in IoT and CPS systems relies on the elements of resilience and reliability, security, safety, and privacy, which work in tandem to form a trustworthy system. Panels at the workshop explored these elements and how they can be achieved.

Resilience and Reliability

Presidential Policy Directive 21 defines resilience as “the ability to prepare for and adapt to changing conditions and to withstand and recover rapidly from disruptions.” These disruptions include everything from cyberattacks to natural disasters and physical attacks.

“While we focus very heavily on cyber adversaries and cyberattack, we need to be mindful of all forms of disruption,” said Deb Bodeau of MITRE.

At the workshop, reliability was defined as “the ability of a system or component to function under stated conditions for a specified period of time.”

“Greater reliability means less need for resilience,” said Pat Muoio, director of research and development at G2 Inc. She explained that reliability consists of things that people understand will disrupt the mission, while resilience consists of the unexpected things that happen to a system.

Security

“You need a policy to explain, at least in terms of security, what the system is supposed to do,” said Irvine. She explained that, due to the difficulty of defining and testing how a system is secure or not, trust in the system comes from concrete security policies.

Steve Lipner, former partner director of program management at Microsoft, added that organizations must first adhere to a set of cyber physical best practices, such as the SANS top 20 or DSD top 35.

“In the world of IoT, I don’t know that there are equivalent, common best practices documented, but if I were going to start facing the problems there, that would be an early thing I would do,” Lipner said.

In order to implement these practices, and therefore make cybersecurity a primary concern of an organization, Shannon said that companies have to “make cybersecurity less onerous while providing more effective defenses.”

Safety

“Safety is often an explicit objective that’s laid out in terms of the goals of the organization. Cybersecurity, per se, might not be, however, cybersecurity risks can impact all of those objectives,” said Al Wavering, chief of the intelligent systems division of the Engineering Laboratory at NIST.

For example, Wavering described a manufacturing plant in which employee safety on the floor is a priority. That employee safety can be compromised by a hacked or failing cyber physical system, making cyber considerations a key element of safety concerns. Consumer safety is also a major concern in IoT and CPS.

“Trustworthiness is very similar to airworthiness,” said Ravi Jain, an aerospace systems engineer at the Federal Aviation Administration. He described the interconnected nature of flight systems, flight communications, and passenger devices as a major safety consideration for commercial flights.

Privacy

“A lot of the principles that those of us in the privacy field talk about for all sorts of things certainly apply in the cyber-physical systems space,” said Lorrie Cranor, chief technologist at the Federal Trade Commission, professor at Carnegie Mellon, and director of the Carnegie Mellon Usable Privacy and Security Laboratory.

Primarily, Cranor addressed a person’s right to access their own data, as well as the need to know what kinds of data is being collected about them.

“In cyber physical systems, there’s a lot of data collection going on that is probably not obvious to the humans,” said Cranor, pointing to navigation systems in cars as an example of where personal data can be collected about a person that would enable someone with the data to figure out where they live or work.

“Privacy is generally not the first thing on the minds of the engineers building these things,” Cranor added.