Effective CISOs often have to move out of their own comfort zones to become great communicators. The ability to align terms and expectations and to separate the “signal” from the “noise” is crucial in dealing with multiple tasks within an enterprise, driving budgets, motivating employees, educating boards of directors, and more.

I have learned the hard way what works and what doesn’t work as a TEDx and Google Talk speaker. Watching people’s faces light up or glaze over during my talks has provided invaluable data on what generates engagement. I also study how others, especially tech speakers, make complex terms and ideas easier for the non-tech person to understand and engage in.

Cybersecurity’s future in reducing incident response time is to automate the process. In other words, the process of marking an attack, aggregating key data, identifying the actual threat, assembling the tools and executing actions needs to be as close to machine speed as possible.

Unfortunately, most companies are still outsourcing only 30 percent of the decision-making to AI/cybersecurity programs that allow this, when a minimum of 70 percent is the healthier goal.

Nevertheless, reaching that goal does not free a CSO or anyone in an organization from making key “human” decisions in the AI/cyber arena. Processes that include automated programs and algorithms are just one part of the job. The other significant factors are the management of:

There's a lot of cyber babble out there that creates AI confusion, drives poor cybersecurity decision-making and kills key safety initiatives. Part of the issue is that we assume we all speak the same language and thus are in alignment. On the other side of the coin, we sometimes feel foolish because we don’t understand the language being spoken and assume it’s our fault (see video below).

The video in this blog reinforces the concept that assumptions are part of your everyday decision-making process and that to deny making them is unproductive and potentially threatening. Click here or on the embedded video below to watch:

For the purposes of this blog, let’s agree that these unconscious beliefs can be quite dangerous when they go unchecked. Consider phishing and how frequently humans open an email or social media link without thinking and then provide sensitive information assuming it’s safe, or assuming that their IoT will not compromise their security or lead to a hack.

In case you haven’t noticed, public airline safety announcements (PASAs) just got more interesting. People genuinely enjoy these formerly tedious messages.

Wouldn’t it be nice to see everyone in an organization engaged in the same way when immersed in complex cybersecurity awareness and policy? Perhaps we can learn something from the airlines to enhance this type of engagement.

The good news is that we can. Unfortunately, the lessons the PASA teach are a bit different than you might imagine. Let me explain.

I’ve studied the effectiveness of the PASA in depth, and one of my takeaways is that the communication of important messages starts as a leadership issue, not a creative one. For example, as the entertainment levels of the PASA increase, retention is still below the 50 percent level, and that level drops significantly beyond two hours. Why?

Yet we often behave in a way that is the exact opposite; assuming people think and act just like we do. In cybersecurity, treating something as a truth rather than belief can lead to a project’s crash landing. Cases in point:

Assuring alignment

Your code may be 100% spot on, but it turns out that the code’s function isn’t in alignment with the work of the other engineers It’s not your fault. Rather the various teams working on the project had different specs.

Sounds implausible?

Just think of the 1998 Mars Climate Orbiter disastrous disintegration upon entry into Mars atmosphere. The culprit was that NASA engineers had used software based on the widespread metric unit of measurement while their partner; Lockheed Martin engineers used calculations for the hardware using the English unit of measurement.