This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

'Shift Left': Codifying Intuition into Secure DevOps

Shifting left is more than a catchy phrase. It's a mindset that emphasizes the need to think about security in all phases of the software development life cycle.

Continuous delivery (CD) is becoming the cornerstone of modern software development, enabling organizations to ship — in small increments — new features and functionality to customers faster to meet market demands. CD is achieved by applying DevOps practices and principles (continuous integration and continuous deployment) from development to operations. There is no continuous delivery without implementing DevOps practices and principles. By that, I mean strong communication and collaboration across teams, and automation across testing, build, and deployment pipelines. But often achieving continuous delivery to meet market demands presents numerous challenges for security.

While DevOps principles and practices acknowledge the need for security, many organizations struggle to find the right fit and speed for integrating security into DevOps. In a study conducted by HP Enterprise, 99% of respondents say that while DevOps culture offers the opportunity to improve application security practices, only 20% of respondents say that secure systems development life cycle (SDLC) testing is done throughout their development process. (Read "Software Assurance: Thinking Back, Looking Forward.")

Security is still trying to catch up with all the innovative software being developed, tested, deployed, and delivered without slowing or bogging down the process. Security has to be intrinsic and transparent yet visible enough in the process that it is a "shared" responsibility at the heart of DevOps practices and principles. So when a product owner describes new features and functionality that need to be added to the next release, everyone thinks about ways in which those features and functionality can be designed and implemented securely to reduce exposures and vulnerabilities. This is what it means to "Shift Left."

I often hear that slogan in the industry: "Shift Left." Most of the time the term is used in reference to moving security testing into continuous integration. While I agree that security testing as part of continuous integration is important, it's way too late at that point in time. "Shift Left" must go far left, past continuous integration into the requirements and design phases. "Shift Left" to me is a mindset that thinks about security from the onset and is pervasive throughout the software development process. This is what it means to build "security in."

When you start far left, you have the opportunity to embed the appropriate security lexicon and security considerations into the requirements phase. Starting with really solid security requirements allows organizations to codify their intuitions into design; it enables organizations to make sound design decisions up front that will help eliminate technical debt and reduce the cost to maintain software.

Codifying intuitions is a concept I got from a colleague of mine, David Molnar, a security researcher from Microsoft, in a talk about what security can learn from AI. The concept of "codifying your intuitions" was used in a larger context to the security domain, but also, specifically, to defending against adversarial activity and advanced persistent threats in a talk by security researcher Taesoo Kim. I like the concept because it reinforces the need to think like an adversary and infer some of our intuitions and assumptions not only into security design but also into developers' daily activities. Kim gives an example in his talk about what led Jung Hoon Lee, a notable bug bounty hunter, to find vulnerabilities at the Pwn20wn 2016 hacking contest. Lee attributes the discovery of exploitable vulnerabilities to his "intuition."

The Gift of IntuitionExperience in the trenches working in security, developing software, breaking software, and protecting software forms patterns in our minds that provide a solid foundation from which we can train our minds and mental capacities to recognize and be more aware. I tend to look at intuition as the revelations about the mental patterns we accumulate over time; intuition forms from the right hemisphere of the brain (the creativity region) and inspires ingenuity. As with many complex problems in cybersecurity, solutions to those problems often require some level of curiosity and creativity to decompose complexity. This same level of curiosity and creativity is what often motivates attackers. It must be applied to software development to help organizations shift their "security-minded thinking" all the way left.

One of the keys to shifting left is figuring out how to codify intuitions into threat models that can be used to guide secure software development. This could be in the form of user stories and misuse/abuse cases that help organizations better understand how to securely design and implement features and functionality into software. These threat models can be used to:

Guide product teams in making good design decisions regarding security features of the systems.

Assist developers in understanding the consequences of their refactoring or development activities when implementing the design in code.

Develop situational awareness about security threats and risks that can be used to guide more targeted and efficient security testing (achieving security "at-speed") throughout the software development life cycle.

To help organizations formalize threat modeling activities into their SDLC, organizations should consider adopting the following practices and standards:

Kevin Greene is a thought leader in the area of software security assurance. He currently serves on the advisory board for New Jersey Institute of Technology (NJIT) Cybersecurity Research Center, and Bowie State University's Computer Science department. Kevin has been very ... View Full Bio

Yes, this isn't an new problem. I wanted to present the issue in a way to get developers and security folks to think differently about software security; especially with the rise of DevOps and the ongoing issues with security tools clogging up CI pipelines. Do it early and do it often is the name of the game.
Thanks for commenting and reading the article.

As I point out in the intro to my last book, Securing Systems, the first standards reference to design-time security requirements that I managed to find was NIST 800-14, 1996! My chapter in Core Software Security describes early aat inception engagement followed by full participation in creation of the the structure (architecture) that will be built. Of course, IEEE Center For Secure Design's "Avoiding The Top 10 Security Design Flaws" reiterates the same message (I'm a co-author).

Talking about early engagement for security requirements is not new. It amazes me that we have to keep reiterating this as though it were something new. Why? because such engagements are still too rare, unfortunately.

Still, within the 4 security architecture practices that I've lead, we have achieved early engagment. Part of whatever success I and my teams have enjoyed has depended upon security people becoming fulll participants in the entire development process. When seen as a key subject matter expert (SME) develop teams are quick to integrate their security members right from inception. I call it, "developer-centric security".

As I've become more involved in DevOps (as I spoke about at RSA SF '16), it's become clear to me that just as the software/system needs security architecture, so does the DevOps chain, as early as possible: same deal.

the problem with DevOps is that it often begins organically as experiments. But the tipping point to production and canonization, architecting DevOps with security deeply engaged has the promise of fulfilling the "shift left" imperative.

As cyber threats grow, many organizations are building security operations centers (SOCs) to improve their defenses. In this Tech Digest you will learn tips on how to get the most out of a SOC in your organization - and what to do if you can't afford to build one.

Adobe Acrobat and Reader versions 2019.010.20069 and earlier, 2019.010.20069 and earlier, 2017.011.30113 and earlier version, and 2015.006.30464 and earlier have an use after free vulnerability. Successful exploitation could lead to arbitrary code execution .

Adobe Acrobat and Reader versions 2019.010.20069 and earlier, 2019.010.20069 and earlier, 2017.011.30113 and earlier version, and 2015.006.30464 and earlier have a type confusion vulnerability. Successful exploitation could lead to arbitrary code execution .

Adobe Acrobat and Reader versions 2019.010.20069 and earlier, 2019.010.20069 and earlier, 2017.011.30113 and earlier version, and 2015.006.30464 and earlier have an use after free vulnerability. Successful exploitation could lead to arbitrary code execution .

Adobe Acrobat and Reader versions 2019.010.20069 and earlier, 2019.010.20069 and earlier, 2017.011.30113 and earlier version, and 2015.006.30464 and earlier have an out-of-bounds read vulnerability. Successful exploitation could lead to information disclosure.

Adobe Acrobat and Reader versions 2019.010.20069 and earlier, 2019.010.20069 and earlier, 2017.011.30113 and earlier version, and 2015.006.30464 and earlier have an use after free vulnerability. Successful exploitation could lead to arbitrary code execution .