Government Spotlight: DevOps Accelerates Cyber Security
By Derek Weeks
6 minute read time
A Tale of Two Quakes
In 2010, a 7.0-magnitude earthquake devastated Haiti. The quake killed an estimated 230,000 people and sparked a massive global assistance response. We all remember this tragedy. Yet, six weeks later, a far stronger earthquake (8.8 magnitude) shook Chile. That quake killed 279 people and we saw fewer news headlines and a subdued global response.
Why were so many people killed in one quake versus the other?
The answer: building codes. “It was that simple. Chile had modern building codes. Haiti didn’t” recounted Josh Corman, director of the Atlantic Council’s Cyber Statecraft Initiative (and the former CTO at Sonatype). “It wasn’t the presence of the earthquake and it wasn’t the magnitude of the earthquake. It was the building materials and the building codes that made the difference.”
Corman continued: “When I think of the Internet of Things, I think about how vulnerable we are. I think about how many breaches we’re having. Maybe it’s not the presence of adversaries that we should be concerned about. Maybe it’s not the magnitude or the strength of the adversaries. Maybe it’s the lack of building codes for building code.”Corman was speaking at the recent ATARC DevOps Summit in Washington DC.
A Radically Different Approach
We need radically different approaches to IT. DevOps can deliver radically different approaches to IT by helping to introduce building codes for building code.
Of course, Corman wasn’t the only one at the Summit advocating that DevOps can help us dramatically reshape our approach to IT security. I’m happy to say more and more federal IT executives are also lining up behind this idea. They recognize that keystone DevOps principles and practices — such as tight feedback loops, better instrumentation and transparency, and pervasive use of automation and testing — are particularly well-suited to drive strong cyber hygiene throughout the development and testing process.
I heard this from federal executives like Mark Schwartz, CIO for U.S. Citizenship and Immigration Services (USCIS); Ann Dunkin, CIO at the EPA; and Dr. Greg Shannon, assistant director for cybersecurity strategy at the White House Office of Science and Technology Policy (OSTP), and others who spoke at the ATARC Federal DevOps Summit.
US Citizenship and Immigration Services (USCIS)
“We have put a lot of thought into security in the DevOps environment,” said Schwartz of USCIS. “I would argue that what we’re doing is much, much more secure than any other model we’ve had before.”
At USCIS, security is a concern from the first day of development, Schwartz said. “We are developers, but we understand the OWASP Top 10 vulnerabilities. We’re going to make sure we do not allow for SQL injections in our code and we’re not going to allow for buffer overflows.”
Schwartz encourages other federal organizations to adopt a Rugged DevOps culture. “The idea is to build ruggedly. We have to assume as we create stuff that it is always going to be under attack. We have to build it in a way that is rugged and resilient. That’s just a responsibility while we’re creating software applications.”
White House Office of Science and Technology Policy (OSTP)
DevOps is characterized by tight, amplified feedback loops and instrumentation. These mechanisms contribute to a process in which software is continually tested for security flaws and then refined to reduce vulnerabilities. Because of this dynamic, OSTP’s Dr. Greg Shannon predicts that DevOps will be critical in advancing the White House’s goal of making software more sustainably secure in the coming years.
Shannon called out a specific White House goal (outlined in the 2016 Cybersecurity Research and Development Strategic Plan) to reduce the number of vulnerabilities in new and legacy code bases by a factor of 10. “It is development mechanisms like DevOps that gives us that opportunity,” Shannon said. “We can see how rapid feedback loops can help drive that down.”
Reducing vulnerabilities by a factor of 10 is an ambitious and necessary goal. Achieving this will require that software organizations employ tools and practices that prevent third-party components containing known vulnerabilities from entering the software assembly line. As we learned from the 2016 State of the Software Supply Chain report, a typical application is roughly 90 percent open source components and 10 percent custom source code. The report also revealed that 1 in 15 components used in building applications today have known security vulnerabilities.
Employ supply chain best practices
Corman offered another great approach to building better code: adopt proven best practices for supply chain management, such as those advocated by W. Edwards Deming, the legendary manufacturing engineer and consultant who pioneered the principles of
Total Quality Management. Those best practices are:
- Use better and fewer suppliers.
- Use higher quality parts.
- Track what you use and where it is.
“Just like you have a supply chain [in the automotive industry] of brakes and rotors and tires and airbags, we also have a supply chain in software,” Corman said. “We just don’t manage it like one.”
Put another way, if you tailor your supply chain practices so your developers are selecting only the best components, you will end up with great software. Not doing so contributes to unnecessary cyber risk, waste, and rework. “Building quality in” is a key mantra of the DevOps movement, and selecting the best components is a practice that fully supports that aim.
RMF: Building Codes for Building Code
These approaches — building ruggedly, employing feedback to continually improve security, and employing supply chain best practices — are just some of the many ways federal organizations can drive better security into their DevOps practices.
One other I would add is this: Federal agencies need Risk Management Framework processes that specifically address the risks posed by open source components. Federal organizations are at various maturity levels when it comes to RMF and many programs currently lack the ability to spot and mitigate fast-growing risks posed by components with known vulnerabilities. Stated another way, they don’t have sufficient building codes for building code.
To learn more about how Federal RMF processes can be tuned to address those risks, please visit sonatype.com/government and download our white paper, Improving RMF Practices Through Automation.
Written by Derek Weeks
Derek serves as vice president and DevOps advocate at Sonatype and is the co-founder of All Day DevOps -- an online community of 65,000 IT professionals.
Explore All Posts by Derek Weeks