Wicked Good Development is dedicated to the future of open source. This space is to learn about the latest in the developer community and talk shop with open source software innovators and experts in the industry.
In today's episode we're tackling the ongoing discussion about shifting security left - or really starting security left. What do developers need to understand about the current state of application security? How should they be involved in security decisions? What's involved in building secure code from the beginning? This episode wraps on the topic of what questions organizations and developers should be asking themselves when it comes to their security practices.
Wicked Good Development is available wherever you find your podcasts.
Developer first, vulnerability remediation, DevSecOps, application security
Kadi
Hey everyone, this is Katie Grigg and welcome to our episode of wicked good development. This is a space to learn about the latest in the developer community and talk shop with OSS, innovators and experts in the industry,
Omar
Hola, my name is Omar, and I'll be your co-host. Today we'll be talking about dev SEC ops and what it means for security to shift left.
Kadi
So today we want to explore how the industry is rethinking security and software and what it means really, for developers joining us today is a brilliant group from Sonatype, we have Brian Fox, CTO and co-founder Ilka Turunen field CTO, Stephen Magill, VP of product innovation and Ax Sharma, security researcher, developer advocate and Tech Ninja. Welcome, guys.
Brian
And Ilkka, Ilkka, can we can we have you pronounce your name for the record?
Ilkka
Yeah, you can try. It's, it's actually Ilkka Turunen. But nobody says like that anymore.
Brian
There we go.
Kadi
I'm just going to say Ilkka from now on, I'm just gonna say it's
Brian
okay. It's okay.
Ilkka
It's a great short hand.
Brian
Now everybody knows how to pronounce it.
Omar
So I know that we've had Brian and Ilkka on before, right. So we have a little bit more of their background, but Ax and Stephen, can you introduce yourselves and what lens you bring into the conversation?
Ax
Hi, I'm Ax Sharma, I've been at Sonatype for four years now, I originally started out as a developer, did a bunch of those gigs. Got some experience. But I soon realize developing for yourself, it's different than developing eight hours a day implementing business stories. So that got me interested in cybersecurity, that some gigs that you know that managing firewalls and stuff. And this has been the perfect balance because you get to analyze malware vulnerabilities. It's hands on security research. So it's been going great.
Stephen
All right, great. Yeah. So my background starts more on the research side, I did my PhD work in static analysis, looking at how to analyze programs for security and performance issues. And since then, I've sort of progressed more and more towards being interested in the practice of development, how we can take these advanced technologies, and use them to really have an impact on the security and quality of code that developers are writing day to day.
Kadi
Great. Thank you guys for joining today. So let's dive in. Currently, as we know, based off our software supply chain report that we release annually, most organizations currently at the have the 110, one problem. And for those of you at home listening, we're talking about 100 developers, and there's 10 operations for those 100 developers, and just one security person for all 110 of those people. This, as we know creates a massive resource issue for any organization large or small. And really, without any security resources, working with teams practicing Dev Sec Ops and security organizations might be having seen in a negative light. What does it look like when security and development clash in the real world? What is kind of, you know, can you give us some examples of the risk of keeping this kind of dysfunctional relationship between the two groups.
Brian
I could talk for hours about that. I've seen so many, you know, versions of how this goes wrong for years, which it really informed you know, how we've tried to approach the problem. But, you know, I think there's, there's a lot of history of sort of bad blood and mistrust that have built up between development and security. You know, and there have been lots of tools that have come along and been pitched as the next big answer. And when the tools are just forced upon whoever it is often forced upon development, or the results of the tools forced upon development, in a naive kind of way, it often creates more problems than it solves, you know, I've heard horror stories of, you know, security, buying a static scanner, and then scanning the code base and sending the list over to development saying you have to fix everything, without an understanding that many of those may not be accurate problems. And that going and making all of these changes, breaks more, like creates actual problems, where there were only theoretical problems in the past, right. And so, you do that enough times, and of course, people just, you know, start to give you the Heisman, when when you approach with a new idea. And it's really unfortunate, because the right answers to have everybody work together, but I've seen, you know, so many of these cases where security, you know, kind of either treats development, like dogs, you know, like they need to be whipped, they need to be told what to do, they just need to be handed a list and, and they'll go fix it. You know, and I try to remind security teams, especially that most developers actually do care about the quality of what they're doing. You know, in a way, I think developers are not that dissimilar from artists. We want it to be good. We want to put good things out there. And the challenge has historically been that companies haven't provided the tooling to give visibility into the problems. Right, so the developers are are gold on trying to produce functionality quickly. Without the visibility to make the proper choices, and then somebody comes along later with a tool that says, and you got it wrong, go do it again, and you play this global game of fetch me a rock, right. And I've seen so many practices kind of fall down and various forms of that. And so the inevitable result is, is that when security finds legitimate problems in a component, let's say, I've seen so many organizations where development has built up this immune reaction where they won't make a dependency change, unless security can prove to them that it's exploitable. And yet, when development needs to make a change to a dependency, because they want to, or because, you know, they need a new capability, there's no drama, and from that, they make a change. So, you know, that's been sort of my visibility, there's so many stories that I can tell about horror shows, but they all kind of boiled down into similar variations of that.
Ilkka
Yeah, I mean, part of the reason why, you know, there's that 100 to one or 100 to 10 ratio differences, most organizations are really focused on pushing software out the door, right. So you know, who produces software, and it's the developers. And by nature, security, and maybe controversially, is all about risk assessment and risk reduction, right, you know, get a sense of how much total risk is out there, and then either contain it or reduce it as much as possible. So when those teams are kind of incented, in that way, one teams all about, you know, containment and reduction. And the other team is all about move as fast as you can break things and you know, apologize later. But it's happening, having been on the development floor where we literally done this, you keep pushing the stuff thats priority, right? You know, security is never priority in the sense that you do care about it, but it doesn't add value to the software. So what often ends up happening in the sort of immature organizations is security, then becomes a mandatory activity somewhere along the line, right. So what ends up happening is you package your software, or you give them the link, you know, to your builds, or whatever. And then they run their magical tools, you know, it goes into the wizards tower. And they do all sorts of magical concoctions. And then they bring back a book of here's all the sins you've ever committed in your life, and they hit you on the head and say, now go fix it. So the natural reaction is they'll time, it so that it will happen very late in the delivery cycle. So it's like, Well, I'm sorry, Mr. Security person or Mrs. Security person, we can only fix the top priority ones, can you sign off on the rest of the risk, we promise that we'll fix it in the next sprint? Like, I've gone through this cycle, so many times, right. And, you know, it's kind of kind of everybody knows that, that that's the game that you play, right? And part of the reason for that is they're there to do different things like and, you know, you know, it's a very unhealthy sort of situation, like, all these sort of human elements come into play as well, just like Brian said, right? It's the why's that person so much smarter, like, Haha, I found all these holes in your software, well, why don't you come and help me help me fix it? And then if you found that Miss, you know, clever clogs? Right? You know, come back here and help me with that. So it's, it's a combination of all those sorts of things that leads to that sort of behavior. And it's not entirely uncommon at all, it's like very typical in places where these functions operate completely separate from one another. And typically, they also have separate reporting line, that's kind of kind of my observation on why people kind of end up in a situation like that.
Kadi
I mean, looking at the other side of the coin, I think Ax and Stephen probably might have a little bit of a different view, what is your thoughts on and kind of the heart ache from a security perspective,
Stephen
Following up on what Ilkka said, and sort of getting to that topic - I think, a big problem often is that security gets involved very late. And, you know, Ilkka went through the reasons why that is, you know, when security is viewed as like this, you know, onerous process that you want to avoid going through at all costs, you know, yeah, then there's no incentive to engage earlier. But sort of, paradoxically, I think things often go better if security is engaged early in the process, right? Because high security software really starts with the design and architecture. And then you know, throughout the software development process, you want to have sort of a shared view of what your security goals are. And so if you can work with security, at the beginning to say, you know, here's how we're going to architect the system for security, here's the standards that we're going to enforce as we developed the software, you know, here's our standards around vulnerability remediation, here's, you know, the goals that we're going to set, then developers can work towards those, you know, on their own and use, you know, tooling in the pipeline to help enforce that. And then you never reach this point of like, trying to apply the security bandaid at the end of the process, which never goes well.
Ax
I have a bit of a longer take on this. I mean, see for developer priority and central focus is on implementing new features, right? As Brian said, like, they're like artists pretty much building a solid product. Most importantly, importantly, delivering the functionality, right how to solve a given challenge. So I think like to incent incentivize developers to code securely has to be done with minimal friction, like without imposing additional burden, or roadblocks that can hamper the creativity and productivity. So I mean, when we talk about that, that's why it's some sort of automation, right? Even if it's not a commercial se application, maybe even something like NPM audit, or just one script, right, that that kind of helps them work one step closer to it. And the other aspect is, I know, this may not sound very technical at all, but the developers and the products image, as we've seen time and time again, like for example, Log4j. Right. I mean, there was so much spotlight on the act of exploitation it kind of pressed developers to address it.
Ilkka
Yeah. So you know, just kind of latch on actually what you said Ax, and what Stephen just said over here, right? Was, it's absolutely true. I mean, that's part of the reason why, you know, we, as an industry started advocating adding SEC into DevOps, right, DevOps was like, the first big eye opener between the turf wars, that was development and operations, right, you know, just ship it and throw it over the fence, and they'll run it. And, you know, they'll be the ones on call, we'll be the ones to fix the bugs, you know, five to five to nine, you know, that sort of that sort of timeframe. Right. So, that same dysfunction existed elsewhere as well. Right. And, you know, that's why I kind of said, incentives incentivize, ultimately it has to do with, what are each of those functions there to do? I really actually agree with what Stephen said, you know, you know, in software engineering, there's always this sort of platitude that gets thrown around, which is early you find a bug, the faster and easier it is to fix. You see, stats like, you know, I think there was an IBM report like 20 years ago that said that, it's 100 times more expensive, fixing a bug in production than it is to fix it in design. And it's 10 times cheaper to fix it during development than in production. So kind of by a similar analogy, I often say to people, like, listen, what's the fastest and cheapest way of getting rid of security vulnerability, well, don't have it in the first place. Right? That's, that cost you $0 is zero hours of work. Everything else further down the line is effort that doesn't deliver on the the main goal of why we're building this software, which is delivering value to our customers faster and faster. So, so in many organizations, I feel lik e that sort of very baseline truth is sometimes lost. And so when the function is served to just deliver software for their functions and deliver security, because we need to deliver security, that's kind of the real danger zone. And that's why you kind of end up in those sorts of behaviors.
Omar
There seems to be a lot of friction. And part of what we are seeing right coming out more and more, in discussions and articles is that people are shifting security left ya'll already pointing to this. What does it mean, when people talk about shifting security left?
Brian
Well, I think left comes from the traditional SDLC diagram where you know, development happens on the left, and then it gets built and then tested or something and then integration tests and then production, right. So left means really move it further into development. And not like Steven was saying, try to test at the end, and then throw it back, right, that is horribly inefficient for security, just like it is for testing and bucks, right? It's, it's the same learning we went through and we figured out, we needed to be more agile and integrate testing into development, and not just build, test, and then go back and fix. So that's what shift left is really all about. And that's kind of what I was getting at before that, you need to be able to provide the visibility to the developers at the right time. If if they had hygiene statistics, for example, when they're picking between two different projects that would influence their decision. I mean, nobody walks in and just buys a piece of electronic really, without comparing it to, you know, the other similar things, and the company behind it in the history of their quality is an important factor, you might choose to ignore it because you really just need something cheap, and you don't care how long it works. But sometimes you really do care about the quality, right? So we do that when we buy physical goods, but that information is hard to obtain or impossible to obtain often for software components. So that's just an example of providing information upfront, can avoid you getting you know, a poor component deeply embedded into your software only to be tested later to find out it's a problem because by then it can be a huge architectural nightmare to rip it out. Especially if it's something like a framework. And and so it goes beyond that. And also into when there are problems with the version of the component you are using. There's a convenient time in development when it's When you can bump a version, just before you push something into production is not that time. Right. So again, if development had the visibility, that there is a vulnerability, maybe you should fix it, maybe there's some licensing issues, maybe there's some quality issues at the right time, it becomes natural. And that's part of that conversation I was describing before. With security, it's like, Look, you don't always have to have an airtight case to convince them to make the dependency change. If you provide the information at the right time, it just looks like anything else that they're doing. If there was a bug in the component that affected their functionality, they would fix it. So it's a lot about timing and where it is and trying to be. It's not just about convenience, it's a little bit about that. But it's about efficiency. At the end of the day, if it's convenient for the development team to work something in, it becomes more efficient for the organization overall, and avoids all of that rework.
Ax
And I think just just to add here, these attacks that keep happening, for example, the colors and Faker debacle, right? I think it's kind of also educating newer developers. So before, they may not just pin their dependency versions, now they will like it's just become an acquired practice just to avoid this incident in future. So these attacks ultimately reinforced, you know, best practices and the importance of hygiene in coding.
Brian
By the way, mavens been painting versions for like, what, 20 years, just just wanted to throw that out there.
Kadi
Always plug and play, I love it. So Ax I feel like you said an interesting intersection, right, by being a contributor to the development community, as well as also being a security researcher for all these malware vulnerabilities. You know, what are your thoughts on how enterprises can best encourage developers to code more securely? I know, Brian talked about getting that information at the right time. But you know, when does that? What does that look like? What does it smell like? What does it feel like?
Ax
Yeah, so I would say, you know, developers are primarily focused on delivering features, right? Building a solid product implementing functionality. So if we want to encourage them, right to also adopt security, it has to be done, like with least amount of roadblocks and friction. So and this is why I'm seeing any like, say, cyber attacks or active exploitation of their product, which puts the spotlight on them. I know, it's, it may not be the ideal way. But it does help reinforce the importance of security. So like Brian, Brian said, right, if they were looking at this as a way to prevent a bug or a vulnerability, they would just address it as opposed to thinking this is extra work.
Kadi
I think. And I think it's interesting, all four of you, that kind of been alluding to this, we're starting to see organizations really look to that developer first mentality. Steven, do you want to help with that, just kind of explain what that means?
Stephen
Yeah, I think, you know, it, it means having the developer responsible for as much of the security of the system as that makes sense for, you know, and I think that last part is important, because, you know, there are certainly things that that should live at the security team level. And, and, you know, I think the other thing that's important is this idea of defense in depth, right? So we've seen, you know, I think, initially a long time ago, when security was just sort of a band aid, right, you know, and, like, there was sort of this this realization that, that all these internet connected systems are insecure. And security is an important property. And we need to focus more on this. And, you know, you sort of had this containment approach of firewalls and monitoring, and so forth. And then, as the tools got better, you know, it became more and more possible to also deploy tooling and deploy automation around the construction of that software to make sure that, you know, you're doing what you can during development, to enforce security. And it's not like we got rid of those other things, right, we still monitor systems, we still use firewalls, you know, we still do all of that stuff. It's just that, you know, we've sort of layered on to it more and more, and, you know, there's become, I think, more than is possible for developers to do as the tooling has sort of CO evolved with development processes. So, you know, as agile rose, and DevOps became a thing, and automation became more and more a part of the development process that provides the opportunity to pull tooling into that development process that can help developers stay on top of security, you know, it would not be would not be feasible to say, alright, you know, security sort of begins with developers if they didn't have tooling to support that process. Right. And, you know, you mentioned this 100 to one ratio. Right, you know, that that wouldn't be sustainable without the proper tooling and without the proper development processes. So it's really been sort of an evolution over time. In terms of developer responsibility.
Omar
I'm curious, ya'll seem to talk about different pieces of the puzzle in terms of shifting left, what about the people aspect, how do people come together and really make that process happen?
Ilkka
Well, I mean, you know, one of the things that kind of, you know, spurred on by your question there, Omar, and what Steven just said, is actually something that I heard from Shannon Leitz who owns the dev sec. ops.org. Kind of many years ago, I was talking to her in some conference or another. And the comment that I heard from her was, listen security people don't want to be developers and developers don't want to be security people. And in fact, when you look at, you know, the most successful DevOps implementations, you know, kind of to Stephens Point, ops people haven't disappeared, they've just become masters of their own domain. So actually, a lot of this stuff, you know, when when you look at, like successful things, is less to do with making developers do more security. Because often, you know, the tooling kind of implies that, hey, developers, let's just stack like 17, you know, PR commenters on your thing, and you know, you'll do 17 sets of analysis on on every build, right? And you can, you can deal with all of that, right. But as a developer, you don't have the time to configure all of them, you don't have the time to even know what any of it means. Really successful implementation starts with communication, right? They start with how do you exchange information between the brain that is the security researchers brain to my, you know, work as a developer, kind of what Brian said earlier in the conversation, right? context is key to what you're doing. And similarly, when there is a security vulnerability, but it doesn't affect us for whatever reason, that's obvious to see from the development flooring, how do you get that info back onto the security flooring? And so on? So really, to me, when I when I boil it down in kind of a very reductionist way, is that exchange of information, that's the problem that everybody has to solve. And then, you know, kind of the next step is, okay, well, what tools will help us gaining that sort of knowledge? What are those kind of basic questions that cause us this friction is kind of value streaming 101.
Stephen
Yeah, I think that's, that's totally right. Communication is super important. And, you know, I'd say, you know, sort of communication and culture are sort of linked and go together. And, you know, achieving security goals is often about setting up the right culture and the right communication patterns, you know, if you have, if you have everyone on the engineering team, you know, with a shared notion of what quality means, and working together to make sure that they're pointing out issues that should be addressed in this on the security side, but more generally, you know, when it comes to code quality, if you can set up that sort of shared expectation, that's a big part of solving the problem.
Omar
Ax, I'm curious, I know that earlier, you had said you had more to say on on this, it seemed like you were trying to go into solutions route, what can developers do? Can you talk more about that,
Ax
I just want to say I see this, I'm commenting on what I'm seeing, I see this, you know, heading towards a positive side. In the sense, for example, this massive no blocks campaign, that Sonatype tract last year, bunch of malicious packages, not just one, one after another after another in npm. They're just delivering like malware ransomware. For the first time, we saw a ransomware in an open source repository. And, of course, these were all type of squatted packages for legitimate package. But the maintainer of the legitimate package got riled up like he didn't like this then attacker capitalizing on his brand, Josh mirror, and he reached out to both us our research team and the register, just you know, spreading awareness that this is, you know, my package. There's typosquats, repositories are not doing enough. That's one example. The other example is like, say, log4j, right? Active exploit exploitation happens, right? And everyone, I mean, developers almost kind of take it as their responsibility to, you know, code even more securely. So what I'm saying is, all of these incidents, they are kind of reinforcing the need to develop securly and secure your supply chain. It's not an optional enhancement. At this point, it's become almost like implementing a feature that people expect in their product.
Omar
Is there anything developers can do to sort of start advocating for this more in their organization? Is that possible?
Stephen
Like, like, how do you instigate change from the ground up sort of thing?
Omar
Yes. Exactly
Stephen
Yeah. Yeah, I mean, you know, one idea, I think, would be getting proactive. And, and sort of, like, developers have a lot of flexibility generally. And, at least to some extent, it depends on the organization, but sometimes have flexibility on the tooling that they pull in, you know, how they set up their build process, what they what they put in place in terms of tools and standards. And so, you know, you can sort of take ownership of it, and then say, make a case for you know, look, we have this under control, here's what we're doing, you know, if that's not sufficient, let's have a conversation.
Brian
Yeah, that point is huge, right. And that's kind of what I was getting at before that. When development wants to do something, usually, it can get done, and especially if it's around the build pipeline, and some of those things it can get done without a lot of drama. I remember talking to a company who told me that they needed to wait six months to get a new plugin installed to their CI server because security asked for it. And I was kind of horrified and shocked and was like Really, how long does it take for development to make those changes if they want to, and they said, Oh, it happens every day. It's like, right. So again, it's that immune reaction, it's only it has to go through, you know, a business approvals process, because somebody else asked for it. So if you're listening to this, and you're on the development side, making those kinds of changes might be really simple. And you might find yourself a lot of friends on the other side, if you start making the right moves in the right direction, will buy a lot of goodwill and show that this really can be done and doesn't require a lot of drama.
Ilkka
I think I think there's a very cliche line here, right, be the change that you want to see. It's no no harder than that.
Kadi
Sometimes the simplest, you know, phrases that are the ones that stick and hold true just over time, right. So before we close this, I think it'd be good to just wrap with, you know, a couple of questions that organizations, or developers should be asking themselves when they're looking to start this journey of kind of how do we better reduce that friction between security and development?
Stephen
Yeah, I think, you know, I would start with questions that even go outside the development team, you know, sort of getting back to this this theme of communication and setting up the right channels there. And shared expectations. You know, I think it starts with sort of looking organization wide and saying, you know, what are our goals from a security perspective? On the risk side, you know, what are what are the major risks that we're concerned about, you know, that should feed into the security policy, not every organization is going to be facing the same risks, not every organization will have the same sort of willingness to accept various risks. So you have to figure out what's right for your business. And then, you know, that sort of sets the goalposts and then you need to work with the various teams that are involved to figure out, you know, what's the right process to put in place to make sure that we can achieve those goals, right. And so that's, you know, bringing in security as the experts on, you know, how you should address these risk, how best to mitigate against these various threats. It's bringing in developers in terms of, you know, what's possible with the tech stack that we have, you know, pulling in OPS to, you know, advise on, on some of the operational architecture side of things. And, you know, and then, you know, you often have sort of DevOps and tooling teams, right, that can be involved as well, they can say, you know, here's how our pipeline set up, here's the amount of centralization we have, you know, here's, here's a good place to introduce a control, you know, where it can have maximum impact. You have to have that conversation. And it's actually a lot of conversations, right to get everyone on the same page and come up with a shared plan. But I think if you can get that right, then you're really set up for success.
Brian
So in a, in a world that is much more hyper aware about the software supply chain in general. How do you feel about the choices you're making, both in terms of what components you're choosing, but also how you're getting them? And how you're tracking them? Do you feel that they are trustworthy, that the process is good? If these were parts going into your car? Would you buy that car? If they assembled their car the same way you're assembling your software and the way you're making your decisions? Knowing how that's happening underneath the hood, I think developers have pretty good visibility into that better than the average consumer would. And so how do you feel about that? Can it be done better? You know, if the security team is coming? And asking for things, try to resist the immune reaction and think about what are they trying to achieve? And what what what can we do to solve that problem, not this immediate problem, but to solve it generally, so that they don't have to come walking over port to you later. Right? It goes both ways. And like we talked about earlier, that communication is important that that culture is important. But if you're if you wouldn't be comfortable buying a physical good, that was assembled the way you're building your software, then I don't know what else is there? If if the answer to that is no, then start figuring out how to make it better.
Ilkka
Well, you know, it's kind of riffing off of what Brian said, I've always been, you know, one of these people that has got weird enjoyment at just picking up my laptop and going working out of the other teams room test for half a day, especially if that team is really annoying me. Because it turns out that that that's usually the fastest way of finding the real understanding of why Why am I being asked to do you know, certain things like let's say, fix a bunch of CVE's that I don't see the point of, right, you know, usually people ask for a reason. So but that reason might not be obvious to me where I'm sat right there. So physically, when we're, when we're still in physical offices, I would physically move myself there and kind of get to the bottom of it. I think today you know, if you think about it from a, from an organization your point of view, there's a famous, Demming quote that says, cease cease dependence on mass inspection rather build quality into the manufacturing process, then imagine that you can inspect quality into the project later on. And security to me, when I think about it from a manufacturing perspective is a facet of quality, right? A secure product that we can fix quickly, delivers more value to customers than an insecure product that we have to spend, you know, a ton of time incident managing, right? You know, if we have a bunch of fireworks that are kind of going off all the time, we were spending time kind of dealing with the technical debt or security incidents on that product, we're not really spending time delivering value. That is a problem. So with that in mind, I think the number one thing is really to really to say why are we doing this? How does that factor in into the bigger picture and then start finding ways of splicing it in I think that's how that all kind of begins to hang together.
Kadi
I think with that being said this is probably a good place to stop. Thank you to Brain, Ilkka, Stephen and Ax for joining Omar and myself today on Wicked Good Development.
This show was co-produced by Omar Torres and Kadi Grigg and made possible by our collaborators. Let us know what you think and leave us a review on Apple Podcasts and Spotify. If you have any questions or would like to leave us a message