:
Skip Navigation
Resources Blog Wicked Good Development Episode 29: White House unveils ...

Wicked Good Development Episode 29: White House unveils National Cybersecurity Strategy

On March 2, 2023 the Biden-Harris administration made a historic move with the release of the National Cybersecurity Strategy. This is the first time the US government has taken a stance on product liability with regard to software.

In this episode, Jeff Wayman–Conduit of Goodness at Sonatype–takes the mic to speak with Sonatype's Co-founder and CTO, Brian Fox. Listen in as they break down important details of this historic new strategy, its meaning, and how it impacts you and your organization.

Listen to the episode


Wicked Good Development is available wherever you find your podcasts. Visit our page on Spotify's anchor.fm

Show notes

Hosts

Panelists

Relevant links

Transcript

Kadi Grigg (00:11):
Hi, my name's Kadi Grigg, and welcome to another episode of Wicked Good Development. On March 2, 2023, the White House Administration made a historic move by releasing the National Cybersecurity Strategy. This is the first time that the U.S. Government has taken a stance on product liability with regards to software. Today, as a special treat, we are sharing a breakdown of the strategy, what it means, and how it impacts you and your organization. Let's dive in.

Jeff Wayman (00:43):
Hello, thanks for joining us. My name is Jeff Wayman. I sort of lovingly refer to myself around here as the conduit of goodness and resident word nerd at Sonatype. But the real star of the show is Brian Fox, our CTO and Co-Founder. Thanks for joining us today, Brian.

Brian Fox (00:55):
Hi, everyone.

Jeff Wayman (00:57):
So, if you're in the cybersecurity space, there was some big news today. The Biden-Harris administration released the national cybersecurity strategy. I know, Brian, you had a chance to review that a couple weeks ago, maybe a little bit longer than that. Tell me a little bit about what that was like.

Brian Fox (01:12):
Yeah, sure. FirstI think if you're in the software industry, this is big news for you, not just the security industry.

Jeff Wayman (01:18):
That's true.

Brian Fox (01:19):
Yeah, it was pretty interesting to be able to do a table read of the strategy and provide feedback directly back to the authors of this. And it was an interesting experience, because right around the time that this happened, I think we had just finished putting the final touches on our analysis and feedback on the European Union CRA -- the Cyber Resiliency Act, which within the open source community has a lot of concerns. And I blogged about the potential Balkanization of open source there. The CRA has good things in it but the specifics and the implementation could be problematic.

Brian Fox (02:05):
So, that was sort of the mindset I had when I was thinking, "Oh no, another set -- better be prepared to try to pull on the reins." And as I had the opportunity to read it, I was thinking, "Wow, this is really good. It's really big. It's encompassing." The preface sets out a pretty sobering reality of where we are in terms of threat actors from hostile nation states. And then of course diagnoses what we've been talking about for a long time, which is the sad state of typical security in most software these days. And as I read through it, I immediately became a fan of like, "Yes, this is a very different way of approaching this -- one that I don't think has been really talked about much before."

Brian Fox (02:54):
One that was a little near and dear to my heart, where I like to talk about other industries, as you well know, and try to show how situations that we accept as normal within our software industry would be ridiculous in our food chains, in our cars and planes. And so I think the strategy analyzes that in a really interesting and nuanced -- where it needs to be nuanced and mature where it needs to be mature -- way. So, I had a little bit of feedback, as you know I like to talk about, "We need to be focusing a bit on companies able to do the recall" and some of these things and not just producing the bill of materials, but those are implementation details. There wasn't anything in the strategy that really had me concerned that was woefully wrong. So, I kind of became a cheerleader right from the beginning.

Jeff Wayman (03:45):
So, digging a little bit into that, you alluded to this earlier. It's pretty long. It's 38 pages. It's probably something more for right-before-bed reading if you want to doze off. But there's a lot of important stuff in there. And I haven't gone through all of it yet, but if you could. I know a little bit more than the average person, but I think there's pieces that affect everyone, not just people in the software security space. Explain it to me like I'm five. I'm just someone that uses web applications -- what's the big change here? Why would it be important to me?

Brian Fox (04:17):
Yeah. So, what the paper does a good job of doing is recognizing that there are certain what they call "market forces" that are leading to the outcomes that none of us are really happy with. And there's some sections that kind of analyze what happens when you're in a heavily regulated or priced-fixed industry. Think about like a utility where they don't get to set prices on an open market. What does that mean? It means the bottom line gets squeezed. And in those types of markets, there can be a race to the bottom in terms of security, because it's the only way they can eek out additional profit. And then if you take a step back in other industries they recognize that we have things a little bit upside-down.

Brian Fox (05:05):
That through existing contract law, every piece of software, everybody kind of jokingly laughs about the click-through EULA that nobody reads. But in all of those things companies are disclaiming liability first and foremost. And that basically, as they describe in the paper, has the effect of pushing the responsibility to carry the consequences onto the end user. If you use any piece of software, you've accepted the contract that says you can't really sue the company for anything meaningful, except maybe to the extent of whatever you paid for the software, and you get hacked. That's the end of it. And they make a -- it's a separate chapter, but I kind of put it in the same vein, which is holding the stewards of our data accountable. If you think about large companies that have amassed tons of information about us as individuals, and then there's a breach, what is your recourse?

Brian Fox (06:00):
Oftentimes there's none, because many times you don't even have a direct relationship with that company. Think credit reporting agencies and health providers and things like that, that are behind the scenes. And in practical reality, those types of lawsuits that come out of that amount to oftentimes just the cost of doing business, a rounding error for these companies. And I think what this strategy is basically saying is that the market forces are misaligned. That by being able to put the onus on the end user, it's not really causing the companies to come to grips with what they should be doing to produce better software. And so, by turning that around, what the proposal is doing, is saying, "We want to propose that you can't disclaim all liability." That that becomes a thing that is not valid in U.S. contract law.

Brian Fox (06:50):
But that you can basically earn the right back to cover yourself by following best practices -- they define them as safe harbors -- which I think in retrospect feels obvious to me. Of course, I've had many weeks to think about that after reading it. But that is the novel type of change that -- what we want to do is encourage companies slash require them -- because companies that aren't moving in this direction now are unlikely to just change willingly at this point -- to get them to do the right things, but also recognize -- and this is the mature and nuanced part that I was talking about -- recognizing that even if you do all the things, software is still imperfect. There will still be bugs, and some of those bugs will lead to vulnerabilities. And so it's not entirely fair to put all of the onus back on the manufacturers if they are doing all of the generally accepted things.

Brian Fox (07:43):
Accidents will still happen. And so, you have to think about that strategy in that nuanced way. And I think the safe harbors do that. So again, basically turning the tables around instead of companies saying, "I'm not taking responsibility for this software, use it at your own risk" and whatever instead saying, "you can't do that, but the organizations must follow a defined set of best practices." That probably will be a little bit moving over time, because we can't whipsaw the industry tomorrow, but we still need to get them moving in the right direction. So, that's the summary of the part that I think has the biggest impact on the software industry as a whole.

Jeff Wayman (08:24):
And so it's broken down into three pillars. And there's this -- I think this language that's there. But it could be sort of easy to dismiss, but we've written on some of this, but it's like software liability or liability for software products. And so if I'm a CISO or engineering executive today, how closely should I be paying attention to that? How concerned should I be about liability that I may introduce to my organization to the areas that I control?

Brian Fox (08:52):
Well, the thing to keep in mind is that this is a strategy, not a regulation. It's a call for work that would hopefully lead to some type of policy and regulation, and ultimately Congress would have to change the laws to change the understanding of contract law in these areas. But if you take the long view, and you look at other industries, again, I know Jeff you've helped me write on these things, other industries have gone through this as well. Where the food industry -- there's some examples we took about the Paisley Snails case. That almost exactly mirrored this that examined contract law for somebody who got sick drinking a beer, I think it was, that was contaminated. The law at the time said they couldn't sue the manufacturer because they didn't directly buy the beer from the person who made it.

Brian Fox (09:42):
They bought it from a bar. Of course. In retrospect, that's ridiculous. And so over time those industries change to the point where there's now an expectation of due care that is defined somewhere in the realm of what everybody expects is normal, which can be a little bit hard. But it basically turns that table around again and says, "If you're at least not doing the basics, you might be liable if something bad goes wrong." It's the same thing with our auto manufacturers. And so, while I think it's going to take time for this to fully unfold, it's going to take time for legislation to come out of it and the exact standards are defined. I just can't imagine a reality where this doesn't become the thing that sets all of that in motion. And so I feel like the clock is ticking. How much longer do you want to wait to continue to ignore this problem? Because if you're doing that, you're basically betting that all of this is much ado about nothing, and it's just going to go away. Which literally I think has 0% chance of being possible, because this is the course that has happened in every other industry that we've seen. So, which side of history are you going to be on? I feel like it's just as simple as that.

Jeff Wayman (10:53):
Yeah, no, it's interesting. I mean, some of the illusion -- we've talked a lot about the automotive industry. I was listening to a podcast the other day in particular the FAA and the aviation transportation administration that governs all that in the U.S. Which is auto and airplanes. But how really within even recent time, like the late -- mid to late 90s, there was still more car crashes than they wanted. But when they made this focus, it was never to eliminate all incidents or complete danger from air travel. But with those government regulations, safety has increased where it's literally -- it is more safe to ride in an airplane today than it is to walk across the street, ride a bicycle, get in your car. People always talk about the car analogy because there's lots of car wrecks, but literally there's no form of safer transportation. And I think there's that element of today we use software quote unquote free software, social media, all these tools credit report agencies in the U.S. We use all those. And as a consumer or an end user of those things, you sort of just don't expect any protection. And this seems like really aimed at doing that among many other things. At least the pillar three I think that's in the paper.

Brian Fox (12:04):
And there's even more context behind it. If you look at some of the papers and certainly Jennifer Easterly of DHS gave a great talk at CMU just on Monday. You can find the transcript of that online. I shared it on LinkedIn -- really tells this same story that we're talking about here. That basically it's ridiculous, and all the other industries -- we don't have to reinvent the wheel. We kind of have to just do the same thing. We understand how this is going to play out. Let's get on with it.

Jeff Wayman (12:36):
I think in that regard, if I am someone that hasn't dismissed it, which again I agree with you, I think this is -- no one should be dismissing this even though it's not legislation -- is something I think people should lean into, because it's more serious than even the executive order previously, I think in even spelling out what needs to be done. If you're a leader on a software engineering team, if you're a developer, what are the things you can do right now to address some of these concerns to start? Or is it just, is this the ocean you have to boil, and it's a mess coming?

Brian Fox (13:11):
Well, it really depends on where you're at. But the area that we've been pushing on with the data is that we know modern software, 80 to 90% of it is open source third-party. That quite literally is the iceberg analogy. The code that your developers are writing is binding these components together and adding business logic. And it is the part of the iceberg that you could see. The deeper part is all this other code that is effectively outsourced to people you don't personally know or have contracts with. That's the reality of the software. And if you don't know what all that software is, which applications it's in, then you don't have a great understanding of what's going on inside your software. We saw that when Log4j, when the Log4Shell vulnerability happened, how difficult it was for so many organizations to understand if they were using it and where they were using it before they could even actually fix it.

Brian Fox (14:09):
That triage phase for companies unprepared kind of extends to now, because we still see close to 30% of the versions of Log4j being downloaded from Maven Central are of the known vulnerable versions. So, I don't know where you've been for the last 16, 17 months that you didn't hear about this and make the change, which implies that people don't know what's inside their software. That's why that's still happening. And so those are the obvious things that you can do. I like to refer to it as the 96% problem. The analysis we did back in October was that at the time something that is vulnerable is consumed, 96% of the time there's already a fix available. So, there's a lot of talk about making open source better, safer, providing tools, education, all that stuff is gravy.

Brian Fox (15:01):
But at the end of the day, if the things are already fixed, and you're still choosing the broken version of it, who's responsibility is that? It's not on the open source project. It's on the consumer, which is the organizations. And that's again why I think the liability "turn that table around" makes sense, because they are the ones who are making the decision to continue to use the broken thing. And they should be held responsible for that. So, if you're not able to do these things, you need to start because getting control of your supply chain is a huge part of this that I think will solve a big chunk of the risk. And also is something that you can do today. You don't have to wait for the industry to get better. You don't have to wait for the next generation of engineers and the next generation of infrastructure to be written in memory-safe languages. Those are what the paper calls "generational types of transformations." We must do those things. But as a consumer today of open source, there are very specific steps you can take that will reduce your risk immediately. And it's managing the supply chain, understanding what those dependencies are, starting to provide your developers with guidance and make better choices. So, that's why I think you need to be thinking about this today, because you can do something about it now.

Jeff Wayman (16:15):
So, I haven't been an engineer by trade. I certainly understand this area now. Do you think the mentality behind organizations and development teams is that they are consuming open source that when they use or download or include something in the project, which is most of an application today, is it like "consuming?" Because we use the analogy all the time of, say, manufacturing industry or the produce industry, and as a true consumer end user of something, if I go to the store and buy a package of lettuce after there's been a recall, and I hear that, people look at me crazy if I got sick. But the same thing is happening, what you just described, with Log4Shell.

Brian Fox (16:54):
Yeah, that's where the analogy breaks down, because many of these times it's not literally people picking it off the shelf. I think the equivalent is more like, you signed up for a subscription of food to be delivered to you every month, and later they found out that that food was actually not healthy for you. But it's still available for sale for reasons, and you just keep on using it, even though it's potentially known to be cancer-causing or something. I think a lot of these things end up on autopilot, because an organization makes a choice to use a component. And until there's a reason for them to change, and if they don't have visibility into this, they just keep on using that same version. They keep shipping it over and over and over again. So it's like having a giant bin of known defective airbags, and you just keep putting them in cars even after everybody knows that they're no good. That's what's happening here. It's not that they're choosing to order more broken airbags. And that's why I'm saying, the only place that can be fixed is at the time you're assembling the software when the producers are building the software and then shipping it.

Jeff Wayman (18:03):
We've talked a little bit about this too, but it's not like -- this isn't legislation. It's not law. I know contract law comes up quite a bit. How far out do you think reasonably we are from seeing big changes? Executive order on this was 2021? Because of a lot of major issues. But I think this comes actually pretty quickly for a government organization, but administrations change and whatnot. Do you think they'll move quickly on this?

Brian Fox (18:34):
I think we're looking at a couple years to maybe closer to a decade, unfortunately before certainly things are fully phased in. Just speculating. I've been getting asked this question quite a bit today. How do I think this unfolds? And the best parallel that I can think of as I understand it -- and certainly everybody can read the last couple pages of the document -- kind of talks about implementation and specifies which government organizations are responsible for driving the details forward. But I think looking at what initially started with the NTIA effort, the multi-stakeholder effort that ultimately led to SBOMs -- this was many years ago -- it moved a little bit slower than my taste would've preferred. But that's how these things work when you're trying to do more good than harm, and you need to collect a lot of viewpoints and think through these things.

Brian Fox (19:34):
And so in my mind, there are a number of efforts that get kicked off that kind of parallel that. There's calls in there for IoT types of regulations and infrastructure-as-things. There's software liability that we've talked about. There's talking about what cyber insurance might look like in this new realm and these types of things. So, there's a lot of discreet areas that I think will need deep-diving. There's whole sections on overhauling federal government and getting it to be able to work better together, to be able to work with our partners overseas to be able to work with the private industry. There's 15 or 20 major initiatives that probably all run in parallel out of this. Those are going to take time to play out. But again, I bring it back to what I said before. Unless you really think that the world is just going to give up on this and that this will be the first industry that didn't go down this path, what are you waiting for? Assume that this is coming and act accordingly.

Jeff Wayman (20:35):
Yeah. And if you look at the past -- we've mentioned things like GDPR and you see the CRA, you see the other strategic recommendations that are happening -- this seems an inevitability. And then again comparing it to every other manufacturing industry, not that software development is manufacturing, but certainly producing goods that we want to be safe for end users and protect them. Among all the other things, I think it absolutely makes sense. Anything else you want to add here you think that's important that we should highlight?

Brian Fox (21:05):
We have a question here that came in from LinkedIn. So, is it possible to impose legal liability for cybercrime upon a legit software company without restricting innovation from that small cutting-edge company? It's a great question. One of the things that's called out in Section 3.3 recognizes again that the ones -- it says "responsibility must be placed on the stakeholders most capable of taking action to prevent bad outcomes." That's what we've been talking about. Not on the end users, not on your grandmother who used software, who doesn't know what's inside the software, let alone able to do anything about the vulnerabilities. Not on the end users that often bear the consequences of insecure software, nor on the open source developer of a component that is integrated into commercial product.

Brian Fox (22:02):
And this is an important understanding that a lot of the other regulations like the CRA are trying to grapple with. How do we change responsibility and behaviors without harming innovation? Understanding that open source, at least in the United States, kind of butts up against free speech issues. And it's impossible for somebody who's creating a component in their spare time, because they think it's needed and they're trying to share it with the rest of the world. It's impossible for them to know all the places that people will include that and use cases that might be far outside of what they imagined that might lead to problems. That's why, again, putting it on the people who are putting those parts in these products are the ones who should be assessing whether it's fit for their purpose.

Brian Fox (22:49):
When it comes down to a small company, I want to assert that a smaller cutting-edge technology company probably is doing a lot of the right things already. They don't have a massive portfolio of aging applications -- that nobody works on yet they still sell a lot -- to deal with. So, I think the issue, the biggest burden to bear will be with larger organizations with larger legacy portfolios rather than ones that are moving fast and being innovative. But if you're moving fast and quote unquote breaking things and not doing the basics, you probably still have some culpability there. It's not any different. If you have an Etsy store, and you're selling stuff that blows up in people's faces just because you put it in an Etsy store and not in an actual department store doesn't actually change that. So, I think it probably will apply fairly equally, but yet it is most likely easier for smaller modern companies to be able to pick up these best practices.

Jeff Wayman (23:56):
Do you think it's just by scalability and size? Because I know with Log4Shell and Log4j framework issue, I saw this firsthand from an organization's working code search was -- that's where people, that's where they went. When it came out to really search all the applications, they were searching the entire codebase for references of that. Not easily identifiable in a day that was like -- and work stopped, right? It's immediate unknown technical debt.

Brian Fox (24:23):
Yeah, I just think it's a scale issue. Your smaller cutting-edge company's less likely to have decades of legacy to deal with, less likely to have abandoned projects that are still sort of in production but nobody really knows anything about anymore. Those are where I think these challenges come about. And it was interesting. I was at a forum recently where everybody was talking specifics about SBOMs and talking about all of the cases where producing an SBOM is hard and expensive and coming up with reasons why that is the case. And I was not on that panel, although I had a lot to say about it. But it was interesting as it gave me the opportunity to sit and reflect on what everybody was saying.

Brian Fox (25:09):
And at the end, my takeaway was if you step back, and you're responsible for society, and you listen to this conversation and all of the reasons why this is hard, not a single one of those reasons is a good reason for society. We have products that we sell and nobody knows anything about. We have no dev teams working on these things anymore, yet we continue to ship them to people. I'm sorry. I understand that that's the reality. And yet that is not a good reason to not do a better job here. And that was one of the interesting takeaways, and I suspect if you take that suspicious critical eye to some of the inevitable pushback that will probably come from this, and you asked yourselves what is best for us as a society as a whole, it's probably not to go down the path of why all this is hard. That's not going to move us forward.

Jeff Wayman (26:04):
What about -- I think this has come up in conversations. I think it even came up in a meeting we had today. Where's the responsibility lie for public repositories, code houses, in housing these vulnerable pieces. As stewards of Maven Central, I think we have some thoughts on that, but what would you like to say?

Brian Fox (26:21):
That's a question that we get a lot because it seems obvious. Why don't you just burn the books, take everything down that's vulnerable. And certainly if something is outright malicious -- we haven't even touched on malicious stuff in this time -- certainly if something is outright malicious, of course that gets taken down as soon as possible. Fortunately, for reasons on the Central side, that doesn't happen as much as it does in other ecosystems. But most vulnerabilities are not globally applicable. They're not vulnerable to every single person. It depends on how these things are used. And so, if you put yourself in the role of judging for the entire world what components they're allowed to have or not, that's a pretty dangerous place to be because you probably break more stuff than you actually prevent. The Log4j one I think was really interesting, because it was pretty easy to exploit. It was very ubiquitous. It led to bad things. But even the community didn't agree that they should be taken down. And even if it did, it doesn't solve this macro problem.

Brian Fox (27:31):
It breaks a bunch of builds for a bunch of people and fixes one vulnerability. There's like a hundred thousand a year these days. I don't even know what the number is that are filed in CVEs in the national vulnerability database that -- we're treating the symptom not the underlying cause. And so it's easy to say, "Why don't you take that thing down?" But the question is why do people keep using it? We have to solve that because there are other vulnerabilities that are not so globally applicable that will still affect these people. And if they can't understand with all of the media shouting that's happened, why they need to deal with Log4j, all these other ones probably have a much bigger, bigger actual cumulative impact. And so that's the conversation. If you take the supermarket analogy, it's not that we're leaving tainted food on the shelves -- it's that some of this food is allergic for certain people.

Brian Fox (28:24):
Lots of people are allergic to peanuts, yet you could still go in the store and buy peanut butter. Now, of course, if the peanut butter had salmonella in it, that's the equivalent of a malicious thing that's going to come down. So, that's how I draw the distinction. Sometimes the analogies break down a little bit, and they don't always hold in that way.

Jeff Wayman (28:41):
They're pretty close though. I think especially on the manufacturing side and especially for consumers. I don't think you'd get in a car today without seeing it. Like get in an old car without a seat belt, and you're going to feel uncomfortable. And there's a reason. There's a reason for that. And so, I don't think we see that today and the software development industry. It's sort of at least broadly the end user doesn't have anything protecting them from that.

Brian Fox (29:05):
Yeah, and it's interesting. I had a conversation with somebody a long time ago that made a really great point on this, that when you're trying to look at a physical product, take a car, it's easy-ish to understand the relative build quality. Just think about when you get in certain cars, and you close the door, they sound really tinny, and then you get in another car, and it sounds really beefy and strong. You can perceive at least you can think you can have a perception of quality there. It may not always be accurate, but you can feel that. But with software, it's very difficult. You have no idea, especially if you're dealing with a service you have no ability to adequately assess the actual quality of this thing. It's frankly a lot more like that paisley snail case, like when you're given a beer, how do you know what was happening inside the factory and that there were snails and rodents crawling in and out of those bottles? As the consumer, you have literally no ability to judge the quality on that, only based on the history of that company producing bad things or not. But if you're the first person, that's not a great place to be. And I think that's a more apt analogy for what's happening in software for that reason.

Jeff Wayman (30:16):
So, I think we're at the top of the hour, maybe a little bit past. Any anything else you want to add?

Brian Fox (30:20):
I think we've hammered it enough.

Jeff Wayman (30:22):
All right. Sounds good. Thanks again, Brian. Thanks everyone. You can reach out on the website at Sonatype.com for a bunch of this stuff as well. Our Launchpad has stuff on liability to inform yourself. We have a bunch of blog posts. I think there's a lot of content out there to help everyone.

Brian Fox (30:34):
Yep. Thanks for joining us everyone.

Kadi Grigg (30:38):
Thanks for joining us for another episode of Wicked Good Development, brought to you by Sonatype. Our show was produced by me, Kadi Grigg. If you value our open source and cybersecurity content, please share it with your friends and give us a review on Apple Podcasts or Spotify. Check out our transcripts on Sonatype's blog and reach out to us directly with any questions at wickedgooddev@sonatype.com. See you next time.

Picture of Kadi Grigg

Written by Kadi Grigg

Kadi is passionate about the DevOps / DevSecOps community since her days of working with COBOL development and Mainframe solutions. At Sonatype, she collaborates with developers and security researchers and hosts Wicked Good Development, a podcast about the future of open source. When she's not working with the developer community, she loves running, traveling, and playing with her dog Milo.