Did you know Intel has 16,000 software developers? As one of the world’s largest software development companies, securing Intel’s DevOps pipeline is no small task.

Amidst the increasingly complex threat of software supply chains, how has supply chain cybersecurity evolved? Even though SBOMs have existed in the DevOps space for a long time, it’s even more critical today to know what’s going into our software—especially following the SolarWinds attack on the DevOps pipeline.  

In this episode of Finite State’s podcast, “IoT: The Internet of Threats,” Intel’s Chief Solutions Architect Darren Pulsipher discusses Intel’s process for analyzing third-party software and scanning for vulnerabilities, securing its DevOps pipeline, and the pros and cons of using open-source software. He also gives his perspective on the potential impacts of Executive Order 14028 on improving the nation’s cybersecurity.

During this 31-minute episode, Darren and Eric Greenwald, Head of Cybersecurity Policy, and General Counsel at Finite State, examine:

  • Intel’s process for analyzing third-party software and scanning for vulnerabilities
  • Securing the DevOps pipeline
  • Balancing value and risk in using open-source software
  • Potential impacts of Executive Order 14028 on improving the nation’s cybersecurity

All episodes of Finite State’s “The Internet of Threats” podcast can be heard on Spotify, Apple Podcasts, and Google Podcasts. Listen to this episode in its entirety below:

 

Episode Guest: Darren Pulsipher, Chief Solutions Architect, Intel Corporation

Bio: Prior to his current role as Chief Solutions Architect at Intel, Darren served as an enterprise solution architect and internal consultant for software development and DevOps, and a firmware director at Intel. Earlier in his career, he was the Founder and CEO of Yoly, Inc., a cloud-based SaaS social media aggregator and diagnostic tool. He also held positions of increasing importance at EMC, Ovoca, and XanGo.

Since 2020, Darren has also hosted his own podcast, Embracing Digital Transformation.

Darren earned an MBA in Technology Management from the University of Phoenix and is working toward his Doctor of Philosophy, Ph.D., from Northcentral University.

Full Podcast Transcript:

Eric Greenwald: And now to our interview segment with Darren Pulsipher, the Chief Solution Architect for the public sector at Intel Corporation. I'll note that Darren has been with Intel for a good long time, 12 or so years, and is also a fellow podcaster. Darren, welcome to the podcast.

Darren Pulsipher: Hey Eric, thanks for having me on.

Eric Greenwald: Yeah, you bet. Darren, can you give our listeners a little bit of background on how you got to where you are today? And tell us a little bit about your podcast as well?

Darren Pulsipher: Yeah, yeah, it's interesting, I have a very weird background in that I'm technical, I have patents and so I'm highly technical in the cloud and grid space. But I've also been a CIO. So, I'm kind of a weird duck. And my boss even said that in an interview recently, Darren is an odd duck, because I can talk to executives, and then I can get down and dirty with software architecture. So, it's kind of a fun position that I have now. Because I do get to talk to CIOs and I also get to talk to technologists, and my podcast is, is of a variety of my “odd duckness.” I have CIOs come on. And then I have practitioners and guys that are down deep in the guts of software come on. So, it's kind of fun.

Eric Greenwald: Well, that's great. I mean, I like to think that, on our podcast, we specialize in odd ducks and it’s nice to have you on and carry the analogy a little farther, birds of a feather. So, Darren, you and I've had just a few conversations now. And we've talked a bit about supply chain cybersecurity. And, you know, we try to attack that subject from a number of different angles, trying to help give insight to security teams, particularly folks who are working in device security, on how to approach the problem. So, I wanted to just start with kind of an open-ended question on how you are seeing the evolution of supply chain cybersecurity, from where you sit at Intel, whether it's a question of the evolution of the threat or the evolution of the approach that Intel or other companies that you work with take to dealing with the increasingly complex threat of supply chain cybersecurity.

Darren Pulsipher: You know, from my background, which is primarily in the DevOps space, or back in the day, it was configuration management, this is something we've been doing for a long time, but now it's gotten a lot of attention. And that is the automatic building of SBOMs. When I'm building software, knowing what's going into my software is really critical. But we're also seeing at the same time, primarily because of the SolarWinds attack on the DevOps pipeline, we're seeing the DevOps pipeline become a focus area of security. So, to me, that's really, really important. Even if you do a great job at securing your production environment, if your production code has backdoors, and Trojan horses and all those things in them, I mean, what good is all the security you can put in your production area at all?


"We're seeing the DevOps pipeline become a focus area of security. If your production code has a backdoor, what good is all the security you can put in your production area?"  - Darren Pulsipher


Eric Greenwald: Yeah, that makes a ton of sense. And obviously, this is becoming a focus of a lot of different companies out there. What we're interested in exploring is, you mentioned your own software development environment, I can imagine that in addition to its own internal security supply chain questions, Intel has a lot of devices and other systems that it consumes, have you been involved in working with vendors to Intel, to try to get them to provide the same kind of transparency in their software?

Darren Pulsipher: We're just in the throes of beginning that right now. I'm no longer in the production side at Intel. But I do know that they are in the beginning stages of that—requiring SBOMs. We do do a lot of scanning of third-party software that we get, most of the software that we get, we demand the software itself, not just binaries. So, and we've been doing static scans of software for a long time, decades. But with the stuff that you guys have talked about, scanning binaries, this is a new thing, right, that we need to start evaluating now.

Eric Greenwald: Yeah, so what happens when you do a scan of software? I mean, is it in the procurement process? Or is it after it's already in your stack?

Darren Pulsipher: After it’s already in our stack. So, our whole build processes that we have, all of our, you know, our CI/CD pipeline? One of the steps before release is that it has to go through static analysis of the software to check for vulnerabilities, to check for even licensing agreement stuff. It's quite extensive to get software put into production at Intel.

Eric Greenwald: Do you find that that is a halting process because you're constantly finding things that are unexpected or unwelcome?

Darren Pulsipher: You know, surprisingly not. Because our scans are pretty extensive. And anytime someone wants to put new stuff into a current product, it requires change management and things like that. So, it's a pretty locked down process, because you have to realize a lot of the software that we write is firmware stacks, and, you know, lower-level stuff that we're very cautious about what we put into our software stacks.

Eric Greenwald: So, my sense is that Intel is probably well ahead of a lot of other tech companies in the level of care that goes into that process. Is that your sense? And I don't know how much exposure you have.

Darren Pulsipher: Yeah, I have a lot of exposure to other companies as I sit out on the sales edge. Now, in talking to our customers, yeah, Intel really does do a lot when it comes to vulnerability. Even, you know, we've done bug bounties on our own processors and things like that. So, we're very cautious when it comes to using open source and we contribute a lot to open source. Many people don't know this. Yeah, Intel has 16,000 software developers, which is, yeah, we're probably the fifth largest software developing company in the world, which is amazing. In fact, our CEO Pat Gelsinger just recently said in an interview, when he was CEO of VMware, he had fewer software developers than he does as the CEO of Intel, which is amazing when you think about it. So, we're very cognizant of the value of open source and also the vulnerabilities. And we have and continue to do a lot of vulnerability scanning and continuous integration scanning of our software.

Eric Greenwald: Yeah, the reason I have my own theory as to why Intel is different from a lot of companies, in terms of the level of care: Intel is a very well-established company, you know, it prides itself on process, and places a ton of value in its reputation. But if you look at a lot of smaller, younger players in the tech space, they're fighting and scrapping for market share and narrow margins. And, so, they just don't have the resources, or the expertise, or frankly, the time to put the kind of care into change management, vulnerability scans that Intel does. Is that your sense as well?

Darren Pulsipher: Absolutely, but I would have to say, if Intel started today, we would probably do things more automated. We would probably do things even more secure. There's some great technologies that are out there today that can help even lock down your development, and build environments even more. I've been working with a lot of companies and also agencies in federal government on how can we help you lock down your dev environments. Because there can be vulnerabilities and people don't understand this. When I'm building my software, even developing my software, I can very easily inject malicious libraries in the build process. So, unless I've locked down my build environments, I could be exposed. So, we've come up with a pretty cool reference architecture that allows us to lock those down even to encrypt memory and use in those highly secure embedded software stacks, where you have to say, I have to guarantee everything that went in here is extremely important because some of these embedded systems are, they're controlling missiles, right? They're controlling aircraft. They're controlling a heart monitor. I mean, people's lives are at stake. So, these are very critical systems that need a little extra care.


"Embedded systems are controlling missiles. They're controlling aircraft. They're controlling heart monitors. People's lives are at stake. These are very critical systems."  - Darren Pulsipher


Eric Greenwald: So, do you think that the process that Intel is using to lock down its development systems and helping government do the same, is that something that a broad swath of companies could adopt? Is that you know, is it something that… 

Darren Pulsipher: Oh, absolutely all the technology is available today and easily, easily accessible. And yeah, it's even accessible in many of the cloud service providers. The concept of confidential computing is available in AWS, Azure, and GCP. So, we've put together some reference architectures and some examples on how to do that yourself. And we're working with a set of ecosystem partners to provide it as a service. So, you'll be able to go right into Azure, or AWS, and say, Hey, I need a secure CI/CD pipeline, and boom, you'll get it, it's pretty, pretty cool.

Eric Greenwald: You know, what I'd love to do is to get some reference materials that we can post in our show notes for listeners who are curious to learn more about this process so that they can…

Darren Pulsipher: Absolutely, we got some great material. And like I said, it's a recipe. It's not a … it's not a turnkey service. So, you can cook it yourself any way that you want to, to secure your own CI/CD pipeline.

Eric Greenwald: Well, so, what I'm interested in turning to next is how you view the emerging regulatory framework, because, you know, we're seeing at least an effort by the US government to develop standards, and this is, you know, limited to the focus of government procurement of software. But, you know, I think we all know that those standards, once adopted can have a way of filtering out into the private sector, having a broader scope, broader impact. So, I'm curious to know your perspective on specifically executive order 14028, and the direction it's moving to the extent we can appreciate it from outside the government in terms of trying to promote secure development software?

Darren Pulsipher: Well, I'm working in a standards body right now that's working on SBOM. And it's an interesting standards body because we're not dictating the SBOM grammar and, you know, all the aspects, but we're saying what you should do with an SBOM. And, to me, this is really important, right? If I know what my software bill of materials are, I can check that against a CVE. I can check that against known vulnerabilities, and then produce a risk matrix and score things so that I can know where to focus my efforts on cleaning things up. And that's what we're trying to articulate right now is how to use an SBOM, to help me secure my software to meet the regulatory requirements that are out there and to not, you know, cause a major vulnerability. And what I find fascinating as I've delved deep into this recently, the vulnerabilities that we think of aren't always what you might always say, as … what's the right word? … so clear. For example, I can find a vulnerability like a Log4j vulnerability, right, where, hey, now I have access to the shell, which you should not have access to, right? That's a software vulnerability. There's an organizational vulnerability too where, in fact, this recently happened. On some NPM packages, some node packages, where, one of the most popular packages out there—I'm not going to mention it—was maintained by one person. And that person's email domain that he was using, came up for renewal and someone else took it. And now someone … and they did this on purpose. Luckily, the guy was not malicious. And he helped this guy get his credentials back. But the whole point was, is that one person can control a whole lot of software in the world. And a lot of people are dependent on the software without even really knowing who you're getting your software from. And the other is the security of checking in and maintaining open source. Software is still highly vulnerable. So, there's, there's a lot of weird stuff going on that we got to … we got to figure out.

Eric Greenwald: Yeah, I mean, it seems like every time we learn about a new vulnerability that's based upon some open-source code, we renew the conversation about the importance of putting resources into maintaining security of open source because far too often, for open-source code, even popular open-source code, it's one person, one developer who's responsible for having written it and maintaining it and issuing patches and yeah, you know, no, no person is perfect. And when so many different systems are interdependent in the way that they are today that it's a pretty scary idea that we have these points of failure that we're all you need is one mistake by one person and it can affect countless…

Darren Pulsipher: … Or it could be that the guy that was maintaining it lost his paying job. And this happened last month, where a maintainer, he maintains, like four or five open-source repos. And he said, I am sick and tired of people making money off of all my stuff, and I'm poor. So, he said, I'm not changing this back. He changed the codes so it wouldn't work anymore and said, I'm not changing this back until I get some funding, which is a source of ransomware, but not really. Kind of ransomware. Yeah. So, I was like, Wow, that's pretty brave of this guy to go out there. Because there's some big companies that can come down on you pretty hard. But, so, there's this. And it's kind of a weird thing, right? Because I'm a proponent of open source, I contribute quite a bit myself. Yeah. But at the same time, companies need to realize that there are individuals out there that are writing this stuff, and they're like guys in their basement. Right? I mean, is that who you're trusting a missile guidance system on?

Eric Greenwald: That's the crazy thing. You know, we were talking about SBOMs. And without the benefit of an SBOM, you just don't know, is the code that I'm relying upon, for this system, written by an Intel with thousands of software developers and high standards for secure software development and the ability to maintain it? Or is it written by one guy who might be having a bad day? And, you know, just how fragile is this? It's really interesting to see, with the complexity of the software supply chain, comes not only the difficulty of understanding what's in all these components, but also an awareness of their provenance and the fact that some of them are just not developed in the way that you might hope, think and hope that they are.

Darren Pulsipher: It's important for people that are producing products and services, that they at least understand that, and to me, that's, you can spend a lot of time going down the SBOM rabbit hole, but you have to at least have some kind of visibility so that you can run a risk analysis on what's going on?

Eric Greenwald: Well, I tell you, Darren, it is really good to hear that Intel has been so thorough over such a long period of time in generating its own visibility into the supply chain questions that it faces. And I sincerely hope that Intel is going to be a model that people will follow in other areas of the technology sector because we absolutely need that greater transparency, greater visibility into what's in our software, and a better understanding of how, how much risk we face day to day. So, thank you very much for coming on the podcast.

Darren Pulsipher: Hey, thanks a lot, Eric. It's been enjoyable talking to you.

Eric Greenwald: Excellent. All right. Thanks very much. Take care.