Hubbard Radio Washington DC, LLC. All rights reserved. This website is not intended for users located within the European Economic Area.
Hubbard Radio Washington DC, LLC. All rights reserved. This website is not intended for users located within the European Economic Area.
Bob Gourley and Matt Devost, co-founders of tech research and advisory firm OODA, discuss the rise of machine learning and A.I., and how their national security...
Both A.I. and machine learning are technologies poised to change our economy, as well as the world at large. However, without important innovations in security in both the public and private sectors, these new advancements could become a nightmare for governments around the world. To understand the potential of these new systems, and the steps that need to be taken to ensure national security, we spoke with Bob Gourley and Matt Devost, co-founders of OODA LLC.
ABERMAN: What the heck is OODA?.
DEVOST: OODA really is a tribute to a concept that was put out first by a guy named Colonel John Boyd. He was a military fighter pilot who had a reputation for being unbeatable in dogfights, and that attracted the attention of the Pentagon and DOJ officials, and they brought him to Washington D.C., and they made him figure out what it was that actually allowed him to outperform all of his peers in that space. And he came up with this concept of OODA: to observe, orient, decide, act, as being the decision cycle that was kind of guiding his abilities in being able to engage in these tactical fights.
And it turns out that OODA is just an incredible framework for thinking about all sorts of issues, national security issues, technology issues, technology security issues, it’s been applied as a framework over the past 20 to 30 years in all sorts of domains. So, we thought it was a great tribute for what we were trying to accomplish with the company, to get a name it after that concept.
ABERMAN: Bob Gourley, how does this apply in the private sector? I often find when I deal with CEOs and companies, particularly in cyber security, it’s more ready, fire, aim.
GOURLEY: Well, frequently, you know, you might need to explain it at first, and as soon as you do, generally a CEO really gets it. And they say, I’ve been doing the OODA loop my entire life, because a good, actionable CEO realizes that even the best strategy is only good temporarily.
Subscribe to the What’s Working in Washington podcast on iTunes.
Your strategy, if you don’t change it and update it based upon new information, will cause you to fall behind in the marketplace. So, the great CEOs know it’s a dynamic world out there, where you have to take in information and change based upon that dynamic, especially if you’re in a competitive environment.
ABERMAN: So, as you go around your business, and both of you have been involved in cyber technology in different ways, why did you decide to start this company, and to fund a venture fund at the same time?. I mean, what’s the market, what’s the opportunity, why did you decide to take the entrepreneurial risk and do this?
GOURLEY: So, first of all, let me say: Matt and I have worked together for decades now. I first heard of Matt when he published a paper in 1995 called ‘In the age of digital terrorism, can you trust your toaster?’ So, how prescient was that! He won the Sun Tzu contest at the National Defense University. I won the next year, by the way. So, we met each other, and then we started working with each other when I was in the DoD, helping raise the defenses of DoD networks.
He was running red teams that was cutting through our defenses like a hot knife through butter, and have worked together ever since in some capacity or another. So, in many ways, we’ve worked together well, and we know that we have this dynamic. We know the threat is only growing, and the need for cyber security services is growing. The technology landscape is changing. We’ve seen the rise of cloud computing and artificial intelligence, and there’s just a dire need for more work in this area.
ABERMAN: It’s fascinating to me, knowing what you all have been up to. You’re so emblematic of what I see as one of the biggest strengths and opportunities in our region: coming out of a national security background. Now, dealing with something that frankly is a huge private sector problem. Is that why here, and where you’re going? Is this basically an enterprise play, a business play, or a national security play?
DEVOST: I think it’s a combination of all three right here. Initially, we focused obviously on the enterprises to help them address this threat. And I think our national security background gives us just a great framework for understanding the cyber threat environment, but also being able to look towards the future. One of the things that we’re trying to address is to get folks thinking about future issues, future risks, future threats, and developing approaches now that will help them address those. And the National Security DNA of the company definitely contributes to that perspective.
ABERMAN: Now, I’ve been in the venture industry for many years myself. I’m interested how you’re going to work the inherent conflict between being a consultant, being arm’s length, looking at things objectively, but yet by the same token, as an investor, you know, you want to invest in things before anybody else figures out that they’re worth investing in. So, have you given some thought to how you’re going to manage that, when you find a really cool technology? What are you going to do with it?
DEVOST: These problems are greater than any one company is going to address. So, we’re going to play our role, which is going to be on the advisory, and really trying to engage new markets. You know, talking in a little bit greater detail about some of the stuff we want to do around A.I. security, but really the approach with the OODA ventures is, we encounter all of these great entrepreneurs, including folks that come out of that national security space, who have great DNA in the technology space, understand the security threat, maybe don’t know a lot about entrepreneurship.
Or we know, from the hundreds of CSOs and CTOs and CIOs that we interact with, what the enterprise pain points are, what is not being addressed in the enterprise. And the idea with OODA ventures is that we’ll select those technologies and opportunities that we see, will fit the gaps that exist in the market, and where they don’t have the entrepreneurial experience, we can bring some of that expertise to them, from an advisory perspective as well.
ABERMAN: My sense is that that is really the missing ingredient in our region right now. We have a lot of really really smart people who are dealing with highly complex problems around A.I., cybersecurity, and in particular big data. But we don’t have a lot of product people, we don’t have a lot of people with commercial experience. So that’s your hypothesis, that you can bridge that.
DEVOST: We do believe we’ll be able to bridge that, that we’ll find some of those kind of entrepreneurs in waiting that can be encouraged to bring their expertise, technologies, approaches to the market.
ABERMAN: So Bob, this is actually something we touched base on a couple of weeks ago: when the data breach got announced at Marriott. You know, I was really struck that a lot of our colleagues, not people necessarily in national security, but just people here in town that we know, they shrugged. You know, it’s almost like they’ve got cyber hacking malaise. Is corporate America really now involved in an arms race with hackers that they literally can’t win?
GOURLEY: You have hit on this fact that we’ve become numb to all these attack notifications, breach notifications. There’s just millions of records stolen all the time, and so, how do you sort it out, and realize what is important or not? You look at Marriott, you think, oh great. Did they steal my credit cards again? But when you dive deep into that one, this is more significant and serious.
They have stolen personal identifiable information on millions of people, including passports, and who is the they? All indications are it was China. So, it was groups associated with a serious national security competitor, which is now able to mix that information with other information they’ve been acquiring, our health and medical records our OPM records. So, this does become a serious national security issue.
ABERMAN: You know, when you think about, or talk about it that way, it really does bring up, to me, cybersecurity in particular. It’s a natural security challenge like nothing we’ve ever faced in this country, because previously, military force and national security, that was something protecting the edge by our armed services. The private sector, we made stuff so people could fight wars. But now, the private sector is the target. You just described Marriott, and Sony, with the North Koreans, and on and on. Does our national security establishment, and our private sector, do they really interact the way they should?
GOURLEY: No, no. There needs to be far more leadership played, not by the government, but by industry. I’m just convinced of that. And another key one that we need to talk about was the Tribune attacks of two weeks ago. You know, the media’s newspapers across the country were either delayed or could not publish because of a sophisticated ransomware called Rook, which seems to be associated with North Korea. So, yet another example of commercial entities being attacked that has an impact of national security with it.
ABERMAN: So, this is something that bears a lot of close consideration, and we better teach people not to use their password as ‘password’, right? Something that I’m shocked when I see how many people still do that. Matt, I’m going to turn to you. You mentioned A.I. I know this is going to be an area of interest of OODA, both as a consultant and investor. You know, one thing that’s been bothering me recently is how often I’m hearing that one of the big issues of A.I. is not job displacement, it’s actually the inherent problems of human bias. What does that actually mean? I hear people talk about it, what are they getting at?
DEVOST: Sure, I think that is just one of the key issues that we’re gonna have to solve as it relates to A.I. When A.I. programs are written, or algorithms are written, they kind of inherit the bias of the authors. So, we’ve seen this over and over again in the market, where machine learning or A.I. is introduced, and as it continues to build upon the learning in which it was programmed, those biases start to get enhanced. The analogy I like to use is like, money in a bank account, there’s a compounding interest, so a little mistake may become a very big mistake compounded over time, because it’s just reinforcing that bad behavior.
That’s just one of the issues I think that we’re going to have to address, is the algorithmic bias from the authors of these A.I. We need to make sure that we are ensuring the integrity of the training data that is being used. We need to make sure that these A.I. and machine learning systems can’t be influenced by external actors, where we see increasingly the use of machine learning to process signals from external sources. Social media, Facebook, etc. What if a Russian botnet decides that it wants to influence a stock price instead of an election? How do we make sure that we’re not building some of these, you know, algorithmic trading machine learning platforms that aren’t susceptible to that type of bias, that could have a machine learning impact?
And then lastly, I think you know, key security issue with they A.I. is: how do we ensure the integrity of the A.I. platforms? We think of A.I. as an essential technology, we thought of electricity as an essential technology. Then we thought of the Internet as an essential technology, because they both have permeated all aspects of society and commerce and national security. A.I and machine learning are headed in that direction as well. So, should we not sit down and deliberately think through how do we secure these platforms in advance, rather than just rush straight into the technology, and build on these mistakes? You know, are there lessons learned that we could have avoided from the past implementation of some of these technologies?
ABERMAN: You know, when I talk with most people who are outside of technology, living on a daily basis, when you talk with them about artificial intelligence, what they’re most fixated about is: are computers conscious? By the time we get to the point where software is sentient, if it ever is, and who cares. If we don’t deal with these issues, what’s going to happen to our society?
DEVOST: And the impact is going to be much more pronounced over the near term. We may reach a point where they have some sort of sentient ability. I’m waiting for the day when I can upload my brain to a computer, or a toaster, I threaten my children with that all the time, that I will be following them through their lives. The key is, though, that as there are much more narrow implementations of these technologies, that we are bringing into every single Fortune 1000 company today, and that is the curve that we’re trying to get ahead of.
We’re not saying, hey, let’s secure sentient A.I. We’re saying, let’s build security into these machine learning and narrow A.I. approaches that are being used now, because we’re starting to eliminate the humans from the process. And as we implement machine learning, we might reach a point where we don’t really even understand how the technology is working, so let’s start with a very strong security foundation at the beginning. Let’s build in the best practices that we know, let’s secure these platforms. Use that as the baseline. That is one of the big things that we’re going to be pushing for within OODA.
ABERMAN: It’s feeling more and more like giving a machete to a 2 year old. You could do it, but it’s a bad idea. So, A.I. is out, it’s going to be driving our economy. Last thing that I want to touch on with you guys: it seems like we’re at a crossroads right now in tech, and a lot of people are really questioning the whole business model of giving out personal data to exchange it for benefits. But by the same token, as we talked about earlier, they’re getting desensitised around these hacking issues. How do you think it’s going to play out? What’s going to happen to businesses that depend upon access to personal data in order to function?
GOURLEY: I think we’re looking for changes in the regulatory environment. I think something is going to happen. Now, in our community, we all know that compliance does not equal security. Some new rule comes out, you comply with the rule, but hackers don’t have to comply with the rule, they get in anyway. But we do see changes to the compliance and regulatory environment coming, and that may raise some awareness, that’s not sufficient. We know that. So we have tried to figure out: how can we steer our own thoughts, and how can we steer our writings on our web properties like CTO vision and OODAloop.com. A
nd we’ve assembled a team of experts to help us think through that very issue and many others. And these are guys like Scott McNealy, the iconic businessman, founder and chairman of Sun Microsystems. Jeff Jonas, perhaps one of the greatest data scientists out there. Jeff Moss, the creator of the DefCon and Blackhat hacking conferences. Them, and many others are going to help steer us as we try to dive deeper into questions like that.
ABERMAN: Well, I really appreciate you coming in the studio today, and also I wish you the best of luck with OODA. I think it’s an enterprise that will help our region grow, and for that reason, I thank you. Bob Gourley, Matt Devost, thank you for joining us today.
DEVOST: Thank you.
GOURLEY: Thanks.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.