Episode 9 - Tony Sager - Growing Up In (and with) Cybersecurity

GTCF9-blog-logo

Tony Sager has helped make the internet safer for you and me and he did this while humbly working for the National Security Agency. Tony and Ron discuss his career as a cryptologist, NSA's early work securing computers in the 90s, the "Systems and Network Attack Center" and what it takes to get vendors like Microsoft, the Air Force and security researchers to agree on what makes a computer secure. Tony continues this work today as the Chief Evangelist for the Center for Internet Security. Our title today comes from the byline on Tony's blog - "Sage(r) Cyber" - where he publishes articles about cybersecurity and updated blogs he wrote during his nearly 35 year NSA career. 









Transcript:

Ron Gula: [00:00:00] Hi there. It's Ron Gula with the Gula Tech Cyber Fiction Show. Today our guest in studio is Tony Sager. Tony, how's it going?
Tony Sager: [00:00:10] Great. Thanks, Ron. Thanks for having me here.
Ron Gula: [00:00:11] Thanks so much for being here. So, it's such an honor. I like to tell people I got my start at the National Security Agency.
Tony Sager: [00:00:18] Mm-hmm [affirmative].
Ron Gula: [00:00:19] Didn't work directly for you but I was like a room or two away or something like that.
Tony Sager: [00:00:23] Sure.
Ron Gula: [00:00:23] So thanks, thanks so much for coming on the show.
Tony Sager: [00:00:25] Oh, absolutely. My pleasure.
Ron Gula: [00:00:26] Awesome. So can you tell folks how you got into cybersecurity? How you got to work as a cryptologist and all that, all that kind of good stuff?
Tony Sager: [00:00:34] Yeah. Happy to share that. It's, uh, you know, I'm, I'm at the stage now where I've thought a lot about how I got to where I am. And it was completely by accident. So I'd love to tell you I had a grand plan and could see the cyber wave coming and, you know, positioned myself beautifully. But, of course, none of that's true.
Ron Gula: [00:00:49] [laughs]
Tony Sager: [00:00:49] So I was, um, undergraduate math. And completely by accident ... So I was trying to decide. I went to a small, liberal arts college. Right? The kind of a place that no one goes to recruit. Uh, maybe 12 graduates in math in any given year. And I'm in the math faculty area, the, the offices there, the fall of my senior year, going to talk to one of the professors. And the guy happens to be there. Turns out he's an alum. And, um, we got to talking about, "Hey, what are you gonna do next?"
Uh, I'm not quite sure. I'd like to get into the department of the RB. You know, be a mathematician modeling guns and rockets and that kind of stuff. And I'd done a little bit of that.
And he said, "Did you ever think about, uh, NSA?"
NSA? W- w- is that where they make the rockets? Or what is that?
He goes, "No- N- NSA." And, uh, he wound up giving me a phone number and then he followed up with a brochure, which I still have to this day. And, uh, it was about math- mathematicians at NSA. And, uh, it turned out this guy, whose name is lost to me, I tried to track him down some years later. Uh, he was the perfect, like, NSA employee of his day. Uh, he had an advanced degree in mathematics and was a native Russian speaker and a US citizen. And it was like, hey, State Department, CIA, NSA. You know, everyone- that was just perfect of the times, right? The mid 70s.
So I wound up taking the math test, uh, hardest math test I ever took in my life. Uh, sitting in a- by myself in a room in College Park. And did well enough, uh, not into the elite math programs but into what was called Communication Security. So, you know, and the- a recruiter calls me up middle of September 1977 and says, "Son," 'cause that's how they-
Ron Gula: [00:02:19] [laughs]
Tony Sager: [00:02:19] ... spoke back then, "Son, if you can report in two weeks, you've got a job."
I said, "Great. What is this job?"
And he said, "Well, I can tell you it's in the Communications Security intern program."
I said, "Great. What is that?"
He says, "Can't really tell you." But honest to goodness, he said, "Go to the Encyclopedia Britannica, look up an article on cryptography," and he had to spell it for me, "and you'll get an idea of what it is." And he said, "But I need an answer pretty quick 'cause this job closes out at the end of September." Which now I know was the end of the fiscal year. And he said, "What do you think?"
I said, "Well, I don't have another job. I'll see ya- I'll see ya at the end of September."
But that article was written by a guy named David Kahn. I don't know if you know that name, but he wrote a book called The Codebreakers. Right before- sort of, you know, popular, um, uh, I won't say expose but, you know, revealing a lot of the things that happened, uh, you know, in- at the NSA and the, the, uh, kind of work that goes on there.
So I wound up at- in that intern program three years. Uh, an unbelievable opportunity, right, to sample a lot of great things that were going on. And spent some time at the Treasury Department and so forth. Um, math wonks, you know, doing analysis for US Systems. And that's where I wound up after graduation. So I was with the group, uh, you might call them today Black Hats or Penetration Testers or whatever. We called them-
Ron Gula: [00:03:32] Oh, we're gonna- we're gonna get into that.
Tony Sager: [00:03:33] Yeah. Well, we called it Security Evaluation. So the mathematicians who looked at the security of US algorithms and crypto- and remember, those days were- almost everything was about confidentiality, right? And custom cryptography and, you know, had to be approved by NSA and so forth. And I was just hooked. I mean, just loved that kind of analysis and being around such smart people.
But the real, you know, uh, path for today, that led me to today, was the PC revolution, the personal computer revolution [laughing].
Ron Gula: [00:04:01] [laughs]
Tony Sager: [00:04:01] And so this, this idea of kind of commodity, you know, consumer priced computing that you'd have on your desk. And a friend of mine, um, uh, Bob, was starting a little group to look at, uh, the, um, emergence of this kind of technology, micro processors and personal computers, and the impact it might have on security. And he said, "Do you wanna come join me? I'm s- starting a little group."
And the real hook for me was I would get an Apple II [laughing], Apple II Plus, actually-
Ron Gula: [00:04:27] [laughs]
Tony Sager: [00:04:27] ... to have on my desk.
Ron Gula: [00:04:28] It's hot stuff in the 80s, right?
Tony Sager: [00:04:29] Oh, well, you know, uh, I go back to when computing was like giant room, right? Somebody guards the door. You hand over a deck of cards. You- if you're really nice to the guy you can get two runs a day. [laughs] You know? I always mistyped a JCL card, if you remember what those are-
Ron Gula: [00:04:42] [laughs]
Tony Sager: [00:04:42] So it was like the most tedious stuff. I don't know why would anybody do this? But this idea of computing being accessible. You could understand it, right? It was something that consumers could deal with. And so, so I made that career move. And, and a number of people told me that was- you know, I'm gonna ruin a- my math career at NSA. But that let me ride the wave. So it became, you know, we started to do things in software that were unthinkable. I mean, absolutely the mantra of the day software can't be trusted to do high security things, right? 'Cause it could have a bug. It could change. Right? You know, it's, it's the ch- you know, it's just if you really cared about something, you built it in custom hardware 'cause you could test it, you could design it, you could over design it, you could duplicate it, right? You could, uh, essentially model every possible failure mode and see did that affect the outcome?
So, but, you know, one of my lessons through all this is that economics always wins, right? You know, you got something that's commodity priced, reprogrammable, flexible. And so that stuff just inevitably started to sneak into every high security thing. So, so that, that notion just sort of led me down this path of, you know, the emergence of computing as both a technology issue as a- well as a social issue.
Ron Gula: [00:05:53] So could you talk for a little bit- I think a lot of people think of the NSA and they think of spying on the Russians or the adversaries or terrorists, right?
Tony Sager: [00:06:02] Sure.
Ron Gula: [00:06:02] But, but your career is more defensive. So wh- what's the role of the NSA on the defensive side?
Tony Sager: [00:06:07] Yeah. My, my career, uh, will probably never be repeated for a- for a variety of reasons [laughing].
Ron Gula: [00:06:11] [laughs]
Tony Sager: [00:06:11] But, you know, think of NSA, big machine and, you know, what gets in the press, right? You know, the, uh, uh, all those issues. Uh, but it's sort of a 90% signals intelligence. You know, an incredibly important, incredibly tough, uh, problem and important for the nation. 10% defense. So by accident, I wound up starting in defense and I never left, which is really, really unusual there. And, you know, it wasn't because I was a rebel or, you know, whatever. It just kinda worked out that way. You know, defense really appeals to me.
And so this notion of the role of NSA in defense ... So back then it was very focused, you know, on cryptography, on, uh, government systems. You know, and kind of things that we built. And, uh, and usually it's part of a life cycle, right? So the, the role that folks like me had was to be a part of the development life cycle. And, you know, you remember your military days, right? I mean, it would take 10 years plus to build a new radio for the US Army, right?
And it didn't feel like it at the time, but by today's standards, we had lots of money, lots of time, lots of control of the environment. Right? So you could say, only cleared contractors, only certain companies, parts can only be sourced a certain way. We get to reinspect the software. We get to put all kinds of requirements on it. The- you're the contractor, you gotta write all this documentation to prove you did the right thing. Et cetera, et cetera. You know, so, really a complicated government run life cycle.
And then the opportunity for folks like me to be there and say, hey, before the bad guys really- before they put this in the field, let's pay people to pretend to be the bad guy and see what they find, right? And how do we- can we fix it, you know? And so we, we, we could afford to do that. Again, luxury of time and control and resources.
But a lot of- it's a, you know, there's a sort of historical reason. So why you put essentially offense together with defense? Well that, that, that really originates I think from the, the notions of cryptography, right? If you really wanna design cryptography, you need to understand how to attack cryptography. And, you know, long before my time, uh, one way to do that is to sort of put folks together. You know, to make sure that you understand the state of the art of the attacks so you can defend against them and vice versa. Right? You wanna understand defenses so that you know how to attack them. And so this idea of this sort of closed society around offense and defense.
But the, the, uh, the sort of logistics of it were 90%, 10%. Right? 90% of the money and the people and the time, and in particular, executive attention. Right? So the director of NSA was always a career intelligence officer. And so, um, you know ... And even in terms of, um, authority and funding, as you know, big, big things in government, that- those were from different buckets, right? So different sort of- uh, the Pentagon sort of overlooking the defensive mission. The intelligence budget overlooking the, the second mission.
But it's, you know, as times change towards the, the middle and end of my career, it became really clear that people needed to understand both. And there was a lot of motion of people back and forth and cross training and, uh, joint development programs and things like that. So, uh, it just worked out for me, you know, in a particular way to stay focused on defense. But I was, you know, eventually got to sort of run the vulnerability finding for defense, right? So that naturally brought me with my counterparts in Cigent who were running vulnerability finding except against other nations.
Ron Gula: [00:09:23] And, and targets and whatnot.
Tony Sager: [00:09:24] [crosstalk 00:09:24]
Ron Gula: [00:09:24] So, with the cryptography example where you have the offense and the defensive people put together, so now you're working on the new computers, right? You got your Z80 with your-
Tony Sager: [00:09:33] [laughs]
Ron Gula: [00:09:34] ... what seven inch floppy disks, right?
Tony Sager: [00:09:35] Oh my gosh.
Ron Gula: [00:09:35] You know, all, all that kind of good stuff. But then people have to start figuring out what a trusted system is. So were you involved with the creation of like the orange book series? All those i- the rainbow books, all that different kind of stuff?
Tony Sager: [00:09:47] Yeah, no, not at all. [laughs]
Ron Gula: [00:09:48] Not at all. But you were there when it happened, right?
Tony Sager: [00:09:50] I was. Yeah. So kind of an observer more than-
Ron Gula: [00:09:52] Mm-hmm [affirmative].
Tony Sager: [00:09:52] ... so the start up of what was called the National Computer Security Center, right? So this i- this recognition that, um, this computer security is a big deal. And it's about sort of commodity IT, right? And how do we improve the security of it? So the- again, before my time, there was a, a notion of standing th- this up and the, the, the kind of acknowledge- I mean, there were a lot of great minds who've worked computer security for a long, long period of time. But the notion of standing up a, a national center that brought together government and industry to develop kind of the basic modeling or, you know, what, what, what does it mean to have security, right? How do I design? What are the architectures that make sense and so forth?
So, I was- I stayed in the analytic group at the time that was focused on the communications security problem. And- but there were a lot of people back and forth, you know, during that time, working the computer security things, uh, moving back into com-sec and sort of back and forth again. And, and com-sec wasn't static, right? So it, it was evolving along a, a path that got broader-
Ron Gula: [00:10:51] And for, for folks here from cyber, com-sec is communications security, like maybe intercepting a cell phone or radio transmission, even a- even a phone call, right?
Tony Sager: [00:11:00] Right. And so it was- it really started from those sort of built up, right? And what you saw was the evolution of both technology and then the security problems that come with it.
But the, the focus of the NCSC, the National Computer Security Center, you know, produced the classic works of the day, right? And you referenced, uh, you know, the, the rainbow series, right-
Ron Gula: [00:11:18] Mm-hmm [affirmative].
Tony Sager: [00:11:19] ... these colored books each, each of which dealt with a different topic. And some of the best minds in industry and government, right, created those. And so a lot of great work. Uh, at some point there was a, you know, a reason to essentially, um, re- re-organize the way that was done. And so the elements of that, uh, wound up being sort of absorbed into the mainstream of NSA. And, uh, so that led to some other restructuring things. And so I became sort of part of the receiving end of that, right? Where all those- where did those folks wind up? Well, they wound up in, in organizations that I was a part of.
Ron Gula: [00:11:49] And, and a lot of that reorganization was because you had some experience with these assessments, right? These penetration tests.
Tony Sager: [00:11:56] Right.
Ron Gula: [00:11:56] So how did- how did you feel that you were sending guys like me out in the field and I think it ha- ha- had been going on a little bit longer before I got there-
Tony Sager: [00:12:02] Right.
Ron Gula: [00:12:02] ... in the 90s. But you're sending people out in the field. And, you know, they're doing their work. But you started to see some patterns, you know, from-
Tony Sager: [00:12:10] Right.
Ron Gula: [00:12:10] ... from our customers out there.
Tony Sager: [00:12:11] Yeah. That was, uh, you know, again, I grew up in a world where the, the notion was, if you got the mathematics right, then everything would be okay. And, um, of course that's not true. You know, that is. And you have to have great mathematics, right? You, you can't afford sort of foundational flaws in cryptography for example, a randomization or initialization. But, you know, it became like, well, that, that, that real life stuff, you know, that's for other people. But more and more, so folks like me who drifted sort of out of the pure math area into computer science, that became, hey, you know, the software could actually undo the, the good intentions, right, of the mathematics. Or the way we built the hardware could lead to all kinds of bad side effects. So we needed to start looking at that too.
So that started to branch out. And then it sort of, well, you know, after we build it, it gets fielded. Oh, and people operate it. And they make errors. They don't do things the way we thought they would do things, right? There are procedural things that will happen. So this notion of getting out to, to see these sort of things. And it, it started in an ad hoc way, you know, that we would find people who could sort of work the operational testing stuff. And we, okay, let's get the team together and go, go look at this.
Uh, but as it grew, it became really clear that this is vital, right? So you started to see the things you talked about. How do I codify this? How do I find people, train people, develop them, build organizations? A lot of this, I would say, it's not a direct line, but if you remember the term eligible receiver, 97, right? So the big wake up call in the defense department that said, wait a minute, you know, this is a big deal and it affects everybody, right? So it was a major exercise where cyber components, you know, happened to be honed at NSA, played a major role and could disrupt, you know, the, the absolute fundamentals of a major, you know, exercise.
And there's a couple ways to deal with the results of that from an executive level, right? You could say, oh, you know, that's, that's not fair. That's not gonna happen. We'll just push that aside. Or you could say, uh oh, if this happened during this exercise, it's gonna happen in real life and it could happen to others. And it really became kind of a rallying cry, I think, across the department. Which led to, within NSA, the, uh, codification right? The, the formalization of things like red teams and blue teams and, you know, the sort of testing went from, you know, sort of, uh, bits and pieces in different organizations to something that was really a- had a name, had a place, had a, you know, budget and a, and, and an authority. And, uh, so a lot of things kind of developed from there.
So I, I, I came into that-
Ron Gula: [00:14:34] If I may-
Tony Sager: [00:14:35] Please.
Ron Gula: [00:14:35] The term red team.
Tony Sager: [00:14:37] Yeah.
Ron Gula: [00:14:37] Who came up with that?
Tony Sager: [00:14:38] You know, I don't know. Um, that's a good question. There are a lot of these terms that are a little bit lost to history. Black hat. The penetration testing. Some of them ... I know a little bit of the history of, uh, uh, this kind of analysis for, um, um, nuclear weapons.
Ron Gula: [00:14:50] Mm-hmm [affirmative].
Tony Sager: [00:14:50] So the- Black hat was the preferred term there, right? You're playing the black hat role. And it later kind of got grabbed by the industry and events and so forth. So red teams, that's a good question. It was always a DOD term-
Ron Gula: [00:15:00] Yeah. I always thought they- the red team was the Russians. You know?
Tony Sager: [00:15:02] Yes. That's right.
Ron Gula: [00:15:02] So blue team, red team. Blue's US. Red team's the bad- bad people, right?
Tony Sager: [00:15:05] I'd, I'd assume that that-
Ron Gula: [00:15:06] [crosstalk 00:15:07] And I think it just pulled in from there.
Tony Sager: [00:15:07] Right.
Ron Gula: [00:15:08] Yeah.
Tony Sager: [00:15:08] But it's so, uh, it's so embedded. It's like saying Kleenex or something, right? You sort of-
Ron Gula: [00:15:12] Yeah.
Tony Sager: [00:15:12] ... forget where it came from. So, but this idea ... So I, I became involved, uh, at a, I'll say front line supervisory level, you know, in the, in the red teams and blue teams, uh, either providing people or conducting such things. And, uh, you know, over time what, what really came to, to trouble me or to, to really, uh, uh, cause me to think about the model, was this- seeing the same results over and over again. And it wasn't until I moved up into management that I really appreciated that.
So I usually talk about my career in three phases, uh, they're not linear but conceptual. But the first third is learning the craft of finding vulnerabilities, right? Mathematics, computer science, implementation software. The middle third was about moving into management. And so running organizations that did that. And then the last third was about trying to figure out why, why do we keep seeing the same problems over and over again? What's going wrong here? Why, why can't we learn from this?
And if you remember those days, and some of this is before your time even, Ron, you know, there was nothing more carefully controlled than the report from an NSA red team. Right? The only people that got to see it were the participants, right, the team. And then, you know, outbriefed to the executives, right, to the general officer, to the commanders, to the IT people [inaudible 00:16:26]. And it turned out, as I looked at this, and I moved up to management, I realized no one's looking at all of these things and figuring out what am I seeing in common? Is the root cause bad DOD policy? Bad purchasing? Bad technology? And why are we seeing the same thing in every setting? And why, why are we only telling the owner of that system, why isn't the department seeing this at a higher level to say, you know, I have a systemic problem? Everybody is failing on this part of it.
So this notion of, you know ... A- and I, I actually came up with a conceptual model that I used with my workforce for years. And, and I happened to have a kid. You know, uh, seventh or eighth grade, you know, learning basic science, right? And scientific method and sampling and things like that. And I said, you know, really what we do when we do a penetration test, a red team, a blue team, yeah, we wanna help that customer when we find something. Give them- But what we're really doing is we're really sampling the environment, right? We're taking one sample. And the purpose of sampling is so I can draw conclusions about the population from which I'm sampling, right? The DOD. DOD's networks.
And so unless we combine the samples, right, we take them in a uniform way. We combine them. We have separate people who think about what am I learning from these samples, we never make any progress. I mean, that took me some time to sort of figure all that through. But I actually, literally, pulled out a hand out, the one my kids had from seventh grade, you know, about science and sampling, and, you know, drawing conclusions. I said, you know, we've, we've forgotten some of the lessons that we knew as kids. And this idea of ...
And I get it, you know, in the early days, right, it was oh my gosh, you- your testing revealed a vulnerability in the live DOD system? We couldn't possibly tell anybody else that! Right? And it became this, this, uh, really closed, you know, knowledge. And there was a, a, an incident, I can't tell you the details, but it was ... So at then- at one point I was running the blue team. Or they were one of my groups. And the red team had found the problem that was a big deal. Got briefed at very top levels of the DOD. And they kind of came back in through channels and the blue team was, uh, tasked to go fix it. I had to go out there and find and fix.
And that- and this is no exaggeration, I went to the Navy captain that was the chief of the red team, right, and said, "Hey, uh, you know, my team is tasked to go fix this problem. I'm- I don't know what you guys are briefing but hey, you know, mine's got guys- my guy's gotta go fix this. Can we get outbrief?"
And he said, and he wasn't a nasty person, he was a good person-
Ron Gula: [00:18:44] [laughs]
Tony Sager: [00:18:44] ... "No can do." I'll never forget. "No can do."
I said, "Really? Why is that?"
"Well, the senior at the Pentagon, who's in charge of this thing said, 'Close hold.'"
I said, "Why do I get invited to the meeting? I can't believe the guy's an idiot, right? He wants this problem fixed."
Well, I actually found that person later [laughing], who happened to eventually wind up on the NSA Advisory Board. And I told him this story. And he roared laughing at-
Ron Gula: [00:19:05] [laughs]
Tony Sager: [00:19:05] ... the idea. But it wasn't that person- the Navy captain was not evil, right? He was under orders. And this notion of this, oh my gosh, vulnerability in a live DOD system. Well what I always said, this, this over-secrecy actually does not cripple the bad guy, it cripples the good guy, right? 'Cause we don't see these problems in common. We don't identify root cause. We don't figure out, you know, how do I fix many things at once? Not by chasing whack-a-mole but by changing DOD policy or purchasing or training or whatever the, the root cause happens to-
Ron Gula: [00:19:38] So I, I had two comments there. So one-
Tony Sager: [00:19:40] Mm-hmm [affirmative].
Ron Gula: [00:19:40] ... so, you know, I leave NSA. I eventually start Tenable Network Security. And, you know, we're selling vulnerability scanners.
Tony Sager: [00:19:46] Sure.
Ron Gula: [00:19:46] And so in the early 2000s, it was common to have like a DOD auditor whose job was to fly all around to every Air Force base-
Tony Sager: [00:19:52] Okay [laughs].
Ron Gula: [00:19:52] ... and, and do the vulnerability scan and bring it back, right?
Tony Sager: [00:19:55] Mm-hmm [affirmative].
Ron Gula: [00:19:55] So that's an improvement. They, they, they went from the NSA doing it to actually doing it in the DOD, which is good. But then it still took like another almost 10 years for the DOD to be like, "We want a permanent continuous monitoring system to do-
Tony Sager: [00:20:07] Right.
Ron Gula: [00:20:08] ... what you're talking about in real time so we can get that ... " And, and Tenable was also involved in that. But it, it, it just took so long to kind of get them-
Tony Sager: [00:20:15] [crosstalk 00:20:16]
Ron Gula: [00:20:16] ... convinced.
Tony Sager: [00:20:16] And you're exactly right. I mean, one of the lessons, right, is the power of the system-
Ron Gula: [00:20:19] Mm-hmm [affirmative].
Tony Sager: [00:20:20] ... or bureaucracies. That, uh, you know, one, one former, uh, director NSA once, was as a quote, and I'm tracking this down to make sure he actually said it, but it was, basically, "I'm con-" he said once, "I'm convinced largen- large organizations prefer to fail rather than change." You know, and there's something to that. You know, people forget. I often say that the, the real purpose, the sort of real functional purpose of, uh, of red teams in the early days, was drama.
Ron Gula: [00:20:45] [laughs]
Tony Sager: [00:20:45] Right? Remember these days? You know, I have to convince the decision makers they have a problem. So I need a dramatic event, right? You know? But I looked at it and I said, you know, you don't actually learn enough information from a typical red team to drive a real improvement program. And it's okay. There's a need for drama. But, you know, any executive that's not paying attention to this stuff today, really oughta- needs to find other work. Right? We don't need drama, we need information.
Ron Gula: [00:21:09] Yeah.
Tony Sager: [00:21:10] And so it's really hard to change- N- I- one other quick story if you don't mind. A large agency I will not name, uh, for a variety of reasons, um, you know, so the, the, the, uh ... Cecil calls me in. Hey, I got this, uh, uh, intelligence agency requirement. I gotta do penetration testing on a hundred systems. A hundred systems. Could you guys help me out?
Now I think I was running the blue team- or at the- was one of my groups at the time. I said, look ... Metro Plus systems. Oh, and you gotta do this every three years. Okay. Y- you know, we could do three or four a year. And I'll, I'll talk to my- I'll talk to my friends on the red team. I'll get them to do two or three a year. We're not even gonna make a dent in this list.
And I said, "I'll put my lunch money on the table right now. I could write 80% of the report right here, right now, without putting a single human being to work." And I said, "So, but the reason you can't take that list that I give you and just get to work on it is because of the requirement, right? You think you gotta meet that requirement. And you're afraid that some security wonk like me is gonna come up and say, 'You're not being responsible 'cause you didn't get tested. I will walk with you," and I made this offer straight up, "into any meeting with any auditor, any IG, to say it's more important for us to do this 80% right now than do another test.'"
And he, he loved the idea. But at the end of the day, they wound up getting a big bag of money to bring in one of the beltway contractors to do the penetration.
Ron Gula: [00:22:36] I'm sure they did a good job.
Tony Sager: [00:22:38] I'm sure it was wonderful.
Ron Gula: [00:22:38] [laughs]
Tony Sager: [00:22:39] But, but it wasn't he was a bad person, right? It was the power of the system. His entire performance rating was based on can I get through those umpteen hundred systems penetration tested and, and approved in this period of time? And so it's, it's astounding how much momentum there is in that stuff.
Ron Gula: [00:22:56] So, so let's talk about vulnerabilities a little bit.
Tony Sager: [00:22:58] Sure.
Ron Gula: [00:22:58] And then we'll talk into like, you know, how we can manage them and, and, and-
Tony Sager: [00:23:01] Okay.
Ron Gula: [00:23:02] ... be more proactive about them. So-
You know, how we can manage them and, and-
Tony Sager: [00:23:04] Okay.
Ron Gula: [00:23:04] And be more proactive about them, so, you know, so based from where you said, what's a vulnerability these days, and then maybe what's a vulnerability that you actually worry about?
Tony Sager: [00:23:11] Yeah, I do have, it, it, uh, NSA because it's full of math wonks, or was at the time. You know, there's a whole definition and hierarchy, and you know, and so forth, all that, right, vu- vulnerabilities. And so we would distinguish between sort of flaws, vulnerabilities, um, attacks and so forth. And there's, there's reasons to be kinda picky about that, right? Because as you know, as a industry, people throw terms around just kinda willy nilly. And so, it's, um, it tends to cloud what the issue might be. So, you might say, well, I, I looked at the software, and this was a real learning for me, right, 'cause I didn't come on this as a math wonk. And I, you know, I grew up, I spent a c- at least a few years developing systems, primarily embedded things in assembly language, so that's my background. I mean, you know, when you could say it'll fit inside an 8K EEPROM, right?
So it's, that's, that's dinosaur days, but, you know, as you looked at software, it was pretty clear you, you could teach people the, the language. You could teach them the control structures, but if they didn't understand how hardware actually runs, you couldn't really understand all the security implications, right? And then when you looked at the software running on a har- on a processor, and when you looked at system context, it could be entirely different as to whether something you found as a flaw in the software was in fact something you had to worry about. So, it's important to say, "Okay, I found something in the software that makes me uncomfortable." I found, you know, and then you might say, "What, you know, what," and I look at this thing, uh, it, it makes me uncomfortable because things like this have led to some sort of problem in the past. Okay.
Oh, but when I look at this, you know, in context, well, if, there's a huge a difference if this piece of software is running on this sort of environment or this kind. Oh, and if this environment is outward facing, you know, that's a huge difference between if it's, you know, uh, back in a server, you know, uh, DMZed kind of environment there. And so, to, the, the step from, you know, sort of a, a thing that makes you uncomfortable as an analyst, to something that is actually exploitable, is huge. In fact, a lot of the red team work was about, you know, sort of the operationalization, uh, the weaponization, right? It's non-trivial, so you'll see people point out flaws all the time that may or may not, when you actually look at them, represent a real risk, right? And it may depend on the context. So, it's important to have some sort of sense of what that looks like.
Uh, but, so, the, the, the, the, the, but the, the sort of lesson to your question is that vulnerabilities can sorta appear anywhere, right, in the way humans operate and flaws in software in... and the, flaws in hardware were a big thing when I was coming up too, right? I had a few new, uh, kinda follow that level, again this is very low level embedded system, but hey, much to my surprise, I didn't realize this, uh, microprocessors can have quirks, they can have bugs, right? They can have, uh, and that, back, it was actually kind of high arc in the old days, if you remember, 6502 mi- proc- microprocessors had some quirky things that people would use as like programming techniques, 'cause it was like kinda clever the way it wrapped around the, you know, the pages and that kind of thing. So, so understanding that sort of stuff is really important.
So, but so this notion of vulnerability, you have to I think be specific about it and try to say, "Well, on the scale of, you know, how much do I really need to worry, uh, where, where does this, um, really fall?" And a lot of, there's so much noise in the industry about these things, you know, because are all rushing to find the thing that gets them attention, and you know, that, that they can talk about and get into press on. So, you have to be, I think, careful and thoughtful about these things now. You know, careful and thoughtful doesn't always match, you know, the way a market works, but that's I think really important.
Ron Gula: [00:26:40] So, so I'm going to ask you as a, as, as a mathematician to boil this down, right?
Tony Sager: [00:26:44] [laughs] Okay.
Ron Gula: [00:26:45] So we have probably more lines of code out there-
Tony Sager: [00:26:47] Uh-huh [affirmative].
Ron Gula: [00:26:47] ... more computers out there than ever before.
Tony Sager: [00:26:49] Mm-hmm [affirmative].
Ron Gula: [00:26:50] So, and we see more vulnerabilities that, than, than ever before.
Tony Sager: [00:26:54] Yeah.
Ron Gula: [00:26:54] But are we getting better at writing secure code, or are we getting worse at writing secure code?
Tony Sager: [00:27:03] Yeah, this, that is a great question, and I don't have an easy answer for it. But it would say yes, that the explosion, right, of software, and complexity, and sources of software, right, comes from sorta every, everybody. Uh, make the problem much, much worse. And so, uh, you know, in balance it's likely that we're sorta worse off, and you know, vulnerabilities kind of flow and bad guys go kinda right where the money is. So as, you know, in the early days, the focu- uh, of the, the really, um, you know, the world wide web and all those issues, right, the security of the operating system, then the browser, right, then all these, you know, web-enabled apps, mobile code. I remember that term, right, that was a big thing in the, you know, the s- the, that is the thing that runs as dynamically assembled at run time, right? So there was no thing to examine statically earlier, you know? And for me early it became really clear, you know, that the incredibly tight limits of human inspection of software, it's just not, you know, we ran, actually ran experiments to this effect back in the day, right? Could I, if you let me write a piece of software and give it to you, can you find if, if I have had control of this, right, can you find that thing that, uh, you know, would, would 'cause a negative security-
Ron Gula: [00:28:14] Sounds, sounds suspiciously like solar winds, Tony. I-
Tony Sager: [00:28:17] Well-
Ron Gula: [00:28:17] ... I don't know.
Tony Sager: [00:28:19] Yeah, those, well, those experiments convinced me, right, that, uh, you can't count on human inspection.
Ron Gula: [00:28:24] Mm-hmm [affirmative].
Tony Sager: [00:28:24] You can't count on testing either, uh, right, that is there's only so many things you can observe. [crosstalk 00:28:29]-
Ron Gula: [00:28:29] You can't count on open source and that the community's gonna find it, either.
Tony Sager: [00:28:32] Well, there was a, a, you know, again, the, the early naïve days, you know, 10,000 eyeballs on every piece of code, and the reality is it's two guys in a garage that really-
Ron Gula: [00:28:39] Mm-hmm [affirmative].
Tony Sager: [00:28:39] ... care about this thing, right? And so, uh, yeah, a whole separate is, you know, a lot of our high secur- the, the infrastructure that we are counting on was actually kind of built on the cheap, right? And great people, you know, have taken a references implementation, and suddenly it's become a production implementation, right, that, that no one's ever examined again. And so, there's plenty of examples over the years, as you know, of sort of things that have been hidden for many years in really critical pieces of, of the infrastructure. So, so it says, then, that, you know, um, there's clearly a much better understanding of how to build security into the development life cycle. Uh, and some companies have done a great job with that, and others have not, right? And there's other reasons. And then the, the focus today of this sort of, uh, you know, rapid development, dev ops, you know, this sort of, okay, uh, the market pressures, all my vendor friends, even back in the 80s, yeah, hey, second to market is last to market. You know, that's our model, right, you know?
And so, yeah, you get it, uh, yeah, it's buggy, but too bad. If we don't put it out there, the market speaks. You know, that's part of the lesson, where economics always wins. And so we, you know, people that grew up in security, you think, well, everyone should care about security because we care about security, but the reality is not true. And the same is true, by the way, with all the issues of public health and safety, and civil engineering and all that, right, you know? That is you can't ask the population to become experts or to care at the same level, and so the knowledge of experts really needs to be built into most of the, uh, you know, through building codes, infrastructure, licensing of professionals, that, that sort of thing. You know, that's the way we deal it, with risk as a society, right? And we don't expect perfection, but we have a right to expect, you know, a, some level of, of confidence that's like-
Ron Gula: [00:30:19] Re- reasonable, reasonableness, right?
Tony Sager: [00:30:22] Yeah, I think that's-
Ron Gula: [00:30:23] [laughs]
Tony Sager: [00:30:23] ... that's a, a, a huge theme. And it's actually, it makes a lot of folks of my generation uncomfortable, right? We're, we, uh, I, I think security is a classic case, cybersecurity is a classic case of perfection is the enemy of the good.
Ron Gula: [00:30:34] Mm-hmm [affirmative].
Tony Sager: [00:30:34] Right, so that is in the pursuit of that sort of 80s era, mathematically verified, kernalized, you know, beautifully designed operating system that don't happen to run Power Point, by the way, uh, you know, there, some great ideas were in there. But if you don't acknowledge user preferences, that sort of capability that the market demands, you know, you see what happens, right? People go with what gives them the function that-
Ron Gula: [00:31:01] Yeah-
Tony Sager: [00:31:01] ... they look for-
Ron Gula: [00:31:01] ... so-
Tony Sager: [00:31:01] ... in support systems.
Ron Gula: [00:31:02] So, let's talk about that, so this concept of good enough security, best practices.
Tony Sager: [00:31:07] Yeah.
Ron Gula: [00:31:08] So you were involved with the NSA best practices guide, right?
Tony Sager: [00:31:11] Right.
Ron Gula: [00:31:11] So instead of sending out, you know, pros from Dover-
Tony Sager: [00:31:14] Mm-hmm [affirmative].
Ron Gula: [00:31:14] ... doing the, doing the hero audit-
Tony Sager: [00:31:16] Right.
Ron Gula: [00:31:16] ... you know, high five and here's a report. Let's be proactive.
Tony Sager: [00:31:19] Right.
Ron Gula: [00:31:20] And let's have frameworks. Let's have guides to, to do this. And I think right around the same time, SANS was kinda going in a different direction, right? They were talking about the top 20 vulnerabilities. Like-
Tony Sager: [00:31:32] Right.
Ron Gula: [00:31:32] ... you fix these 20 vulnerabilities-
Tony Sager: [00:31:34] Mm-hmm [affirmative].
Ron Gula: [00:31:34] ... and you're good, right? So you were kind of pioneering because the, the, the message that the commercial world was taking was, oh, I gotta like secure FTP, secure email, secure some file sharing, and I'm, I'm, I'm good to go.
Tony Sager: [00:31:45] Right.
Ron Gula: [00:31:46] The NSA guides were really the forerunner of Ce- Center for Internet Security, the, the, the, the critical controls, so what was that, what was that, um, uh, what's the-
Tony Sager: [00:31:55] Yeah-
Ron Gula: [00:31:55] ... backstory there, as that came to light?
Tony Sager: [00:31:57] Sure, yeah, I mean, I, I, I take no credit for the origin of the NSA's security guides.
Ron Gula: [00:32:01] Mm-hmm [affirmative].
Tony Sager: [00:32:01] But I sort of wound up managing the, the, that kind of activity. And, and at that time, history is a little fuzzy to me, but sorta late 90s, you know, was when NSA started to put together these security guides. And it turns, so no one had a grand plan, by the way, as near as I can tell.
Ron Gula: [00:32:15] Mm-hmm [affirmative].
Tony Sager: [00:32:15] That is, we had blue teams, right, pass a [inaudible 00:32:19] receiver. We had red teams. And, um, this idea of sorta codifying what would be the recommended practice, how should, what-
Ron Gula: [00:32:27] Mm-hmm [affirmative].
Tony Sager: [00:32:27] ... should my registry keys be set to? What should my file permissions look like? It turns out, as, as near as I can tell, that was originally written down as a training guide for our own people, right? How do I, what should everyone look for?
Ron Gula: [00:32:40] How do I pardon Windows NT?
Tony Sager: [00:32:42] Yeah, and that was-
Ron Gula: [00:32:43] Right?
Tony Sager: [00:32:43] ... the idea. And there were other sorta pieces around in the industry. SANS had some things they called step by step-
Ron Gula: [00:32:48] Mm-hmm [affirmative].
Tony Sager: [00:32:48] ... guides back in the day, and there were few things. There was a guy na- there was a thing called, uh, Bastille Linux back-
Ron Gula: [00:32:53] Mm-hmm [affirmative].
Tony Sager: [00:32:53] ... in the day. Yeah, and kinda JB [inaudible 00:32:56], the sort of first [inaudible 00:32:57] guide that, but there were pieces, there were early [inaudible 00:33:00] stings, there was a little bit stuff from this. But for NSA it became this sort of n- and the, the name was always security guides. And it originally started sort of a training thing or a handout thing, something you leave behind after you leave the customer, or you'd say, "We, we don't have time to come see you, but here's what we would test for." This is a huge part of that. And it started to take on a life of its own, and then, um, I worked in one of the next door groups that wound up moving, you know, up to the... at that time there's a group called the System and Network Attack Center, right, great acronym, the SNAC-
Ron Gula: [00:33:27] Sounds snappy, right?
Tony Sager: [00:33:28] Yeah, the SNAC-
Ron Gula: [00:33:29] Snacky.
Tony Sager: [00:33:30] And, um, oh, the, yeah, the, the whole controversy was be- uh, believe it or not, System and Network Attack Center. Oh, my god, we couldn't possibly put attack in the name of an NSA organization. That, uh, believe it or not, that's, that was the level of discussion in that sort of time.
Ron Gula: [00:33:43] System and Network Assessment Center.
Tony Sager: [00:33:45] Yeah, something like that. But it was a, you know, sort of a bold step at the time. So, I moved up, bec- become the deputy chief of that, and the first job the chief gave me is look at this security guide thing and figure out whether we ought to be doing it. There's seems to be a lot of friction there and, you know, pieces floating around. And I looked at it, and I, I think I knew what the answer was, but like, I know, I looked at sort of what was in the market and what was going on, and what NSA said, and so forth. And I said, "You know what? We could make a huge difference if we really got organized around this. And it, again, it, it had become, had taken on a bit of a life of its own. And eventually, at least of two of the DISA and STIGs were actually NSA documents, by permission with their cover on it. You know, the, 'cause they, they're model was to pay someone to do, to write that.
Ron Gula: [00:34:24] And for folks listening, what's, who's DISA and what's at stake?
Tony Sager: [00:34:27] Oh, uh, sorry.
Ron Gula: [00:34:27] Yeah, it's all right.
Tony Sager: [00:34:27] The, think of DISA as the big, uh, sort of the IT service provider for the Defense Department-
Ron Gula: [00:34:30] Mm-hmm [affirmative].
Tony Sager: [00:34:30] ... Defense Information Systems Agency. And a STIG, uh, roughly, uh, uh, the equivalent to a, uh, CIS benchmark, right? A-
Ron Gula: [00:34:38] Mm-hmm [affirmative].
Tony Sager: [00:34:38] ... a component level sep- you know, individual security guide. You should set this registry key to this-
Ron Gula: [00:34:44] Mm-hmm [affirmative].
Tony Sager: [00:34:44] ... value, and so forth. So very low level, very component oriented kinda stuff. So, for NSA then it became a thing, but I, I deliberately kept it as, you know, we didn't set up a group to write those. I, I wanted it to be, this is an outgrowth of security testing, right? This is the analysis that we do, and we happen to codify it. And the reason is, 'cause again, I, I live in bureaucracies, right? And I got, I had offers from the Pentagon. They put a pile of money and blah, blah, I said no. The problem is I don't wanna fight with DISA for authority. I wanna support them. I don't wanna fight with this, I wanna, you know, be a part with them.
Ron Gula: [00:35:20] And, and when you say fight, you're literally doing things like, at the time maybe password length. You might have an-
Tony Sager: [00:35:25] Right.
Ron Gula: [00:35:25] ... opinion, maybe it's 12 characters, maybe it's 9 characters. Well, some auditor at the Pentagon is gonna say, well, if we from 9 to 12, that's $500 million we have to spend fixing it.
Tony Sager: [00:35:35] Yeah.
Ron Gula: [00:35:36] Right?
Tony Sager: [00:35:36] It, it is a, it was, and, you know, there, there was a whole kind of a thing back then. I called it the special snowflake school of security, right?
Ron Gula: [00:35:42] Mm-hmm [affirmative].
Tony Sager: [00:35:43] We're all special snowflakes. Oh, you guys are doing that in the Army? We couldn't possibly do that in the Air Force, you know? Uh, how much of that do you remember, right? Everyone is special, and every, uh, auditor gets to decide for themselves what's important, every assessor, IG, and that's just nuts. So, what happened? And then, so, you know, we, we released the NSA security guides to the public, by the way, in June of 2001. But some time around then, there was a guy named, uh, Paul Bartock, the late Paul Bartock, was just a, an amazing force for us, you know, for good back then. Um, Paul leans into my office, you know. I, uh, I always knew when Paul, he would, I would hear a knock on the doorframe and he'd lean in and say, "Hey, Tony. Um, you know, our Windows NT stuff is really good, but Windows 2000 is coming, and we're going from, uh, I forget, a couple hundred registry settings to a couple thousand. I don't know that we can keep up the same level of work on 2000 as we did," and by the way, NT isn't going away any time soon, right, in the Defense Department.
And I said to him, "I tell you what. You go make me a list of everybody in the US government that sponsors a Windows NT security guide. Let's see if they wanna work together. See if I can talk them into it. And the count was, uh, is a bit lost to history 'cause I never wrote this down, but I think it was 14. There were 14 different, uh, in Windows NT security guide, paid for by you, the taxpayer, by the way, right? You know, uh, and all good people, Navy, [inaudible 00:37:04]-
Ron Gula: [00:37:03] Mm-hmm [affirmative], mm-hmm [affirmative].
Tony Sager: [00:37:04] ... the Army had a gold master or something, I think it was called. You know, all these, there were two from NSA. Two-
Ron Gula: [00:37:09] Two.
Tony Sager: [00:37:10] ... different ones from NSA. One from the RND group and one-
Ron Gula: [00:37:12] Yeah.
Tony Sager: [00:37:12] ... from the blue team. So, I, uh, "Paul, let's split the list. You call half, I'll call half, see if they want to work together." And thi- I called this the great experiment. By the end of the day, the list evaporated. People got reorganized, they were no longer funded to do it, they had adopted ours, they're, you know, they no longer had the mission, whatever. And who was left standing, you know, the usual, so DISA, uh, NIST, um, [inaudible 00:37:34] offered to me a miter point, you know, a, uh, [inaudible 00:37:36] effort, but they had no resources otherwise. So, you know what, I think we learned a lesson here, right? And by the way, and we did go back to examine this, if you look at the content all those, how much was, uh, how much was the same recommendations, all of them, in the high 90%. So we all spent, right, separate energy, separate money, you know, separate resources to reach essentially the same collusion.
Now, at some point, I, I have to go back and recreate the history, but, um, NIST wound up hosting a meeting at their, um, at their site there [inaudible 00:38:08]. And, uh, so I started the meeting with this, uh, and we, um, the, uh, early folks from CIS came, DISA was represented. We had the Microsoft security option, we had SANS with their step by step guides, uh, us, you know, all the, all those players. And I started the meeting with this, uh, and there was this fellow, uh, [inaudible 00:38:27] at NIST, I don't know if you know him here, but the guy's a, a work machine. He had literally taken everyone's draft guide for Windows 2000, built this gigantic spreadsheet of, you know, NSA said this, NIST said that, you know [inaudible 00:38:40]. And we were, we had planned an entire work d- d- work day to do nothing but go through that, point by point by point.
So I said, started the meeting with this. If we, okay, so we, we'll do this. If at the end of the day we get into the 90% agreement range, and we all agree, we'll, we'll aim to produce one thing. And if we, NSA, believe and fall on our sword that there's some special requirements for intel, intel, the intelligence community, I'm gonna, I volunteer. We take that as a [inaudible 00:39:11]. DISA, will you take DOD, uh, you know, whatever special requirements they believe, take appendix B, NIST, rest of government, you know, SANS, CIS, et cetera. Yeah, at the end of the day, I had [inaudible 00:39:22] couple thousand items. The number of differences were about 2 handfuls, maybe a dozen. And half of them we couldn't resolve 'cause we just didn't have enough information at the time. So to me, so I closed this with the lesson of, okay, now look at this, right? We're, we're gonna say the same thing, essentially. Let's focus on what we know in common and manage the differences in some other way, appendix A, B, and C. And that's become, uh, like a bedrock life principle for me, right? That, you see that in CIS. Let's focus on what we have in common. And yeah, we do have differences, but if we start from this, we're all special and unique, we never make any progress.
Ron Gula: [00:39:59] Right.
Tony Sager: [00:39:59] So that was a, kind of a driver for me. And again, these sort of, uh, I call them experiments, right? We actually looked at what people said, tried to compare it, and looked at the difference and thought, you know, we're not, we're not getting our money's worth in the differences.
Ron Gula: [00:40:12] So, that's such a good story because, you know, a lot of people who are outside of cyber, they ask, "What do I nu- do to be secure?" You know, and it's almost like, you know, I'm Catholic, what do I need to do to be a good Catholic, right? Well, you can follow the 10 commandments, but-
Tony Sager: [00:40:25] Right.
Ron Gula: [00:40:25] ... then it kinda tails off there like what are you supposed to do, right?
Tony Sager: [00:40:28] Right.
Ron Gula: [00:40:28] And, and cyber security is the same, the same thing. So-
Tony Sager: [00:40:32] Yeah.
Ron Gula: [00:40:33] How, how do you look at cybersecurity now? Right, so you came through, we called it-
Tony Sager: [00:40:37] Right.
Ron Gula: [00:40:37] ... information security-
Tony Sager: [00:40:38] Right, mm-hmm [affirmative].
Ron Gula: [00:40:39] ... what the computer network operations-
Tony Sager: [00:40:41] Oh, yeah-
Ron Gula: [00:40:41] ... a couple classified terms we can't talk about-
Tony Sager: [00:40:43] Sure.
Ron Gula: [00:40:43] ... and stuff like that. But when people say, "I'm in cyber," do you go, "Do you mean cybersecurity?" Like how do you, how do you look at our industry right now, how we've named it?
Tony Sager: [00:40:52] Yeah, it's a little chaotic, to be honest, Ron. And there's sorta two parallel paths here, right? There's the, there's the, um, called meaningful technical path, and then there's the bureaucratic and marketing path, okay? So again, COMSEC, and for us it was, uh, information security, computer security, and then the terms assurance came in at some point, right? So, and that was a big step for, for NSA, you know? I, I know- I know who actually proposed that term and led the, led the, uh, the change there. And what it was recognizing was something important. That is security, right, we, we always thought of as a destination, we're gonna get there. We will be secure. That doesn't seem to c- that doesn't look like it's gonna happen in my life-
Ron Gula: [00:41:30] I don't, I don't think we've got there yet.
Tony Sager: [00:41:31] Right. But the notion of assurance was about confidence. It's not am I at the right stage or not. It's what level of confidence do I have and my information and my operations. And that's how, uh, conceptually a big, big step, right, to say, "I don't expect perfection." You don't expect perfection, you, you hope for it, right? When you, uh, book a ticket on a commercial airplane, or when you go in surgery, right? Now, you, you look at-
Ron Gula: [00:41:55] Um, I, I kinda hope for perfection.
Tony Sager: [00:41:57] Oh, absolutely. But you recognize-
Ron Gula: [00:41:59] Yeah-
Tony Sager: [00:41:59] ... right, they're, they're human-
Ron Gula: [00:42:00] ... Pe- mistakes can happen.
Tony Sager: [00:42:00] ... beings.
Ron Gula: [00:42:00] Yes, yes.
Tony Sager: [00:42:01] But you know that that person is no- that's operating on you, right, has, you know, has been certified, you know, is capable of demonstrating knowledge. There's a record made, you know, and they, you know, they, they have, if thy1 failed enough they're no longer in the profession, a commercial airline pilot, right? Uh, the airline can't hire a person who isn't certified, who isn't retrained every year, re certified. Uh, the tri- you know, and there are breakdowns in the system. There's no question. But the key is can you make a rational, reasonable decision on that highly risky thing, to fly in a plane or to go into a, an operating room? Yeah, you can. And you can do some homework, right? You can check the credentialing of that person, right, if you're, if you're diligent. You can look at the Better Business Bureau. You can look at complaints lodged against that hospital or that airline and so forth. And you can, you can adjust your behavior in a rational, not perfect, but a rational way.
If the system is broken down, right, it turns out the airline failed in their hiring screening or whatever, and you have, you have the ability to go, to look at the courts for, uh, you know, for some relief, or to deal with some of the licensing issues, right? So, again, we can't expect perfection, but we can do much better in terms of raising the confidence. So, assurance was a big step. Uh, you know, there were other paths for me, you know, software security and all this stuff. And the bureaucratic path I mentioned was, uh, I call it the, uh, saluting the taller flag, or the bureaucratic Jedi mind trick. Oh, Ron, you care about communic- uh, communications security. That's great. I care about information security. My charter, my mission, therefore my budget, should be bigger than yours. Oh, you, you tell them you care about information security. No, no, no, I care about assurance.
That kind, you know, so you, you sorta like it's a race to rename yourself. But there's also the logical path, right, which is more productivity, more commercialization, more, uh, complicated business arrangements means that the problem is in fact more complicated. So, the, uh, watching the bureaucratic naming, you know, I have a chart kind of that effect, right, it also recognizes the problem has gotten bigger. And so we need to think of solutions in a bigger space. Now, the huge step of this cyber thing, right, uh, you know, what happens is you get to a peak where you, you produce the most general purpose, but also most meaningless name. So, cy- I always say, and I, sometimes I violate this rule myself, I have to admit, cyber with nothing after it is not very helpful.
In other words, at minimum you have to talk about things like am I focused on defense, right? Is it about, you know, all these other issues, and you need some way to address that. Uh, you know, and I remember when cyber command was starting up, and I, I mean this respectfully to all my friends at [inaudible 00:44:40]. We had thou- literally thousands of cyber warriors who could barely spell IT roaming the halls. And, uh, you know, got it, but this is a, this is a new domain here, right? We need, we need some literacy in these issues. So it's talking about cyber defense, you know, about, uh, exploitation, about attack, about all these different issues. And so it becomes this, you know, so everyone gets to declare their role without really being very specific about it. So that's the confusion, or the marketing confusion that comes with that too. So, I mean, the problem is, is unquestionably bigger and more complicated, but it's also not helpful to sort of, uh, brush it over with a term that doesn't have any real definition to it.
Ron Gula: [00:45:20] Do you, do you think the Department of Defense and the NSA got defensive, right, so you have, you have skips, right? You have rooms you can go into, you can't bring your phone, no signals leak out, right? You've got cryptography everywhere, you've got all of this kinda information that's, that's designed. Do you think the commercial world's starting to figure out that the only way to keep China and Russia from hacking in to them and stealing their secrets is to kinda adopt those kinda standards?
Tony Sager: [00:45:46] Yeah, that's a, that's a tough question. You don't give me too many easy ones. Appreciate that. You know, the, um, you have to look at the history of some of these things, right? A lot of g- government defense is based upon, if you're following any of the more, you know, you know, in the last few years, the more popular TV series that then start to show spy craft is-
... in the last few years the more popular TV series-
Ron Gula: [00:46:03] Right.
Tony Sager: [00:46:03] ... that- that start to show, Spycraft is- is one that's sort of out now. Ah, yeah, people do that stuff, right? I mean, uh, but my won- my wife once asked me like... "Uh, don't get me in trouble, but I don't get this whole espionage Counter-espionage thing, you know, can- can you tell me how to think of it?" I said, "Oh, there- there's one simple rule to me, and the- the phrase I use, which I've used in talks before, 'Uh, when nations compete, everybody cheats.'"
You know, just assume, [laughs] right, that this is a complicated ugly game. And so, all kinds of things happen, and we will spend, right, large amounts of national treasure, any nations will, to learn about the adversary. That- someone we see as a rival or an adversary in economics, and politics, and technology. And so, uh, that doesn't mean you have to be paranoid, but a certain amount of paranoia is a healthy way to think of this here. But it's also true that when you look at the- the way that the Government has managed it, um, you know, there's a tendency in cyber security to say, "It is really complicated, and so, I'm gonna- the- the things I can control, I'm gonna control them to a fine point."
Like, password complexity, right, it's one of the few knobs and switches that we can actually control. Oh, and it [inaudible 00:47:17] uh, 12 characters, plus this level of comple- well, we say 14. Wait a minute! You know, and you start to build up sort of, um... Yo- you know, and that was one of the things we discovered in that meeting I talked about, right, where we reconciled all these things, and- and some of these things popped out in discussion. "Well, you- you guys say time out so many minutes after umpteen bad passwords, and you guys say this many minutes and you guys say... Who- who- who can- wh- why do you say that?" "Uh, well we say that because we've always said that, or we said it because, 'Hey Ness said this, and- and so we want... whatever they say, we say more.'"
I mean, it was no more complicated that that. And in my early, early, early days, I'm back in the '70s, when they- w- how were they keeping Intern busy? Uh, they would- I would do all these calculations for, you know, with a 300 bod dial-up mode. I would- if I could throw so many passwords, attempts at it, it takes the system so many seconds to respond to how many... You know and so, those were the kind of, you know, fairly casual studies that were driving these. Okay, then the password has to be this long, because it'll slow down the processor, you know, to- to try all these candidate passwords, and blah, blah, blah.
And I said, "Some of that is still creeping around in our security guidance." In other words, if nobody has a rationale for it, pick the middle one, I don't care, I'll sign off on it, I don't care, okay. But because there's this folklore right, if we have to... So, there- I'm- I'm sorry to go on, but it was a bi- a little bit of this, we look at the things we can control. You know, for- and so, there's a tendency to over-build sometimes right, because security people are a little bit paranoid, and they're naturally conservative. Right, they don't want the system to fail. But again, and when you look at this as a mass market problem, uh, you're not going to see that kind of expenditure.
Right, you're not going to see it, because economics just don't work. And so, you need to be more selective in the way that we think of this- this problem, right. Some things do deserve more attention than others. And it- it shouldn't be driven because the company's willing to spend all that. And so, I think also, a national debate that's kind of raging, and I think will come up in this new administration, you know, how much do we expect the typical commercial company to protect themselves? Right, how much is it reasonable for them to sort of take the approach that a Government agency might, right? The economics probably just aren't there. Right, it makes no sense for that.
And what is the responsibility of the Federal Government to provide some level of safety and you know, quality of the software, and [crosstalk 00:49:34].
Ron Gula: [00:49:33] So, what you're saying is that you endorse Hack-back for- for companies? No, man... [laughs]
Tony Sager: [00:49:37] [laughs] yeah-
Ron Gula: [00:49:38] We'll- we'll talk about that in a second.
Tony Sager: [00:49:38] ... sure.
Ron Gula: [00:49:38] So, um, so you have these guides-
Tony Sager: [00:49:42] Mm-hmm [affirmative].
Ron Gula: [00:49:42] ... you are- and you've kind of seen, you know, the transition from information security to- to- to cyber.
Tony Sager: [00:49:47] Mm-hmm [affirmative].
Ron Gula: [00:49:48] But now you've got these guides. And these guides for something like a Widows 95, Windows 2000, and even the modern ones you have-
Tony Sager: [00:49:54] Mm-hmm [affirmative].
Ron Gula: [00:49:54] ... they're 50 pages, 100 pages.
Tony Sager: [00:49:57] Oh yeah, absolutely, yeah.
Ron Gula: [00:49:57] So- so, two questions. So, what happened, how do we automate that, so I don't have to read it and then-
Tony Sager: [00:50:02] great.
Ron Gula: [00:50:02] ... and then even though y'all i- interpret it and agree and stuff, I might read it and then-
Tony Sager: [00:50:06] Yeah.
Ron Gula: [00:50:06] ... reinterpret it for what I need to do. So- so, h- what happened with automating that? And then-
Tony Sager: [00:50:11] Yeah.
Ron Gula: [00:50:11] ... if we're even gonna automate and make the [inaudible 00:50:14], why- why isn't Microsoft just shipping these things and- and you know, Abuntu shipping these things based on these things anyway?
Tony Sager: [00:50:19] Sure. Boy those are- those are great. Yeah, so for me, I- I mentioned that we were releasing let's say a security guide to the public in 2001.
Ron Gula: [00:50:26] Mm-hmm [affirmative].
Tony Sager: [00:50:27] And, uh, so, you know, the response was great. And by the way, it was a- it's a great illustration of culture change, right. Because, um, at least one manager I worked for... And- and I- you know, I- I convinced, you know, several levels of bosses this was the right thing. It was a Paul Bartalk idea, by the way. He leans in my door and says, "We need to release this stuff to the public." And I- my jaw just dropped and I said, "Oh my gosh, of course." So, I- I worked, you know, and I- I was blessed to work with some, uh, amazing, you know, put to the director of COMSEC at the time, who supported the idea knowing, you know, this would be a big- big thing.
And, um, gr- a great reception from the public, but at the last minute at least one of the, you know, kind of died in the wool, math folks that I've worked for, great- great guy... But you know, his- his- his ca- you know, his upbringing was... We- and I say, control the knowledge of cryptography. And the math model at that time, was journal quality, peer reviewed, you know, this is- this could be looked at 100 years from now and we should get this right. And this is of course, operational data right, it's gonna change because you learn something new in the next month.
So, he said s- then, he came up to me really and, um, very sincerely, good- great person, "Tony, we're gonna release this tomorrow, you know. Suppose somebody finds a mistake in it, what are we gonna do?" And I said, I- I literally bit my tongue. You know, because- because we're- our views were just so different, right. Ca- you know, based on upbringing and time, a- a generation. And I said, "Well, I'll tell you what, number one, it's- it's probably all a mistake. It's- probably somebody has knowledge though that we didn't have or, you know, would point out some condition we hadn't thought of. We did two things, first, I'm gonna correct it.
Second, I'm gonna walk down to the NSA Gift Shop, buy them a coffee mug and send it to them. And we will get more credit, right, by participating. We- we took our shot- best shot and put it out there, and we're- we're willing to listen.
Ron Gula: [00:52:17] How many- how many coffee mugs did you send out?
Tony Sager: [00:52:19] Actually, only one. [laughs]
Ron Gula: [00:52:20] Only one?
Tony Sager: [00:52:20] But it was- and by the way-
Ron Gula: [00:52:21] Yeah.
Tony Sager: [00:52:21] ... we do that mor- more regularly now at CIS. But the idea, what I was trying to illustrate to him, right, was that no one expects perfection here. And we- you know, the- the goal is, as I said, you don't want NSA in charge, you want NSA to be a participant right, an active technical participant in this. So, this- this- this notion of putting them out there, uh, became important. And so, that went really well, and then I- you started to see a- a- a market pop up around- around that time. Of folks that say, "Hey, I've got, uh..." I forget, one of the early ones was, uh, do you remember a company Bindview?
Ron Gula: [00:52:51] Mm-hmm [affirmative].
Tony Sager: [00:52:52] Okay, so Bindview was one of the first that I bumped into anyway. Oh, uh, "You know, we- we help you manage your network. Oh, and by the way, uh, we can- we have a module that you can buy that will measure compliance against a [inaudible 00:53:02] and then let's say a security guide, early CIS Benchmark." Oh, so I tracked that company down, I think at RSA 2003, or something like that. In fact, actually in 2003 I made it a quest. I had never gone to RSA before that. I used to send, uh, you know, some great people out there. Do- do you know the name Neil? Neil would write this legendary trip report about what I saw at RSA.
And I always said, "As a taxpayer, I can't justify s- paying to go out there. I- I should spend the afternoon, instead reading Neil's trip report." But at one point, I thought, you know, there's so many industry things to do out there. And so, I went out... Yeah, well, how do I- how do I not feel guilty about being out here? I walked the trade floor for two days plus. Wor- wor- I wore myself out. Every company that had some kind of thing like that, we measure, I would talk to them til I could get to some, you know, CTO or some technical person. And say, "Now te- tell me exactly how you do that." "Oh, well we work with a..."
"No, no, tell me exactly how you do it." "Oh, well, we pay a team of people, they- they look around. They say, "Oh, this is update to this. We download it, we read it, we poke it into our script language, our hard-coded scanner or whatever." [laughs] "Oh my gosh, is that like, value ad for you? "Are you kidding me, I've got to pay a whole team of people to do that. Oh and by the way, you guys don't write very well." Right, your point. [laughs] "So, I've got to interpret what you meant." "Oh wow." And so somewhere in that, you know, year or two period I said, "We- you know, this doesn't make any sense, right. We're- we're taking- the way I view it... And my- I know, um, a little bit of economics too.
The value... And I'm gonna say it this way, the value created by the Pauls' and all those kind of people right, uh, the analysis that we now codify, because we leave it up to human interpretation, and manual cut and pasting and all that stuff, we are essentially throwing of the way, the value that was created, right. I have no confidence in the output of that at the end. Therefore, we need some machinable way to move it- that value created, into- you know, into the customer environment. So, I started to, you know, as I talked to these companies, "Wow, it- you know, if the Government had a... Or sorry. If there were an open standard, right, a universal language of security guidance, would you support that? And the answer was almost always, "Yes-
Ron Gula: [00:55:15] But-
Tony Sager: [00:55:15] ... if we get a vote on it-
Ron Gula: [00:55:16] ... [laughs]
Tony Sager: [00:55:17] ... right." I said, "Great. How about if you help. Because I can guarantee you I can take 10 Government people and go off and write something that will be intellectually beautiful and you'll hate it, because it won't be implementable." And that- you know, that was one of the things. And then, uh, I think a year or two later, uh, NIST came to my doorstep with, what was then, they called S-Cap. They sort of bundled together, remember they were- at that time there were pieces of data, CBE, uh, the- the names of vulnerabilities. Oval, the test for the presence of a vulnerability, uh, and- and all the early ones that went with that. Right, the- the, um, scoring system and so forth.
And they bundle it- cleverly bundled all together into a- into a program. But my- my contribution was about the- uh, the conceptual idea. Right, this idea of preserving value, uh, ask- not asking the vendors to do these error-prone manual steps, so I could raise the confidence for the DOD buyer, primarily, right. And then I started to look around, uh, Ron, at the point, I said... And- and this was like a big moment for me. You know, I- I'm a little slow, but I, uh, usually catch on. I said, "Oh, this is interesting, the name of a vulnerability, the name of a piece of software, a unique name, the name of a configuration item. Who else uses- who else uses that kind of data?"
And it started to lead me around to knock on the doors of companies like, uh, um, uh, sulfur licensing companies. You know, there was one that was big at the time, Belarc. You know, the idea was they're not a security company, but they actually know all the same things, they use the same pieces of data. And I approached, you know, Belarc out of the blue. I mean, I literally downloaded one of their white papers. Got a call from, um, Suman Chen, if you know Suman and Gary, and that started a multi-year friendship. And sort of moved them into this space. There was another guy, a Wyatt Starns, you know, one of the original trip-wire guys who had, uh, [Signisurd 00:57:01], I think was the... He just took this and ran with it.
And the idea was, you know, s- uh, not to think of it as a security problem, but who else uses this data? And where we could all benefit from some common naming or standardization or something like that. So-
Ron Gula: [00:57:17] And that's a forerunner of basically threat-listing.
Tony Sager: [00:57:20] ... Yeah.
Ron Gula: [00:57:20] So, the- they were looking at, you know, known good files, known good configurations.
Tony Sager: [00:57:24] Excellent, yeah, that's right.
Ron Gula: [00:57:25] And- and that gives you some assurance, right-
Tony Sager: [00:57:27] Yeah.
Ron Gula: [00:57:27] ... it's not perfect, but it's- it's- it's quite good.
Tony Sager: [00:57:28] And there were- there were a lot of threads that... Some of them, which went to dead ends, but otherwise... You know, the idea was... So, I started to talk to the threat reporters inside NSA. "Hey, when you write a report of an incident, when you refer to the software that was exploited, [laughs] you know, if you could start using standard names to it, and standard... Oh, and by the way, you know, it was configured this way, and that allowed the exit. You know, if you'd use a standard thing to that, we could go find that later. Right, as opposed to the sort of narrative way."
And- and the industry... Um, others ran with those ideas. You know, in the emergence of things like, uh, sticks and taxi, and these other, you know, again, more- more modern forms for that. So, again, I'm not claiming any, uh, ownership here-
Ron Gula: [00:58:10] You're- you're very- you're very humble.
Tony Sager: [00:58:12] ... No, no, but it was just all these ideas started to swirl in one place, you know. And once you start to get tuned into it, then you find others that are like, "Oh, yeah, yeah, okay." And it really broadened my thinking at that time. So, it was really a- an important time, early 2003.
Ron Gula: [00:58:26] So, you've got this body of work-
Tony Sager: [00:58:27] Yeah.
Ron Gula: [00:58:27] ... where it just says, how do I harden-
Tony Sager: [00:58:29] Yeah.
Ron Gula: [00:58:29] ... UNIX, Solaris, Windows, whatnot. And then you've got- now you've got a body of work that you can automate the assessment of that.
Tony Sager: [00:58:35] Right.
Ron Gula: [00:58:36] Any progress on getting people who are building these things to lock them down so that we don't have to.
Tony Sager: [00:58:42] Yeah. And that- that is a really key- key part of this Ron. I mean, think of this idea of, you know, we- we've grown up in a- um, in an industry where it's the- uh, the- the model of security is, "Build it yourself. Go by a thing, go get a security guide, go buy a tool, and compose it yourself." And you know, the, um, m- uh, I think this is a fair statement and you- and we- we can discuss it. Most of our economy will never be able to do that, right. You- I know you care a lot about the plight of small businesses, right. They'll never hire a security person, they can't afford expensive security tools.
You- you cannot- it's not reasonable for us as a society to think that that's a sort mass market solution to the problem, so it has to be built in. And I will say that, you know, we've all- we've all grown up in this, right. So, the early days, and I- and I'm not gonna pick on anybody here, they- in Microsoft, you know, um, their early days there was a lot of friction with the Government. And I had one of the top security people at Microsoft, I mean, just pull me aside, when the NSA security guides first came out.
And he gave me, uh, literally, the finger... Not the finger, [laughs] but the finger pointing, "You Government guys don't get it. Every time you tighten up security..." I mean, this is the quote. "Every time you tighten up security, you cost me money." "Why is that?" "Because software breaks. I've got to pay more people to answer the phone. I've got to train people, I've got to, you know, trade my system and..." And I said, "If we work together, you'll save money." He said, "How is that possible?" I says, "Because you can't..." Right now the DOD is every Desktop for itself, right they're all configured randomly. You cannot tell me that that's a cheap support cost.
If we knew that a DOD Desktop, and this is at- you know, at some level, not- not universally became true, was one of six variations, right. By the way, the first level of breakage analysis will be done by folks like Dessa and NSA. And then we will care about command and control applications, and mission-unique applications. Right, and we'll know what runs or doesn't run on the hard DOD systems, and we can turn that into a spec, or integrative software companies. Right, we should be telling them, "Don't just give me Windows out of the box, it's gonna run on a DOD system and it looks like one of these three, right. And you don't get to use the- all the ports and services that you want.
So, this idea of you've got- you have to break the cycle, right. And you have to say... So, he said, "You cost me money because systems break." But the following question is, "Well, why do they break?" "Well, because everyone that writes software, writes it sort of with, uh, the expectation that they own the machine."
Ron Gula: [01:01:09] Did- did the, um... I'll just go at-
Tony Sager: [01:01:11] Yeah.
Ron Gula: [01:01:12] ... talk about the Air Force, all right?
Tony Sager: [01:01:13] Sure.
Ron Gula: [01:01:13] Didn't they pioneer this kind of concept-
Tony Sager: [01:01:15] Absolutely, I- I give great credit for it.
Ron Gula: [01:01:16] ... and measure some reduction-
Tony Sager: [01:01:17] Yeah.
Ron Gula: [01:01:17] ... in IT costs because of it, right?
Tony Sager: [01:01:19] Right. It was the- the first, uh, sort of large scale, maybe the- the largest, and maybe only, [laughs] of that scale. It was really the Air Force and a guy name John Gilligan, who happens to be the CEO of CIS right now, who looked at the whole problem, right. And- and just to- to close the Microsoft thing, we came a long way with Microsoft. We went from this sort of friction, to really cooperation. Now, we can talk about that later. But, so, John Gilligan was a CO- that- CIO of the, uh, Air Force at the time. And he was quoted in the Press, and I tracked him down to make sure... And he's, you know, the Red Team used to do regular work for them, so, I- you know, we knew each other at some level.
But this quote was, "I'm spending more to recover from bad software, than I am to buy it." And what, uh, I would interpret that for the lay person is, he was looking at the whole life cycle, right. So, and again, big systems kind of work like this. So, the- the acquiring of stuff was different than the operational management of stuff. Different budgets, different leaders, but when you looked at it... So, you buy cheap upfront, right. Hey, wide open, commodity pricing, throw it to my integrators, or at the big companies. They now push it into DOD systems or ship it with DOD boxes and all that kind of stuff. Oh, and now bad guys are attacking it.
And- and poor, you know, overworked 20-year-old tech school graduates are trying to- like heck to configure this thing so it works. And by the way, commanders yell at me because the email's not running, right. And so, you've got this kind of crazy quilt of- of life cycle here. And so, he looked at this whole thing and said, "This is crazy." So, he was able to combine se- the- the improvements with security, really with acquisition, right, with purchasing. With, how do I bring- as a CIO, go from a random unknown number of contract vehicles, to a relatively small number? How do I put requirements in upfront, that allow me then to save money on the tail end, right.
The- and so, we became the security advisor to that, and once this got started, uh, basically I wound up, uh, uh, uh, offering two NSA people for as much time as they wanted to fly down there, I think it was Gunther, Alabama. And what the Air Force did, I mean, it was a tremendous leadership thing right, it was really... And I- I talk them up all the time. Wh- to make this happen, leaders have to take charge. And so, they would have these big conferences, where all the operators, right, you know, the DOD, right, operators win. Right, operators are in charge, because that's- that's the deal, that's- that's the mission.
But they would all show up, and there would be this discussion of, you know, the- the mission critical applications that they had. And they would go through, I mean, literally point by point right, "If we tighten the configuration to this, who might be affected?" And there was this ongoing... And I- I wanted, you know, our security people in there sort of, every time. [laughs] And that became the basis that became known as the Air Force Desktop, right. And so, the idea was, it wasn't just defining in terms of security, it was working with operators, it was changing the way we acquired stuff, it was working with the- um, um, the integrators, right.
In terms of, "Okay, when you deliver a DOD system, it'll be pre- pre-configured." It- there was even a point in time, I think where Dell was the primary supplier, right. Uh, there was actually a line item you could buy, a DOD configured, you know, uh, Windows o- based operating system. Or, and there was Solaris for servers and so forth. So, the idea was to look at this as a- an entire life-cycle activity. And- and so, um, you know, that's by far the largest scale kind of experiment of that type. What happened was, that became the basis for the- what was then known as the, uh, Federal Desktop. I've heard the exact, uh, name, but the first name Karen Evans at OMB.
And this idea of, would we do this at the Federal level, right? Uh, it had mixed success, yo- the- I mean, systems are very hard to change, but they- their concept is, you know, something you have to deal with. They say, "I can't- it's unfair to ask everyone to solve it, it's insane to ask, you know, the most, um, underpaid, under-trained part of your workforce to save you, right. The- the ex-school graduates are just doing their best to keep things running, it's bad strategy. So, how do you look at this as a- as a whole thing? Uh, I will say also, more recently, uh, CIS, you know, I- I have nothing but good things to say about our relationship with companies like Microsoft, right.
And we're actively working now to kind of build in what we independently create. But wer- you know, we're not... Uh, a lot of the early models also, people forget, we're really done in conflict with the vendors. I'm going to cartoon just a bit, like CVE, right, the naming of vulnerabilities, "Hey, we can't trust those Mike..." I'm gonna cartoon. "We can't trust those people at Microsoft, they never admit they have a vulnerability." We have to have an entirely separate industry process to figure that out and vote, and give it a name and a number.
But within a few years, it became, "You know what, Microsoft participated, they started Patch Tuesday, regular sturdy bulletins." Uh, CVE winds up giving them a block of numbers. If they say it, hey, we might as well do something about it, right. Them, Red Hat and some of the other big vendors became active participants. So, we went from, again, "We can't trust the vendors, and, you know, the Government-end industry," to, "We need to find a way to cooperate and sort of- sort of build this in there." So, we- we've come a long way.
We've had some high water marks, um, but- but at the time, I didn't say, "We'll have some that we're announcing pretty soon at CIS. Where we're- we're- you know, this is working with the vendor, not compromising our independence. But really, hey, they're- at the end of the day, what they deliver is what's gonna run. And so, we need to with them on that.
Ron Gula: [01:06:24] So, w- we talked about vulnerabilities-
Tony Sager: [01:06:25] Mm-hmm [affirmative].
Ron Gula: [01:06:25] ... and fixing vulnerabilities.
Tony Sager: [01:06:26] Yeah.
Ron Gula: [01:06:26] We talked about you know, configuring systems-
Tony Sager: [01:06:30] Yeah.
Ron Gula: [01:06:30] ... like, good passwords and stuff.
Tony Sager: [01:06:31] Yeah.
Ron Gula: [01:06:32] But now, let's talk about Center For Internet Security, but also talk about this concept of the higher level controls. Like, what are these things that we can do... It's- it sometimes is password management-
Tony Sager: [01:06:43] Yep. Yeah, so-
Ron Gula: [01:06:43] ... right, but there- these controls, there's a lot of- a lot of interesting things there. So-
Tony Sager: [01:06:46] ... Yeah.
Ron Gula: [01:06:46] ... so, what is CIS and what are the controls?
Tony Sager: [01:06:49] Sure. Yeah, there's a, um, a bit of an origin story from, uh, from NSA days, around all that time of the Federal Desktop and all that kind of stuff too. And, uh, so, again, I was- I was hitting this point of realizing, oh my gosh, we're seeing the same things over and over again. And that- and also, the release of the stuff to the public, led me out into the public to start giving talks. And people would start flooding me with questions, right, because at that time, an NSA guy coming out was fairly un- un- uh, unusual.
And I would get these questions, I- you know, I- I should have been smarter, but I- I would get these questions that I hadn't really thought about. "That's great stuff Tony, what do I do first?" "Well, I mean, you've got to do this stuff, it's great, then this..." "No, what- what do I do first? My boss has a limited attention span, I only have a budget for one thing. I've only got one person on-staff, right, what would you recommend I do in the first quarter? For- oh my gosh, and I realized, I was never responsible to fix this stuff. That's the- those are the questions of a person who really owns the- the solution, right, who has to deal with all these realities. Wow, okay.
And so, um, so th- this sort of swirl of questions were just really eye-opening for me. And at- at some point, I mean, and John Gilligan our CEO, was one of the threads, right, he calls in our group one- after one of the briefings, you know, what the Red Team had done to Air Force. And said, "Tha- that's great, but what do I do? What do I do first?" You know, he asked a similar question, right. "We're tired of getting our teeth kicked in by your folks, what do we do?" I said, "Wer- we need to do something differently here."
And the- the origins of the controls, uh, literally I- I called five friends in a room, one of them was Paul Bartalk, uh, the- also the tech Director of the- uh, Red Team at the time, Paul, and the Blue Team. One of the folks from [inaudible 01:08:25], right one of the people who tax other countries for a living, uh, a technologist, someone who kind of deals with the Internet, NSA Enterprise, and the threat reporter, threat kind of person, right, who watches what bad guys do, you know, for a living. And the- the way I started... You know, this is- this is my thing, started the meeting.
People were really struggling with how to get started. Security people, you put them in the room, they want to solve world hunger and peace in our time. What- ca- "No one leaves the room," this is the way I said it. "No one leaves the room til we all agree on a small number of things that all of our friends should do. Not to solve the problem, but to lay the foundation. To get going, to get started, right. And small to me, means five to seven-
...To get started, right. And small to me means five to seven. Do not, you know, we, we can create an infinite list, I know that. But the discipline is what, how do I help people who don't know how to get started? And that afternoon turned into it. I mean, I had no idea. Um, two page letter went to the Pentagon, the joint staff, the CIO, the Air Force. Based on our experience, if you don't know where to begin, start here.
Ron Gula: [01:09:24] And, and what were those things?
Tony Sager: [01:09:26] They were, they're basically the controls, a simpler version of it, right? Patch, no. What you have, I mean increase your visibility through, uh, the letter is long lost I mean. One of the, one of the problems with when you retirement from NSA this, I used to say, when you walk out the door, you're out of the loop. So I don't have access to a lot of, you know, things that-
Ron Gula: [01:09:42] You couldn't just save that to your Dropbox share and a friend.
Tony Sager: [01:09:44] No. I did not, no.
Ron Gula: [01:09:45] [laughs]
Tony Sager: [01:09:45] Um, but, but it's around, but the idea really resonated. And it was yet another, you know, Paul peaks his head in my doorframe and says, “Uh, Tony, uh, Alan Paller of SANS got ahold of that letter. And he wants to know, can you build a community project use only. I said, ut tell him, I gotta check with our lawyers. And two, he has my permission, as long as we get to know, SANS is not capable of doing a small thing, right.
They only do big things, right? So it turns into, uh, you know, events, posters, you know, and it went from 10 things to 20 things, more comprehensive. Uh, and they, they really obviously ran it under the umbrella of, uh, the Center for Strategic and International Studies, right?? So John Gilligan was, was brought in to kind of run the project. Uh, nominally run by this think tank and SANS took it over for care, care and feeding afterwards. So most people would kind of know that project is a SANS point. And it had a life of its own, you know, we were peripherally involved or, you know, continue to stay involved, but I never thought much more about that to be honest for a while, I retired in 2012.
Uh, went to work for SANS, uh, not in the teaching, but in special projects. So I had some personal reasons why I really wanted to work from home at that, at that time. And, um, Alan asked me, “Would you take a look at this thing again and see what we oubout it. I said to Alan, the right answer, is it really, uh, people respect, SANS, but it's also, you're also a great money machine, right?
You're one of the biggest for-profit activities in the business. If you really believe in the idea, it belongs in a non-profit or independent place. And he said, would you go with it? If so, he said, you don't have to. I said, I, I'm, I'm in for the ride. You know, I'm, I'm committed to this. So that brought me together with a lady named Jane Lute, who was the number two at Homeland security, who was looking for something to do after government. And we had a tiny nonprofit company that took this over with that we merged with the Center for Internet Security in 2015. So now it's, it was in, it's kind of now it's kind of in its natural home, You know, it's really where it belongs, uh, combined with the benchmarks, right?
So the controls, you know, benchmarks are, how do I configure a device, a thing, you know, kind of this low-level operator knobs and switches. The controls, have always tried to aim for what I would call a sweet spot. So if you look at the, the world of security frameworks and standards, right, they go from the microscopic, very discrete, very, you know, the Australian, remember the original Australian top, uh, about four and top of-
Ron Gula: [01:12:29] ASD.
Tony Sager: [01:12:29] Yeah. You know, turn on application white listing, you know, uh, very, uh, so people, lots of people like that because it tells them exactly what to do. Lots of people hate that because it tells them exactly what to do. They said, well, that doesn't work in my environment, or I don't, that kind of tool doesn't work for me, out of my price range, whatever. And then you go from there to the cosmic, right. You know, these sort of security frameworks. I do good. And write me a paper that says you did good, or have a firewall and tell me, you have a firewall, but it's so it's, it's just full rain. So we, we aim for this prescriptive, but as, as best we can make a technology, you know, a vendor independent and, you know, we don't always get it right. But we do pretty well there.
Ron Gula: [01:13:12] And, and you have three levels.
Tony Sager: [01:13:13] Yeah. We have three.
Ron Gula: [01:13:14] To get people started. What are those three levels?
Tony Sager: [01:13:16] Well, we call them implementation groups. And so the, any list has a certain, um, uh, clarity and goodness to it. But it also has a certain danger, right. People say, Oh, it's a list. And number one, I should do first, number two, I should second. Yeah. And they get stalled at some point, and that, that's not the way they were written, but that's, that's how they get interpreted. And so you can think of maybe the 853 catalogs, right. The granddaddy of sort of everything you could do. We're certainly a subset of that. And lots of others are too. But even that again, especially for small to medium enterprises is pretty daunting, right. For fitness stuff.
I mean, hundreds of beautifully written intellectually, you know, well, well done pages are other reach for lots of companies. So, but even with our stuff, we found a struggle to get started, right?. And where to focus. And especially as we looked at modern things like ransomware was kind of the, the swinger for us, which said, you know, you better get to your backup stuff pretty quickly and you can't wait until I think at that time it was number 10. If you, if you're working your way through one through five and six and seven and you're stalled and you'd never get to 10, that's not good either. So we, we have, you know, these 20 headings with lots of individual items underneath them.
And so, um, you know, a couple of years ago, a little less than a couple of years ago, uh, we had a group go through and say, look, take a horizontal view across that, right. For a low resource sort of mass market, you know, everybody's got the same level of risk kind of view go through and pick the things that basically everybody ought to do, right? Independent. They don't need a fancy risk assessment. They don't need, uh, you know, uh, a consultant to tell them this, is there something that we could put in here that really meets that? And, and I've been consumed and, and, uh, this term gets used a lot in the industry hygiene.
So I look back, I do my, the first time I said it in public with a speech was in 2004. I don't, some people think I created, I don't think so.
Ron Gula: [01:15:12] It's a black hat. I think, right?
Tony Sager: [01:15:14] Yeah. I stole it from someone else I'm pretty sure. But, but the idea was at the time I was discussing, is there an equivalent, right? So there's a number of things we do, especially in public health, like brush our teeth, we floss, we hope, uh, wash your hands, get the shots we hope, uh, avoid certain kinds of locations or, you know, take precautions. And the fancy way to talk about that stuff is, uh, those are behaviors, but they're really translations from science, right? Someone has studied the transmission vectors of a disease and said that washing your hands is not a cure, but it's a tremendously important.
And well-defined it inexpensive step to interrupt a key transmission path, right?. And so just do it. And if the science changes, you need an antibacterial soap instead, or, uh, you know, a different sort of formulation would make more sense and okay, then we can change the behaviors right. Once you buy could change. So that was the idea that I've been searching for to kind of the equivalent of that, right?. And trying to get past this snip special snowflake, Oh, you couldn't possibly do anything wrong in security without hiring a really expensive consultant to come in and do his fancy risk assessment. And I knew that that was really out of reach for a lot of folks.
So this idea of, of this is there a basic cyber hygiene set. So at night use the term and I hunted around, I couldn't find a good definition of it, frankly, right?. It's the kind of thing people say, you know, cyber hygiene is really important. For example, you should patch. Okay, great. Is that a definition? Well, no, it's an example, right? And for example, you should know where all your hardware assets are, identify all your, so for us, we define this implementation group one as a basic cyber hygiene, right. That's that is our definition. People can argue with it, but it's also the on-ramp right. Just do this. Don't spend a lot of time thinking about it.
You can build a program around it, you can build expectations around it. Now we matched that with the implementation group too, right?. Which is sort of the next step. And then three, which is sort of all else, uh, we're building a more technical defendable rationale under the hood for that. But that first step is kind of the, was really about an on ramp, right? What's the basic thing that we ought to expect. Again, same thing for you as a citizen. When you walk into a restaurant, you spec, you expect, you know, not perfect, but a certain level of cleanliness, a certain level of hygiene, right. That certain level of, of behavior by the employees when they handle food and all that kind of stuff. And some of that's codified in regulation and you know, some of it, you just expect.
Ron Gula: [01:17:39] And the, the last thing you mentioned, let's talk about that.
Tony Sager: [01:17:41] Sure.
Ron Gula: [01:17:42] And then we'll, we'll kind of finish up here. Cause this has been, this has been awesome. You're going to try to expand the controls with some proof looking at things like MITRE ATT&CK.
Tony Sager: [01:17:53] Right.
Ron Gula: [01:17:53] So what is MITRE ATT&CK and is that relevant to this discussion?
Tony Sager: [01:17:58] Yeah, I think that's right. Yeah. We, we are on this multi-year campaign, right? To bring more data, more rigorous rigor and transparency, right?. So control lists, right? Any lists that anybody's ever done basically starts like this, get a bunch of smart people together. Like the origin of the controls, argue, write it down, right, publish it. And people look at and go, Oh, that's, that's a bunch of smart people. Or I have to pay attention because I'm the regulating me or whatever. And the worry is, that's just not good enough for the future of this, right?. Different people might give different answers, uh, the loudest voice in the room, you know what this industry is like full of really brilliant, but also really opinionated people, right?
So there's a lot of arguing and not always logic that pops out of it. So the idea was, can we, can we do that? So, um, kind of where we are now, we call the community defense model and the, and we're taking advantage of some industry trends, right. And one of them is this MITRE ATT&CK model. And so think of it as a, uh, technically rigorous way to describe what they don't told us, but I call it the atomic elements of an attack, right. The individual steps that are part of some, the mind determines tactic. So the tactic is to gain a foothold, for example, well, what are the individual techniques that could be used to get them, right? So with great detail and lots of brains involved, they've been sort of building this structure that looks across the universe of attacks and tries to identify this sort of common elements of, of tactics.
Ron Gula: [01:19:26] So like the SolarWinds attack?
Tony Sager: [01:19:27] Yeah.
Ron Gula: [01:19:28] That type of implant coming through a trust that's in there.
Tony Sager: [01:19:30] Right. Exactly. And so you, so you, you want to, and the, this is, you know, most people aren't going to look at this, right? This is highly detailed practitioner stuff. Uh it's, it's some sense, a lot of folks in the industry, you know, have used a prior models where things like the Lockheed Martin Kill Chain, right. Which is a little higher level of abstraction, you know, to talk about the life cycle of an adversary attack, the, MITRE and APT one model, things like that. They're, they're all useful, right? And we've used them also. I've written papers around them to say, this is, you know, you, you're not going to provide perfect defense against everything.
Therefore, the, the knowledge of what attackers do allow you to make good choices, but where you, uh, uh, ranger defenses, right?. And, and by the way, I don't want to have one level of defense that would have multiple levels of defense. So it allows you to think about that. Think of the attack, MITRE ATT&CK Framework is really, uh, sort of a more detailed atomic element, refinement of that. And it's gotten a lot of play in the industry, right? People are adopting it.
Ron Gula: [01:20:27] Oh, I'm a fan.
Tony Sager: [01:20:28] Yeah.
Ron Gula: [01:20:28] A lot of people are fans.
Tony Sager: [01:20:29] Makes a lot of sense. So we have, that is kind of one element, right? We don't need to create something. The industry is coalesced. And the other is we have a long standing relationship with the folks that produce the Verizon data breach report. So that's kind of the godfather of these. Hey, um, based upon our business model and data contributed by lots of folks, uh, we have analyzed over the last year, all the incidents categorize them, summarize them, look for commonalities and so forth. So we've been working with since about 2013 to map into the controls. Um, and what we did, the CDM sort of looks at this attack framework, looks at our recommendations and tries to place them there. If we recommend this thing, right. This very discreet thing in the controls, where does that provide you value in that MITRE model, right? It's not everywhere. It's very specific points. And so we, we, we map that.
Ron Gula: [01:21:18] It's almost like a jiu-jitsu, right? The attacker does this-
Tony Sager: [01:21:21] Yeah, exactly.
Ron Gula: [01:21:22] ...the defendant does this.
Tony Sager: [01:21:23] And so you, you can't do that exercise at the high level because you're completely missing all the technical nuances here. Uh, and MITRE, the MITRE model has a thing for mitigations and we worked through those, but there's a, that's not as mature frankly, as the rest of the MITRE ATT&CK model. And it doesn't, you know, it's, it just makes more sense in terms of driving defensive programs to use our controls and make them map into that. Then we look at so, so we have a model right? To, to talk about the value of any defensive step against this industry standard framework. But now we have data. So we have the MITRE, uh, sorry, the Verizon data breach report, which again, we're working and they have very good data scientists.
They have a whole collection model and a frame and a structure to manage that. So you look at that and you go, well from there, can we identify a set of attacks that we think everybody should care about? And so there's a little nuance to it. Basically we identified from that report, the top five types of attacks that everybody should care about. And then we look at that and we say,ework in this way, right? Or this way or this way. And we call that a pattern. So now we are able to say for every safeguard against these patterns of the top five attacks, what do you get value from what we recommend?
All right. So we're looking for that match. And so it's, it's been fast. So we're doing this kind of backing into this, right. You know, in logically we offer, we pick our recommendations based upon the model, but that's not where we were in history. So, so we looked at this and go, it turned out, uh, and there's a paper out published on this, our hygiene set, hygiene one against those top five classes of attacks. You get amazing coverage, like 100%, okay, in multiple places. It's not perfection, right? This is, uh, this is, uh, exercise here, analytically, but it's, it's pointing the way to what we think is, is really valuable.
Ron Gula: [01:23:23] Oh, to say, it shows that the people who argued over this knew what they were talking about.
Tony Sager: [01:23:27] Yeah, it does. It, it-
Ron Gula: [01:23:27] But having data to prove it's even even more useful.
Tony Sager: [01:23:31] Right. And, and again, we just... For us, right, this is just something we put into the public. And, and I've always been very conscious of, you want to produce something like this that is sort of data set agnostic. You can bring the DOD data set. You know, the model is there. You don't need us. You could use the same general approach.
Ron Gula: [01:23:48] And, and the more models that are out there and frameworks, the more interesting things people can use, like I've heard some examples where some organizations, they can identify the techniques that certain attackers use?
Tony Sager: [01:24:00] Right. That's right.
Ron Gula: [01:24:01] And then they can map those techniques. Where if I'm going to mitigate them, which Center for Internet Security controls. Do I need to mitigate that? So that's, that's a mature industry.
Tony Sager: [01:24:08] Yeah.
Ron Gula: [01:24:09] It took us 25 years to get there, but maybe 30 years, right?
Tony Sager: [01:24:12] I think, I think that's right. Yeah, I think there's a, there's a need to have a rational basis for kind of mass market recommendations, right. That are built into frameworks and regulation and that kind of stuff. But, you know, for those that face really high risk and that have the resources to think about these kinds of things, you know, they're facing really expensive decisions, right. And to have some rational basis to say, you know, the intelligence tells us this and therefore this is the most effective action. The other thing that comes up a lot though, right. And defense and offense are not one-to-one, right?
One thing you do in defense is not against the one thing it's, it's a one to several often. But you can also say that, you know, should we panic every time there's a new type of attacker tradecraft? Well, it turns out a lot of times the things we've already done actually provide you, you know, excellent or if not complete protection against new things, right? And so people will want to get in the panic over every zero day. And remember I talked earlier about, you know, the difference between knowing a flaw and is it really something I could build into an attack? It turns out a lot of sort of good practice things limit the attack surface, bound the attacker once he's got a foothold and doesn't let him move on.
So you want to have an intelligent way to say I learned something new, but how do I assess the actual risk from that? And therefore having a model of let you kind of continue to assess the value of your security program, change it if you need to. But it turns out my experience, you know, do, do a good set of things once you are actually in really good shape, right? And so you don't have to panic over every new thing you just need to re-examine your model.
Ron Gula: [01:25:41] That's, that's excellent. So let's, let's close out. So, so you're, you're on LinkedIn, right?
Tony Sager: [01:25:46] Yeah.
Ron Gula: [01:25:46] Center for Internet Security is a lot of, a lot of good quick start guides. But, um, you know, you had this conversation and this sort of mentorship with the industry while you're inside NSA.
Tony Sager: [01:25:56] Yes.
Ron Gula: [01:25:56] And you've recently come out, you've been out what, six years, uh, a little bit long, 10.
Tony Sager: [01:26:01] Eight, eight years.
Ron Gula: [01:26:01] Eight years.[crosstalk 01:26:02] All right, I was off a little bit.
Tony Sager: [01:26:03] That was eight years.
Ron Gula: [01:26:04] So, um, but, but it's hard to take the work you did inside NSA and then just publish it, right?
Tony Sager: [01:26:09] Right.
Ron Gula: [01:26:09] But you've recently started to bring some things out.
Tony Sager: [01:26:11] Yes, I have.
Ron Gula: [01:26:12] So let's close out with that.
Tony Sager: [01:26:13] Oh, sure. Yeah. Uh, yeah, www.sagercyber.org [laughs] uh, one word. And, um, you know, it was my, uh, habit when I was still inside NSA. To, um, we had an internal blogging system journal in NSA. And, you know, as managers got along, it was a pretty frequent there during a superior to several years towards the end of my career. And for me, it was a way to, uh, present ideas, uh, messages to my workforce, talk about leadership, you know, in general and, uh, you know, entertain also. And I always had the intent to do something with that someday. And I, I basically wrote unclassified things in, inside that closed society. And I, um, you know, for a variety of reasons, I just never got around to doing anything with it.
It turned out by accident a friend of mine had scooped up my writings and run them through the plastic declassification process and public affairs process to see if it could be released. And so it turns out he once sent me a big file of a bunch of stuff I wrote that was, you know, in theory already approved for release, but, you know, I'm getting a, I'm closer to the end of my career than the beginning. And one of the things I did when I was, uh, even writing internally at the NSA, with occasionally print something out and bring it home to the kids, you know, unclassified completely. But, you know, here's, here's what dad does. Here's what dad thinks. And here's kind of whatever, and I regretted not doing more of that.
And I also said, you know, I've been blessed with this career to know so many amazing people and be kind of, you know, I have my own set of, uh, in the room where it happened kind of stories, right. Things, things that were, uh, part of the shaping of what we now call cyber. And I think I feel a certain responsibility for that. And so, yeah, just starting last fall, um, you know, I, I contacted public affairs about some of this older material and, uh, and asked, you know, asked them for support. I mean, I don't, I have a lifetime obligation, right? In terms of what I can put into the public.
But the most of what I have to say is, is really about people and ideas and not about classified things at all. So, so I'm now in the process of, you know, of writing new things and releasing, you know, modifications to older things.
Ron Gula: [01:28:16] Are you like 10% through the archive? Are you 100% through?
Tony Sager: [01:28:19] I, I don't know. Uh, there's only like a dozen articles or 15 out there right now.
Ron Gula: [01:28:23] Okay.
Tony Sager: [01:28:23] I'm probably just touching the surface.
Ron Gula: [01:28:25] Good. Good. Good.
Tony Sager: [01:28:25] So it's a question of sort of how much, and it's a, non-linear wander, right? As things kind of hit me, I start to write about them and, and I have a stock of old things to, to put out there. But, you know, they're, um, they're both acknowledgements of sort of ideas and like the story I told you, like how, you know, I had a, uh, you know, some role in this early security automation days, right. How did I get there? What, what was I thinking? I, whenever I tell a story, people are fascinated, right? And so maybe it's worth sharing, you know, that's kind of idea.
Ron Gula: [01:28:52] Well, I, I appreciate you sharing all that with us.
Tony Sager: [01:28:55] Sure.
Ron Gula: [01:28:55] I would encourage you to keep sharing both on the blog.
Tony Sager: [01:28:58] Yeah.
Ron Gula: [01:28:58] You should probably write a book. Matthew McConaughey could play you in the movie.
Tony Sager: [01:29:01] [laughs]
Ron Gula: [01:29:02] You know, we gotta, we gotta have better role models for, for NSA people-
Tony Sager: [01:29:05] Oh, that, that's true.
Ron Gula: [01:29:05] ...in Hollywood. So it's, it's good. So-
Tony Sager: [01:29:07] Yeah.
Ron Gula: [01:29:08] ...thank you, Tony Sager.
Tony Sager: [01:29:09] Sure.
Ron Gula: [01:29:09] And, um, thank you for joining in to this episode of the Gula Tech Cyber Fiction Show...