The Digital Threat to Democracy
This discussion brings together leading voices from politics, national security and technology to examine threats and strategies for safeguarding our democratic institutions.
This discussion brings together leading voices from politics, national security and technology to examine threats and strategies for safeguarding our democratic institutions.
Cyber and digital attacks on campaigns and elections are a threat to our democracy and affect people of all political stripes. Over the past two years, nearly every election on both sides of the Atlantic has been affected by cyber-attacks. Foreign actors could target any political party at any time, and deterrence starts with strong cyber defense and public education.
The bipartisan initiative Defending Digital Democracy is co-sponsored by The Belfer Center for Science and International Affairs, Institute of Politics and the Shorenstein Center on Media, Politics, and Public Policy at Harvard Kennedy School.
Eric Rosenbach (moderator)
Co-Director, Belfer Center for Science and International Affairs;
Co-Director, Defending Digital Democracy Project, Belfer Center;
Chief of Staff, U.S. Department of Defense (2015-2017);
Assistant Secretary of Defense, U.S. Department of Defense (2014-2015)
Heather Adkins
Manager, Information Security, Google
Molly McKew
Foreign Policy and Strategy Consultant;
Information Warfare Expert and Writer
Robby Mook
Senior Fellow, Defending Digital Democracy Project;
Campaign Manager, Hillary Clinton Presidential Campaign
Debbie Plunkett
Principal, Plunkett Associates LLC;
Former Director, National Security Agency's Information Assurance Directorate
Matt Rhoades
Senior Fellow, Defending Digital Democracy Project, Belfer Center;
Campaign Manager, Romney-Ryan 2012 presidential campaign
Clint Watts
Senior Fellow, Center for Cyber and Homeland Security;
Fellow, Foreign Policy Research Institute
ERIC ROSENBACH: Okay, good evening everyone. Thank you for coming out to the Forum this evening. Tonight we have a great group of guests at the Kennedy School to talk about the digital threat to democracy, or what you’ll probably hear it more often called, too, is the threat to the digital democracy. And you’ll see that the group that we have gathered here today literally are the leading experts in a very, very complicated field. So we’re lucky that they were all willing to come up here and talk about all of the things that make up a digital democracy and the threat to it right now. You all know what that is. Those are things related to the First Amendment, free speech, to the Fourth Amendment, to privacy, to technology, and the systems that underpin that. And, most importantly, our electoral system of Democratic governance, which is what was under attack in 2016 and earlier. And we’re going to talk a little bit about that.
So I’d like to very briefly introduce myself. My name is Eric Rosenbach. I'm the Co-Director of the Belfer Center for Science and International Affairs. I've been here in Cambridge about six weeks now. I just moved up from D.C. In a past life, I was the Department of Defense Chief of Staff and Assistant Secretary of Defense, where I was in charge of all things cyber at the Pentagon. And to me, this is an issue that also really resonates very strongly in my heart and in my gut, and in particular because the last year I was there in the Obama administration, I saw a lot of the bad things unfolding and the bad guys going after our democracy. And it felt bad.
So once I was out, I thought to myself, well, I'm going back to the Kennedy School. And there's still a role for people outside of government in trying to address this issue, because it still is an issue. And my deepest fear, actually, is that all of the bad guys around the world saw what happened in 2016, that bad guys were going after our Democratic system of government. And they're all now just rubbing their hands together, getting ready to go back after the US.
And so with that as an introduction, I want to introduce you to a cast of characters here. And all of them are doing something about this. So, different than in some parts of Harvard. They're not only great thinkers and strategic thinkers, but they're literally doing practical things to try to address this issue.
So I’d first like to start right here. And I'm going to start with Molly McKew. Molly is very interesting, not only because she’s a journalist for Politico, but she’s been an Infowar expert for a long time, to the degree she’s also a special advisor to several governments around the world who deal with this issue in a very personal way. She just got back from some of the Baltic States, and she’s going to talk a little bit more about that.
Clint Watts is a Fellow at George Washington. But, you know, he knows something about this in a very operational way, too. Because he’s a former special agent from the FBI, where he was working on cyber issues, knows how this works operationally, and in a past life was also an Army Ranger and knows how to keep people safe.
Then we have here Heather Adkins, who is the Director of Information Security and Privacy at Google. Heather, like Debbie, who I’ll introduce later, is literally a pioneer in the field of cyber security. So Heather was the first person at Google, 15 ½ years ago, working on cyber security, if you can imagine. So she’s seen a lot over the years in the way things have evolved. And she knows how to run cyber security operations and knows a lot about both the technical and information side of things.
Next we have Robby Mook, who is a star in his own way, and very popular around the Kennedy School. I see little groups of people following him now. You know, he’s got like a cult. Robby is most well known because he was Hillary Clinton’s campaign manager. And he is really one of the hardest working, nicest guys I've gotten to know. We’ll tell you a little bit about that story later.
Sitting next to him, believe it or not, is a real live Republican. [laughter] Yeah. And he’s a real Republican, he’s not a Republican in name only. This is Matt Rhoades. He was Mitt Romney’s campaign manager. And so the story about how and why he’s willing to work with Robby to protect the digital democracy is something we’ll talk more about. But again, Matt, a great guy, a very effective guy, and one who’s helping in the project that we've started, navigate the complicated politics of all this.
And last but certainly not least is Debbie Plunkett, who, at one point, was one of the most senior people in the National Security Agency. And if you want to talk about someone who’s seen the bad guys go after your network, Debbie was responsible for all of the information security and cyber security for the National Security Agency. And at that time, the Department of Defense. So I think it goes without saying that, if you're a bad guy, you're doing everything you can to get into the National Security Agency and DoD. But thanks to Debbie, they almost never did it.
[laughter]
If I said never, I’d be lying. And anyone who really knows cyber knows that you never say never.
So here is the way we’d like to progress. This is a very complicated issue, right. It’s some politics. It’s some technology. It’s some bad guys and understanding the way that bad actors, the Russians, the Chinese, the Iranians, the way that they act. And then, it’s a little bit of the private sector, too. So I don’t have to tell any of the Kennedy School students. But this is a really complicated mix when you have all of those things together. If it were just a technology issue, it’d probably be complicated enough. But it’s the reason that we have all of these people here.
So I'm going to start with the politics and talk to Robby and Matt. Move onto the technical perspective, where we’re going to ask Debbie about some of her ideas about how you can defend digital democracy. We’re going to talk to Molly and Clint about info ops, the Russians in particular, their observations, and some of the things that they have been doing recently. And then finish with Heather talking about the perspective from Google, the role of the private sector. And then, of course, in the Forum tradition, open up to some questions from you all.
Okay. So let’s start with you, Robby. You're fresh off the Hillary Clinton campaign. Obviously, a little painful. And there are a lot of things that you can do when you’ve been the campaign manager for a big campaign like that. Some of them involve making lots of money. Others of them involve going to a deserted island, burying your head in the sand, and pretending the whole thing never happened. With at least those as two options, why decide to work with me, Matt in particular, on a project to defend the digital democracy of all the things you could be doing? Talk to us about that.
ROBBY MOOK: Yeah. Well first, in full disclosure, I did spend a lot of time on a beach after election day. So I sort of worked that through my system. But look, aside from being fantastic people to work with, and people on this stage, I was charged up. I was very upset about what happened. I was concerned in particular about two things. One is, our campaign and a lot of Democratic campaigns were really vulnerable to these kinds of attacks. And I didn’t see anybody in particular talking about what we’re going to do to help the campaigns. And so it was important to me to fill that gap.
And second of all, I was really concerned about how partisan the issue was becoming, because you know, I’d known that in 2008, both parties had been attacked. And Matt and I were linked up through a common friend, and we were kind of—You know, it’s a small club that we’re in of ex-campaign managers, very exclusive.
MATT RHOADES: You don’t want to be in it. [laughter]
ROBBY MOOK: That’s accurate. [laughter] And we were talking about this. And I think first of all, I learned that Romney’s campaign had been attacked in 2012. Matt really understood what a problem this was. But I think we also had alignment around this issue, that this couldn’t become Democratic or Republican. Otherwise, we’re basically going to be inviting foreign powers to attack our adversaries. And that wasn’t okay.
And then look, the last piece I just want to underscore that you said, and what's been great about the space you’ve given us to do this project, is we’re about getting practical things done. So at the end of the day, we want to know that we've made some campaigns more secure, that we've done more to help the professionals, you know, scope out the bad guys, and keep them out of our campaigns.
ERIC ROSENBACH: So it’s kind of a funny story, but I was talking to my friends, other ex-Obama appointees, and talking about this issue, and how much it was just really bugging me. And I was losing sleep about it. And they said, “Oh, you should talk to Robby Mook.” And, you know, no blow to Robby’s ego, I didn’t know who that was. You know, I've been buried in the Pentagon for the last seven years, working 20 hours a day, seven days a week. So I was like, “Robby Mook, who’s that?” And they said, “Are you a political Trump? That’s Hillary Clinton’s campaign manager.”
ROBBY MOOK: He’s on the beach somewhere.
ERIC ROSENBACH: And I started talking to Robby. And then he told me about this guy, Matt Rhoades, a real live Republican, who also thought this was an important issue. Matt, tell us about your thought process. Why did you want to get involved in this?
MATT RHOADES: Well first off, I just want to thank the Kennedy School for hosting us, and Eric, and the Belfer Center, for providing us a platform to elevate this issue. And you know, Robby kind of stole my thunder, but this did happen to the Romney campaign in 2012. It just didn’t get as much publicity because it wasn’t done in a very public way. But in the fall of 2011, we were notified by the government that our campaign had been hacked by the Chinese. And what it did and how it impacted our campaign was, you know, we had to use vital precious primary campaign hard dollars, money that we wanted to use to win New Hampshire, and Iowa, to upgrade our security, our cyber security. So it had a direct impact. That was money that we didn’t get to use when we needed it, when Newt Gingrich caught on fire in Iowa, or Rick Santorum caught on fire five different times later on, during the course of the primary So I was well aware that it is a bipartisan issue, that doesn’t just face Democrats. It’s a challenge to conservatives as well.
ERIC ROSENBACH: Matt, your campaign was hacked. Robby’s was hacked too. We want to spend a lot of the time tonight not just relitigating the past. When you think about the future, what's one of the things you worry most about?
MATT RHOADES: Yeah, and I'm excited, and Debbie is going to get into some of the things that we’re working on. But what I worried the most, from a campaign standpoint, you know, I don’t worry as much about the Presidential campaigns in 2020. That’s not to say they aren’t targets. I worry about the next Barack Obama or George W. Bush, two candidates you could see coming a mile away, and having some hacker decide that they want to change the course of history by hacking into some rising star’s email and misconstruing something that they wrote, or targeting someone close to them, and having that candidate just stop on the launching pad, that’s what worries me when it comes to these campaigns.
On top of that, from an election standpoint, you know, I really worry about some Secretary of State site getting hacked on election night. And we live in a very, very polarized world right now. And whether a Democrat or a Republican came out on the short end of the stick of a hack on election night, even if that Secretary of State was able to verify the results with hard ballots, hard copy ballots, I just think, in the polarized world that we live in, no one would believe the results. And it would create absolute chaos. And ultimately, in my opinion, that’s what these hackers and outside foreign entities are trying to do the most, they're trying to create chaos. They're trying to get something for the media to cover. They're trying to get guys like Robby and I to hate each other so much, and be even angrier than we already are, and not even be able to sit next to each other at events like this. And that’s what really scares me the most.
ERIC ROSENBACH: All right. Robby, we’ve been talking about this project that some of us here are working on. Tell us a little bit about that. Kind of what's the idea of trying to get a group outside of government working on a hard issue like this.
ROBBY MOOK: Yeah. Well, I think in the Rosenbach School of Thought, we are trying to be very practical and outcome-oriented. I think it’s very easy to put out a white paper and say, “This is what the government should do.” And our government is kind of struggling to do things quickly nowadays. So what we've been trying to do is focus on all the resources that exist in the private sector, that I don’t want to speak for anybody here in the private sector, but I think are really eager to help with this problem, and get those connected with the campaigns.
So first of all, we’re trying to create a playbook, and Debbie has been really active with this, that can walk a campaign through, you know, here are the basic things you can do. Not to get yourself completely secure, as you said, but get pretty darn close. And put that in language that someone like me or Matt can actually understand.
MATT RHOADES: And we’ve done that.
ROBBY MOOK: Yeah. Like really like second grade, maybe lower. No, but this is actually a very practical issue. You know, when we were worried about this on our campaign, people would come in and say, “Well, you need to secure your campaign.” You have absolutely no framework for understanding what that even means. And then the other thing that I've learned doing this, Matt and I have learned, is the private sector has done a lot to organize itself to better share information about threats that are out there, and confront their adversaries. So we want to bring some of those best practices into the political space, so the parties can take advantage of those best practices.
MATT RHOADES: I love when Robby starts talking about the private sector and how great it is. [laughter]
ROBBY MOOK: This is all an elaborate scheme.
MATT RHOADES: So one thing you have to realize, too, these campaigns, they're not all Presidential campaigns. A lot of these down-ballot campaigns, they're a bunch of young people who come together, bring their laptops from home, and someone gets assigned to be the digital director/IT director. And they're told, “You need to create a secure system.” And it’s a ripe target for anyone to try to hack into. And hopefully we can do some things—Debbie, we’re all counting on you—something practical that can help these campaigns.
ERIC ROSENBACH: I just wanted to mention again, the fact that you see these two guys sitting next to each other and working on this together, it’s a pretty rare thing nowadays. And it’s not easy for either of them, quite frankly. They're taking a lot of personal risk putting themselves out there right now, in an environment that, when you're talking about cyber security, maybe the Russians, other bad guys, they get a lot of flak for this. So I'm very appreciative, because, you know, moving the politics is one important aspect of this. And they’ve really been doing a great job. So we’re happy about that. Thank you guys.
Next, you know, so Debbie Plunkett, back in her previous life, I got to know and see in real life action. And she really is one of the people who I would say are a handful, who I most admire for everything she’s done. And as I told you, when you're in charge of defending NSA’s networks, that’s like a pretty hard job. So we called Debbie up. We said, “Hey Debbie, we have this, you know, funny project. What do you think?” And she didn’t even think twice. She was like, “I'm there. I'm on it, absolutely.” And she’s been working really hard ever since then.
She’s the lead for the playbook, where you heard, we’re going to try to give some practical advice. She’s working along with Dmitri Alperovitch from Crowdstrike, another good, hard-working guy. But talk to us, Debbie, a little bit from your perspective, about some of the things that campaign staffs could do to raise the bar on their security.
DEBBIE PLUNKETT: Sure. So again, thanks for the opportunity to—certainly to be here, but be a part of this project, because you know, for me, personally, I really felt like it hit at the core of—not to get hokey—me as an American. You know, I want to know that our democracy is protected, that our electoral processes are pure. And so to the extent that I could be a part of that, I think it’s really important that we all join forces and protect it.
So for the campaigns, you know, the weakest link, not just in campaigns but writ large, are people. And so the first thing that I think we have to do is to have some type of training for campaign personnel as they are coming into a campaign. It doesn’t need to be complicated. It does need to be frequent, to frequently remind those who might not normally have been exposed to appropriate security hacks[?] of things that they should and shouldn’t do on campaign time and campaign resources. So the very first is address the human factor.
Next I’d say a campaign really needs to think hard about what they need to protect. And what comes to mind are, you know, donor lists, or strategies and plans. And you know, Matt and Robby can talk to what's most important much better than I can. But the things that need to be protected need to be protected in the appropriate way. So using the cloud for security, and using virtual private networks as needed, to make sure that you're able to protect those assets that need the most protection.
And then authentication and identity management. Have strong passwords. Use two-factor control, so that you can at least raise the opportunity, increase the security of the infrastructure writ large, to change the landscape and make it much more difficult for someone who’d want to get in.
ERIC ROSENBACH: Okay, that’s great. Thank you. Now we’ve been talking a lot about kind of like the pure cyber security angle. But there are a lot of other things that have been going on over the last several years, and last year in particular, that are about more than just pure cyber security. And so you’ve been looking at some of the election processes themselves, and the things affiliated with that. What are some other things that you're concerned about, and you think we should keep an eye out for?
DEBBIE PLUNKETT: Well, you know, the elections are largely run by states. And different states have different procedures. Some are more attuned with security than others. I think, to some extent, that’s an advantage from a security perspective, because you don’t have one method that’s used across all states. Of course the flipside is that, at some point, almost all of the state election data is networked, is put on some type of electronic device or electronic means. And so it’s important, I think, to—and that's something that’s of concern, because once it gets there, you have the potential for alteration, for destruction, for anything that might change both the results, as well as, maybe more importantly, the perception about the results that would cause the populace to not trust the outcome. And I think that’s probably the biggest concern that I can think of.
ERIC ROSENBACH: Yeah, right. So we've heard the political perspective, a little bit about the technical perspective, and like in the class that I teach on these issues, I say, you always have to think about policy, politics, technology, but also the threat. And so Molly and Clint, they really have been up close and personal to the bad guys in different ways. Molly, in particular, you know, she literally just got back from Estonia. And you can tell, she’s a pretty brave person, because when you're going out there, and you're essentially advising other people about how to protect themselves or mitigate the risk of Russian attacks, usually the Russians go right after you. So Molly probably can tell you some really good stories about that.
But Molly also, as I explained, writes for Politico. And earlier this year she said that hacking the US voting systems would be kind of complicated. I think there are different ways to think about that. But just like Debbie mentioned, sometimes complexity is the best form of security when you have all kinds of different systems. And the Department of Defense, at one point, we counted, we had 30,000 different sub-networks. Which, on the one hand, made me want to pull my hair out; on the other hand, made me feel good, because it’s pretty hard to figure out. But that it’s far simpler than hacking voting machines, just to hack people. What did you mean, Molly, when you said that?
MOLLY McKEW: You know, it’s sort of a good transition into this, but it’s exactly what Debbie was saying. And that’s that the weak link is people. And any of these systems. And, you know, when you're thinking about sort of the hard security side, cyber, and hacking, and leaks is kind of one separate issue. But, when you're talking about information, that’s a very different space. And that’s not really something you can cyber secure or find a great firewall for.
And I think in this last election, suddenly there was attention to the ways that information moves and finds a life of its own, no matter where it is generated from. And that information is a tremendous tool for chaos, which is very much the goal of the Kremlin, when looking at its policy towards the United States, but the west writ large.
And when you have information as a tremendous tool for chaos moving across social media, which is an accelerated means for moving information like none of us have ever seen before, giving information, really, a viral speed, in terms of what it can do. And when you have information on social media backed by incredibly powerful data sets that are sculpted in very specific ways, using new cycle-metric targeting, and then you have campaigns coming out, like the Trump campaign, and bragging about how they have almost individually targeted content toward people, whether or not they actually did that, different story.
But it is possible to sculpt these different landscapes of information. And I think the way you heard the Trump campaign talking about it after the election, they very much said, “There were three different groups of Americans that we were targeting heavily, to keep them from voting. Black people, young women, other women.” There was another group, I can't remember. But they said it very specifically. They were running voter suppression campaigns to keep groups of people from voting that they believed were not going to be helpful to their ultimate voting outcome.
And it’s the technology they're using is not something specific to them. It’s developed by private sector companies. It’s for sale. Anybody can buy it. When I'm in the Baltics, when I'm in countries that are looking for different technology tools to monitor, track, and fight Russian information warfare, most of what they're using is stuff they bought from guys, Canadians, American engineers, whatever. And I'm fine with the Lithuanian information ops guys having that. I want them to have that. I'm fine with the Estonians having it. Not fine with Russians having it. I'm not fine with some guy sitting somewhere who bought it off the shelf having it.
And I think we’re really behind on understanding the power that social media and algorithmic challenges have given to the spread of information in new ways, and how that affects us. The core of what a lot of this is, is just marketing. There's no special sauce. You’ll hear all these data companies talking about their magical algorithms or whatever. It’s basically very simple psychology marketing techniques. But convincing you of an idea is not that much harder than convincing you you want a pair of shoes. And we’re really far behind on understanding that.
And, if you look at what was happening during—So when you saw—and I'm sorry for the long answer. But Comey’s sort of infamous spring testimony this year, which I'm sure many of you sort of saw news on, and he mentioned the beginning of summer, 2015, as sort of an uptick in this period of time when you saw Russian information operations accelerating in sort of the US information space.
Now what happened in the beginning of summer, 2015? There was this bizarre chemical factory hoax that the Russians sort of put on social media about—I think it was Alabama. But, you know, some random chemical factory in the south. They put out the story. There was some leak. There was some terrible thing that was going to happen. But it was a test to see how you can mobilize people, how you can create fear and panic. So it’s not just information, it’s not just convincing you about a thing. It’s to provoke specific response.
And if you look at that as sort of the starting point of a longer landscape, coming up through election in the United States, as the beginning of a point of public opinion, you had in that time period, just on three key issues, the Republican base shifted 35 points on their views on free trade, 20 points on their views of Vladimir Putin, and about 30 points on their views of the ability of the media to act as a restraint on political leaders. And in the same period, no shift in Democrats.
So if you have three issues that basically say more isolated America, more in favor of Russia, with more leaning toward authoritarian tendencies and leaders, on which there was a 20 to 40 point shift in US public opinion within a specific political party, and a specific period of time, when people try to come out and say, “Yeah, the Russians did stuff during the elections with information. And it happened to parallel all the stuff that some of the Trump campaign people were doing. But it had no impact on the outcome of the vote.” That’s absolutely a false proposition.
And I think that’s where, when you're on the Hill talking to people, if you're talking to the Senate, to the Congress, there's still this hang-up, where there's like cyber and other and this other thing that no one wants to look at. And even if we cannot prove that our election infrastructure was not physically hacked, which I think we don’t really know, because no one wants to look at it, this hacking of the information space, which is about influencing how people think, happened. And we know it happened. Guys like Clint have documented it really well. And we have yet to discuss this in an articulate way, figure out what the response needs to be, or tell Americans in a clear sense this whole no-star chamber idea. There's people like me and Clint and like the private sector, where they want to contribute trying to find solutions to this. But we’re not your government. And we need our government talking to you clearly about this, the same way some of the Baltic governments and others do. But we don’t have a strategic center to this effort yet. And we’re leaving our population vulnerable to attack by a foreign adversary. Sorry about that.
ERIC ROSENBACH: So there are a couple things. I wouldn’t be a Kennedy School professor if I didn’t explain the difference between causation and correlation. And so one of the things we’re also, just like Molly said, really working hard on, is trying to help people understand all the facts associated with this. And, like you said, have a candid conversation about it. So the fact that the Russians acted against the United States to try to influence the election, I think, is pretty well established fact. There are, I think, objective intelligence reports that say that.
Whether or not that’s what tipped the scales is where, in my class, I would say, you have to decide whether that’s correlated or a cause, almost always showing causation is a very difficult thing. So just one thing to keep in mind, I think, because there are a lot of intervening variables to use wonk speak that may have impacted whether or not the election was ultimately influenced by that.
MOLLY McKEW: Of course.
ERIC ROSENBACH: What I think really is important to try to understand—and Clint, you’ve been working on this for a long time too, is the Russian mindset and what they're going after. And so you, like Molly, you’ve been studying the Russians and a lot of other bad guys, not just them, in terms of what they're trying to accomplish. Talk us through that part. Like ultimately, if it’s the Russians, it could very well be the North Koreans next time. Why are they doing this? What's their ultimate interest?
CLINT WATTS: Yeah, I think the biggest thing—I’ll give an example. The first Charlottesville nightmare protest that we saw, one of the first lines they chanted was, “Russia is our friend.” Now I grew up in Missouri, and we played war in the cornfields. And that’s how I ended up in the Army. And we wanted to kill the communists. I mean Rocky IV was all about this, right? Imagine, in 1981, now, a group of men showing up at a protest and chanting, “Russia is our friend.”
The second Charlottesville protest, there was a guy talking about, he had Bashar al-Assad’s bomb-making factory, barrel bomb factory, and was chanting about Syria, you know, has it right, and those sorts of things. Russia’s goal is two parts. One, it’s a strategy of devolution, to break up unity of all unions, wherever they're at. And that goes from a local level—I mean I think you’ve seen the Calexit, Texasexit, stuff that’s come out just the past week, a lot more detail analysis—but also in terms of two primary adversaries for the Russians, which is NATO and the European Union. If they can break up those, that allows them to go one-on-one with any country. And then they have strength diplomatically, in terms of information, military, even economically with their gas and oil reserves.
Beyond that, it’s to push their agenda, their foreign policy view. And they have already won. It is over, folks. They have won. Through, in three years, the greatest influence campaign in the history of mankind has just been pulled off. They now have influence over a nationalist, not globalist, a nationalist agenda that stretches from Russia through Germany through France through the United States. They have influence over audience segments that agree with them. They're anti-immigration; they're anti-refugee. They are pro-nationalism, they are anti-globalist. This is the themes that you saw, whether you're a Democrat or a Republican, during the campaign, that was the targeted theme. Didn’t matter if you were Marco Rubio on one side or Hillary Clinton on the other.
The other thing that we really need to understand is, it is anti-democracy through and through. We might have hated the Soviets whenever I was out in the cornfields running around playing war. But at least they believed in something. They had an ideology they believed in. What does the Russian regime believe in? It is a kleptocracy. It is against human rights. It is a country that is now controlling information. WikiLeaks, which is a proxy, essentially, of the Russian government, that talks about transparency, it is promoting a country that has zero transparency. There is 100 percent internet surveillance, probably, by the end of the summer. VPN is the new legislative agenda of Putin.
So I think what we need to understand is, we are in an era that is National Security in a world of audiences. It is not defined by the borders of our country. And we have lost that in the United States at this point. There are audience segments that don’t believe in the system which we have in this country right now. They don’t believe that elections are true. And they actually don’t think everyone should have the right to vote. They also don’t believe that we should have unity. And they're okay—if you look at just the public opinion polling, they are okay with democracy maybe being replaced with other things. They don’t know what that is, but examination and evidence isn't always their strong suit.
But when you look at it, the things that they're advocating and believe in is what we fought the cold war for, is what we fought as a country for, it’s what we were founded on. And so if we no longer have our values—and I think our biggest problem, over the longer term, especially in this digital era, is what does the United States become if we’re not united? Gerrymandering and this digital disinformation match up really, really well. We live in two countries right now. We have some districts where Republicans fight Republicans for seats, other districts where Democrats fight Democrats for seats.
ERIC ROSENBACH: Okay, just on cyber or infowars, we’re going to open up a whole ‘nother can of worms here.
CLINT WATTS: Please let me finish this. This is important. They look in cyber, not about hacks, but in terms of influence and information. It is one bubble. People together, and they all share and use the same information sources. Hacking is for novices, influences is for masters. And the Russians have mastered that in the information space. And anyone with enough resources can master it as well. It was a two-part failure of our government going into this election. We didn’t understand that hacking powered influence. That was what they were doing the hacking for. And we never thought it would happen to us. And it happened all across Europe. So I think—I know that we’re trying to segregate this out as cyber. But there is no difference when 80 percent of your news comes off of social media feeds anymore. We are in a bubble, two bubbles.
ERIC ROSENBACH: We’re going to talk a little bit more about why this is something bigger than just the US. But Molly had a two-finger just on that.
MOLLY McKEW: --Follow up to what Clint just said, which I think is a really important point about gerrymandering, essentially. I think right after the US election this year, there were all of these stories about how divided we've become, how divided—like Democrats only watch these news sources. Republicans only watch these news sources. And there are all these great, you know, media charts of the Twitterverse of the red vote and the blue vote that are totally separate.
And I think that the core of that was, we as Americans have chosen this. And I think the piece that we’re missing, and not that people haven't been choosing this, is that this is also being done to us by the way that information is moving, by algorithms that show you what they think you want to see, by the targeting of information through data and other means. And I think that’s the piece that really needs to be discussed, is that this is not just people choosing to see only what they want to see, it is the way that the internet is now giving us information, is helping to create these divides.
ERIC ROSENBACH: Matt.
MATT RHOADES: I just want to add, you know, my boss, my former boss, in 2012, on the debate stage with Barack Obama, said that Russia was the greatest national security threat we faced. He was right then, people didn’t necessarily agree then, but he was right. He’s right now. That said, when you look at the election results, he’s going to have to respectfully disagree on a few points, one being, if you look back at the exit polling data, Hillary Clinton’s unfavor was at 58 percent on election day, right. Put that in perspective, in the 2012 campaign, on election day, Barack Obama was at 44 percent and Mitt Romney was at 45. Fifty-eight percent is an amazing threshold.
That said, and I certainly agree with the point that Eric made, and I said it earlier, there were outside entities that tried to create chaos in our country. Mitt was right in 2012. But that’s not why her unfaves got to 58 percent. You can go back and start to look at Hillary Clinton long before she hired this guy in 2013, and she went on a book tour back in 2013. And you could see her unfaves slip and slip and slip. And so, to just point to an outside foreign entity to try to explain why Hillary Clinton lost, I just disagree with.
On top of that, I was involved in the 2012 and the 2008 primary. And some of the populist issues that you're talking about, trust me, they were burning quite brightly and quite hot all the way back in 2007. And trust me, Mitt winning the Republican nomination in 2012 wasn’t as easy as I think some people thought it was. So I just want to put things in perspective and be bipartisan in how we talk about 2016. [laughter]
CLINT WATTS: Can I make one quick addition?
ERIC ROSENBACH: Yeah, of course. I'm going to go right back at you too, don’t worry. [laughter]
CLINT WATTS: I’ll just give this real quick. I don’t care about Donald Trump or Hillary Clinton, okay. I worked in the US government. So we worked over, under, and around politicians to get things done. So my point for you is, that if you believe in this country and its systems of democracy, we've got to get out of our bubbles. We don’t have debate right now, really. And this is why digital is important and why hacking is overrated. It’s about how information is maneuvered and used in terms of influence.
I was trained on this because I was working counterterrorism, and this is what we did. We would go over and do an assessment. I don’t need a giant gonculator to influence you. I can do it with a laptop and this amazing product called Microsoft Excel. You might have heard of it. And I can do all the analysis right there. This doesn’t take any sort of tricks to do it. Once you get the core down—and I think this is the key about influence—is, the data that’s available to do influence now gives you such a huge advantage, that once those with resources, once those with a desire and a motivation, those without rule of law and any limitations on their intelligence services, you can maneuver on any audience that you want. And that could be a corporation or a marketing sort of organization. That could be a political campaign.
The playbook is out there. And so anywhere there's democracy right now, if I was an authoritarian regime, and I was going against a democracy, I would use this same system on whatever social media or information platform that’s out there. And so I guess I didn’t want to get into the Republican/Democrat thing, because I just don’t care a whole lot. My job ultimately is to go after things that I think threaten democracy, whether it be terrorists or the Russians most recently, and that this is not that advanced. And anyone can target in an influenced way, very simply.
MOLLY McKEW: And to Matt’s point, just really quickly, and I think this is really important, it doesn’t matter how good the Russians are, how good the Chinese are, how good anybody is, they can't create this. You can make it worse. And that’s how—The places where Russian influence operations are the most successful is when it’s in the Baltic States, for example, yes, there is a Russian speaking population that is slightly separate from the locals. You can make that worse. Yes, there are people who hold populist views in the United States, or who believe in white supremacists’’ views, or whatever else. You can inflame those, give them a bigger platform, make it worse. They're not very good at creating these divides, but they can really use them in effective ways. So they use what's here.
ERIC ROSENBACH: And to expand a little bit on the point that Clint made, I think we’re not talking specifically just about hacking or info ops, but it’s that very potent mix of two together, which is very typical of the hybrid warfare that the Russians use, not just for information operations, but the two together are very potent. Whereas you hack into a real target, get some real info, mix it with fake information, put the two together, it’s an even more potent mix.
So I’ll take one issue with you, Clint, is you said that the Russians won. You know, I'm an old Army guy like you. And I'm not sure I'm ever going to admit that Russians really have beaten us in anything, and definitely not that they’ve won. And so I'm going to use your words against you a little bit, because you wrote earlier this year that Putin lost in France but still has a chance in Germany. So tell us in France, just real briefly, what did the French do that was different than the US government, in terms of reacting to it? And again, just to underscore the point that they’ve made, that this is not just a US issue, this is an issue for democracies. And so we want to think about that, and that there are countries, not just the Russians, going after democracies other than the US too. But that case of France in particular is really interesting. And I know that you worked on that. So help us understand that a little bit and how it worked out better.
CLINT WATTS: Yeah, so it’s two parts. It’s cultural and it’s structural. And so let’s talk cultural first. It’s much harder to compromise a Frenchman than an American. They just don’t get upset, right. [laughter] People have mistresses, no big deal. Actually, if you don’t have a mistress, it would be odd, you know, if you're running for office. You know, LePen actually shows up with Putin at events. She doesn’t try to hide it. And so they don’t—they are not as affected by compromise.
The other part of it is, they don’t get their information digitally as much as Americans do. It’s about half as much. Now that’ll change over time. But they only absorb their news about half as much digitally, which makes it much harder to do influence. This is why active measures didn’t work in the cold war. You’d have to set up a propaganda newspaper and run agents and do payments and all these sorts of things.
ERIC ROSENBACH: Active measures. That’s the old school term for Russian and Soviet info ops.
CLINT WATTS: Yeah. So that part is much tougher to do in the European context. They also had the luxury of coming after our election rather than before. Every time you run the playbook, it becomes easier to defend against. But structurally, we do things in America very dumb, compared to some other countries, in terms of our elections. One, it’s a much shorter run-up space. We now run elections every four years for four years. You know, we’re already talking about 2020. So like he mentioned before, we could have a candidate stand up right now and get knocked down in 60 days by a hack. We provide a huge ramp to sort of set out their influence.
The other thing is, the distance between primaries to general election. If you noticed, they had a runoff, many candidates. So if you're Russia, if you take out your top adversary, most of his votes are going to do what? Going to fall to whoever the replacement is. And there's only a two-week timeframe. That’s very tough to influence an electorate. And what the French also do is a media blackout, which is probably good for everyone’s information diet in general, you know, particularly in the US space. But you know, 48 hours before, they don’t take that information.
They also use, you know, some tricks—fake email accounts, you know, setting up dump sites, putting out false information mixed in with true information, which is all effective. But that is a waste for the political campaigns. You know, that’s taking up their time and their capital. Ultimately, though, they were very wise to what was going to happen or what could potentially happen. And they also didn’t get shook by it. You know, they stayed with traditional media outlets. They went with their newspapers. They went with friends and family. They have actual discussions rather than virtual ones and shouting fests on Facebook with family members. You know, it’s a different political scene. And I think it’ll be similar to what we see in Germany, which is coming up just in the next 10 days or so, less than 10 days.
ROBBY MOOK: -- [00:45:05] [off microphone.]
ERIC ROSENBACH: [laughter] Okay. Robby had to throw another jab in there.
CLINT WATTS: I mean I should have mentioned that, I mean the fact that we only have eight states that decide an election makes it much easier for targeting.
ERIC ROSENBACH: Yeah. Another thing that the French did, related specifically to defending against information type attacks, is when they found out that they were under attack, they went public with it. So they said, “Look. Here is information that has been hacked. Here is what the Russians,” and they named them explicitly, “are trying to do. Here is what they say. Here is what's the truth.” And they then let people understand what was going on.
So I think one of the things, at least, to take out of this, is a sophistication and understanding what the Russians or any other bad guy is trying to do. And making hard decisions about public attribution of that is something that has a deterrent effect, and also mitigates a lot of the potential impact, too. And so that’s one of the things I think that you can take from the French election, is the Germans now, they're thinking a lot about this too, right. They have an election coming up pretty soon. And they are the target of some of this effort.
So now we’re going to turn to Heather, who, after a long, super interesting conversation, we’re going to talk about the private sector aspect of this. You’ve heard Clint and some others talk about the private sector role in this. Heather is not here to talk about all of the private sector, nor defend all of them. But that role, I think, in that the private sector plays in all this, is really important. Build our infrastructure, election infrastructure, you know, aggregate news, all of the things that you all know a lot about.
Heather, talk to us about the private sector perspective on this. And you can't speak for everyone, so it could just be for Google. What’s Google, the private sector thinking about this? What are some of the things you're doing in the aftermath of the recent elections, to either get better, or address some of the things that have come your way?
HEATHER ADKINS: Well let me say, first, that I think I would not be alone in the private sector to say that I cannot believe I am here talking about this topic. Most of us who got into this field did so through sciences and not through the sort of political or economic perspective. I think the reflection upon what we have built is therefore not as robust as we would like it to be. And I'm going to tie the comments that Molly made and the comments that Deborah made together.
In the physical space, you have your five senses that tell you when you're in danger, and when you should feel fear. You do not have a sixth sense to tell you that online. There is no digital sense, there is no digital fear. This manifests itself with cyber security issues and with info op, operations. And if you think about it, the way we have to solve it is very similar. We have to give people technology that allow them to sense what's going on and to make an educated decision with what to do.
Now I'm very, very lucky to work for Google. We have hundreds and hundreds of people who work on cyber security issues. And we have built for ourselves incredibly robust infrastructure. And one of the things that we have started doing, as of March, was to start giving that to campaigns and to NGOs and to elections monitoring people. And we call this project the “project your election project.” And there are simple tools. They are free. And they should help combat some of these problems, such as how do you, as a small campaign without funds, and without experts, how do you protect your email? Well, you can use—if you're using Gmail, we will help protect you. And we will give you due factor authentication. And we will give you tools to protect you against phishing, such as password alert. And we will even protect the website upon which you put your information with project shield, which prevents denial of service attacks. And we have used this effectively in places like Kenya and The Netherlands, to keep information online for people.
And I think the other aspect where we really need that digital sense is for information. And so recently, in April, we announced that we will start, to the best of our ability to apply a fact-check label to a piece of information that appears to be political in nature. Because this gives people the opportunity to learn for themselves what the real facts of the situation might be. So the example we give is a claim that there are 27 million enslaved people in the world. And you might believe that. You might not. But if you put that search query into Google, we would try to also provide a place where you can do the fact checking for yourself. And that label should actually trigger your digital sense to think, “Wow, I wonder if I should check this fact. I think maybe I should. And so I’ll try to find some extra information for ourselves.”
So I think that the human element, going back to what Deborah said, is incredibly important here, because we’re trying to protect, what, seven billion people on the planet. And we need to educate them. And sometimes the best time to do that is in the moment, when they're actually making that decision and that action for themselves.
ERIC ROSENBACH: This, I know, is a hard question. But we saw Facebook, I think just last week, they came out and essentially admitted that, in doing further analysis, they found that a lot of paid ads had been paid for by a foreign government, and the Russians in particular. And a lot of the internet is set up, nowadays, so that firms that are built to make a profit should make a profit. But it can skew the way in which information flows. And so my question is this. How hard is it, when you're working at a firm like Google, or a Facebook, or a Twitter, to match up what, you know, is the right thing to do, against what shareholders expect you to do. And things that may be against what your commercial interests, and how difficult a struggle is that for the senior people at Google or other places?
HEATHER ADKINS: Well, I would not care to speak on their behalf. But let me say that the only reason you probably use Google is because it’s reliable, it’s fast, and we provide you things that are relevant and interesting to you. So if we suddenly changed our ethos to not do that, you might not use our product. And thus, we wouldn’t make the kind of money we do. The open free web is incredibly important. And we believe that. And we try to preserve that, try to make that the basis of what we do. We are no stranger to fraud on ads. We have had actors in the cyber space try to use ads to deliver malware. We have people trying to make an economic gain off of ads through click fraud and spam.
So we are no strangers to this idea that you can mix revenue alongside malicious intent. And so we try, to the best of our ability, to use technology saddled with people, to solve these problems. We are not 100 percent perfect. There are no silver bullets in this game. We are still very much learning what kinds of technologies and what kind of strategies work here. But we think there are some very interesting ones, and we have commissioned studies. And we are trying hard to push on the forefront of this.
ERIC ROSENBACH: It’s been, I think, interesting and kind of refreshing for me, just like you can see Matt and Robby want to band together to do something about this. I think the folks from the tech sector really have kind of stepped up to this challenge, too, recognizing in some way that they have a role also for democracy and doing something that is important. And it’s not easy, you know. Just like it’s not easy for Matt to stare down all the red eat meating [sic] Republicans who think it’s crazy to even talk to Robby Mook. It’s not easy for the tech sector to admit that they may have inadvertently had something to do with things that have happened in the past, too. So I think that’s good. There's a long way for everyone to go. But that’s important.
So this is the point at the forum in which you have now a lot of information. You have a lot of questions. And one of the best things about being in the forum is you have really smart people who can ask really interesting, if not sometimes crazy, questions to real experts. So here is the way we’ll go. You all can line up. You see there's one, two, three, four microphones. And remember, when we say, “Ask a question,” if you have a general point at first, no more than two sentences, followed by an interrogative, which ends with a question mark. That’s the general definition of a question. And then, what I would like in this case, just tell us who you are, and where you're from, so we have a sense of that. And then, specific question for one or two specific people, just so we can keep things orderly here.
So the person first up is this gentleman right here. Sir, go ahead.
ALEXI: Hi, good evening. My name is Alexi [00:54:50], graduate of Harvard Kennedy School, Russian. [laughter]
ERIC ROSENBACH: You know, I was going to go out on a limb.
ALEXI: So a troll, as you would call me, is like usually an attempt to deny me a voice, usually you call people like me trolls or just bad guys, okay. I’ll be just a bad guy tonight. [applause] So everything that we know that happened during the 2016 elections is that there was an attempt to collude in a Democratic party, in order to sideline a popular but out of the system candidate, Bernie Sanders, by the friends and family of, like, the Clinton family. So basically, when it—everything that we know that it was somehow revealed, it was made public by someone.
So it’s like, and now I'm just surprised why all this farce. Why we all, it’s really—it was a real issue, actually. This event undermined democracy. And this event undermined belief in democracy in the United States. This event, not an attempt to like punish someone who brought this bad news, be it Russians or somewhere else, I don’t know.
So basically, my question to you is, why do you set priorities this way? Why you do not care and do not see that this is really your democracy is under threat in this way? Why you try to attempt to shift this attention to someone who brings the bad news?
ERIC ROSENBACH: Okay. Who do you think would be the best person to answer your question?
CLINT WATTS: I’ll take it.
ERIC ROSENBACH: Actually, we’re talking about shifting to bad news. Matt, do you mind taking that one on? What do you think about that? Matt and Robby? Go ahead, Robby.
MATT RHOADES: You're a Bernie Sanders supporter?
ALEXI: I don’t care. [laughter]
MATT RHOADES: You don’t care?
ALEXI: I just want to—is that you leave Russia alone. That’s everything I want. [laughter]
MATT RHOADES: Like one thing, and this is the point—maybe I didn’t make it as eloquently as I should have. But to just point blame on Russia and the chaos that was created, Hillary Clinton’s unfaves—and you can go back, and you can chart them—started to rise to astronomical levels where many voters, including by the time she got to run against Bernie Sanders in the primary, where you would find in some Democrat primary states, that a majority of Democrats found her to be dishonest and untrustworthy. That’s just a fact, all right. I have my personal opinions. I've shared some of them tonight. But I just, you know, we need to step back and have a real sense of what is real, and go back and look at Hillary’s numbers over time. And it started early, like I said, well before Robby was even named campaign manager.
ERIC ROSENBACH: Robby?
ROBBY MOOK: Yeah. Look, I disagree with the premise of your question, for two reasons. One is, the primary campaign—I think sometimes people don’t understand this. It’s run by Secretaries of State, except in caucus states, which Bernie Sanders won overwhelmingly. Those caucuses are run by the parties. And we only won two of them. We won Iowa and Nevada. We worked really hard to win those. We were really proud of winning them. But Bernie Sanders won overwhelmingly in every contest that was run by the parties. Everything else was run by Secretaries of State.
And second, one of the things that I was proudest of in that primary campaign was the work that I did with Bernie Sanders’ team. I voted for Bernie Sanders three times. I'm from Vermont. So I've actually voted for Bernie Sanders more than most people. And it was three separate elections, by the way. [laughter] [applause] And I was proud to do that. And I'm incredibly proud of the work that I did with Sanders’ team during the Convention in particular. It was really hard. And they're good people.
And so I think that there are people out there that are trying to, you know, make it seem like we were really far apart. We were actually really close together. I thought the platform that we created was really something to be proud of. And so for those two reasons, I just disagree with the premise of your question. I'm glad you asked it. But I think what matters now is the future, that we have a lot—you know, we’re going to have a lot of new candidates. And I'm going to be voting in that primary too. So we’ll see who I choose.
ERIC ROSENBACH: Thank you for your question. One other thing I want to say, I'm guilty of calling people bad guys all the time. I once backpacked Russia for four months. Amazingly friendly, warm people, had me stay in their house. But President Putin and some of the things he’s done, there's a strand of bad guys in Russia just like there are in a lot of countries around the world.
ALEXI: I have been called like a troll each day, and thanks to you.
ERIC ROSENBACH: No, well, I apologize to you then. I wouldn’t mean it to someone like you. We can talk after this, too. Okay. Yes, sir.
SAUL: hi. I'm Saul Tannenbaum[?]. I'm a neighbor. And thank you for opening this to the public. And thank you for the really interesting discussion. I want to ask about what a gap I heard in this talk. The focus started on practical things. And you're putting together a playbook for campaigns, you know, which is security 101, which I mean, matters. But the whole other aspect that was talked about, information warfare, influence operations, etcetera, I mean it doesn’t seem, at least at this point, to have anything formed that’s anywhere close to practical. I mean we heard certain suggestions like Facebook should make every political ad that it serves up available to the public, so people know what's being done. I presume the same could be said for Google, etcetera. Are you folks thinking about the same practical problems in that sphere? And can you share at least preliminary ideas?
ERIC ROSENBACH: I definitely can't take credit for it, because it’s a project that Clint’s running that is not a Harvard thing. But he’s working on something great called the Hamilton Project, that deals with that. So I’d like him to talk about that part.
CLINT WATTS: Sure. I think there's probably two things with that. So thanks to a sponsor in this room, actually, we started this dashboard called Hamilton 68, which is Alexander Hamilton, the 68th Federalist Paper, noted that we are vulnerable as a nation to foreign meddling, essentially, in our elections, based on the way we’re structured and that sort of coercion, whether it be political or financial, whatever it might be.
So we have been watching these influenced campaigns over about a two and a half year period, leading up to the election. And we had documented that. But we were trying to figure out, how do you inform the public of it? And it was also a neat tool for us to sort of understand what is the Russian position. And if we get a chance, maybe we’ll go back to that point about why I'm particularly upset about the Russians.
But the idea is, if you understand what their influence narrative is, you start with what they're actually putting out. So our dashboard actually puts out—This is the state-sponsored propaganda that is put out by Russian news outlets. And then the second part is these personas that we had watched for many years, that we had assessed tend to routinely promote or amplify. And a lot of times, those are social bots that you see.
And the part about social bots that’s important is, it allows you to amplify your message to such an extent that you can change the way the media environment operates. And you cause mainstream media to actually react to the story. So it gives you an outweighted influence. And I reference artillery, because I came out of the Army world. It allows you to take information and shoot it like an artillery barrage in a very timely manner to gain social media systems.
So the first part was awareness. And our dashboard, what we tried to show is, there's two parts to influence. The first thing you do is you infiltrate the audience. And you do that by mimicking them. You use their own organic content to amplify divisions inside that electorate, social issues, religious, financial issues, whatever it might be. Whatever divides people up and gets them fighting amongst each other. So you amplify that. Then, once you have that audience paying attention to you, that’s when you start to influence. And that’s really what we watched over a couple-year period. So that’s our first thing. And we’re working on other version of the dashboard.
The second part, which Google is already kind of doing, but I think it needs to go and be more expansive, is nutrition labels for information. Essentially, do you know what you're consuming? Because if you do, you’ve got no one to blame but yourself. And it’s not about, is it propaganda, opinion, or anything. It’s about how much fact is being put out from information? When you rate those outlets, like Consumer Reports did for products, you put it over 17-18 variables. You rate their information over a month. We could call it mainstream media sweeps. And you then get an icon on your Facebook feed, Twitter feed, whatever it might be, when that news article comes up. And it says, “This has been rated by Information Consumer Reports version number three is, this much true, this much false. If you want this widget, you can keep it on. We’re not making you have it. You can opt in or opt out.”
And it becomes like the Snopes, essentially, across all social media platforms, where if you choose to read junk information, and you get mind fat, then you’ve got no one to blame but yourself. You're not suppressing free speech. Anybody can continue to write. And you're not, you know, suppressing freedom of the press.
And I think that’s the other part that’s there, we need to empower citizens to make their own choices about information. And that’s where my problem comes with Russia. They hacked into thousands of Americans. I have been targeted by a foreign government. They have stolen people’s information. They have shaped the information environment. You don’t know what was stolen, and you don’t know how many Americans, many of them who are in the US government right now, that I know personally, have been hacked by a foreign adversary. They hit a NATO commander. They stole private secrets.
Let me give you a parallel example. What if someone had broken into Chairman of the Joint Chiefs, Colin Powell’s house and stole files out of his house, taken them then to a newspaper, and published them during the cold war? We would be talking about armed conflict. But because it happened in cyber, we say, “Oh, no big deal that you violated our privacy, that you attacked our military service members with malware in place and embedded.” That is my issue. It’s not with the Russian people. They have their version of democracy as well. But we’re talking about crimes committed against Americans. We’re talking about the foundations of our country and what we supposedly believe in.
The revelations about Bernie Sanders is people found out how the party system actually works. Another way we could help Americans out is to give them a civics education, because they don’t know how a bill becomes a law. [applause] They don’t understand super delegates. You know, they don’t understand these things. And so we can help provide them an education. My problem with the Russian government, and I'm going to address this point, even if you try and cut me off—
ERIC ROSENBACH: --okay. Just remember, we’re here for everyone else. We just want to make sure we can get lots of questions in.
CLINT WATTS: Right, right. But this is my issue. My issue is with the Putin regime. And essentially, they not only launched an information attack, it’s a form of warfare. It is winning through the force of politics rather than the politics of force. And so we need to understand it is a threat to our Democratic governance. That’s what my issue with it.
ERIC ROSENBACH: And thank God we’ve got a School of Government here in Cambridge, Mass. that’s more about than just physics, it’s about information warfare and shaping the next generation of leaders who take all this on too. Okay. Sir, I'm so sorry. Normally I’d give you a follow-up, but there are a whole bunch of people waiting. Yes, sir. Up here in the green. Go ahead.
LEWIS: Hi. Thank you each for coming and spending your time with us today. My name is Lewis. I'm a Cambridge resident and MIT alumnus. My question is similar to the previous gentleman’s. It’s about legislation about information dissemination. Services like Google and Facebook count their users in billions of dollars, as much as programs like that Hamilton dashboard or other ways to rate the truthness of your news is nice. The headlines that we saw last week about Facebook selling hundreds of thousands of dollars of ads to foreign adversaries, or other troubling headlines earlier this summer about searching for, “Is the Holocaust,” Google ought to suggest, “Real,” several of the top results are ads about the Holocaust not being real. All of this is happening way faster than any one person or entity can regulate. How does our government start to create legislation or regulation for these American businesses that are propagating this around the world?
ERIC ROSENBACH: Okay. Who do you think is the best to answer that?
LEWIS: More folks either making the policy or the folks representing the private sector. No one specific.
ERIC ROSENBACH: Oh, okay. Heather, real quickly, one sentence of the question itself.
LEWIS: How do you regulate misinformation in the billions?
HEATHER ADKINS: Well I think it’s an interesting question. I think if we have trouble classifying what the problem is, it’s going to be very difficult to regulate it. As you can see, we have some difficulty classifying the problem, even. And I find that—and we see this in the cyber security space—it’s really difficult to regulate something you can't describe, and you don’t know what the solution is to.
And so I think maybe it’s a bit early to talk about regulation, if we ever want to talk about it at all. If you don’t trust your information sources, we will cease to be a useful information source to you. And I think that that is actually a better motivator for us to find technology and people solutions that solve this problem, versus regulation when we don’t have any solutions yet.
Imagine a time when we had cars. It was like, you know, the early 1900s. Had we regulated car safety before we had invented the seatbelts or the airbags or the roll bars, imagine what that car might have looked like. It might not have actually had four wheels. It might have had none. And you might have had to use a horse instead. So I think we have to think about these things very thoroughly before we start talking about regulation.
ERIC ROSENBACH: Robby, real quick. What do you think the odds of legislation, trying to regulate information? What are the odds of that passing on the Hill?
ROBBY MOOK: Yeah, no. It’s interesting. As you were asking the question, I was sort of thinking this through. I think they're very low. But I also think, almost to your point, I think the answer can't necessarily come from putting a guard in place to stop the flow. I mean I think information, whether it’s true or not, it’s like water. It’s just going to find a way out. And so to some extent, I think this is cultural. And we’re going to need to figure out a way to educate people.
It’s funny. When I was in college, actually, I was told never to cite the internet. Never. [laughter] And I wonder if potentially we’re going to go back, that pendulum is going to swing back the other way again, where people are going to be much more careful. I'm not saying that’s the only—I'm not saying just sit back and wait and let the culture adapt. But I just don’t know that we will get 100 percent solution by trying to—Any time you try to stop information, I think somehow we have to figure out, as a culture, how to make ourselves more resilient as well.
ERIC ROSENBACH: Go ahead, Molly.
MOLLY McKEW: I'm just going to plug myself a little bit. But on Thursday morning I’ll be testifying for the US-Helsinki Commission on these issues, on what we do about Russian information warfare, 9:30. It’ll be in the Senate. But you can watch it online. Because it is this complicated. It’s like four parts. It’s private sector, government, civil society, and citizens. And everybody has a role to play and what comes next. And it is fluid and adaptive. But we need to figure out how to put those pieces together much faster than waiting to see if the market will bear a better solution. So yeah.
ERIC ROSENBACH: Okay, thank you. Yes, sir, right here. Just real quick intro, and who you are. One sentence or two, and then question.
JONATHAN: Hi. My name is Jonathan Avery. I'm a 1L at Harvard Law School, coming from Washington, D.C., where the last year or so has been especially disappointing. I am addressing my question primarily to Molly McKew and Heather Adkins, but I’d love a response from anyone on the panel. So earlier, one of you made the comment that it would be no harder for me to sell you an idea than it would to convince you to buy a new pair of shoes. And that made me think of a wealth of recent research in psychology and neuroscience suggesting that, on one hand, people’s political views are fairly pliable. And it’s showing them—presenting them information in ways that are conducive to emotions involving anger, disgust, and fear, can seriously temporarily alter their political inclinations. On the other hand, there seems to be a lot of information suggesting that people are often intransigent. And it can be really difficult to change or even understand their political intuitions, often tying to basic communal or identity-based motivations.
And the problem there seems to be that a lot of the way that information affects people’s political intuitions, I wonder, could be out of the control of people who are trying to structure the flow of information or, in certain ways, its presentation. For example, rating the quality of news online through a platform like Google would probably be useful for many people. But I suspect there is a considerable part of the population for whom that would just confirm their antiestablishment intuitions.
So how much of the broader underlying psychological context that this information plays into, do you think can be affected by the information industry? How much of it might come down to surrounding social, cultural, educational structure?
ERIC ROSENBACH: Okay, Molly, why don’t you go ahead first. Then we’ll go to Heather.
MOLLY McKEW: It’s both. And it’s all of those things. There is an emotional component, and there is sort of the confirmation bias component. And you can use them for different things. The easiest thing is to radicalize people within their own set of views, to reinforce everything you believe continually, until you are so cut off from alternative views, like the red and blue bubbles on Twitter, that they just never cross over. You never see anything on the other side.
The emotional factor is where you can change views. And I was working on this project in the Baltic States on Russian language media. And we were doing some analysis of Russian state media propaganda, targeting Baltic Russians versus the locally generated Russian language content. And if you put the two stories in front of—I think my favorite focus group was a group of college graduate student level kids working in journalism and media, really well informed, really smart. And if you put two stories in front of them with no labels, they would look at them, and they would say, “Look, I know this is the propaganda story. But I like it. It’s more emotional. It’s really compelling. I want to read it to the end. This other thing, it’s like all balance. I don’t care. I read the first paragraph, and then I flip my phone to the next thing.”
And you can use both of those things in different ways. Like that’s where the emotionalism comes into the—There's sort of two pieces of really good propaganda narrative, which is why things are the way they are, and you know, sort of answering the big questions of the world, and the storytelling, which is the specific vehicle of how you get people to believe the narrative. And that’s where emotion is really important for sort of longer form efforts. But confirmation bias is much easier to do.
HEATHER ADKINS: We actually, as part of our looking at how to solve this problem, commissioned a study. And they looked at about 14,000 people. And unfortunately, I don’t remember the reference off the top of my head, but it was published earlier this year. And in particular, we wanted to understand if people really lived in a bubble online. And there is some data to suggest that, while there may be a bubble, they also do seek that information outside.
So I think having the platforms be open, free, and able to find a variety of views are very important. I think when you look at platforms like Twitter, where there is a voting component to it, you're already seeing some of that. I'm a Twitter user. And I follow both Hillary Clinton and I follow Donald Trump as well. And I look at both sides. And I think, actually, a lot more people do that than we realize. And I think that—or admit to.
And so I think it’s important to recognize that what you pointed out in the studies that you pointed out are very important. It’s also important to realize that it might not be the majority of people. It might be a microcosm that we’re still learning to understand how it works. I think your idea around creating these sort of platforms that allow us to vote for our favorite news, or news that we think is important, is good. One of the changes we made to the auto-suggest feature in Google, actually, is to make it really easy to give us that feedback, so that we can do those adjustments in real time.
ERIC ROSENBACH: All right, thank you. Yes, ma'am, go ahead please.
SARAH: Hi. My name is Sarah Angel. And I'm a senior at the College. I'm from Arlington, Virginia, originally. And my question goes to something Mr. Watts said earlier about limitations on American intelligence services, that foreign intelligence services may not have. And I was wondering where limitations come from, and whether it’s from US domestic policy, international agreements, or some kind of a moral line that we draw for ourselves that will only go so far in spying on or seeing what other countries are doing, and in what kind of information we collect, and where we stop ourselves when we think that we’re going too far. And I'm just wondering where that limitation comes from, and how that separates the US from foreign intelligence services.
ERIC ROSENBACH: Okay, Clint, you want to take that one? I can explain also.
CLINT WATTS: Two things. One is the law. And those laws come from things when the US government is overstepped, generally. Church[?] committee, you know, whatever it might be, those provisions are in there because we’re not comfortable with it. And part of that is influence in the US audience space. This has always been a challenge and one of the reasons why the US sucks at influence, is that we are always very cautious to not be influencing our domestic audiences. In a digital world, it’s very hard to know where the domestic audience starts and the international audience begins.
I think the second sort of component of it, and why we don’t do it, is the New York Times test, which is, who will stand up, you know, when these programs are put in place, and defend them? And we have a good system for that. And I don’t mean to be facetious with it, but you know, ultimately, inside government agencies, there is always a great idea. “Hey, you know what we could do?” And then somebody goes, “I'm not signing up for that.” Because we know that isn't in line with our values, ultimately.
I have never seen it, and I pray to God our country never goes and hacks 10,000 people’s accounts overseas, and dumps all their personal information out on the internet. If we ever did something like that, I think our country would hate ourselves for doing that. And we've seen that. I mean even with the Edward Snowden debate, depending on what side you're on, we’re very uncomfortable with it.
So I think it’s two part. You know, one is legislation we've already done. But the other part is, do we want to be signing up for violating people’s privacy, or destructive malware attacks? And we’re not comfortable with that. And I think it’s a good thing. And I hope, you know, we got rid of the US Information Agency during the cold war. And we don’t really have it. And that’s part of the reason why we’re so vulnerable to this, and we missed it. But, at the same point, I hope we don’t try to follow the pattern of our adversaries. The US government is slow to respond on this, because maybe they should be the ones to counter this sort of disinformation this time.
ERIC ROSENBACH: So two quick points. First, in the most recent defense authorization bill, there was a provision that granted new authority and several hundred million dollars for the United States to do more active information type campaigns, which was the modification of existing law. Because I think there's a recommendation of this.
Second of all, you know, when I was in the Department of Defense working on a lot of cyber and info ops things, you do not want the Department of Defense doing covert action type things that you know may end up in the New York Times. And so by law, what we had to do, even when trying to counter influence terrorist propaganda, is list that it was the Department of Defense that was providing you this information, which, on its face, immediately means it’s going to be discounted by anyone here trying to influence, because you had to explicitly say that. So there are ways in the law which essentially is under covert action provisions from the CIA that you can do it. But, like Clint said, this is a very clumsy process. We don’t do it very often. And there's a general level of risk aversion. So thank you. That was a great question.
Okay. Last question, then we’re going to head out. Was there a last one over here? No? Okay. Yes, ma'am. Thank you.
SOPHIA: Hi. My name is Sophia. I'm a student here at HKS. My question is on the benefit of public engagement on such discussion, infowar, and public digital security. I think it was Molly who mentioned that you see a 20 to 30 percent opinion shift on political issues when there is a malicious rumor being spread. And I wonder if you observe a reserve[?] of that trend if people are provided with some sort of information, or I engage in such discussion? Thank you.
MOLLY McKEW: And on the counter side of this, it’s actually really—There's not good data. You were mentioning some of these programs that are being run. A lot of it is contracted out. So the Defense Department doesn’t have to put its name on it. And there's a $500 million dollar CENTCOM project, for example, to counter violent extremist narrative online, which, as far as I can tell, has been flushing money down the toilet, cannot prove it has kept one Jihadi from being recruited in the X amount of time, whatever crap contractor has been running that project. It’s been written about in numerous things, and I think it’s worth taking a look at.
But I think the problem with this is, because of some of the psychological issues of this and other things, it is really easy to use, well comparatively easy to use social media and online media to harden beliefs and radicalize people, however you want to define that, for action, for political views, for love of shoes, whatever it might be. But it is far, far more difficult to deradicalize people with the same tools. It requires a much deeper understanding of the purpose of information, of how people absorb it, of how it is affecting their emotions and their sense of identity, and everything else. And I have yet to see a secret sauce version of deradicalization. And we really need to focus on that much more intently.
ERIC ROSENBACH: Okay. So that brings us to the end of the evening. I was always taught the end of any class, where you're trying to help people learn, that you try to summarize a couple key points. So here are three things I think you can take away from all of this, and the good questions.
First of all, it’s the information age. It’s not just the United States, but all democracies around the world, they're struggling with how people synthesize information, how it influences important things, and it’s extremely important for democracies around the world to figure out how you get to state of ease with that, and how much information can influence things.
Second of all, I think you see there are a lot of different examples, that trust is a really important part of a democracy. And whether it’s a bad guy, meaning to my friend, the Russian, the Russian government, the FSB, the GRU, and those who are actively trying to undermine our democracy, not the Russian people. If they're able to erode trust in government, or a diplomatic—or excuse me, a Democratic process, that’s really important. We ourselves, as a country, can have something rebolstering that trust, when you have guys like Robby and Matt up here together, working on that.
Third, something that’s a little bit broader than this, that we didn’t talk about, I think it’s really important that we as a country, and other democracies, send a signal that you can't do this to the United States. And that we push back. And that it’s a visible response, that there's some form of deterrence that prevents or at least dissuades half of the bad guys in the world from going after our democracy. And there are different ways that you do that. Some of these are defensive, some of them are resilience, when you see that the country itself is too strong, and that you can't impact it. And there are some things that we didn’t talk about here, that are a little more aggressive and forward-leaning.
So those are three points I’ll leave you with. First of all, thank you all for coming. More importantly, thank you to all of our guests for being here and enduring all this. [applause] Okay. Thank you all. Now you may exit stage left.
END