Q&A with Peter Singer

The Blade Runner rule.

How would you define resilience when it comes to cyberwarfare, particularly disinformation campaigns that try to influence how we see the world?

Resilience is what the great strategist of online warfare Taylor Swift described as the ability to “shake it off.” More seriously, resilience is the ability to weather the storm, either by making attacks less effective or by recovering so quickly as to nullify their goal. That is what makes it so crucial; it creates what is more technically known as “deterrence by denial.” We think of deterrence as coming from Cold War style retaliation, which was a fear of punishment, but it can also come from the attacker deciding not to attack because they don’t think it will work.

This holds true whether you are talking about a traditional cyber threat that involves the hacking of a network or what we call “LikeWar,” the hacking of people on the network. Both have such low barriers to entry that we’re not going to be able to deter all of them away through a fear of punishment alone. We’ve got to build a more resilient ecosystem to protect against these harms.

You’ve said that when it comes to fighting cyber offensives, “unfortunately, the 2016 U.S. election is the example every other nation looks to for what they don’t want to have happen. We’re the worst-case scenario.” What would have been the right and appropriate response by the federal government to foreign interference in the last presidential election?

About a year after the 2016 election, I testified before Congress and outlined 30 nonpartisan actions that we could take to secure us in cyberspace. Sadly, many of the representatives were unwilling to look me in the eye as I testified about what was, essentially, an attack on American democracy. And that is issue number one: getting past our sense of denial, which only aids and abets an enemy. After we do that, there is a wide array of actions that can be taken, too many to list here, but they break down into the role of government, the private sector, and the individual. Each has a role to play, just as they would on issues ranging from public health to natural disaster preparedness.

For the government, a key starting point is the creation of a strategy that cuts across government agencies and addresses multiple threats. Since 2016, there have been varied actions, but there hasn’t been any unifying vision or plan. For example, in the fall of 2018 we had an improved strategy and action taken by NSA/Cyber Command against Russian disinformation, which was missing in 2016. But neither the State Department or the Department of Education have the same level of preparedness. And not to mention the fact that we can’t frame this issue as Russia-related only.

What do today’s 2020 presidential candidates need to understand about information cyberwarfare in order to protect, as much as possible, against its negative effects?

They can and will be targeted. The key is to have a plan beforehand to defend themselves and their campaigns. There are certain best practices that should be implemented, indeed many of them already used by their peers in the 30-plus other national elections that have been attacked in other countries since the U.S. election in 2016. A key issue is to not just feed the fire of disinformation, but figure out when to respond and when not to (so as not to aid in the virality). Then, when they do respond, the idea would be to put the attacker on the backfoot by making it not about a simple fact check (which won’t work), but rather expose the attacker’s networks and their history. Make them the story, not you.

You have expressed concern about the rising use of AI by all parties, explaining, “there are those using AI to blur [the] lines between what’s real and what’s not, and then, in turn, the tech companies are all developing AI tools to help police content.” How can society continue to develop this technology while reigning in the dangerous, potential uses of it?

What you’re referencing here is the use of AI to manipulate imagery and the like, popularly known as “Deep Fakes.” This will be used for entertainment, but it will also be weaponized. I don’t think we can or should ban it, due to the First Amendment and the potential for good it does have, but I do think we can better prepare for it, including developing better tech and policy to rapidly identify and attribute its creation to the implementation of what I jokingly call “The Blade Runner Rule.” In other words, you may have a right to create something that is fake, but then I have the right to know when I’m interacting with something artificial. Think of this as the Deep Fake version of the little blue check-mark on Twitter that is used to verify accounts like the “@realdonaldtrump.”

Successful businesses like Amazon, Nike, and Toyota have been drawn into information wars. Amazon has responded by paying a dozen or so employees to defend their employers on Twitter after a string of reports detailing bad working conditions at the company’s fulfillment centers. Is this the lesson that companies should take away from cyberattacks: Embrace the tactics of the information war that pervades them?

Like it or not, we will see more of these “LikeWar”-style tactics, used by companies, activists, and others. It’s just part of being online now. Going back to the question of verification though, it is a risky move for companies to do so in ways that are deliberately deceptive. You can win at LikeWar without being a liar. The key is to understand that it is the virality—not the truth in and of itself—that matters most online.

How can the average person trust that the information they’re consuming online is accurate and truthful? After all, you’ve detailed that “on Twitter, some 80 percent of the accounts that spread the most misinformation during the 2016 election are still online today, pushing upward of a million tweets a day”. Is there any verification process that we should run articles, facts, and news through?

You shouldn’t just trust it. You have to be an active consumer of information, as you are constantly surrounded by attempts to shape you and your views and your clicks and shares that in turn drive the views of your friends and family.

This raises the demand for providing the tools of digital literacy, an area where the United States is far behind democracies like the Baltics, but also in building the personal and societal ethic behind hygiene. I teach my kids (and they are taught in government funded schools): “Cover your mouth when you cough.” This is not because it protects them, but because they have a responsibility to protect everyone else they connect with. It is the same online. And yet, when it comes to Internet toxicity in all its forms–hate, conspiracy theory, disinformation, etc.–we are comfortable with an environment in which people are unapologetically doing the equivalent of coughing in our face. We have to change that if we want both a healthy Internet and a healthy democracy.

You Might Also Like
Peter Singer

Peter Singer is Strategist at New America. He has been named by the Smithsonian as one of the nation’s 100 leading innovators, by Defense News as one of the 100 most influential people in defense issues, by Foreign Policy to their Top 100 Global Thinkers List, and as an official “Mad Scientist” for the U.S. Army’s Training and Doctrine Command. Singer’s award-winning books include Corporate Warriors: The Rise of the Privatized Military Industry; Children at War; Wired for War: The Robotics Revolution and Conflict in the 21st Century; and Cybersecurity and Cyberwar: What Everyone Needs to Know and Ghost Fleet: A Novel of the Next World War, a techno-thriller crossed with nonfiction research, which has been endorsed by people who range from the Chairman of the Joint Chiefs to the co-inventor of the Internet to the writer of HBO’s Game of Thrones. His latest book is LikeWar (HMH, 2018), which explores how social media has changed war and politics, and war and politics has changed social media. It was named an Amazon book of the year, a New York Times “new and notable,” and reviewed by Booklist, which said: “LikeWar should be required reading for everyone living in a democracy and all who aspire to.”

His past work includes serving at the Office of the Secretary of Defense, Harvard University, as an editor at Popular Science magazine, and as the founding director of the Center for 21st Century Security and Intelligence at Brookings, where he was the youngest person named senior fellow in its 100-year history.

You Might Also Like