The overarching question imparting urgency to this exploration is: Can U.S.-Russian contention in cyberspace cause the two nuclear superpowers to stumble into war? In considering this question we were constantly reminded of recent comments by a prominent U.S. arms control expert: At least as dangerous as the risk of an actual cyberattack, he observed, is cyber operations’ “blurring of the line between peace and war.” Or, as Nye wrote, “in the cyber realm, the difference between a weapon and a non-weapon may come down to a single line of code, or simply the intent of a computer program’s user.”
Renewables are widely perceived as an opportunity to shatter the hegemony of fossil fuel-rich states and democratize the energy landscape. Virtually all countries have access to some renewable energy resources (especially solar and wind power) and could thus substitute foreign supply with local resources. Our research shows, however, that the role countries are likely to assume in decarbonized energy systems will be based not only on their resource endowment but also on their policy choices.
As the United States emerges from the era of so-called forever wars, it should abandon the regime change business for good. Then, Washington must understand why it failed, writes Stephen Walt.
House Speaker Nancy Pelosi of Calif., tears her copy of President Donald Trump's State of the Union address after he delivered it to a joint session of Congress.
Last week Speaker Nancy Pelosi famously ripped up her copy of the President’s State of the Union address on camera after he finished delivering it.
Later, the President retweeted a video based on it. The video the President retweeted (and pinned) had been edited to appear like the Speaker had been ripping up pages throughout the speech, as if reacting contemptuously to each American credited by name, like Tuskeegee Airman Charles McGee.
As a starting point for thinking about this, it helps to know that the video isn’t legally actionable. It’s political expression that could be said to be rearranging the video sequence in order to make a point that ripping up the speech at the end was, in effect, ripping up every topic that the speech had covered.
And to show it in a video conveys a message far more powerful than just saying it — something First Amendment values protect and celebrate, at least if people aren’t mistakenly thinking it is real.
So a first question is whether sites should even consider taking action against content that is otherwise legal. I believe they should, and they clearly do too — for example, their terms of service prohibit types of nudity that the First Amendment protects.
But Facebook’s and Twitter’s current policies don’t, and shouldn’t, result in a takedown of the video here, even as it’s important for social media sites that have massive reach to make and enforce policies concerning manipulated content, rather than abdicating all responsibility. The platforms are clearly still figuring it out — each has recently updated its policies with the 2020 election no doubt in mind. (Twitter’s can be found here, and Facebook’s here. The Facebook policy is explained further in this blog entry.)
Facebooks’s policy is narrowly drawn around the target of the manipulation being made to say words they didn’t say, so it wouldn’t apply here, since the Speaker isn’t shown saying anything at all. Maybe Facebook’s policy should be broader, but even if it’s changed to cover actions as well as words, there should still reasonably remain, as there are now, certain exceptions for expression that isn’t real but makes a point.
Twitter’s policy is different than Facebook’s — it concerns media that have been “deceptively altered or fabricated” — but to be removed something has to result in “threats to physical safety or other serious harm.”
Importantly, both Facebook and Twitter leave open the possibility of labeling videos like these as manipulated rather than removing them, and that might rightly apply here. Even something that to most people clearly appears to be satire or point-making, rather than offered to deceive, can be taken seriously by others — Poe’s Law — and it might be helpful to label accordingly, so long as that is done consistently.
So while it shouldn’t be a free-for-all, taking down a video like this — one that could be said to be more taking creative license than to be outright deceiving to most people — would cause at least as many problems as it solves, so it shouldn’t be taken down. But it wouldn’t be nearly as intrusive to label it as manipulated, since it is. In any case, it’s entirely possible that for the Speaker’s purposes, a public debate around the President’s actions here is as useful as any action by the platforms.
One thing that gives me pause: Claire Wardle, Danielle Citron, and others have pointed out the prospect that video is especially visceral and compelling. The rise of social media and the ability to make deep fakes is still in its infancy. We don’t know what it’s doing to us, or to society. We need to.
Meanwhile, there remain clear instances of outright disinformation that shouldn’t stand alone. Today, for example, the President retweeted a bald, unsupported claim about other politicians’ sons doing business in Ukraine:
The writeup from @PolitiFact is persuasive that the claim, broadcast to 72 million followers, is likely false and reckless; if so, it’d also probably be, in context, defamatory.
That’s a real problem, and one where the platforms offering the outsized megaphones have a role to play. Claire Wardle and others are developing expertise around fact-checking and labeling that doesn’t inadvertently reinforce the falsehoods they debunk, and it’s not just delete-or-not vs. label-or-not, but also promote/demote — something platforms already naturally decide. So there’s no “neutral” position here.
Finally: Apart from what the “right” policy should be across all political discourse, there’s the question of who should enforce it. It is dicey for the platforms to be the sole judges and enforcers of policy as if it were simply a customer service issue. It doesn’t match the gravity of the decisions called for here, and it makes them too powerful. Facebook has turned to third party fact checking with mixed results so far, and I’ve called for users themselves, indeed perhaps high school students, as part of their graded schoolwork, to do the policy review.
Ultimately these should be decisions undertaken reflectively by exactly the kinds of people to whom the content is targeted.
More open-sourced technology means that those who wish to use it for nefarious purposes have the same access to the technology as anyone else, write Bruce Schneier and James Waldo.
As more of society’s critical functions go digital, government must protect individuals' ability to perceive, judge, and trust or distrust things with which they interact, argues Steve Johnson.
This paper by Svenja Kirsch and Bethan Saunders proposes policy recommendations to enhance transatlantic coordination and cooperation against cyber-related threats by Russia and China.