Analysis & Opinions

Russian digital attacks pose a threat to democratic elections

| July 25, 2018

Authors: Johan Sigholm and Gabriel Cederberg


The recently disclosed indictment of 12 Russian intelligence officers gives us a fascinating insight into the complex and coordinated cyberattacks that the Russian state has the capability to execute. The indictment reveals how hackers within the Russian military intelligence service GRU, in a well-organized and methodical manner, conducted cyberattacks against specific targets during the last US presidential election. Both human error and vulnerabilities in computer software were exploited to gain access to IT systems, belonging to political campaign organizations as well as various software companies.

In the light of their conduct during recent years, there are indications that Russia is now targeting this fall’s national elections in Sweden. Defending our digital democracy thus demands that we are constantly alert to new possible offensive approaches.

During last week's Helsinki summit, President Trump was unwilling to hold Russia responsible for the recent offensive digital campaigns targeting democratic elections. The fact remains that Russia, on several occasions, has demonstrated the capability as well as intent to carry out advanced influence operations. Intelligence reports from multiple sources point out Russia as the actor behind cyberattacks and attempts to influence elections and referendums in France, the United Kingdom, Germany, and Spain. The operations have been well-organized, with tactics borrowed from the military domain.

The mass-production of fake news has occurred in so-called “troll factories,” and primarily distributed through social and alternative media channels. Through cyberattacks targeting individuals, organizations, and technology companies, sensitive material has been stolen, some of which has subsequently been published as “leaks” mixed with falsified documents. In the short term, the goal has been to embarrass, confuse, or tie up resources. In longer term, the aim is to reinforce domestic conflicts, reduce confidence in politicians and traditional news media, and to underpin an aggressive and polemical debate climate.

Our research shows that a plausible next step for the forces that aim to influence us through the cyber domain is to turn to emerging digital technologies, such as artificial intelligence. One area where this technology has made rapid progress is within advanced video processing. In the recently presented research project “Deep Video Portraits” a team of German scientists have developed software that can make a person portrayed in a video say things and move arbitrarily. Facial expressions and shadows fall naturally, and the manipulated video is almost completely impossible to distinguish from the original. The researchers demonstrate their program through a video sequence where Theresa May’s lips and face are made to move by the researchers’ commands. The technology is called “deep learning,” a method where the computers themselves progressively “learn” to better solve a certain task based on deep analysis of large information sets.

The researchers predict that within a few years you will not be able to trust the authenticity of any image or video content. An apparent risk is that we in the future, especially at sensitive times as during elections or conflicts, will see an increased flow of digital information that is manipulated by actors based on their own agendas. Stopping the dissemination of false information and cyber propaganda in digital channels is difficult and there is no single solution to the problem.

Nevertheless, the spread can be mitigated through collective actions by several stakeholders, such as social media platforms, researchers, traditional news media, government agencies, and educators. As individual citizens we must also be mindful that messages addressed to us may be false, and thus be more critical to sources. A negative consequence could be that we more easily will dismiss true information as being false.

Our research at the Harvard Kennedy School studies how the cyber domain is used in conflicts and how democratic processes can be protected against influence through digital channels. A concrete outcome is the Cybersecurity Campaign Playbook, a recently published document that includes extensive recommendations and guidelines for political campaign organizations as well as election authorities can defend against digital attacks. We believe that this publicly available material, that is also available in a European version, could be of benefit to Swedish parties for the upcoming elections. The recommendations addresses human behavior as well as technical procedures, and is intended for a target group that does not need to be experts.

Upholding adequate cybersecurity for the coming elections is a key issue, one that will likely so remain during the foreseeable future. The Russian conduct in cyberspace that we have witnessed is a reason for concern. We therefore believe that it is important that several parts of society jointly contribute to counteract threats against fundamental democratic norms.

In a society that is increasingly driven by technology, and where progress is rapid, an increased awareness to digital influence is required, at government agencies as well as in the population at large. Our wish is that we can contribute to this through our research, so that this fall’s elections will be decided by well-informed voters and no one else.


**Translated from Swedish. 

For more information on this publication: Belfer Communications Office
For Academic Citation: Johan Sigholm and Gabriel Cederberg.“Russian digital attacks pose a threat to democratic elections.” , July 25, 2018.

The Authors