As Germany prepares for its upcoming general election on February 23, 2024, concerns are mounting over the potential digital threats that could disrupt the democratic process. The country is grappling with an increasing number of cyberattacks, disinformation campaigns, and the weaponization of artificial intelligence (AI) to influence voters ahead of the vote.
The stakes are high as Germany elects a new parliament, and authorities are focused on safeguarding the integrity of the election. Cybersecurity experts and intelligence agencies are raising alarms about the possibility of “hack-and-leak” operations, where sensitive data could be stolen, manipulated, and released to undermine candidates or political parties. These efforts, which can severely damage the credibility of politicians, have already raised concerns within Germany’s political and intelligence circles.
Rising Threats: Foreign and Domestic Actors
The German Federal Office for the Protection of the Constitution (BfV) has issued several warnings about the growing risk of external interference. Russia, in particular, is seen as having a strong interest in influencing the election due to its geopolitical ambitions, including its ongoing war in Ukraine. The agency warned that Russia may attempt to sway the election outcome to benefit its interests. However, domestic actors—ranging from far-right groups to extremist parties—are also emerging as significant threats.
Claudia Plattner, President of Germany’s cybersecurity agency, the Federal Office for Information Security (BSI), has emphasized that “forces inside and outside Germany” are targeting the election process. These actors are leveraging digital platforms, such as WhatsApp, Telegram, and social media sites like TikTok, to bypass traditional media and spread their messages, often skewing public perception and amplifying polarizing narratives.
The Role of AI and Digital Infrastructure
One of the most concerning developments in the lead-up to the election is the use of generative AI to produce misleading or biased content. These AI tools allow for the rapid creation of posts, images, videos, and other media, which can be strategically designed to influence voters’ views.
In Germany, the far-right Alternative for Germany (AfD) party has been particularly aggressive in utilizing AI-generated content to push its agenda. Researcher Katja Muñoz from the German Council on Foreign Relations notes that while this AI-driven content is not necessarily false, it is designed to reinforce existing beliefs and ideologies, often misleading the audience. For example, a 78-second AI-generated video released by the AfD featured racially charged imagery aimed at stirring division among voters.
The AfD’s digital strategy includes a complex network of social media accounts that interact with one another to manipulate platform algorithms, ensuring that their messages receive broader visibility. This “alternative digital infrastructure” has given the AfD a significant head start over other parties in reaching voters, particularly on platforms with less stringent content moderation policies.
Lessons from Romania and the Risk of Social Media Manipulation
The political impact of digital disinformation is not a hypothetical threat. In Romania’s 2024 presidential election, a disinformation campaign led by Russia was linked to a surprise victory by far-right candidate Calin Georgescu. The Romanian Constitutional Court annulled the election results after the discovery that Russian-backed social media campaigns had used platforms like TikTok and Telegram to boost Georgescu’s profile. Experts, including Josef Lentsch of the Political Tech Summit, warn that similar tactics could be employed in Germany, underscoring the vulnerability of democratic processes to digital manipulation.
Countermeasures: Cybersecurity and Public Awareness
To counter these digital threats, Germany’s authorities are stepping up efforts to secure the election process. The BfV has established a task force dedicated to monitoring cyberattacks, while the BSI is providing online seminars to political candidates and parties to strengthen their cybersecurity practices. These efforts aim to reduce the risk of successful cyberattacks, which could compromise the security of the election.
Given the compressed timeline before the election, experts stress the need for greater cooperation between state authorities, political actors, and civil society. As Lentsch points out, “It is all the more important that civil society, political actors, and state authorities engage in dialogue” to mitigate the risks posed by cyber threats and digital disinformation.
In addition, experts like Muñoz advocate for raising public awareness about the manipulation of public opinion. By educating the public on how AI-generated content and disinformation campaigns work, authorities can help voters recognize and resist attempts to sway their decisions.
Conclusion
As Germany heads toward its critical election in February 2024, the country is facing unprecedented digital threats aimed at disrupting its democratic processes. From cyberattacks and disinformation to AI-driven propaganda, the risks to the integrity of the election are significant. However, through increased cybersecurity measures, public awareness campaigns, and vigilance from authorities, Germany hopes to safeguard its democratic values and ensure a fair and transparent election process.
This emerging challenge highlights the broader global trend of digital manipulation in elections, a trend that other democracies must also address to protect the future of democratic governance.