EU Commission issues guidelines for addressing digital risks to elections

EU Commission issues guidelines for addressing digital risks to elections |

The European Commission issued guidelines on Tuesday (26 March) under the Digital Services Act, outlining measures to mitigate against risks that could impact election integrity digitally.

The European Parliament elections are approaching in June, and ten European countries are gearing up for presidential and parliamentary elections this year.

The EU’s Digital Services Act (DSA), which entered into force on 17 February, is a horizontal legislation regulating how online actors should deal with illegal and harmful content online.

Last year, the EU executive announced the first batch of very large online platforms (VLOPs), and very large search engines (VLOSEs), which have been updated since then. The lists include platforms such as social media networks Instagram and TikTok, search engines, such as Google Search and Bing, and retailers like AliExpress and Zalando.

Social media platforms, for example, which are included on the lists, can influence elections, such as through AI-generated content like deepfakes, or the spreading of disinformation.

The Commission now issued its guidelines for the designated VLOPs and VLOSEs, recommending mitigation measures and best practices to be implemented before, during, and after electoral events.

Platforms not complying must demonstrate equivalent effectiveness. The Commission plans a stress test at the end of April.

EU Commission issues guidelines for addressing digital risks to elections |

How social media is shaping the 2024 EU elections

Together with Katja Muñoz, research fellow at the German Council on Foreign Relations’ Center for Geopolitics, Geoeconomics, and Technology, we talk about the impact of social media and technology on the 2024 European Parliament elections.

Risk assessment

The guidelines include establishing internal teams with sufficient resources and using available analysis and information regarding local context-specific risks and user interactions with their services for search and information acquisition.

They also suggest applying election-specific risk reduction measures, customised for each election period and location. This includes promoting official electoral information, launching media literacy programs, adjusting recommender systems to empower users, and reducing harmful content’s monetisation and virality.

During a press briefing on Tuesday, a Commission official noted that a dedicated risk assessment for each EU election is a requirement under the DSA and that platforms “have to check whether they have enough content moderators on the ground, and that they have local knowledge because every election is different.”

Local experts do not have to be in the country, but they must have enough knowledge about the dynamics of each specific platform the official emphasised.

The official also noted that the Commission is investigating X for potential non-compliance with the DSA, expressing concerns that X may not have enough content moderators to address all risks adequately.

Political ads

According to guidelines, political advertisements should be clearly identified, anticipating the upcoming regulation on political advertising, aiming to increase transparency in political campaigns, especially online. So to avoid possible manipulations, as per the ones related to the Cambridge Analytica scandal.

In November last year, as Euracitv reported, EU co-legislators reached a deal on the file, including the targeting of online ads and the role of a new European public repository.

EU Commission issues guidelines for addressing digital risks to elections |

EU policymakers nail down agreement on political advertising

EU co-legislators reached a deal on transparency and targeting of political advertising on Monday evening (8 November), including the targeting of online ads and the role of a new European public repository.


The Commission suggests VLOPs and VLOSEs, whose platforms might be used for generating or spreading generative AI content, should evaluate and address AI-related risks.

This includes labelling AI-generated content like deepfakes, adjusting terms of service accordingly, and ensuring appropriate enforcement measures are in place.

Meta and TikTok, which the DSA applies to, already announced they will require AI-generated content to be labelled, especially in light of the upcoming EU elections.

The European Parliament also uses Meta’s Instagram and TikTok for campaigning for the elections.

TikTok also shared its third report under the EU Code of Practice on Disinformation on Tuesday, outlining efforts to promote trustworthy information before the EU elections.



The Commission encourages collaboration with EU and national authorities, independent experts, and civil society organisations to facilitate the exchange of information before, during, and after elections. This aims to enable the implementation of effective mitigation measures, particularly concerning Foreign Information Manipulation and Interference (FIMI), disinformation, and cybersecurity.

FIMI refers to deliberate and often covert efforts by foreign entities to manipulate information or influence public opinion in another country for political, strategic, and security reasons. This can involve spreading misleading propaganda campaigns or interfering in democratic processes, such as elections.

The Commission also recommends implementing specific measures during the elections, including an incident response mechanism; a set plan to quickly address unexpected events and minimise their impact on the outcome or voter turnout.

EU Commission issues guidelines for addressing digital risks to elections |

Following TikTok, Meta announces 2024 EU election preparations

Following TikTok, Meta also announced on Monday (26 February) its preparations for the upcoming EU elections, focusing on combating misinformation and countering the risks posed by Artificial Intelligence.

Post-election period

According to the guidelines, post-elections, the effectiveness of implemented measures should be evaluated. VLOPs and VLOSEs should publicly release non-confidential versions of these reviews, allowing public feedback on the risk mitigation strategies.

The guidelines include public input and collaboration with Digital Services Coordinators (DSCs), emphasising third-party scrutiny to safeguard fundamental rights in mitigation measures.

DSCs, mandated by the DSA, bridge regulatory authorities and online platforms.

However, eight member states have yet to appoint them despite the 17 February deadline.

Read more with Euractiv

EU Commission issues guidelines for addressing digital risks to elections |

Unreleased document: DSA, identity wallets take spotlight on protection of minors onlineThe Digital Services Act (DSA) and European Digital Identity Wallets (EDIW) take the spotlight in the EU’s work to protect minors online, according to an unreleased document seen by Euractiv.

Subscribe now to our newsletter EU Elections Decoded

Email Address * Politics Newsletters


Leave a Reply

Your email address will not be published. Required fields are marked *