Insights Integrity of Elections: European Commission publishes guidelines for ‘Very Large Online Platforms and Search Engines’

Contact

The European Commission has published guidelines for ‘Very Large Online Platforms and Search Engines’ to ensure that they comply with their obligations under the Digital Services Act to mitigate risks online that may impact the integrity of elections.

It comes in a busy year for elections, with 64 countries going to the polls worldwide, plus the elections for the European Parliament in June (for which there are specific measures outlined). The guidelines call on Very Large Online Platforms and Search Engines to reinforce their internal processes so as to identify and mitigate ‘local context-specific’ risks that might stem from information on elections that is shared or accessed on their services. This might include “information on political parties, party programmes, manifestos or other material, or related information, to organise events such as demonstrations or rallies, campaigning, fundraising, or other related political activities”.

The guidelines also set out a series of specific mitigation measures for the relevant companies to implement, including: facilitating access to official information on the electoral process; investing in and implementing media literacy initiatives; introducing measures to provide users with more contextual information through fact-checking labels, prompts, and labelling of official accounts; ensuring that recommender systems promote media diversity and pluralism; labelling any political advertising in a “clear, salient and unambiguous manner” and otherwise complying with the recent regulation on the transparency and targeting of political advertising (commented upon here); ensuring that influencers are able to declare that their content contains political advertising; and introducing policies to demonetise content featuring disinformation.

There is also a section dedicated to measures addressing generative AI. The guidelines suggest that Very Large Online Platforms and Search Engines should put in place effective measures “tailored to risks related to both the creation and dissemination of generative AI content”, such as labelling deepfakes or using watermarks. It also recommends that services cooperate with relevant third parties (such as national authorities, experts and civil society organisations) to exchange information before, during and after elections, and introduce incident response mechanisms during the electoral period itself.

Commenting on the guidelines, Margrethe Vestager, Executive Vice-President for a Europe Fit for the Digital Age, said “We adopted the Digital Services Act to make sure technologies serve people, and the societies that we live in. Ahead of crucial European elections, this includes obligations for platforms to protect users from risks related to electoral processes – like manipulation, or disinformation. Today’s guidelines provide concrete recommendations for platforms to put this obligation into practice.”

To read the guidelines in full, click here.