Insights EU Digital Services Act: consultation on election integrity guidelines

Contact

Under Article 34(1)(c) of the EU Digital Services Act (“DSA”), Very Large Online Platforms (“VLOPs”) and Very Large Online Search Engines (“VLOSEs”), those with more than 45 million average monthly active users in the EU, must carry out annual assessments of systemic risks stemming from the design or functioning of their service or the use made of their services, including in relation to “any actual or foreseeable negative effects on civic discourse and electoral processes.” VLOPs and VLOSEs must then put in place reasonable, proportionate and effective mitigation measures tailored to the risks identified. The Commission may issue guidelines on the risk mitigation measures VLOPs and VLOSEs are required to adopt which must be subject to a public consultation.

On 8 February 2024, the Commission published a consultation on Guidelines for VLOPs and VLOSEs on the Mitigation of Systemic Risks for Electoral Processes with a view to addressing the various types of online content and practices which can pose a risk to election integrity (e.g. hate speech, foreign information manipulation, interference and disinformation, and content generated through generative AI). The guidelines will be particularly helpful at this time as several elections within the EU are due to take place this year, including European Parliament elections (although the guidelines point out that they remain valid even after those elections have taken place).

The guidelines set out the mitigation measures that the Commission proposes should be adopted by VLOPs and VLOSEs to comply with this aspect of the DSA, taking into consideration existing guidance and regulation such as the Strengthened Code of Practice on Disinformation 2022 and the forthcoming Regulation on the Transparency and Targeting of Political Advertising and the AI Act (previously covered by Wiggin here, here and here). The proposed measures, which are detailed and wide-ranging, include providing access to information on the electoral process (e.g. where and how to vote); media literacy campaigns to foster critical thinking and to build resistance to misinformation and disinformation; providing contextual information such as fact-checking labels and indications of verified and official accounts; analysing and appropriately moderating virality of problematic content; enabling influencer transparency; clear labelling of political advertising and demonetisation of disinformation; and procedures to detect and disrupt inauthentic manipulation of the service such as terms and conditions prohibiting inauthentic accounts and botnets.

The guidelines also propose a number of measures for generative AI including, in respect of services that can be used to create generative AI content, ensuring AI generated content is clearly distinguishable by users (e.g. by watermarking) and, for services that can be used to disseminate such content, clear labelling of deep fakes and coordination with fact checkers to avoid amplification of disinformation on other platforms, such as a false depiction of election irregularities.

The guidelines also address the potential need for additional internal procedures or teams during an electoral period, recommending that such measures are in place and functioning one to six months before an electoral period and continuing at least one month after. The guidelines propose yet further specific guidance for elections to the European Parliament taking place 6-9 June 2024.

For more information and to respond to the consultation, which closes on 7 March 2024, click here.