EU Commission presents new rules for political ads

The European Commission launched a proposal to regulate political advertising, introducing transparency obligations for marketers and strict limits to the use of sensitive personal information.

The proposal was published on Thursday (25 November) to protect the electoral process and democratic debate from manipulation and interference. The intention is to have it in place by spring 2023, one year before the next European Parliament elections.

“New technologies should be tools for emancipation, not for manipulation. This ambitious proposal will bring an unprecedented level of transparency to political campaigning and limit the opaque targeting techniques,” said Commission Vice-President Vera Jourová.

Online political advertising has been facing increasing scrutiny since the Cambridge Analytica scandal in 2018. The revelations shed light on data-harvesting practices during the US presidential election in 2016, which allegedly swung decisive votes in favour of Donald Trump through microtargeting techniques.

The EU executive seeks to restrict such techniques that, in the context of political campaigns, it considers can negatively affect freedom of opinion and information, fundamental preconditions for exercising voting rights.

Four European associations representing the broadcasting industry jointly welcomed the proposal noting how the strict regulation they are subject to for political advertising should also be mirrored online.

“This regulatory asymmetry is not only harmful to the democratic process, but it also contributes to growing imbalance between online intermediaries and media services,” the statement reads

Transparency and reporting

The scope of the regulation includes content sponsored by political actors and any issue-specific advert that aims to influence a legislative process or vote.

In a public statement, a campaign group European Partnership for Democracy criticised the vagueness of the definition. They say it could lead to platforms setting up a list of criteria that would automatically categorise an ad as political.

That might result in restricting the fundraising activities of NGOs working in migration, climate, or LGBTQ+ while failing to address foreign interference campaigns, the statement argued.

Based on the new rules, online platforms such as Google and Facebook would prominently display a sponsor’s name. A transparency notice should also explain why someone sees an ad, who paid for it, how much it cost, and its purpose.

The platforms will need to enable the users to report violations via easily accessible and user-friendly tools. The online service would then have to inform the relevant users of the actions taken and the outcome.

Upon request, advertisers would have to share information on political ads with relevant authorities or other ‘interested parties’, including researchers, journalists, NGOs, political bodies authorised under national law, and international observers.

Marketers will also have to include mentions of the revenues generated through political advertising in their annual financial statement.

The Commission can add, modify, or remove elements from the transparency obligations later via delegated acts.

Sensitive information

The legislative proposal introduces a general prohibition on using sensitive information to target political ads unless with users’ explicit consent. These limitations cover data about race, political opinions, religious beliefs, sexual orientation, health conditions, and trade union affiliation.

The ban does not apply to trade unions or organisations with a specific religious or political nature that will still be able to reach out to their members.

When sensitive data is processed for targeted advertising based on these exceptions, the regulation introduces additional measures, notably explaining the internal policy for these targeting techniques, maintaining records of the parameters and the source of the personal data used.

Moreover, marketers should be able to explain the logic behind targeting, a transparency measure meant to limit the use of AI-powered automated tools that often go behind the understanding of their developers.


The enforcement will be carried out at the national level by the competent authorities appointed by the EU governments. Data protection authorities will oversee the processing of personal data. The fines are also to be decided by EU countries on the condition they are proportionate, effective and dissuasive.

The competent authorities are likely to be those in charge of enforcing the Digital Services Act (DSA), horizontal legislation intended to introduce transparency and accountability rules for online platforms.

There is strong synchronicity between the two proposals, as the DSA prescribes that very large online platforms, namely those with more than 45 million users in the EU, must provide ad repositories accessible to researchers for external review.

Limitations to the data processing of sensitive information have also been introduced in the European Parliament version of the DSA’s sister proposal, the Digital Markets Act (DMA).


About the author

Related Post

Leave a comment

Your email address will not be published. Required fields are marked *