EUROPE
Europe enters patchy road to audit online platforms’ algorithms

Under the EU’s new digital rulebook, online platforms must let auditors look under the hood and grade their algorithms. But several questions remain around this unprecedented task.

The Digital Services Act (DSA) will introduce a specific regime for very large online platforms as of August. These systemic platforms will have to analyse potential societal risks, like spreading disinformation, and put forth risk-mitigating measures.

These risk analysis and mitigation measures will be subject to independent audits assessing their appropriateness to comply with the DSA. Thus, this auditing process will be critical in shaping how platforms interpret and adjust to the new EU rules.

The European Commission published a draft of the delegated act to define the methodology for such independent audits. But the feedback from auditing firms, tech companies, and civil society points to several critical points for this unexplored territory.

Auditors

Auditing firms have consistently highlighted the lack of industry standards against which to audit the algorithms in their feedback and asked the Commission for additional guidance on the “reasonable level of assurance” to comply with the DSA’s different obligations.

“The scientific framework is not in place. We don’t have the answers to what this law aims to achieve. How do you define a systemic risk? How do you measure it?” Adriana Iamnitchi, a professor of computational social sciences at the Maastricht University, told EURACTIV.

The consulting giant PwC wrote in its response that “making the auditor the judge of what constitutes compliance will likely result in variability and potentially create disagreements amongst stakeholders as to who has set the bar at the right level and whether different entities are being treated fairly.”

PwC’s feedback overlaps with that of Deloitte and EY, large auditing firms set to dominate this market. As algorithm audits are relatively new and technically complex, only a handful of companies have the expertise for this task or the deep pockets to acquire the necessary talents.

Platforms

Inconsistency and lack of comparability are somewhat expected, given the different types of platforms that will be subject to the audits, ranging from social media to search engines.

The tech companies consider the delegated act overly prescriptive, failing to consider the diversity of use cases and to bind auditors to be proportionate in their assessments.

“The drawback of imposing an (overly) prescriptive standard is that it limits the choice of auditors and incentivises auditors to strictly follow the letter of the law, rather than its spirit,” reads Wikimedia’s submission.

One point on which auditors and platforms seem to agree is that some flexibility should be envisaged for the first year, given the complexity of the DSA and the novelty of this setting.

However, a more fundamental argument of the digital players is that the audit industry might not have the expertise to assess the platforms’ inner workings.

“This shows us the paradox we are in. A lot of opaque, data-driven companies have led to complexities beyond the reach even of the people whose job is to study them, namely academics,” Catalina Goanta, an associate professor of law and technology at Utrecht University, told EURACTIV.

Civil society

At the core of the feedback of civil society representatives is the question: Who will audit the auditors?

In a joint response, the non-profits Algorithm Watch and AI Forensic pointed out that auditing firms might have an incentive to be indulgent in their assessments to attract and retain contracts.

This ‘audit-washing’ risk worsens in a context without objective standards. Meanwhile, the audited companies will have ample discretion to redact the reports’ information as confidential, preventing them from being published.

For the members of civil society, the best way to keep auditors and platforms in check is to allow vetted researchers access to the complete version of audit reports. Under the DSA, vetted researchers are allowed to request data from platforms, but their role in the auditing mechanism is procedurally still up in the air.

More generally, there seems to be scepticism on the side of civil society as to whether big auditing firms are well placed to assess systemic risks, such as what social media might entail for democratic processes. And there are already actors trying to fill in this void.

One is Algorithm Audit, an NGO whose mission is to ethically assess the criteria for algorithmic audits, indicating the pros and cons in specific circumstances. Its methodology is called ‘algoprudence’, a mixture of algorithm and jurisprudence.

“There will be a collective learning process that will take three to five years,” said Algorithm Audit’s co-founder Jurriaan Parie, adding that much will depend on how the Commission and its new centre on algorithmic transparency will engage with auditors to establish best practices.

“It’s a process. It will not be perfect initially, but we have to start somewhere. The question is who will be paying attention to this,” professor Iamnitchi concluded.

[Edited by Zoran Radosavljevic]

Source: Euractiv.com

About the author

Related Post

Leave a comment

Your email address will not be published. Required fields are marked *