Europe’s Big Tech Law Is Approved. Now Comes the Hard Part


Potential gold The standard for online content governance in the EU, the Digital Services Act, is now a reality after the European Parliament overwhelmingly passed legislation earlier this week. The last obstacle, which is a simple procedure, is for the European Council of Ministers to approve the text in September.

The good news is that historic legislation includes some of the platform’s broader transparency and accountability obligations to date. It will give users real control and insight into the content they interact with and offer protections against some of the most widespread and harmful aspects of our online spaces.

Now the focus is on the implementation of the vast law, as the European Commission is beginning to seriously develop enforcement mechanisms. The proposed regime is a complex structure in which responsibilities are shared between the European Commission and national regulators, in this case known as Digital Services Coordinators (DSC). It will be largely based on the creation of new roles, the expansion of existing responsibilities and smooth cross-border cooperation. What is clear is that, at this time, there is simply no institutional capacity to enact this legislation effectively.

In an ‘appetizer’, the Commission has given an insight into how they propose to overcome some of the most obvious challenges for implementation, such as how they plan to monitor large online platforms and how they will try to avoid the problems affecting the GDPR. , such as unsynchronized national regulators and selective enforcement, but its proposal only raises new questions. A large number of new staff will have to be hired and a new European Center for Algorithmic Transparency will need to attract top-level data scientists and experts to help meet the new expansive obligations of algorithmic transparency and accessibility. of data. The Commission’s preliminary vision is to organize its regulatory responsibilities by thematic areas, including a team of social issues, which will be responsible for overseeing some of the new due diligence obligations. The lack of resources here is a cause for concern, and would ultimately risk turning these hard-earned obligations into empty box exercises.

A critical example is the obligation of platforms to conduct assessments to address the systemic risks of their services. This is a complex process that must take into account all the fundamental rights protected by the EU Charter. To do so, technology companies will need to develop human rights impact assessments (HRIAs), an assessment process designed to identify and mitigate potential human rights risks arising from or in a service or business. case of a platform, something of civil society. he urged them to do so during the negotiations. However, it will be up to the Council, formed by the SDCs and chaired by the Commission, to annually assess the most important systemic risks identified and outline best practices for mitigation measures. As someone who has helped develop and evaluate HRIAs, I know it will not be an easy task, even with independent auditors and researchers feeding on the process.

If they are to have an impact, evaluations must establish comprehensive baselines, concrete impact analyzes, evaluation procedures and stakeholder participation strategies. The best HRIAs incorporate a gender-sensitive approach and pay specific attention to systemic risks that will disproportionately affect those in historically marginalized communities.

This is the most concrete method to ensure that all possible rights violations are included.

Fortunately, the international human rights framework, such as the United Nations Guiding Principles on Human Rights, provides guidance on how best to conduct such assessments. However, the success of the provision will depend on how the platforms interpret and invest in these assessments, and even more so on how the Commission and national regulators will enforce these obligations. But at current capacity, the ability of institutions to develop guidelines, best practices, and to evaluate mitigation strategies is not very close to the scale that the DSA will require.



Source link

Leave a Comment