You can quote several words to match them as a full term:
"some text to search"
otherwise, the single words will be understood as distinct search terms.
ANY of the entered words would match

EU: Put Fundamental Rights at Top of Digital Regulation

(Brussels) – The European Union’s draft rules to regulate internet platforms have the potential to better protect human rights online, Human Rights Watch said today.

EU: Put Fundamental Rights at Top of Digital Regulation

But the European Parliament should be more ambitious in holding technology companies accountable for human rights harm stemming from their practices and introduce stronger safeguards against government abuse. “The draft regulation includes key measures to set standards for transparency and provide remedies for users, which the EU Parliament should support,” said Deborah Brown, senior digital rights researcher and advocate at Human Rights Watch. “However, the proposal should end tech companies’ abusive surveillance and profiling of people and amend vague provisions that invite government overreach.” The European Parliament is expected to vote on the EU Digital Services Act (DSA) during the week of January 17, 2022.

The draft proposal preserves conditional liability for online platforms and prohibits the general monitoring of users, imposed by law or otherwise, which are cornerstones for protecting freedom of expression online. Conditional liability prevents incentives for platforms to over-remove legitimate online speech to avoid the risk of liability. It also introduces important measures to increase platforms’ transparency, requiring companies to explain to users how they moderate content, disclose whether and how automated tools are used and the number of content moderators for each official EU language, and provide access to data for researchers, including from nongovernmental organizations. But the proposal falls short in some key respects and needs to be strengthened. Potential for expanding government censorship online: The regulation is premised on the principle that “What’s illegal offline is illegal online.” It defers to existing EU and national laws on what constitutes illegal content and essentially transposes that standard to online speech. However, some EU member states have laws that restrict expression that is protected under international human rights law.

The regulation would effectively put an EU stamp of approval on applying those abusive standards online. It also provides for removing content based on orders not only from judicial authorities, but from “administrative authorities” that are not subject to any requirement of independence, contrary to EU law.

The compromise text includes references to EU human rights instruments, but it still risks creating new powers that facilitate removing internationally protected expression from the internet by requiring platforms to remove content for allegedly violating abusive national laws without judicial approval. Failure to ban surveillance-based targeted advertising: While the draft regulation includes measures to increase transparency in online advertising and would enable people to opt out of content being recommended to them on the basis of profiling of their online behavior, it does not address the surveillance-based advertising business model that dominates today’s digital environment.

The premise underlying the online advertising ecosystem is that everything people say or do online can be captured, turned into data that is profiled, and mined to maximize attention and engagement on platforms while selling targeted advertisements. This business model relies on pervasive tracking and profiling of users that intrudes on their privacy and feeds algorithms that promote and amplify divisive and sensationalist content. Studies show that such content earns more engagement and, in turn, profit for companies.

The pervasive surveillance upon which this model is built is fundamentally incompatible with human rights. Narrow mandate for assessing rights risks: The draft regulation would obligate very large online platforms to carry out systemic risk assessments covering the dissemination of illegal content, and actual or foreseeable risk of some human rights harm stemming from the design, algorithms, intrinsic characteristics, functioning, and use of their services in the EU, which they are then required to mitigate. Under the proposal, very large online platforms would be subjected to third-party audits by “organisations which have been recognized and vetted by the Commission” to assess the platforms’ compliance with the regulation. Increased independent scrutiny of companies is helpful, but the proposal falls short of the kind of effective human rights due diligence that international standards require. Under the UN Guiding Principles on Business and Human Rights, companies should conduct human rights due diligence that includes identifying, preventing, ceasing, mitigating, remediating, and accounting for potential and/or actual adverse impacts on human rights. But the proposed regulation risks taking a more limited approach by assuming that all risks can be mitigated and by not explicitly requiring companies to utilize the full range of remediation tools.

The narrowly predefined risks included in the current proposal also exclude the broader range of economic and social rights that platforms’ practices affect – for example the labor rights of content moderators, whose work is necessary for companies to comply with the obligations set out by the regulation, to a healthy and safe workplace. Content moderators are typically low-wage workers who experience psychological trauma and emotional harm from their work from reviewing disturbing subject matter, poor working conditions, and a lack of support and training.

These shortcomings make it all the more pressing for the European Union to develop robust general rules governing mandatory human rights and environmental due diligence obligations for companies, in line with the blueprint envisioned by the European Parliament in 2021. Potential for setting bad global precedents that are ripe for abuse: As the Digital Services Act Human Rights Alliance has emphasized, this regulation will have far-reaching consequences beyond the EU both because of its potential to inspire legislation in other regions and because it can set standards that companies may apply globally.

The draft includes problematic provisions that are ripe for abuse. For example, it requires service providers that are established outside the EU to designate a legal or natural person as their legal representative in one of the member states where it offers services. This legal representative can be held liable for noncompliance. This is akin to the “hostage” clause that a number of governments, including Turkey, Indonesia, Russia, and India, have put into place, which incentivizes companies to comply with overbroad orders and subjects companies and their staff to legal threats and intimidation, making it more difficult for companies to resist abusive or improper government requests. Any requirement on platforms to establish contact points or legal representatives in the EU should avoid creating a risk of personal liability for that representative.

The regulation would also require the European Commission to publish the names, addresses, and email addresses of all “trusted flaggers.” Transparency in this area is positive, but disclosing this information can put civil society groups that act as trusted flaggers at risk, especially groups that flag content from government actors. This can set a dangerous precedent to require companies to disclose the identity of trusted flaggers in repressive contexts. Adding an exception to including information that could put trusted flaggers at risk would address this issue. Proposals that would require strict, short time frames for removal of content have been rejected so far. When faced with short review periods and the risk of steep fines, companies have little incentive to err on the side of free expression. This approach also typically fails to provide judicial oversight or remedy. Germany’s flawed social media law, NetzDG, has already inspired similar laws in more than 13 countries around the world to regulate online content, including in more repressive contexts. Including similar provisions in this regulation would further normalize and encourage such measures and should be avoided. Lack of independence for oversight and enforcement: The draft envisages shared responsibility for enforcement between the national-level Digital Services Coordinators, the European Board for Digital Services, and the Commission.

The coordinators will have an oversight role at the national level, with both investigative and enforcement authority, including authority typically reserved for judicial authorities. However, the draft regulation does not require full independence of these bodies, only that they function independently.

The draft regulation also grants the Commission supervision, investigation, enforcement, and monitoring responsibilities with respect to the obligations of the very large online platforms, without requiring independent judicial review. In giving itself a supervisory role, the Commission, an executive body, blurs the separation of powers that is essential for providing for checks and balances between EU institutions. European lawmakers should work to find a solution that ensures both structural and functional independent oversight under the regulation.

The EU Parliament should consolidate progress made on the DSA and support amendments that safeguard against both government and corporate abuse of human rights online, Human Rights Watch said. “The DSA is an opportunity for the EU to take more decisive action to protect people from violations and abuses stemming from online content, as well as from surveillance and censorship online, through meaningful and rights-respecting regulation of internet platforms,” Brown said. “The EU should amend the regulation to ensure that it meets this challenge and puts human rights ahead of the profits of technology companies.”.

Read the full article at the original website

References:

Subscribe to The Article Feed

Don’t miss out on the latest articles. Sign up now to get access to the library of members-only articles.
jamie@example.com
Subscribe