General

The global vs. local dilemma: Crafting cross-border solutions for online safety

On 25 August, a regulatory hammer came down on the 19 largest online platforms and search engines operating in Europe. Since then, major technology companies have been required to comply with the stringent provisions of the European Union’s new Digital Services Act (DSA), a sweeping legislation that governs online speech and conduct.

Following these groundbreaking laws, the UK has completed action on its comprehensive Online Safety Bill, which will impose risk-based obligations on platforms that overlap with the DSA’s requirements in the coming year.

Countries with legislation either approved or under review include:

Australia: Online Safety Act 2021, implemented January 2022.

Brazil: Fake News Bill, introduced April 2023.

Canada: Digital Charter Implementation Act, currently in review.

India: Digital India Act, passed March 2023.

United States: SAFE TECH Act and Kids Online Safety Act, introduced February 2023 and May 2023 respectively.

Singapore: Online Safety Act, implemented November 2022

U
nited Kingdom: Online Safety Bill, passed September 2023.

Have you read?

3 steps to boost your digital safety while working from home

The UK’s Online Safety Bill could transform the internet. Here’s how

How effective risk management is paramount to fostering online safety

Fragmenting digital governance regimes

An assortment of digital governance rules also has been enacted or is pending in a host of countries, among them Australia, Brazil, the United States, India and Canada. Such recently enacted and proposed regulations aim to restrict both illegal and harmful-but-legal online content and behaviour within their respective jurisdictions.

However, this fragmented approach presents a dilemma. Although the internet is global, the laws governing it are local. The many perils arising from online content and behavioural abuses – cyber bullying, disinformation, deep fakes, gender harassment, child sexual abuse materials, election manipulation and more- do not stop at the borders, while the laws governing the
m do.

We can and must tackle this global/local dichotomy and increase alignment of democracies, while respecting jurisdictional sovereignty and fundamental rights. Both the World Economic Forum’s Global Coalition on Digital Safety and the Annenberg Public Policy Center’s Modularity Project agree – and they offer practical pathways to achieve this.

Although the actual risks of harm are the same across borders, the proposed legal frameworks and obligations differ widely, creating confusion, duplication, contradiction and expense for all. This regulatory quagmire also acts as a deterrent for small businesses seeking to launch global online services.

Exacerbating the issue, technology continues to race ahead – reshaping how we work, live and play, while regulation lags behind.

Take ChatGPT – the generative artificial intelligence app that splashed onto the scene last November 30. Just two months later, it boasted over 100 million active users, massively impacting digital tech policy. In reaction, the US Congr
ess rushed to hold hearings. The European Parliament, then wrapping up negotiations on the Artificial Intelligence Act, scrambled to tack on specific provisions for generative AI, lest the regulation be out-of-date before the ink had dried.

Multistakeholder approaches to online safety

Now is the time for multistakeholder, multinational solutions.

Democracies need not wait for the negotiation of treaties or other labyrinthian diplomatic accords to improve online safety across borders while respecting fundamental rights and jurisdictional sovereignty. Platforms and civil society can partner with multiple governments right now to create practical, common rules of the road and systems to operate them.

The idea is called modularity. It is a co-regulatory approach in which multi-stakeholders work with governments to design and carry out specific functions, codes of conduct or protocols that are shared across jurisdictions to address common problems. Participating governments recognize the module’s output as con
sistent with their respective laws. Enforcement would remain a government function.

Other industries have deployed global modular governance systems for decades. For example, the International Accounting Standards Board, an independent, non-governmental non-profit board, sets international accounting rules for corporations that are recognized worldwide. Securities regulators from different countries serve on an advisory council, but have no authority over the accounting rules.

A tool gaining popularity in the digital governance toolbox is a requirement that platforms identify, assess, mitigate and disclose the potential risks from their online services. The DSA, Singapore’s Online Safety Act, the UK’s Online Safety Bill and legislative proposals in Brazil, the US and Taiwan all include some variation of a risk assessment requirement.

A module could be set up as a multinational risk assessment standards board. Formed by multiple stakeholders, the board would set out best practices for conducting risk assess
ments as well as standards and protocols for independent auditors seeking to review platform self-assessments.

The Coalition on Digital Safety

The Forum’s Coalition on Digital Safety is a concrete example of such coordination with its recently released framework for conducting risk assessments. The framework offers practical guidance on assessing the risks from online platforms and services of all types and sizes, regardless of jurisdiction. Coalition members included regulators from multiple governments, as well as industry experts, academics and civil society.

Right now, the European Commission is drafting detailed delegated rules to implement the DSA provisions requiring very large platforms to conduct risk assessments. At the same time, Ofcom, the United Kingdom’s regulator, is preparing to advise platforms on compliance with the Online Safety Bill. There is a way to satisfy key implementation challenges facing both laws (and others) with the same processes and standards.

When drafting implementation
rules, policymakers should enable multistakeholder modules to perform well-defined tasks. And the module should be structured to enable experts or regulators from participating countries to join.

By adopting common, cross-border protocols, codes and systems, regulators can reduce government time and expense of producing customized rules without giving up their fundamental sovereignty. Crucially, it will help to align democratic governments on digital governance, avoiding further geographic splintering of the internet.

It will also streamline the regulatory landscape, making it easier for smaller platforms to thrive. And when the next ‘ChatGPT’ app suddenly goes viral, it will be far easier to update stakeholder modules than to amend government regulations.

It won’t be easy to craft, approve and manage the first modules. The EU understandably takes pride in the ‘Brussels Effect’ of its expanding suite of digital regulations and may be reluctant to cede any drafting authority or oversight over digital rules,
even though enforcement would remain within its remit.

But when regulators from several countries agree to work together with multistakeholders on practical solutions to common problems, it will set a course for greater international harmony and effective digital safety and governance outcomes.

It is urgent and timely for stakeholders to act now, steering policymakers towards a multinational modular approach to avert a regulatory quagmire in online safety. Multistakeholder organizations like the Forum’s Coalition are well-equipped to work with regulators toward that end.

Source: World Economic Forum

Related Articles

Back to top button