Similar issues, different approaches
Whether it’s the US and French presidential elections or Brexit, every major election has its share of disinformation. In fact, this now goes hand in hand with all major debates, which has galvanised civil society, specialised teams within the media and technology companies to debunk false information and offer tools for prevention and monitoring.
However, these efforts cannot respond to all the dangers presented by online harassment, terrorist propaganda, child pornography and illegal content in general. While education is part of the solution in terms of introducing citizens to techniques that help them assess the quality of information, legislation also plays a fundamental role. The state is the only entity with the necessary weight to regulate large platforms.
"In Europe, the bird will fly according to our European rules”
With the Digital Services Act (DSA), the European Commission has become the reference point for regulating very large online platforms and search engines. It is responsible for monitoring them and has wide-ranging powers of investigation. It can, for example, request access to their databases or algorithms. In cases of non-compliance it can impose fines of up to 6% of global turnover.
Furthermore, the Commission can formulate guidelines to regulate platforms, from the transparency of their activities to the design of their interfaces. In a display of assertiveness - when Elon Musk made his takeover of Twitter official in October 2022 with the slightly provocative message "the bird is freed", Thierry Breton, European Commissioner for the Internal Market, replied: "In Europe, the bird will fly according to our European rules".
The Online Safety Bill, a Frankenstein law?
Meanwhile, months after the DSA began to take effect, the UK still appears to be mired in tinkering with its own draft regulation known as the Online Safety Bill (OSB). The UK’s attempts to regulate the internet began in earnest four Prime Ministers ago when the Conservative election manifesto in June 2017 promised that "the rules online should mirror those that govern our lives offline". The Bill did not progress, however, perhaps due to more pressing matters surrounding Britain's withdrawal from the European Union.
It was under PM Boris Johnson in 2019 that a version of the legislation began to make its way through parliament. The stated aim was to protect the most vulnerable including tackling “legal but harmful content” which would make it illegal to send a message causing ”psychological harm amounting to at least serious distress”. But this approach was met with numerous challenges from some politicians, technology players and defenders of free speech.
The new draft presented to the House of Commons on 5 December 2022, removed the “legal but harmful” clause but has been criticised for being watered down and nebulous when it comes to how digital giants will be held to account. And it is far from finalised: the bill reached the House of Lords for further debate on February 1st 2023.
For the Guardian's Chris Stokel-Walker, the UK's Online Safety Bill is "a beacon of mediocrity" that "has gone from its original intention - to focus on online abuse and harassment - to a clear call for 'free speech'." The journalist judges that the bill has become a Frankenstein-like legislative monster, in part due to the chaotic recent history of British politics. The newspaper's technology expert points out that Europe managed to introduce "logical, intelligent and robust regulation" quickly, providing a slap in the face to Brexiters who criticised the slowness of European bureaucracy. Finally, he points to the direct dialogue between Thierry Breton and Elon Musk, while a letter from UK business secretary Grant Shapps to the tech mogul was completely ignored.
The EU has largely bypassed a potentially paralysing political quarrel by focusing first and foremost on purely illegal content (e.g. child pornography, terrorist propaganda) but the DSA does also aim to tackle legal but harmful content: with measures that include mandatory risk assessment and audits for tech giants so that they can be held accountable for potential wrongdoing - such as massive disinformation campaigns. This also has its critics who say these measures will be onerous to implement and open the perception of acceptable online speech to debate.
When it comes to enforcing the rules, the European Commission will deal directly with platforms, with involvement from stakeholders such as the permanent taskforce comprised of fact-checkers, source-raters and anti-disinformation companies including Newsback, that is monitoring and assessing the EU Code of Practice on Disinformation to ensure that it becomes an effective tool.
In the UK, telecoms regulator Ofcom will regulate the OSB. It already has wide ranging powers within the broadcast sector and with video sharing platforms, and has a statutory duty to represent the interests of citizens.
Whatever the legislative future of the OSB or the results of the DSA, the two entities are facing the same issues and the same enemies. Those who seek to destabilise democracies by spreading disinformation and propaganda, as well as criminals who disseminate illegal material, are finding ways to access a digital world that knows no borders. Neither the EU nor the UK can risk leaving any paths open for these people to exploit.
21/02/2023, Delphine Gatignol, Director of Newsback