The UK is speeding up the application of powers that could see tech CEOs sent to prison if their businesses fail to comply with incoming safety-focused Internet content legislation, the government confirmed today.
The latest revisions to the draft legislation include a radically reduced timeframe for being able to apply criminal liability powers against senior tech execs who fail to cooperate with information requests from the regulator — down to just two months after the legislation gets passed. (And since the government enjoys a large majority in the House of Commons, the incoming Online Safety regulation — already years in the making — could become law this year.)
While the draft bill, which was published in May 2021has already seen a string of revisions — with more being announced certain today — the core plan has remained fairly constant: The government is introducing a dedicated framework to control how social media companies and other content-focused platforms must respond to types of problem content (not only illegal content in some cases), which will include a regime of Codes of Practice overseen by the media & comms regulator, Ofcom, in a vastly expanded role, and given hefty powers to fine rule-breakers up to 10% of their global annual turnover.
As the bill’s name suggests, the government’s focus is on a very broad ‘de-risking’ of Internet platforms — which means the bill aims to tackle not just explicitly illegal stuff (such as terrorism or CSAM) but aims to set rules for how the largest Internet platforms need to approach ‘legal but harmful’ online content, such as trolling.
Child safety campaigners especially have been pressing for years for tech firms to be forced to purge toxic content.
The government has gradually and then rapidly embraced this populist cause — saying its stated aim for the bill is to make the UK the safest place in the world to go online and loudly banging a child protection drum.
But it has also conceded that there are huge challenges to effective regulation of such a sprawling arena.
The revised draft Bill will be introduced in Parliament on Thursday — kicking off a wider, cross-party debate of what remains a controversial yet populist plan to introduce a ‘duty of care’ on social media companies and other user-generated-content-carrying platforms. Albeit one which enjoys broad (but not universal) support among UK lawmakers.
Commenting on the introduction of the bill to in a statement, digital Secretary Nadine Parliament Dorries said:
“The internet has transformed our lives for the better. It’s connected us and empowered us. But on the other side, tech firms haven’t been held to account when harm, abuse and criminal behavior have run riot on their platforms. Instead they have been left to mark their own homework.
“We don’t give it a second’s thought when we buckle our seat belts to protect ourselves when driving. Given all the risks online, it’s only sensible we ensure similar basic protections for the digital age. If we fail to act, we risk sacrificing the wellbeing and innocence of countless generations of children to the power of unchecked algorithms.
“Since taking on the job I have listened to people in politics, wider society and industry and strengthened the Bill, so that we can achieve our central aim: to make the UK the safest place to go online.”
It’s fair to say there is broad backing inside the UK parliament for cracking the whip over tech platforms when it comes to content rules (MPs sure haven’t forgotten how Facebook’s founder snubbed earlier content questions).
Even as there is diversity of opinion and dispute on the detail of how best to do that. So it will — — at least parliamentarians respond to the draft as it goes through the scrutiny process in the coming months.
Plenty in and around the UK’s Online Safety proposal still remains unclear, though — not least how well (or poorly) the regime will work in practice. And what its multifaceted requirements will mean for in-scope digital businesses, large and small.
The detail of what exactly will fall into the fuzzier ‘legal but harmful’ content bucket, for example, will be set out in secondary legislation to be agreed by MPs — the latter being another new stipulation the government has announced today, arguing this will avoid the risk of tech giants becoming defacto speech police, which was one early criticism of the plan.
In what looks like a bid to play down further potential for controversy, the government’s press release couches the aims of bill in very vanilla terms — saying it’s intended to ensure platforms “uphold their stated terms and conditions” (and who could argue with that?) — as well as arguing these are merely “balanced and proportionate” measures (and powers?) that will finally force tech giants to sit up, take notice and effectively tackle illegal and abusive speech. (Or, else, well, their CEO might find themselves banged up in jail… !)
Unsurprisingly, digital rights groups have been quick to seize on this implicitly contradictory messaging — reiterating warnings that the legislation represents a massively chilling attack on freedom of expression. The Open Rights Group (ORG), wasted no time in likening the threat of prison for social media execs to powers being exercised by Vladimir Putin in Russia.
“Powers to imprison social media executives should be compared with Putin’s similar threats a matter of weeks ago,” said ORG’s executive director, Jim Killock, in a statement responding to DCMS’ latest revisions.
“The fact that the Bill keeps changing its content after four years of debate should tell everyone that it is a mess, and likely to be a bitter disappointment in practice,” he added.
“The Bill still contains powers for Ministers to decide what legal content platforms must try to remove. Parliamentary rubber stamps for Ministerial say-so’s will still compromise the independence of the regulator. It would mean state sanctioned censorship of legal content.”
The government’s response to criticism of the potential impact on freedom of speech includes touting requirements in the bill for social media firms to “protect journalism” and “democratic political debate”, as its press release puts it — although it’s rather less clear how (or whether) platforms/can actually do that.
Instead DCMS reiterates anyone that “news content” (hmm, does that cover online who claims to be a journalist?) has been given a carve out — emphasizing that this particular definition-stretching category is “completely exempt from any regulation under the bill”. (So, well, ‘compliance’ already sounds hella messy*.)
On the headline-grabbing criminal liability risk for senior tech execs — likely a populist measure which the government is probably hoping helps drums up public support to drown out objecting expert voices like ORG’s — the secretary of state for digital, Nadine Dorries, had already signaled during parliamentary committee hearings last fall that she wanted to accelerate the application of criminally liability powers. (Memorably, she wasted no time brandishing the threat of faster jail time at Meta’s senior execs — saying they should focus on safety and forget about the metaverse.)
The original draft of the bill, which predated Dorries’ tenure heading up the digital brief, had deferred the power for at least two years. But that timeframe was criticized by child safety campaigners — who warned that unless the law has real teeth it would be ineffective as platforms will just be able to ignore it. (And a pressing risk of jail time for senior tech executives, such as Meta’s Nick Clegg, a former deputy PM of the UK, could certainly concentrate on certain C-suite minds on compliance.)
The speedier jail time power is by no means the first substantial revision of the draft bill, either. As Killock points out there has been a whole banquet of ‘revisions’ at this point — manifested, in recent weeks, as the Department for Digital, Culture, Media and Sport (DCMS) putting out a running drip-feed of announcements that it’s further expanding the scope of the bill and amping up its power.
This has included bringing scam ads and porn websites into scope (in the latter case to force them to use age verification technologies); expanding the list of criminal content added to the face of the bill and introducing new criminal offenses — including cyberflashing; and setting out measures to tackle anonymous trolling by leaning on platforms to squeeze freedom of reach.
Two parliamentary committees which scrutinized the original proposal last year went on to warn of major flaws — and urged a series of changes — recommendations that DCMS has said it has taken on board in making these revisions.
There are even more extras today: Including more new offences (information-related ones) being added to the bill — to make in-scope companies’ senior managers criminally liable for destroying evidence; failing to attend or providing false information in interviews with Ofcom; and for obstructing the regulator when it enters company offices.
DCMS notes that it’s breaching these offences that could lead senior execs of major platforms to be up to two years in prison or fined.
Another addition, related to what the government describes as “proactive technology” — aka tools for content moderation, user profiling and behavior identification that are intended to “protect users” — arrives in the form of extra provisions being added to allow Ofcom to “set expectations for the use of these proactive technologies in codes of practice and force companies to use better and more effective tools, this should be necessary.”
“Companies will need to demonstrate they are using the right tools to address harms, they are transparent, and any technologies they develop meet standards of accuracy and effectiveness required by the regulator,” it adds, also stipulating that Ofcom will not be able to recommend These tools are applied on private messaging or legal but harmful content.
Platforms will also now be required to report CSAM content that they detect on their platforms directly to the National Crime Agency, in another change that replaces an existing voluntary reporting regime and which DCMS says “reflects the government’s commitment to tackling this horrific crime”.
“Reports to the National Crime Agency will need to meet a set of clear standards to ensure law enforcement receives the high quality information it needs to safeguard children, pursue offenders and limit lifelong re-victimisation by preventing the ongoing recirculation of illegal content,” it also specifies, adding: “In-scope companies will need to demonstrate existing reporting obligations outside the UK to be exempt from this requirement, which will avoid duplication of company’s efforts.”
Having made so many revisions to what the government likes to brand “world-leading” legislation, even before formal parliament debate kicks off, suggests accusations that the proposal is both overblown and half-baked look hard to shake.
MPs may also identify a lack of coherence being costumed in populist conviction and spy an opportunity to grandstand and press for their own personal pet hates to be rolled into the mix too (as former one minister of state has warned) — with the risk that a born lumpy bill ends up even more unwieldy and laden with impossible asks.
*A line in DCMS’ own press release appears to concede at least one looming mess — and/or the need for even more revisions/measures to be added — noting: “Ministers will also continue to consider how to ensure platforms do not remove content from recognized media outlets.”