Another UK parliamentary committee weighed in on the government’s controversial plan to regulate internet content with a broad focus on ‘security’.
The Committee Digital, Culture, Media and Sport (DCMS), warned in detail report today, which has “urgent concerns”, the bill “does not adequately protect freedom of expression, nor is it clear and robust enough to deal with the various types of illegal and harmful content in user-to-user and search services”.
Among the committee’s numerous concerns are how the bill diffusely defines different types of damages, such as illegal content – and damage designations – with deputies drawing the government’s attention to the government’s failure to include more detail in the bill itself, making harder to judge impact as key. components (such as Codes of Practice) will follow through secondary legislation, so they are not yet on the table.
This general imprecision, combined with the complexities related to choosing a “duty of care” approach – which the report notes in fact breaks down into several specific duties (in relation to illegal content; content that poses a risk to children; and , for a subset of high-risk P2P services, content that poses a risk to adults) — means that the proposed framework may not be able to achieve the desired “comprehensive security regime”, in the opinion of the committee.
The bill also creates risks for freedom of expression, according to the committee – which recommended that the government incorporate a balancing test for the regulator, Ofcom, to assess whether platforms “properly balanced their freedom of expression obligations with their decision-making. decision”.
The risk of platforms responding to sudden and ill-defined responsibilities around wide swaths of content by excessively removing speech – leading to a chilling impact on freedom of expression in the UK – is one of the many criticisms leveled against the bill that the committee appears to be picking up.
Suggests that the government reformulate the definitions of harmful content and relevant safety duties to bring the bill under international human rights law — in order to attempt to protect against the risk of excessive removal by providing “minimum standards against which a provider’s actions, systems and processes to deal with harm, including automated or algorithmic content moderation , must be judged”.
Even on child safety – a core issue that UK ministers have repeatedly attributed to the legislation – the committee flags “weaknesses” in the draft bill that they say mean the proposed regime “does not adequately map the reality of the problem”.
They urged the government to go further in this area, calling for the bill to be expanded to cover “technically legal” practices such as breadcrumbing (aka “where criminals deliberately subvert the boundaries of criminal activity and the removal of content by a service provided”) – citing witness testimonies that suggest the practice, while not illegal, “nevertheless forms part of the sequence of online CSEA [child sexual exploitation and abuse]”.
Likewise, the committee suggests that the bill needs to go further to protect women and girls from types of online violence and abuse targeted specifically at them (such as technology-enabled “nudification” of women and deepfake pornography”).
On Ofcom’s platform investigation powers, the committee argues that they need to be further strengthened – calling for amendments to give the regulator the power to “conduct confidential audit or verification of a service’s systems to assess the operation and results in practice.” “; and “request generic information on how ‘content is disseminated through a service’”, with lawmakers further suggesting that the bill should provide more specific details about the types of data that Ofcom may request from platforms (presumably for avoid the risk of platforms seeking to avoid effective oversight).
However – on enforcement – the committee has concerns in another direction and is concerned about the lack of clarity on how Ofcom’s very substantial powers (which it should be) can be used against the platforms.
He recommended a number of tweaks, such as making it clear that these powers only apply to in-scope services.
Lawmakers also call for a revamp of the use of so-called “technology notices” – which will allow the regulator to demand the use of new technologies (after “persistent and prevalent” failures of the duty of care) – saying that the scope and application of that power must be defined “more rigidly” and more practical information provided on the actions required to bring providers into compliance, as well as more detail on how Ofcom will test whether the use of such power is proportionate.
Here, the committee flags issues of possible business disruption. It also suggests that the government take the time to assess whether these powers are “appropriately future-proof, given the advent of technologies like VPNs and DNS over HTTPs.”
Other recommendations in the report include a call for the bill to contain more clarity on the issue of redress and judicial review.
The committee also cautions against the government creating a dedicated joint committee to oversee online safety and digital regulation, arguing that parliamentary scrutiny is “best served by existing, independent and cross-party select committees and evidenced by the work we have done and will continue to do in this field.” area”.
It remains to be seen to what extent the government pays attention to the committee’s recommendations. Although Secretary of State for Digital Nadine Dorries has previously suggested that she is open to accepting parliamentary feedback on the broad package of legislation.
The report, by the DCMS Committee, follows earlier recommendations – in December – from a mixed parliamentary committee focused on examining the bill, which also warned that the bill risks falling short of the government’s security objectives.
The government published the Online Safety Bill in May 2021 – establishing a long-term plan to impose a duty of care on internet platforms with the aim of protecting users from a range of harms, related to (already illegal) content. ) such as terrorist propaganda, child sexual abuse material and hate speech, through more broadly problematic but not necessarily illegal content such as bullying or content that promotes eating disorders or suicide (which can create disproportionate risks for younger users of social media platforms).
Speaking to the joint committee in November, Dorries predicted that the legislation will usher in a systemic shift in Internet culture – telling lawmakers and colleagues that it will create a “huge, huge” change in the way Internet platforms operate.
The bill, which is still pending in parliament, targets a wide range of internet platforms and provides for the enforcement of security-focused governance standards through regulated codes of conduct, overseen by Ofcom in an expanded role – including with powers to issue substantial penalties for violations.
The overarching scope of the regulation – the intent that the law targets not just illegal content that spreads online, but also things that fall into a grayer area where restrictions risk affecting freedom of expression and expression – means that the proposal drew huge criticism from civilians. digital freedoms and rights groups, as well as companies concerned about responsibility and compliance burden.
At the same time, the government has been intensifying attacks on the use of end-to-end encryption by platforms – the rhetoric that seeks to imply robust security is a barrier to catching pedophiles (see, for example, the Government’s NoPlaceToHide PR recently revealed to try to make the public against E2E encryption). Thus, critics are also concerned that ministers are trying to subvert Internet security and privacy by recasting good practices as barriers to an objective that enforces ‘child safety’ through mass digital surveillance.
On that front, in recent months the Ministry of Interior has also been spending quite a bit of taxpayer money to try to promote the development of technologies that can be applied to E2EE systems to verify child sexual abuse material – which it says could offer a half term. between robust security and law enforcement data access requirements.
Critics of the bill already argue that using a trumped-up claim of child ‘protection’ as a populist lever to push for the removal of the strongest security and privacy protections for all Internet users – simultaneously encouraging a cottage industry of commercial providers to emerge and selling ‘child protection’ surveillance services – is much closer to gaslighting than protection, however.
Stepping back, there is also a lot of concern about the risk of the UK regulating its digital economy.
And of the bill becoming a parliamentary “hobby horse” for all sorts of online grievances, as a former minister of state put it – with the potential for complex and ill-defined content regulation to end up as a disproportionate burden for startups in the world. UK versus tech giants like Facebook, whose self-service and content moderation algorithms fueled calls for internet regulation in the first place, as well as being extremely harmful to the human rights of internet users in the UK.