Dawes new Head of Ofcom, as the regulator adds the policing of social media to its remit

Updated 11am, 12 February 2020
Dame Melanie Dawes has been appointed as the new head of Ofcom, as the UK telecoms & media regulator is given the remit of policing social media platforms for illegal and harmful content such as violence, terrorism, cyberbullying and child abuse. Lord Burns will step down as Chair to enable a new Chair to be in place by the end of this year.
This move is an interim measure as the government prepares its ‘Online Harms Bill’. Ofcom’s remit will be wide reaching – affecting firms hosting user-generated content such as comments, forums and video sharing  which will be expected to remove offending content “quickly” and to “minimise the risks” of offending content being published.
DCMS Secretary of State Nicky Morgan said: “With Ofcom at the helm of a proportionate and strong regulatory regime, we have an incredible opportunity to lead the world in building a thriving digital economy, driven by ground-breaking technology, that is trusted by and protects everyone in the UK”.
Home Secretary Priti Patel, said: “It is incumbent on tech firms to balance issues of privacy and technological advances with child protection.  That’s why it is right that we have a strong regulator to ensure social media firms fulfil their vital responsibility to vulnerable users”.
The initial response from the government says:

  • Platforms will need to ensure that illegal content is removed quickly and will have to minimise the risk of it appearing, with particularly robust action on terrorist content and online child sexual abuse.
  • Ofcom will be given a clear responsibility to protect users’ rights online. This will include paying due regard to safeguarding free speech, defending the role of the press, promoting tech innovation and ensuring businesses do not face disproportionate burdens.
  • The regulations will not stop adults from accessing or posting legal content that some may find offensive. Instead companies will be required to explicitly state what content and behaviour is acceptable on their sites in clear and accessible terms and conditions and enforce these effectively, consistently and transparently.
  • The regulation will only apply to companies that allow the sharing of user-generated content – for example, through comments, forums or video sharing. Fewer than 5% of UK businesses will be in scope. Business-to-business services which pose a low risk to the general public will not be in scope. A business simply having a social media presence does not necessarily mean it will be in scope.
  • It will be up to Ofcom to monitor new and emerging online dangers and take appropriate enforcement action.

While the proposed legislation will focus on Internet giants such as Facebook, Snapchat, Twitter, YouTube and TikTok and their duty of care, it will also pull into the net a large number of smaller firms that might find complying onerous. With hundreds of thousands, if not millions, of sites to police it seems likely that Ofcom will be forced to focus only on the biggest.
Germany has taken this approach with its NetzDG Law (2018). This applies to platforms with more than 2 million registered German users and imposes a duty to remove illegal content within 24 hours. Fines of up to EUR5 million can be levied on those who do not comply.
Australia has gone further by having severe penalities (10% of global turnover) and jail sentences for executives of companies that do not comply with the 2019 Sharing of Abhorrent Violent Material Act.
It is not yet known what penalties, if any, Ofcom will be able to impose. In the white paper issued last year, the government originally proposed fines, criminal liability similar to that introduced in Australia, and a requirement for ISPs to block access to sites that repeatedly fail to comply.
More controversially, in this White Paper the government said that it would hold firms liable for “behaviours which are harmful but not necessarily illegal”. This led to a backlash from commentators arguing that such a move introduced censorship by the back door, since someone would arbitrarily have to decide what constituted a “harmful” act.
Still to be resolved are issues such as to how Ofcom will police sites outside the UK – where the user may be British but the content and site resides abroad. This becomes even more complex if users utilise VPNs which disguise the origin of traffic.
Omnisperience’s view
The worrying aspects of the proposed law would be if its coverage was extended from illegal content to the more nebulous concept of “harmful” content. This opens everything up to interpretation and court action to determine – potentially case by case – what constitutes harm. It would be seen as politically motivated state censorship and would be impossible to enforce on US sites because of the US’s freedom of speech rights. The government appears to have backed away from doing this and is focusing on illegal  content – specifically saying it will ensure Ofcom protects citizens rights to free speech.
The geographical remit of such legislation is also concerning. Giving a duty of care to UK resident sites to remove illegal content is one thing; but trying to police the internet (sites that are not UK resident) would be a colossal undertaking even if assisted by technology such as AI. It seems inevitable that site-blocking would have to occur, which relies on the UK’s service provider community to administrate. This inevitably leads to the question of who will pay for such a service?
Online content continues to be a highly politicised and lobbied topic, but it should be remembered that the UK has a history of backsliding on such issues. Its equally controversial ‘porn block’, part of the Digital Economy Act (DEA) 2017, which would have forced porn sites to install age verification for all UK visitors was effectively abandoned in October 2019 by Culture Secretary Nicky Morgan amid concerns about privacy and security. Although that story is not quite over yet, as several of the vendors (AgeChecked, VeriMe, AVYourself and AVSecure) that were set to benefit from the regulations have now lodged a judicial review in the High Court, alleging that Morgan acted outside her powers.