The UK’s media watchdog Ofcom is to be given the power to regulate social media companies, holding them to account for harmful content such as violence or child abuse.
Digital media and culture secretary Nicky Morgan is introducing measures based on a government white paper launched in April last year. This called for fines, site blocks and the prosecution of senior management for companies that fail to protect their users.
Sites such as Facebook, YouTube, Snapchat and Twitter would have a ‘duty of care’ requiring them to remove harmful material such as terrorist content, child abuse, revenge porn and fake news. However, the plan will also affect any companies that allow the sharing of user-generated content – for example, through comments, forums or video sharing.
“With Ofcom at the helm of a proportionate and strong regulatory regime, we have an incredible opportunity to lead the world in building a thriving digital economy, driven by groundbreaking technology, that is trusted by and protects everyone in the UK,” says Morgan.
“We will give the regulator the powers it needs to lead the fight for an internet that remains vibrant and open but with the protections, accountability and transparency people deserve.”
The plan is believed to be a temporary one, with a new ‘online harms regulator’ to be appointed following further legislation.
The move has been widely welcomed – but there are concerns about how effectively it will be implemented.
“For many years I have been the target of personal attacks online, mainly by anonymous accounts but also by those who happily use their own identity safe in the knowledge that nothing will be done. I am far from alone in being subjected to such abuse,” comments DUP MP Carla Lockhart.
“Key to the success of this move will be the sanctions that Ofcom will be able to apply. A token slap on the wrist will not be enough to protect users and it is vital that regulators are empowered with the ability to impose serious fines and penalties on social media companies.”
And there are massive questions over Ofcom’s capacity to take on the task – even temporarily. The watchdog already has responsibility for TV and radio, but the online world is rather larger. It’s not clear where Ofcom would get the resources to monitor, potentially, every tweet, Facebook post or YouTube video available in the UK.
Indeed, its remit will also include “paying due regard to safeguard free speech, defending the role of the press, promoting tech innovation and ensuring businesses do not face disproportionate burdens”.
Conversely, if the regulator is genuinely given the resources for the job, that’s a massive concentration of power in one organization – especially when its remit covers such broad concepts as online bullying.
Indeed, the Internet Association – which represents major internet firms including Facebook, Twitter and Google – has warned that such measures could “hurt the British tech sector, worsen the quality of internet services for ordinary consumers, undermine privacy, and produce a chilling effect on freedom of speech”.
Click on the comment box below and leave us your thoughts. Thank you