Facebook, Instagram and Google-owned YouTube could face millions in penalty in Britain — up to 5 per cent of their revenue and restriction or suspension of services if they fail to remove harmful content from their platforms. According to a CNET report late Monday, the British government has decided to appoint telecom and broadcasting regulator Ofcom to ensure social media platforms are tackling spread of content that promotes violence, child abuse or pornography.
Ofcom will be given powers to investigate and fine social platforms for sharing or live-streaming “harmful” videos. The watchdog will take charge of policing social media from September 19. “These new rules are an important first step in regulating video-sharing online, and we’ll work closely with the government to implement them,” an Ofcom spokesperson said in a statement.
In Britain, the government became serious on taming social media platforms after the 2017 suicide of 14-year-old Molly Russell, who had been using Instagram to view self-harm imagery.
In April, the British government announced it would appoint an independent regulator to keep social media companies in check. The British government’s “online harms” white paper aims to make online platforms liable to protect their users, especially children. “Online platforms can be a tool for abuse and bullying, and they can be used to undermine our democratic values and debate.
“The impact of harmful content and activity can be particularly damaging for children, and there are growing concerns about the potential impact on their mental health and well-being,” read the white paper. The white paper proposes the mandatory “duty of care” on social media platforms to take reasonable steps to protect their users from a range of harms. It has announced that the government will establish a new statutory duty of care to make companies take more responsibility for the safety of their users and tackle harm caused by content or activity on their services.