Technology companies such as Instagram, Facebook and Twitter are set to face a statutory duty to protect UK users against a broad range of harmful content or risk “heavy” fines.
Plans for an industry-funded regulator, which would enforce rules on removing online content that encourages terrorism and child sexual exploitation and abuse, are part of a push by Prime Minister Theresa May’s government to hold the companies accountable. Enforcement powers could include blocking access to sites and imposing liability on individual company managers.
The department for digital, culture, media & sport laid out the proposals as it opened a 12-week consultation on the measures on Monday.
“Online companies must start taking responsibility for their platforms, and help restore public trust in this technology,” May said. “For too long these companies have not done enough to protect users, especially children and young people, from harmful content.”
The plans weren’t universally welcomed. The Institute of Economic Affairs, a pro-market research group, labelled them “draconian” and more likely to do harm than good by holding back innovation. Giving the government power to dictate what content is appropriate sets a dangerous precedent, director-general Mark Littlewood said.
Broad reach
The proposed laws will apply to any company that allows users to share or find user-generated content or interact with each other online, such as social media platforms, file hosting sites, public discussion forums, messaging services and search engines.
Other proposals outlined by the government include:
- Ensuring companies respond to user complaints and act on them quickly;
- Codes of practice which could include requirements to minimise the spread of misleading or harmful disinformation with fact checkers, particularly during elections;
- Annual transparency reports on harmful content and companies’ action to address it;
- A framework to help companies incorporate safety features in new products; and
- A strategy to educate users on how to recognise and deal with malicious behaviour online.
Damian Collins, a Conservative who chairs the digital, culture, media & sport committee, cited the terrorist attack in New Zealand in which 50 Muslims were killed while video of the assault was live-streamed online.
“A regulator should have the power to investigate how content of that atrocity was shared and why more was not done to stop it sooner,” he said. — Reported by Lucy Meakin and Kitty Donaldson, (c) 2019 Bloomberg LP