(UK) The Online Safety Act has received Royal Assent on Thursday 26 October , heralding a new era of internet safety and choice by placing world-first legal duties on social media platforms.
The new laws take a zero-tolerance approach to protecting children from online harm, while empowering adults with more choices over what they see online. This follows rigorous scrutiny and extensive debate within both the House of Commons and the House of Lords.
The Act places legal responsibility on tech companies to prevent and rapidly remove illegal content, like terrorism and revenge pornography. They will also have to stop children seeing material that is harmful to them such as bullying, content promoting self-harm and eating disorders, and pornography.
If they fail to comply with the rules, they will face significant fines that could reach billions of pounds, and if they don’t take steps required by Ofcom to protect children, their bosses could even face prison.
Technology Secretary Michelle Donelan said:
Today will go down as an historic moment that ensures the online safety of British society not only now, but for decades to come.
I am immensely proud of the work that has gone into the Online Safety Act from its very inception to it becoming law today. The Bill protects free speech, empowers adults and will ensure that platforms remove illegal content.
At the heart of this Bill, however, is the protection of children. I would like to thank the campaigners, parliamentarians, survivors of abuse and charities that have worked tirelessly, not only to get this Act over the finishing line, but to ensure that it will make the UK the safest place to be online in the world.
The Act takes a zero-tolerance approach to protecting children by making sure the buck stops with social media platforms for content they host. It does this by making sure they:
- remove illegal content quickly or prevent it from appearing in the first place, including content promoting self-harm
- prevent children from accessing harmful and age-inappropriate content including pornographic content, content that promotes, encourages or provides instructions for suicide, self-harm or eating disorders, content depicting or encouraging serious violence or bullying content
- enforce age limits and use age-checking measures on platforms where content harmful to children is published
- ensure social media platforms are more transparent about the risks and dangers posed to children on their sites, including by publishing risk assessments
- provide parents and children with clear and accessible ways to report problems online when they do arise.
Home Secretary Suella Braverman said:
This landmark law sends a clear message to criminals – whether it’s on our streets, behind closed doors or in far flung corners of the internet, there will be no hiding place for their vile crimes.
The Online Safety Act’s strongest protections are for children. Social media companies will be held to account for the appalling scale of child sexual abuse occurring on their platforms and our children will be safer.
We are determined to combat the evil of child sexual exploitation wherever it is found, and this Act is a big step forward.
Lord Chancellor and Secretary of State for Justice, Alex Chalk said:
No-one should be afraid of what they or their children might see online so our reforms will make the internet a safer place for everyone.
Trolls who encourage serious self-harm, cyberflash or share intimate images without consent now face the very real prospect of time behind bars, helping protect women and girls who are disproportionately impacted by these cowardly crimes.
In addition to protecting children, the Act also empowers adults to have better control of what they see online. It provides 3 layers of protection for internet users which will:
- make sure illegal content is removed
- enforce the promises social media platforms make to users when they sign up, through terms and conditions
- offer users the option to filter out content, such as online abuse, that they do not want to see
- If social media platforms do not comply with these rules, Ofcom could fine them up to £18 million or 10% of their global annual revenue, whichever is biggest – meaning fines handed down to the biggest platforms could reach billions of pounds.
The government also strengthened provisions to address violence against women and girls. Through the Act, it will be easier to convict someone who shares intimate images without consent and new laws will further criminalise the non-consensual sharing of intimate deepfakes.
The change in laws also now make it easier to charge abusers who share intimate images and put more offenders behind bars. Criminals found guilty of this base offence will face up to 6 months in prison, but those who threaten to share such images, or shares them with the intent to cause distress, alarm or humiliation, or to obtain sexual gratification, could face up to two years behind bars.
NSPCC Chief Executive, Sir Peter Wanless said:
Having an Online Safety Act on the statute book is a watershed moment and will mean that children up and down the UK are fundamentally safer in their everyday lives.
Thanks to the incredible campaigning of abuse survivors and young people and the dedicated hard work of Parliamentarians and Ministers, tech companies will be legally compelled to protect children from sexual abuse and avoidable harm.
The NSPCC will continue to ensure there is a rigorous focus on children by everyone involved in regulation. Companies should be acting now, because the ultimate penalties for failure will be eye watering fines and, crucially, criminal sanctions.
Dame Melanie Dawes, Ofcom Chief Executive, said:
These new laws give Ofcom the power to start making a difference in creating a safer life online for children and adults in the UK. We’ve already trained and hired expert teams with experience across the online sector, and today we’re setting out a clear timeline for holding tech firms to account.
Ofcom is not a censor, and our new powers are not about taking content down. Our job is to tackle the root causes of harm. We will set new standards online, making sure sites and apps are safer by design. Importantly, we’ll also take full account of people’s rights to privacy and freedom of expression.
We know a safer life online cannot be achieved overnight; but Ofcom is ready to meet the scale and urgency of the challenge.
In anticipation of the Bill coming into force, many social media companies have already started making changes. TikTok has implemented stronger age verification on their platforms, while Snapchat has started removing the accounts of underage users.
While the Bill has travelled through Parliament, the government has worked closely with Ofcom to ensure protections will be implemented as quickly as possible once the Act received Royal Assent.
From today, Ofcom will immediately begin work on tackling illegal content, with a consultation process launching on 9th November 2023. They will then take a phased approach to bringing the Online Safety Act into force, prioritising enforcing rules against the most harmful content as soon as possible.
The majority of the Act’s provisions will commence in two months’ time. However, the government has commenced key provisions early to establish Ofcom as the online safety regulator from today and allow them to begin key preparatory work such as consulting as quickly as possible to implement protections for the country.
Via Gov.uk