The Online Safety Bill has taken years to agree and will force firms to remove illegal content and protect children from some legal but harmful material.
The bill has had a lengthy and contentious journey to becoming law, beginning six years ago when the government committed to the idea of improving internet safety.
The idea that inspired the bill was relatively simple, scribbled down on the back of a sandwich packet by two experts, Prof Lorna Woods of the University of Essex and William Perrin of the charitable foundation Carnegie UK.
Dame Melanie Dawes, chief executive of Ofcom, called the bill’s passage through parliament “a major milestone in the mission to create a safer life online for children and adults in the UK.”
“Very soon after the Bill receives Royal Assent, we’ll consult on the first set of standards that we’ll expect tech firms to meet in tackling illegal online harms, including child sexual exploitation, fraud and terrorism,” she added.
There is a lot staked on the success of the bill - not only the safety of children and adults, but also the UK’s ambitions as a tech hub and possibly, if things go wrong, continued access to popular online services.
The original article contains 785 words, the summary contains 201 words. Saved 74%. I’m a bot and I’m open source!
Platforms will also need to show they are committed to removing illegal content including:
child sexual abuse
controlling or coercive behaviour
extreme sexual violence
illegal immigration and people smuggling
promoting or facilitating suicide
promoting self-harm
animal cruelty
selling illegal drugs or weapons
terrorism
New offences have also been included in the bill, including cyber-flashing and the sharing of “deepfake” pornography.
And the bill includes measures to make it easier for bereaved parents to obtain information about their children from tech firms.
Online safety campaigner Ian Russell has told the BBC the test of the bill will be whether it prevents the kind of images his daughter Molly saw before she took her own life after viewing suicide and self-harm content online on sites such as Instagram and Pinterest.
Digital rights campaigners the Open Rights Group said the bill posed “a huge threat to freedom of expression with tech companies expected to decide what is and isn’t legal, and then censor content before it’s even been published”.
Lawyer Graham Smith, author of a book on internet law, said the bill had well-meaning aims, but in the end it contained much that was problematic.
“If the road to hell is paved with good intentions, this is a motorway,” he told the BBC.
He said it was “a deeply misconceived piece of legislation”, and the threat it posed to legitimate speech was likely to be “exposed in the courts”.
And popular messaging services such as WhatsApp and Signal have threatened to refuse to comply with powers in the bill that would force them to examine the contents of encrypted messages for child abuse material.
Wikipedia has also said it can’t comply with some of the requirements of the bill.
After royal assent the baton will pass to the communications regulator, Ofcom, who will be largely responsible for enforcing the bill.
It will draw up codes of conduct that will provide guidance on how to comply with the new rules.
Those who fail can face large fines of up to £18m, or in some cases executives could face imprisonment.
I guess I’m an old fuddy-duddy taking crazy pills because nothing in this seems bad to me. Hell, quite a few parents have had their kids commit suicide after viewing suicide content online, this would literally save lives. And the tech companies should take some responsibility for what’s on their platforms.
This seems like the digital equivalent of burning books. Rather than controlling what people can read, shouldn’t we be doing more about the underlying reasons that mental health has taken a dive, such as the cost of living, climate change, the cost of further education and, you know, giving people a reason to feel optimistic about the future?
Dude, it’s social media sites being more responsible for what they host, Child Rape, suicide, animals being stomped to death. Like, you get that right?
They still have their encrypted stuff, privacy is mostly intact, all this is doing is forcing the shitty stuff that’s being posted there to be more forcibly removed. Nobody is “burning your books” by holding Meta more responsible.
The bill…imposes strict requirements on large social platforms to remove illegal content.
Oh no!
Additionally, the Online Safety Bill mandates new age-checking measures to prevent underage children from seeing harmful content.
That’s awful!
It also pushes large social media platforms to become more transparent about the dangers they pose to children, while also giving parents and kids the ability to report issues online. Potential penalties are also harsh: up to 10 percent of a company’s global annual revenue.
Won’t somebody think of the corporations!
the bill could also put encrypted messaging services, like WhatsApp, at risk. Under the terms of the bill, encrypted messaging apps would be obligated to check users’ messages for child sexual abuse material.
Absolutely disgusting overreach!
Signal president Meredith Whittaker, meanwhile, issued tentative praise for the ongoing conversation around the bill. “While it’s not everything we wanted, we are more optimistic than we were when we began engaging with the UK government. It matters that the government came out publicly, clearly acknowledging that there is no technology that can safely and privately scan everyone’s communications,” Whittaker said
Now the president of signal is onboard for some reason?!? They must have been a privacy poser this whole time!
…… yeah thanks for linking that article, it really cleared things up on the imminent danger policing the internet for the first time with consequences will hold for us all. Jesus Christ, there might be less death, violence, gore, csam, and hate on The Internet for once, absolutely appalling. /MASSIVEFUCKING-S
I think it’s one of those things where the intent is good, but the implementation will cause issues. Another risk is if the laws are abused under the guise of protection. At the same time, it’s an important issue to try and address.
Encrypted messaging for example. It’s impossible to have secure and encrypted messaging while also scanning the contents for issues. The best you could do is local scanning, but that won’t be effective at all (it’ll block legitimate content and let through harmful stuff).
If you get rid of encrypted messaging, that will make a lot of day to day work impossible, and it would harm those who need the protection of encrypted messages (ex. Journalists, whistleblowers, those under totalitarian/authorative governments)
This seems to be misinformation being spread around? I don’t live in the uk so I can only go by what I research on the internet, and it doesn’t seem to do anything to end end to end encryption. (That was fun to type!)
There will still be apps and platforms you can use encrypted, social media included. They just want ways to access the encrypted information on harmful social media sites, as a way to enforce the safety standards, which makes perfect sense. It’s social media not the DoD.
People can move over to signal or use actual apps meant for encryption. Facebook should 100% be able to see what is going on and being said on their platforms, you have no expectations of privacy there my guy. Same for all social media. It’s a publicly facing service so it needs to be guarded and monitored same as any other, and it’s well past time we started holding the platforms responsible.
Maybe once they start facing fines for not only allowing but pushing through algorithms nothing but horrible and hateful content, they’ll do a better job of moderating their environments.
The apps you’re talking about are the ones being targeted - encrypted chat apps. Those apps (including Signal, WhatsApp, iMessage, Session etc) have all said they’ll pull out of the UK market if this happens.
You guys need to read the article then, you’re freaking out over nothing because those apps are not targeted in the law that’s been passed. They only left in the parts demanding social media take responsibility for what they platform.
The bill…imposes strict requirements on large social platforms to remove illegal content.
Oh no!
Additionally, the Online Safety Bill mandates new age-checking measures to prevent underage children from seeing harmful content.
That’s awful!
It also pushes large social media platforms to become more transparent about the dangers they pose to children, while also giving parents and kids the ability to report issues online. Potential penalties are also harsh: up to 10 percent of a company’s global annual revenue.
Won’t somebody think of the corporations!
the bill could also put encrypted messaging services, like WhatsApp, at risk. Under the terms of the bill, encrypted messaging apps would be obligated to check users’ messages for child sexual abuse material.
Absolutely disgusting overreach!
Signal president Meredith Whittaker, meanwhile, issued tentative praise for the ongoing conversation around the bill. “While it’s not everything we wanted, we are more optimistic than we were when we began engaging with the UK government. It matters that the government came out publicly, clearly acknowledging that there is no technology that can safely and privately scan everyone’s communications,” Whittaker said
Now the president of signal is onboard for some reason?!? He must have been a privacy poser this whole time!
…… yeah thanks for linking that article, it really cleared things up on the imminent danger policing the internet for the first time with consequences will hold for us all. Jesus Christ, there might be less death, violence, gore, csam, and hate on The Internet for once, absolutely appalling. /MASSIVEFUCKING-S
Oh, that’s rich! You guys are like Reddit Jr with your hilariously ignorant takes! I could be a leader of the tech sector for all you know about me. Please, assume more.
This is the best summary I could come up with:
The Online Safety Bill has taken years to agree and will force firms to remove illegal content and protect children from some legal but harmful material.
The bill has had a lengthy and contentious journey to becoming law, beginning six years ago when the government committed to the idea of improving internet safety.
The idea that inspired the bill was relatively simple, scribbled down on the back of a sandwich packet by two experts, Prof Lorna Woods of the University of Essex and William Perrin of the charitable foundation Carnegie UK.
Dame Melanie Dawes, chief executive of Ofcom, called the bill’s passage through parliament “a major milestone in the mission to create a safer life online for children and adults in the UK.”
“Very soon after the Bill receives Royal Assent, we’ll consult on the first set of standards that we’ll expect tech firms to meet in tackling illegal online harms, including child sexual exploitation, fraud and terrorism,” she added.
There is a lot staked on the success of the bill - not only the safety of children and adults, but also the UK’s ambitions as a tech hub and possibly, if things go wrong, continued access to popular online services.
The original article contains 785 words, the summary contains 201 words. Saved 74%. I’m a bot and I’m open source!
Kinda left out the important bits, quoted below
Platforms will also need to show they are committed to removing illegal content including:
New offences have also been included in the bill, including cyber-flashing and the sharing of “deepfake” pornography.
And the bill includes measures to make it easier for bereaved parents to obtain information about their children from tech firms.
Online safety campaigner Ian Russell has told the BBC the test of the bill will be whether it prevents the kind of images his daughter Molly saw before she took her own life after viewing suicide and self-harm content online on sites such as Instagram and Pinterest.
Digital rights campaigners the Open Rights Group said the bill posed “a huge threat to freedom of expression with tech companies expected to decide what is and isn’t legal, and then censor content before it’s even been published”.
Lawyer Graham Smith, author of a book on internet law, said the bill had well-meaning aims, but in the end it contained much that was problematic.
“If the road to hell is paved with good intentions, this is a motorway,” he told the BBC.
He said it was “a deeply misconceived piece of legislation”, and the threat it posed to legitimate speech was likely to be “exposed in the courts”.
And popular messaging services such as WhatsApp and Signal have threatened to refuse to comply with powers in the bill that would force them to examine the contents of encrypted messages for child abuse material.
Wikipedia has also said it can’t comply with some of the requirements of the bill.
After royal assent the baton will pass to the communications regulator, Ofcom, who will be largely responsible for enforcing the bill.
It will draw up codes of conduct that will provide guidance on how to comply with the new rules.
Those who fail can face large fines of up to £18m, or in some cases executives could face imprisonment.
I guess I’m an old fuddy-duddy taking crazy pills because nothing in this seems bad to me. Hell, quite a few parents have had their kids commit suicide after viewing suicide content online, this would literally save lives. And the tech companies should take some responsibility for what’s on their platforms.
This seems like the digital equivalent of burning books. Rather than controlling what people can read, shouldn’t we be doing more about the underlying reasons that mental health has taken a dive, such as the cost of living, climate change, the cost of further education and, you know, giving people a reason to feel optimistic about the future?
Dude, it’s social media sites being more responsible for what they host, Child Rape, suicide, animals being stomped to death. Like, you get that right?
They still have their encrypted stuff, privacy is mostly intact, all this is doing is forcing the shitty stuff that’s being posted there to be more forcibly removed. Nobody is “burning your books” by holding Meta more responsible.
That was the original intent - that sole thing. Stop kids accessing harmful content. It’s now morphed into a legislative tool for mass surveillance.
Citation from a non-biased source badly needed.
*ends up linking an article that counters nearly everything he said was bad about this bill but then smugly continues on posting as if it didn’t
Yeah you’re totally grounded in reality and not emotionally invested in this. Carry on, b.
Read the Bill?
If you want a brief overview, lawyer and legal author Graham Smith spoke to the BBC about it, all of which was taken from his pocket guide
Or, The Verge just published a general overview.
Oh no!
That’s awful!
Won’t somebody think of the corporations!
Absolutely disgusting overreach!
Now the president of signal is onboard for some reason?!? They must have been a privacy poser this whole time!
…… yeah thanks for linking that article, it really cleared things up on the imminent danger policing the internet for the first time with consequences will hold for us all. Jesus Christ, there might be less death, violence, gore, csam, and hate on The Internet for once, absolutely appalling. /MASSIVEFUCKING-S
I think it’s one of those things where the intent is good, but the implementation will cause issues. Another risk is if the laws are abused under the guise of protection. At the same time, it’s an important issue to try and address.
Encrypted messaging for example. It’s impossible to have secure and encrypted messaging while also scanning the contents for issues. The best you could do is local scanning, but that won’t be effective at all (it’ll block legitimate content and let through harmful stuff).
If you get rid of encrypted messaging, that will make a lot of day to day work impossible, and it would harm those who need the protection of encrypted messages (ex. Journalists, whistleblowers, those under totalitarian/authorative governments)
This seems to be misinformation being spread around? I don’t live in the uk so I can only go by what I research on the internet, and it doesn’t seem to do anything to end end to end encryption. (That was fun to type!)
There will still be apps and platforms you can use encrypted, social media included. They just want ways to access the encrypted information on harmful social media sites, as a way to enforce the safety standards, which makes perfect sense. It’s social media not the DoD.
People can move over to signal or use actual apps meant for encryption. Facebook should 100% be able to see what is going on and being said on their platforms, you have no expectations of privacy there my guy. Same for all social media. It’s a publicly facing service so it needs to be guarded and monitored same as any other, and it’s well past time we started holding the platforms responsible.
Maybe once they start facing fines for not only allowing but pushing through algorithms nothing but horrible and hateful content, they’ll do a better job of moderating their environments.
The apps you’re talking about are the ones being targeted - encrypted chat apps. Those apps (including Signal, WhatsApp, iMessage, Session etc) have all said they’ll pull out of the UK market if this happens.
You guys need to read the article then, you’re freaking out over nothing because those apps are not targeted in the law that’s been passed. They only left in the parts demanding social media take responsibility for what they platform.
Seriously, read the Bill.
Edit: The Verge just published a general overview.
Oh no!
That’s awful!
Won’t somebody think of the corporations!
Absolutely disgusting overreach!
Now the president of signal is onboard for some reason?!? He must have been a privacy poser this whole time!
…… yeah thanks for linking that article, it really cleared things up on the imminent danger policing the internet for the first time with consequences will hold for us all. Jesus Christ, there might be less death, violence, gore, csam, and hate on The Internet for once, absolutely appalling. /MASSIVEFUCKING-S
I got that from the article though, it’s in the bit I quoted as well
I’m not from the UK so I was using the articles
You don’t know anything about how technology or even communication works then.
Oh, that’s rich! You guys are like Reddit Jr with your hilariously ignorant takes! I could be a leader of the tech sector for all you know about me. Please, assume more.