Child safety group Heat Initiative plans to launch a campaign pressing Apple on child sexual abuse material scanning and user reporting. The company issued a rare, detailed response on Thursday.
Yes, however my (Others may have other concerns, this is just off the top of my head) chief concern was the breaking a major barrier - in that explicitly user-hostile code would be running on the device itself, one I own. I’d say it’s more of the equivalent of club employees entering your home to check your ID prior to, or during your club visit, and using your restroom/eating a snack while they’re there. (scanning would use “your” device’s resources)
There’s also the trivial nature of flipping the require_iCloud_photos=“true” value to “false” whether by intention or by accident. I have an open ticket with Apple support where my Apple Maps saved locations, favorites, guides, Home, reports, reviews ALL vanished without a trace. Just got a callback today saying that engineering is aware of the problem and that it’s expected to be resolved in the next iOS update. I’m the meantime, I’m SOL, so accidents and problems can and do happen, nor is Apple the police.
And on top of that there’s also concerns of upstream perversion of the CSAM database for other purposes - after all, who can audit it to ensure it’s use for CSAM exclusively and who can add to it? Will those images from the device and database be pulled out for trials or would it be a “trust the machine, the odds of false positives are x%” situation? (I believe those questions might have been already answered when the controversy was flying but there’s just a lot of cans of worms waiting to be opened with this, as well as Apple being pressured to scan for more things once the technology has been made.)
The CSAM database isn’t controlled by Apple. It’s already in use practically everywhere. Apple tried to compromise between allowing private encrypted image storage at scale and making sure they aren’t a hot bed for CSAM. Their competitors just keep it unencrypted and scan it for content, which last time I checked is worse 🤷♂️
It’s not very useful for much else. It only find known copies of existing CSAM. It doesn’t detect new ones. Governments could already force Apple to do whatever they want, so it’s a keep to say this is going to do much more.
You go way out of your way to lick Apples boot here. With comparing hashes to whatever Apple wants/is told to, you can profile everyone, find leaked material the gov doesnt want you to have and so on. The fact that people just accept it, or endorse it is beyond me, but again, after the last 3 years I came to the conclusion that most people are scared to be free.
While scanning for leaked government documents is the first thing I’ve heard that could be a problem for whistleblowers, I’ll point out this scanning tech is already in use in major cloud platforms and no government has forced anyone to do it. Having a database of all government documents like that wouldn’t be trivial to put together either. It’s just not practical to be used that way.
I don’t care that it was Apple who did this, it presents a legitimate answer to E2E encryption of data while cutting many government arguments off at the legs. Without an answer we are closer to E2E being made illegal then we are nothing happening.
Yes, thats why I dont use cloud and have a degoogled android. The problem is that this is a slippery slope. I can say I dont mind because it doesnt affect me, but step by step they outlaw anything else, even custom roms and alternative app stores. Either people are against it, or this will get much worse down the line.
I don’t think it’s a slippery slope. That ship set sailed when we started putting our data on other people’s computers. Your situation is extremely niche, not many are going to go through that effort.
Yes, however my (Others may have other concerns, this is just off the top of my head) chief concern was the breaking a major barrier - in that explicitly user-hostile code would be running on the device itself, one I own. I’d say it’s more of the equivalent of club employees entering your home to check your ID prior to, or during your club visit, and using your restroom/eating a snack while they’re there. (scanning would use “your” device’s resources)
There’s also the trivial nature of flipping the require_iCloud_photos=“true” value to “false” whether by intention or by accident. I have an open ticket with Apple support where my Apple Maps saved locations, favorites, guides, Home, reports, reviews ALL vanished without a trace. Just got a callback today saying that engineering is aware of the problem and that it’s expected to be resolved in the next iOS update. I’m the meantime, I’m SOL, so accidents and problems can and do happen, nor is Apple the police.
And on top of that there’s also concerns of upstream perversion of the CSAM database for other purposes - after all, who can audit it to ensure it’s use for CSAM exclusively and who can add to it? Will those images from the device and database be pulled out for trials or would it be a “trust the machine, the odds of false positives are x%” situation? (I believe those questions might have been already answered when the controversy was flying but there’s just a lot of cans of worms waiting to be opened with this, as well as Apple being pressured to scan for more things once the technology has been made.)
The CSAM database isn’t controlled by Apple. It’s already in use practically everywhere. Apple tried to compromise between allowing private encrypted image storage at scale and making sure they aren’t a hot bed for CSAM. Their competitors just keep it unencrypted and scan it for content, which last time I checked is worse 🤷♂️
But Apple still fetches that list of hashes and can be made to send an alternative list to scan for
It’s not very useful for much else. It only find known copies of existing CSAM. It doesn’t detect new ones. Governments could already force Apple to do whatever they want, so it’s a keep to say this is going to do much more.
You go way out of your way to lick Apples boot here. With comparing hashes to whatever Apple wants/is told to, you can profile everyone, find leaked material the gov doesnt want you to have and so on. The fact that people just accept it, or endorse it is beyond me, but again, after the last 3 years I came to the conclusion that most people are scared to be free.
While scanning for leaked government documents is the first thing I’ve heard that could be a problem for whistleblowers, I’ll point out this scanning tech is already in use in major cloud platforms and no government has forced anyone to do it. Having a database of all government documents like that wouldn’t be trivial to put together either. It’s just not practical to be used that way.
I don’t care that it was Apple who did this, it presents a legitimate answer to E2E encryption of data while cutting many government arguments off at the legs. Without an answer we are closer to E2E being made illegal then we are nothing happening.
Yes, thats why I dont use cloud and have a degoogled android. The problem is that this is a slippery slope. I can say I dont mind because it doesnt affect me, but step by step they outlaw anything else, even custom roms and alternative app stores. Either people are against it, or this will get much worse down the line.
I don’t think it’s a slippery slope. That ship set sailed when we started putting our data on other people’s computers. Your situation is extremely niche, not many are going to go through that effort.