A quick update - The European Commission and EU member states have been pondering, for years now, if they should force WhatsApp/Apple/Signal/Telegram to scan all our private messages for suspected child sexual abuse material (CSAM). For various reasons it is a horrendous idea to break end to end encryption in this likely highly ineffective way. Variations of the proposal have also included a mandate to perform such scanning of images using AI, and even to also read our text messages to see if we aren’t “grooming” children.
“Introducing a scanning application on every mobile phone, with its associated infrastructure and management solutions, leads to an extensive and very complex system. Such a complex system grants access to a large number of mobile devices & the personal data thereon. The resulting situation is regarded by AIVD as too large a risk for our digital resilience. (…) Applying detection orders to providers of end-to-end encrypted communications entails too large a security risk for our digital resilience”.
How tf do you even do that? I get how you would on a stock proprietary OS. But there are open OSes, and then how? Doubt something this complex and autonomous could be hidden like the XZ backdoor. If some OS complies - wouldn’t people fork it to remove the malware?
And then there are desktops, which are much easier and more universal to make private…
Important excerpt:
How tf do you even do that? I get how you would on a stock proprietary OS. But there are open OSes, and then how? Doubt something this complex and autonomous could be hidden like the XZ backdoor. If some OS complies - wouldn’t people fork it to remove the malware?
And then there are desktops, which are much easier and more universal to make private…