A quick update - The European Commission and EU member states have been pondering, for years now, if they should force WhatsApp/Apple/Signal/Telegram to scan all our private messages for suspected child sexual abuse material (CSAM). For various reasons it is a horrendous idea to break end to end encryption in this likely highly ineffective way. Variations of the proposal have also included a mandate to perform such scanning of images using AI, and even to also read our text messages to see if we aren’t “grooming” children.
How tf do you even do that? I get how you would on a stock proprietary OS. But there are open OSes, and then how? Doubt something this complex and autonomous could be hidden like the XZ backdoor. If some OS complies - wouldn’t people fork it to remove the malware?
And then there are desktops, which are much easier and more universal to make private…
How tf do you even do that? I get how you would on a stock proprietary OS. But there are open OSes, and then how? Doubt something this complex and autonomous could be hidden like the XZ backdoor. If some OS complies - wouldn’t people fork it to remove the malware?
And then there are desktops, which are much easier and more universal to make private…