A quick update - The European Commission and EU member states have been pondering, for years now, if they should force WhatsApp/Apple/Signal/Telegram to scan all our private messages for suspected child sexual abuse material (CSAM). For various reasons it is a horrendous idea to break end to end encryption in this likely highly ineffective way. Variations of the proposal have also included a mandate to perform such scanning of images using AI, and even to also read our text messages to see if we aren’t “grooming” children.
They’ll just du what they did before - “Our experts say: it’s perfectly safe and secure. No we won’t tell you the names of our experts to protect their privacy and personal safety”
They’ll just du what they did before - “Our experts say: it’s perfectly safe and secure. No we won’t tell you the names of our experts to protect their privacy and personal safety”