“What would it mean for your business if you could target potential clients who are actively discussing their need for your services in their day-to-day conversations? No, it's not a Black Mirror episode—it's Voice Data, and CMG has the capabilities to use it to your business advantage.”
Services that “listen” for commands like Siri and Alexa have to be, by default, always listening, because otherwise they would not be able to hear the activate command. They are supposed to dump the excess data like anything that came before the activation command, but that’s just a promise. There are very few laws protecting you if that promise turns out to be a lie. The best you can get is likely small restitution through a class action lawsuit (if you didn’t waiver right to that by agreeing to the Terms of Service, which is more often than not, now).
They’re not. Not yet. People are on edge and looking for this exact thing, which hadn’t happened yet. Meanwhile, they’ve already built a pretty damn good profile of you based on your search queries and mistyped urls.
Services that “listen” for commands like Siri and Alexa have to be, by default, always listening, because otherwise they would not be able to hear the activate command. They are supposed to dump the excess data like anything that came before the activation command, but that’s just a promise. There are very few laws protecting you if that promise turns out to be a lie. The best you can get is likely small restitution through a class action lawsuit (if you didn’t waiver right to that by agreeing to the Terms of Service, which is more often than not, now).
Of fucking course they’re listening.
Where are they hiding that data locally, and how are they making it invisible in transit?
They’re not. Not yet. People are on edge and looking for this exact thing, which hadn’t happened yet. Meanwhile, they’ve already built a pretty damn good profile of you based on your search queries and mistyped urls.