fresh off the press - this is why I don't like to give the "benefit of the doubt" to Big Tech, even though I will limit my comments to the available info:
bloomberg.com/news/articles/20

@diggity While this practice is expected (as the article notes, other companies do it as well), most users are definitely not aware of it and I think that many more people would be uncomfortable using these devices if they did learn that this isn't all just being processed by computers. Some of that conversation happened during the Snowden revelations---is it okay if it's just computers "listening" rather than a human being? (Of course, it's never just computers.)

It's also another example of AI capabilities being over-sold to users.

Thanks for sharing!

@mikegerwitz Right, this will definitely strike a nerve with the public because it's real people listening. Traditionally, keyword analysis of speech is expensive (money and storage, processing power) but this cost can be driven down by cheap labor and AI/ML techniques.

But this is the #1 audience comment and reason that journalists contact me - that they suspect conversations are being recorded and analyzed for their content.

When asked, I walk through other surveillance scenarios... 1/3

Follow

@mikegerwitz ...with the person who suspects their conversations are being analyzed for their content. Humans underestimate the power of metadata, social graphs, "traditional" analytics, and behavioral profiling. Why analyze the content of messages/speech when it's a PR nightmare and potential legal liability?

However, the FTC complaint in 2015 against Samsung and FTC warning in 2016 re: Silverpush are the only other legal/regulatory pressure I know besides a pending bill in Illinois... 2/3

@mikegerwitz ...called "Keep Internet Devices Safe Act" (SB1719). Amazon and Google are opposing it with the attached talking points (via twitter.com/matthewstoller/sta).

I know of two other cases like this Alexa example that made press. Samsung SmartTV in 2015 w/ help from Nuance Comms (now a partner for Apple's Siri), and "My Friend Cayla" dolls in 2017 (again w/ Nuance).

At @privacylab we think we've found an example of speech broken into syllables and disguised as a nUHF signal. Stay tuned. 3/3

@diggity There's a lot of good stuff your replies---thank you for all of the work that you and the others at @privacylab do! I will certainly stay tuned.
Sign in to participate in the conversation
Mastodon

Server run by the main developers of the project 🐘 It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!