From the Boing Boing Shop

Follow Us

If you buy one of those intrinsically insecure, always-on "smart speakers" from Google, Amazon, Apple or other players, you're installing a constantly listening presence in your home that by design listens to every word you say, and which is very likely to suffer at least one catastrophic breach that allows hackers (possibly low-level dum-dums like the ransomware creeps who took whole hospitals hostage this year, then asked for a mere $300 to give them back because they were such penny-ante grifters) to gain access to millions of these gadgets, along with machine-learning-trained models that will help them pluck blackmail material, credit card numbers, and humiliating disclosures out of the stream of speech they're capturing.
Read the rest