The cost of being on Facebook, the cost of handing over your connections with the people you love, is real…. What I was taking for granted as just the way things were was actually just the way Facebook wanted things to be….I think we can expect, if we keep trusting Facebook, to keep having our trust abused. We have no reason not to expect this, and yet we’ve been letting Facebook stay in our most intimate relationships. Facebook has so far succeeded in convincing us that we have to let it stay so that we might keep our loved ones close. It does not have to be this way…. What do the platforms we legitimize with our personal and heartfelt work have to do to earn our trust? Right now, not enough. It all feels like a shady bargain.

Here’s the Facebook spokesperson in 2011: “No information we receive when you see social plugins is used to target ads; we delete or anonymize this information within 90 days, and we never sell your information.”

Facebook in 2014: Information we receive when you see social plugins in mobile apps will be used to target ads, and it’s in the works for the same thing to happen when you see them when you’re browsing on your computer.

This Japanese Television Conspiracy Has Familiar Faces (Brian Ashcraft): We’ve spent 20 years now worrying about how easily people can fake themselves online, and we’ve become thoroughly accustomed to thinking that the Web is full of doppelgangers and deceptions — unlike older media, where we can easily detect what’s real. In this report of the way Japanese TV news shows stage “man on the street” interviews with actors who keep turning up, Zelig-style, it’s the old-school medium that’s confronting us with sockpuppet fakery — and ironically the only reason we know about it is that online video archives allow us to double-check.

We need machines that cop to their own vulnerabilities. In fact, robots should tell us not only that they might fail but also explain why — letting us know, for instance, that certain conditions cause their sensors to be less reliable or that certain situations cause their decision-making models to break down. In the end, establishing trust and building productive relationships with robots won’t be all that different from doing so with people. After all, a good colleague wouldn’t just bail out on a group presentation. Instead, they’d warn you that they tend to stammer and sweat when speaking in front of an audience and then offer to pick up the slack somewhere else.

It is one-way communication. No one sees the replies but the sender. This is great for avoiding trolls, not so good if you miss the days that the comments section might be as worthwhile as the original post.