They tested Buzz internally with their 20,000 employees, but no one sounded the privacy alarm, or perhaps no one sounded it loudly enough.

It's point #2 that's fascinating to me. When it came out, Buzz was so obviously broken to so many people, not just researchers and geeks, but many in the general public. How did 20,000 Google employees miss that? And what does that say about Google's internal culture?

I began to think about this more when I noticed a news story about how Google and other top Silicon Valley firms are claiming that the demographics of their workforces are trade secrets, refusing to release them. Really? Seems like kind of an obvious cover-up there. Google is an engineering culture, and engineers tend to be overwhelmingly white and male. And what does a white, male engineering culture get you? Buzz, apparently, and a ridiculous inattention to common sense privacy concerns.

I don't mean to bash on Google – they're far from the only company that's predominately white, male, and engineering dominated. But until now I think Google and others have played that card as an asset. They're proud of the fact that they don't have any social scientists around. They think they don't need them. There are lots of computer scientists and engineers who are now creeping into social spaces, claiming they can use massive data and computing to solve the hard problems that social scientists haven't been able to solve. Well, I call BS (a thousand times BS!), and I use Buzz as Exhibit A. I'm no longer shocked that some computer scientists can be that naive and narrow minded. But I still don't understand what's so hard about saying that we need each other. Smart at one thing != smart at everything.

So, do I think Google's internal culture will change? Not in the short term, and maybe not at all. Not unless they suddenly hire a slew of social scientists and put them in positions with real power over engineers, product direction. But I hope this Buzz experience could be the start of a slow realization that algorithms have no answers, they have no whys. They have stunningly small amounts of nuance and subtlety, which is where I'd argue real wisdom lies. And apparently they don't have much common sense either.

I've mentioned my borderline unhealthy interest in Team Fortress 2 before. I'm also interested in the genres of video that have sprung up around the game – frag videos, griefing, machinima. And now an inspired and hilarious cartoon that I think you'll appreciate even if you've never played the game, but especially if you have:

In case you haven't heard, Google's Buzz service is the latest privacy apocalypse – check out a nice short summary here, or the details here, here, or here. Now Google has responded by tweaking its service to address some but not all of the privacy concerns. And yet there are still some fairly horrifying implications of Google's move.

I'll let other more knowledgeable folks take a swing at the nature of the privacy debate. I think this whole debacle reveals a more fundamental flaw in the way web companies handle online privacy today: they treat it as a one-size-fits-all phenomenon. Google sat down to make decisions about how to share information for Buzz users. Undoubtedly they started with a list of outcomes that would be good for Google. Then they probably started to imagine the user, and they wanted to make it easy to manage the service. They saw themselves as simplifying what some think is a tedious process of finding friends, managing connections, sharing content. They thought their innovation would be to make all that happen auto-magically. And once they came up with a solution they liked, they shoved it down our collective throat. We revolted (vomited) at their presumption. So they made a few changes. But it's still pretty much like the Gap selling all its clothes in XXL.

I think the diversity of responses to Buzz and its privacy implications should encourage us to stop thinking of privacy as a unitary concept. Attitudes about privacy are personal and contextual. Some people will decide that Buzz is so brilliant, it shouldn't matter that there are some privacy hiccups. Some people are so used to transparently sharing their online lives that revealing all their contacts wouldn't make a difference to them. Others, of course, will have the opposite reaction and feel completely and utterly violated. I myself fall squarely in the middle. I won't be using Buzz, at least in the short term. And my primary reaction is to be angry at Google for having the gall to do this. They knew exactly what they were doing – this was not a privacy "accident" – but they decided it didn't matter. They decided to try and dictate the next privacy norm to us via their awesome power.

The single worst thing about the web right now is that it tries to squeeze all us irregular geometric shapes into the same round hole. There has been almost no effort to assess privacy attitudes and adapt to them. And I'm not talking about opt-in and opt-out, or the types of (seemingly but not really) fine-grained privacy and sharing choices that Facebook recently implemented. I think Google's impulse was probably right: it's a lot to ask of many users to manage all that themselves, especially as systems are so complex and the tendrils and traces of our content and behavior spread out across the web through APIs. But it wasn't right for everyone. In fact, it wasn't right for most people. The $10 billion question is: how can we tell the difference between users, and adapt the experience to what they want? The pace of innovation over the last 10 years has been accompanied by social norms that move so fast they can be easily pushed around by the behemoths of the web. I suspect that era is coming to a close, and companies like Google and Facebook will have to start responding to our attitudes about things like privacy, trust, and motivation rather than trying to dictate them to us.

(via Slashdot)
According to an article in APC, of the more than 2.8 million lines of code contributed to the Linux kernel over the last year or so, 75% were written by paid developers. Considering the business ecosystem that's grown up around Linux over the last 10 years, this should come as no surprise. But still, it's an interesting counterpoint to the notion that Linux is written by a community of dedicated volunteers. I think that characterization is probably still largely correct: volunteers write Linux. The kernel is a particular beast with a particular social system. What happens at the core of Linux matters so much to the IBMs of the world that it stands to reason they would get particularly involved there.

But I also think this is an interesting window into what happens to open-source systems as they grow, evolve, and become essential to the computing world. What percentage of Wikipedia is written by paid representatives? Nobody knows. Aside from some notable exceptions in which journalists, politicians, or Scientologists were caught with their hands in the cookie jar, we don't know where a lot of Wikipedia's content comes from. I think it's a fair assumption that some large percentage of it comes from paid representatives. It's probably not as high as 75%, though.

John Tierney reviews Jaron Lanier's brand new book – You Are Not a Gadget: A Manifesto – in yesterday's New York Times. I ordered it today. I have to admit I'm pretty wary of any new book with the word "Manifesto" in the title. Seems awfully cocky to me. Reading Tierney's review, I suspect Lanier's book will be provocative, but I suspect I'll disagree with most of it. Why would Lanier want to throw in with folks like Andrew Keen and Jonathan Zittrain, seemingly trying to make a buck or a headline by pointing out the horrible things that the internet will lead us to? Let's fight the technological determinist tendency to argue that the internet is a great beast that has us in its grips and is marching us back to its lair.

I'm going to reserve judgment until I read the thing, although I admit it's hard. People do not seem very tolerant of this tumultuous (but exciting!) period in which we're trying to figure out the whole always-on, massive collaboration, cloud computing, social norm changing thing that's going on. I for one am content to give it time.

In what was the least surprising and most self-serving statement of the weekend, Facebook CEO Mark Zuckerberg has proclaimed that privacy is no longer a social norm in our world. That's right. Privacy… GONE. Over. We're all now happy to put the most intimate and minute details of our lives on the internet, and we won't think twice about it. Thank goodness we have CEOs like Zuckerberg to tell us about our social norms.

RIP Privacy.

Now back to reality. Privacy is not dead. Far from it. Privacy is a bigger issue than it has ever been.

So how should we read Zuckerberg's statement? On the one hand, we can default to the most general implication of what he's saying: notions of privacy are in flux. True. But that's always been true. Is the internet changing privacy more fundamentally than radio or television did? It's interesting to think… It couls be that the pace of evolving norms has been accelerated of late. Or, we could remember that Zuckerberg is the mouthpiece for the internet's most prominent and, arguably, egregious privacy violator. It is squarely in his company's interest to argue that the default reaction of the Facebook-going public is to share everything with everyone. It saves him the hassle of having to deal with the violations that are increasingly occurring.

So, what can we say about privacy in the age of Facebook and Twitter? First of all, I think we should resist the urge to make blanket pronouncements. There is certainly a group of young people who have grown up with Facebook in their lives. For these people, privacy means something different, just as "friend" means something different, than it does for many other people. But that's far from a common view. Yesterday's NYTimes Week in Review has a nice article on this subject. For many, arguably most folks, privacy is still very real. And it's something that many people hold important and nuanced attitudes about.

With Coye Cheshire and Elizabeth Churchill, I have been looking into these attitudes. We've been finding that discretion – the ability or desire to suss out the nature of a specific situation and act accordingly, rather than applying a blanket attitude – is key. I suspect that many people exercise a huge amount of discretion about their online information. They differentiate between contexts, audiences, and types of information. After all, why do we assume that the same privacy attitudes would apply to information about, say, our bank accounts, our present geographical location, and our breakfast?

I think privacy is going to be the banner issue of 2010 and beyond. But the banner isn't going to read "Privacy is Dead." The challenge for sites like Facebook is going to be to build socially smart tools that don't apply blanket rules about privacy. Facebook's new privacy rules are organized around functions on the site. But I don't want to decide who can read all my status updates. I want different people to have access depending on what I'm writing about, when I'm writing, where I am, etc.

Dealing with privacy effectively will mean first doing some tough research. What aspects of individuals, of contexts, and of interactions bear on specific privacy attitudes? We need to be thinking of privacy as a whole range of attitudes, not simply a single standard. Then we need to design easy-to-use technologies that can give people the privacy they want based on what they're doing and who they're doing it with.

Facebook can't solve the privacy issue by wishing it away or declaring it gone. If Zuckerberg's comment is indicative of their stance, I'm seeing the chink in Facebook's armor. Some wily start-up is going to come along with a beautiful and flexible technology that will allow people to share the way they want to and they're going to eat Facebook's lunch.

The deluge of data and analysis on Twitter is continuing to roll. By the time the 10,000 conference and peer-reviewed papers get published in the next 6-8 months, they'll all have been scooped by the folks who are doing public analysis for other audiences.

First, there's a paper by Mor Naaman, Jeffrey Boase, Chih-Hui Lai called "Is it Really About Me? Message Content in Social
Awareness Streams" (PDF). Among a variety of interesting and nuanced findings, the authors show evidence that about 80% of Twitter users are "meformers" – people whose Tweets are mostly about themselves. Only 20% were in the "informer" category – people who share information about other topics.

Surprisingly, TechCrunch has lately been the source of seemingly high quality data about Twitter. In addition to Geoff Cook's great guest post on Why Teens Don't Tweet, back in October Robert Moore posted a huge amount of data and longitudinal analysis.

(Click for a Larger Image.)

And, just today, TechCrunch has news from Comscore that Twitter's growth has basically flattened out for both international and US users, this despite a recent push of new features and new languages. In September I predicted the demise of Twitter, and this seems to be the first stage. 2009 was definitely their year – arguably no technology was more popular, more widely talked about than Twitter was this year. 2010 will be the year of soul searching for Twitter, where the new-ness wears off, new features don't gain the expected traction, and the company continues to look for a reliable business model. If Twitter has a big future, it's going to be as a messaging platform that underlies more interesting services.

About TechnoTaste:

I'm Judd Antin,
and this is my blog. I'm a generalist, but I spend most of my time riding the fine edge between social psychology
and anthropology. As a PhD student at UC Berkeley's School of Information
(iSchool), my research explores how social psychological
incentives operate to facilitate or hinder online cooperation. My research topics let me work and
think in social psychology, sociology, anthropology, psychology, and economics, and to do fundamentally multi-methods work.
It's my favorite thing. Oh, and I'm also a former chef,
so that explains the food & wine stuff.