]]>News Corp. executive chairman Rupert Murdoch isn’t regretting buying Myspace, but everything that came thereafter: “we just messed it up,” he told an audience at the Wall Street Journal’s WSJ.D conference in Southern California on Wednesday. Murdoch’s News Corp. acquired Myspace in July 2005 for $580 million, and ended up selling the social network for just $35 million six years later.

Murdoch said Myspace was “growing like crazy” when his company acquired the social network, which was before Facebook really took off. And Myspace had other bets that could have paid off for Murdoch: It was building a video service, which launched three months before YouTube, only to see the now Google-owned video service quickly take off and leave Myspace behind.

So what went wrong? Murdoch said that he didn’t know what to do with the company, which is why he relied on bad advice, leading to News Corp. putting a layer of bureaucracy in place that eventually led to Myspace’s demise. Instead, he should have trusted the management in place when News Corp. acquired the site, or replaced it with completely new management altogether. The way News Corp. handled the Myspace acquisition turned it into “a very expensive mistake,” Murdoch said.

Murdoch went on to talk about Hulu, another digital bet that hasn’t always been easy for News Corp. He said that his partnership with Comcast and Disney had been “difficult” in the past, but added: “Now we are all on the same page, and we are going to drive this as fast and as hard as we can.” Murdoch went on to argue that Hulu was necessary in part because the industry needed a serious competitor to Netflix and Amazon.

Of course, Hulu isn’t the only company trying to compete with Netflix for eyeballs online. HBO announced that it will take its programming to cord cutters in 2015, and Murdoch said that he would have done “exactly the same” if he had won the bid for Time Warner. However, he cautioned that it will take some time for HBO to succeed online, in part because it will have to price the service at $15 to not cannibalize its cable TV business. “I don’t think it will be sensational for quite a while,” he said.

]]>There’s some non-peer-reviewed “research” (PDF) going around that claims Facebook will have lost 80 percent of its users in a few years’ time, based on the idea that you can draw a reasonable analogy between the social network’s trajectory and that of a contagious disease.

For my own sanity, I would like to pretend I never read about this Princeton study, but there are a lot of articles out there taking it quite seriously — the most irksome headline I’ve encountered reads: “Facebook is an ‘infectious disease’ and will lose 80% of users by 2017, say researchers.” It should go without saying that this story is nonsense, but apparently it doesn’t, so please allow me:

This study has not been peer-reviewed. I would like to think journalists can fulfil the same role with equal expertise, but LOL. (And yes that goes for me too — if someone with greater expertise can analyze the fine points of the researchers’ modeling, please do.)

There is a tenuous case for drawing an analogy between the spread of a disease and uptake of a social network (people say “viral” for a reason) but the analogy collapses once you get to the other side. Simply put, people don’t “recover” from joining a social network within a set amount of time.

The authors seem aware of this, so they modify their model by including “infectious recovery dynamics”, which relies on the idea that a “small initial recovered population” will trigger a sort-of-mirroring of the uptake trajectory. When people go “I’m done with Facebook” and leave, does that lead all their friends to leave? There’s no evidence, not even anecdotal, to suggest this.

The study is largely predicated on the rise and fall of MySpace, which is a lousy point of comparison. MySpace conditioned people to the use of social networks, aiding Facebook’s subsequent rise, but never got anywhere near as widespread as Facebook has. Facebook has at least ten times the number of users MySpace ever achieved at its peak, and it’s also tapped into an older demographic that is far less likely to move onto the next big thing.

MySpace was also badly mismanaged (as Lance Ulanoff points out over on Mashable), which was a big factor in its decline. Additionally, the researchers don’t take into account the ways in which Facebook is evolving – sure, it’s a social network, but also an online identity mechanism that’s getting baked into more and more services, for example.

As Ulanoff notes, if the model is correct, Facebook will lose 200 million active users by the end of 2014. It really won’t.

And who’s to say people will leave Facebook anyway? The far more likely scenario to my mind is that they end up using it only for certain things, like networking with family members, while they go elsewhere for the fun stuff. As my colleague Lauren Hockenson reported this week, teens are picking up on Instagram and Snapchat and whatever the platform of the day is, but not actually leaving Facebook in order to do so.

The idea of an all-engaging social networking platform is becoming outdated. That’s something that is no doubt worrying Facebook – engagement is everything when you’re trying to sell ads – but it’s not a scenario you can model using a disease analogy.

]]>How’s this for an understatement: Operational databases are important for many, if not the majority, of web applications. And if you’re doing big business on the web, finding one that can scale with your data volumes and still perform like you need it to is critical. MapReduce for batch data processing and analysis? Not so much, actually.

The latest HBase user I’ve come across is Gravity, the interest graph company that powers content recommendations for some of the biggest publishers on the web.

From big MySQL at MySpace to big data with HBase

Its co-founders were all senior executives at MySpace, including Gravity CTO Jim Benedetto, who was SVP of technology for the social networking pioneer. He was actually MySpace’s first architect and helped build platform’s MySQL database. Although MySpace never reached Facebook’s scale, it did have 150 millions users at its peak, all able to store unlimited numbers of wall posts, messages and photos. Benedetto eventually oversaw a 600-instance cluster that required about 30 database adminstrators to keep it up and running.

Benedetto (center) at Structure: Data 2012. (c) Pinar Ozger

So naturally, when it came time to build out the Gravity architecture, Benedetto opted for the MySQL he knew so well. Until about three years ago, he told me recently, that database held about 95 percent of the company’s data. At some point, though, Benedetto and his team realized they were spending way too much time keeping their MySQL environment up insteading of building new things, so it was time for a change.

It ultimately opted for HBase, but the decision wasn’t easy. “For us,” Benedetto said, “our data and algorithms are our company,” so making the move from a relational database to a column-based database that can serve MapReduce jobs was nerve-racking. After all, he explained, “You never want to migrate your data … and if you have to, you never want to migrate it more than once.” In fact, he added, “you’re not going back.”

But Benedetto says the move to HBase as Gravity’s primary data store has been “life-saving,” and it’s arguably a more important component of the company’s infrastructure than is Hadoop MapReduce. HBase handles the company’s real-time recommendation algorithms, and it does it across the entire Gravity platform rather than on a site-by-site basis. And although it’s not banking-grade when it comes to the consistency of transactions, Benedetto says it’s about 99.95 percent consistent in real time. Later on, batch MapReduce jobs swoop in and pick up whatever HBase dropped earlier, and process it all against the company’s graph algorithms.

An example of an interest graph from Gravity,

Scalable for sure, and getting easier to use

And although it took some serious engineering effort to get HBase operational when Gravity began working with it three years ago, Benedetto thinks HBase is getting to the point (as is rival NoSQL database Cassandra, he acknowledged) where one could safely call it “enterprise-ready.” Right now, he noted, “You’re not gonna to see HBase in a company that just buys Oracle because Oracle is the name and Oracle has been around for 20 years,” but for web startups that hope to reach a certain scale and even for existing companies that are running into the MySQL wall, he sees a shift occurring.

“The web farm is the easiest part of your infrastructure to scale because all it does is cost more money,” Benedetto explained. Databases, on the other hand, require a lot of thinking about how to migrate data, shard the database and otherwise make a piece of software likely designed for a handful of servers, max, spread across dozens or hundreds. HBase really eases the scaling process, as well as the subsequent management, he said. Now, Gravity’s 100-node HBase cluster has only two operations engineers dedicated to it.

Aside from scale HBase might soon start catching on because of the work companies like Gravity have been doing to make it more user-friendly. It might scale easily, but, as Benedetto noted, it’s not always easy to get started with — especially without some deep understanding of the intricacies of the underlying HDFS infrastructure. Last year, eBay VP of Experience, Search and Platforms Hugh Williams told me that although HBase is one of the big data tools the company is most excited about, it’s also the area where he’d like to see the most improvement.

To help alleviate some of the learning curve, Gravity has developed an open-source tool called HPaste that lets developers access data and run jobs on HBase data using Scala rather than the more-bloated Java programming language on which Hadoop and HBase are built. One of the biggest benefits of HPaste, Benedetto said, is that it lets new HBase developers see the data in a way that makes sense to them: HBase stores everything in byte arrays, he explained, and “when a human tries to read a byte array, it looks like ancient hieroglyphics.”

The Kiji architecture

Elsewhere, a startup called WibiData has created an open-source framework called Kiji that aims to provide a collection of high-level APIs that should make it easier to store different data types in and develop applications on HBase. The company envisions Kiji being to HBase what the Spring Framework has become to Java over the course of the past decade.

Hadoop’s weapon for the mainstream?

But user experience aside, a lot of companies already invested in Hadoop — aside from expert users such as Facebook — are starting to see the promise of HBase and are incorporating it into their architectures.

WibiData co-founder Christophe Bisciglia, who also co-founded Hadoop pioneer Cloudera in 2008, gave me his take on the state of HBase while discussing its role in the future of Hadoop earlier this year. “If you talk to anyone from Cloudera or any of the platform vendors, I think they will tell you that a large percentage of their customers use HBase. It’s something that I only expect to see increasing,” he explained. “… HBase is gonna be what takes Hadoop from an ETL and BI platform into a real-time application platform.”

The Cloudera Hadoop stack (Gravity uses Cloudera’s distro).

Benedetto appears to agree. He considers Hadoop as a whole incredibly important, almost on par with what Amazon Web Services did for computing resources, because it lets startups use commercial-grade open source software to do data storage and processing that previously was only available to massive web companies. “More and more … the shining star in that suite is HBase,” he said. “If I were Oracle, I’d be scared.”

]]>Astute mobile application vendors are bringing to market applications that help mobile users connect and interact with people in close proximity. We expect this emerging market — what we call proximity-based mobile social networking — to grow to $1.9 billion in revenues by 2016.

]]>The New York Television Festival always brings together a unique mix of creators and executives to discuss the evolution of the medium in the digital age. And delightfully, this Friday the entire Digital Day line-up of panels was live-streamed and archived online.

It’s four hours long, though, which is quite a few hours! There’s some good discussion overall, with folks from AOL, College Humor, YouTube, Blip, MySpace and MSN, but here are the Cliff’s Notes — some of the smarter insights and comments from the day’s panels.

Research matters — apparently, people are pitching content to the CW Digital department without actually knowing what the CW is. Don’t be that person. “Nothing is more frustrating than when someone comes in who haven’t been to our website, don’t know who our audience is, but know they have the perfect show for us. Not gonna happen,” CW Digital EVP of marketing and digital programs Rick Haskins said during the Development: Building a Foundation panel.

The great content creators, according to everyone on the Development panel, understand more than just making content — specifically, how to market that content on a social media level.

Celebrities don’t always click on the web, according to Ran Harnevo, AOL Senior Vice President of Video, but casting someone with, say, four million Twitter followers will be significantly more meaningful than a celebrity who has less of a following. And web celebrities like Ian and Anthony of Smosh may have a lot more draw online.

“A lot of well-known stars want to do it for the money — they don’t give a rat’s ass about digital, and to me that goes nowhere. You want to find people who understand digital and are passionate about it,” Haskins said.

My Damn Channel CEO Rob Barnett urged people to avoid exclusivity initially, but then focus on finding a company the best deal as your development grows — especially when it comes to promotion, not just money. “You gotta look for promotion — the sea we’re all swimming in gets more and more crowded every day,” he said. “If you’re just upload to one of those huge places, the odds are getting the push that you need are getting scarier by the minute — you’ve got to be as judgmental of a home as we are of your content.”

Barnett also recommended people check out the YouTube Creator Handbook: “90 percent of it is awesome — 10 percent, I don’t like, but we’re using 90 percent of it and our videos are getting more views.”

During the Talent Debate panel, Innovative Artists head of digital David Tochterman revealed that breakout web series Lizzie Bennet Diaries has partnered with DECA — big news for the homegrown web series from Hank Green and Bernie Su.

Also revealed during the Talent Debate panel: Mark Malkoff’s life is now much more difficult since Facebook removed email addresses from user profiles, as he now has to work much harder to book celebrities for his Celebrity Sleepovers series.

Going back to the web celebs point: Despite Celebrity Sleepovers‘s impressive roster of names (including Camryn Manheim, Steven Weber, Ed Begley Jr. and Rob Corddry), Malkoff said that the iJustine episode was by far the most viewed of the series.

Blip CEO Kelly Day laid out the company’s shift in approach over the last six months — specifically, a shift from its previous “Blip vs. YouTube” mentality, and a new focus on helping creators distribute their content through a variety of means, including YouTube as well as other partners. “I don’t think it’s a binary conversation. YouTube can be a great way to build your audience, but it’s not the only way to build your audience,” she said.

Lee urged creators to “play the whole field” when it comes to putting content out there. “Go out and experiment and see what you get out of it.”

MySpace Entertainment president Roger Mincheff stepped up big time to discuss MySpace’s recent relaunch, and defended keeping the MySpace name despite the brand’s rocky recent years, especially given the comedians and musicians who attribute MySpace to their success — making MySpace the ultimate farm system. MySpace nostalgia? That’s apparently a real thing.

This is just some of the interesting stuff discussed over the course of the day: It’s a fun mix of folks discussing how creators can use the current state of entertainment to thrust their content into the world, and the challenges they might face in the process. Go watch it yourself if you have a minute to spare.

]]>Starting this week, voters can go to three new online platforms to watch the Obama-Romney presidential debates and to see how their views on 11 issues compare to those of other Americans. The forums will help educate voters while also providing the media companies — YouTube, AOL and Yahoo — with a chance to pump up their political offerings.

It works like this. The three companies will all host the same toolkit that lets users take short surveys on topics like immigration and health care. Their sites, which will have a countdown to the next debate, will also show how many other Americans are exploring the same issues. Here’s what it looks like:

This isn’t the first time the Commission on Presidential Debates has worked with online media companies to educate voters. In 2008, the non-partisan CPD partnered with MySpace (described then as “the world’s premier social network”) to stream the Obama-McCain debates and offer political learning tools.

This time around, the process will include data visualization graphics that depict how an individual user’s views on topics like social issues compare to others:

When filling out the surveys, voters will also be able to indicate that a given issue — say jobs or the environment — is important to them. This will produce graphics showing how voters prioritize the different issues.

For CPD, the debate sponsor, the new online partnerships are a way for it to carry out its mandate of educating voters. For the media partners, it is a way to showcase their political coverage.

According to Chris Grosso, SVP of AOL Homepages, the debate channel will let the company “showcase different brands” like AOL, Patch and the Huffington Post. As an example, Grosso said that if a topic like the auto industry bailout comes up, AOL will be able to “surface” relevant content from a Patch live blog.

As of Monday morning, the sites had yet to go live. They will soon be available at yahoo.com/thevoiceof, aol.com/thevoiceof and youtube.com/thevoiceof

The interactive toolkits were produced for the Commission as a pro bono project by New York ad agency BBH.

]]>News Corp announced today that chief digital officer Jon Miller is leaving next month, as the giant company moves forward with its plan to separate into two distinct companies.

Miller, a former CEO of AOL, joined News Corp in 2009. He was tasked with developing a digital strategy for a diverse set of properties that included everything from MySpace to Photobucket at the time. He was also asked to make digital a priority across the company’s many news and entertainment divisions.

Miller’s parting was amiable, according to All Things Digital, which first reported the story. That sentiment was echoed in a News Corp announcement in which the company said he would stay on as an adviser and CEO Rupert Murdoch praised him as “a visionary in the digital media industry.”

The departure appears to be related to News Corp’s impending split. Lucrative entertainment assets like Fox and BSkyB will be spun off into one company and the publishing properties into another.

While presiding over News Corp’s far-flung digital assets, Miller made sensible moves such as offloading MySpace, which the company had acquired for $580 million in 2005. The divesture was consistent with a statement in News Corp’s 2012 annual report that its digital strategy is no longer acquisition-based.

Miller’s other activities included supervising News Corp’s $45 million investment in streaming device maker Roku and working with Hulu, where he was a bord member.

Miller is also an active angel investor with deep contacts in the media space. He joined us at the paidContent 2012 conference, and you can hear his thoughts on digital media here:

]]>Remember when Friendster was the hot social network, publishers doubted that ebooks would ever sell, and Netflix thought DVDs in red envelopes was the future?

We do — that was that state of digital media when paidContent launched in 2002. Other weird things were happening back then too: People still got much of their news from television and newspapers, and they learned about major events after they had already happened.

There have been some huge shifts since 2002: Tablets and smartphones are now ubiquitous, lots of people read on their digital devices, and just about everyone is part of a social network or three. This summer is the tenth anniversary of our launch. In an effort to gain some perspective on the past decade in digital media, I’ve been reading back through paidContent’s archives — a collection of over 80,000 posts.

Since I was only a freshman in college when paidContent came to life, I often didn’t know, as I read through the stories from the early days, how things had begun or how they turned out. As I watched them unfold, I wanted to grab our readers’ arms and give them advice (“Don’t buy that Zune!” “Invest in Facebook!” “Go for the good Twitter handle now!”). But I also realized how difficult it is to predict success.

Some takeaways from my trip through the archives: Some companies — AOL and Yahoo come to mind — have been consistently bad at predicting what consumers want. And a couple of companies, namely Apple and Amazon, have been very good at it. Also, being a native digital company helps, but it’s no guarantee of success (what up, MySpace?). And after all these years, it’s still not clear what content customers will pay for, or how much they’ll pay.

Streaming and Moviebeaming

What do analysts, CEOs and bloggers have in common? None of us can predict the future. Roger Ebert joked in 2002 that “on-demand streaming movies on the Web, like HDTV, are five years in the future — and will be for at least another 10 years.”

If Disney’s Moviebeam had been the only game in town, Ebert probably would have been right. When it launched in three cities in 2003, customers paid $6.99 a month to use a device that could hold 100 movies and plugged into the back of a TV set. They also had to pay for each movie they watched– billing was done via the phone line. The company went through various unsuccessful iterations before India’s Valuable Group bought it in 2008. It was never heard from again.

Speaking of AOL: It’s something of a miracle that the company still exists. In 2000, when it merged with Time Warner, it was valued at $350 billion, and the next year, more than 24 million people in the U.S. were paying for its Internet access service. By the end of last year, that number had dwindled to just 3.3 million subscribers. Here’s a quick recap of some of AOL’s miscues over the years:

Where did these companies go wrong? In 2010, former Time Warner CEO Gerald Levin pondered that question in an interview with the New York Times . The AOL-Time Warner deal was “undone by the Internet itself,” he said. “I think it’s something that no one could have foreseen, and to this day, whether Apple is going to dominate entertainment or whether Amazon is going to dominate publishing, all the old business plans are out the window. How do you get paid for content?”

Know what’s cool? A billion dollars

In 2006, an RBC Capital analyst estimated that a certain social networking company would be worth $15 billion in a few years, based on “raw, unprecedented user/usage growth.”

Six years later, Facebook went public with a valuation of $104 billion. Too bad the analyst wasn’t talking about Facebook but about MySpace. The social networking company that Rupert Murdoch acquired for $580 million in 2005 sold for just $35 million in 2011.

Why did Facebook soar while MySpace — and other social networking services like Friendster — sank? It allowed people to build real connections using their actual personal information, and rolled out a product that was ready to scale and had good technology. Other companies realized sharing was important too — in 2005, Yahoo SVP Jeff Weiner called sharing “the next chapter of the World Wide Web” — but Facebook was able to implement it in a way that kept users coming back. The site surpassed Yahoo and AOL for “stickiness” in 2009, when Nielsen found users spending an average of four hours and thirty-nine minutes a month on Facebook.

Social has already disrupted some industries — witness the rise of Twitter and the way it has changed the way news is reported, with stories like Osama Bin Laden’s assassination breaking there first. In a sign of the importance of these emerging platforms, newspapers like the Wall Street Journal and New York Times are launching “Everywhere” initiatives to deliver news to readers where they are already hanging out.

Fast food and music don’t mix

Hard to believe it now, but there was real skepticism that iTunes’ 99-cent songs would be able to compete with peer-to-peer file-sharing services. “According to academics who’ve studied the economics of digital music distribution,” we wrote in 2003, the year iTunes launched, “the cost still seems too high to attract users of peer-to-peer file trading services.” The piece cited an economist who believed “the appropriate price of a downloaded song is 18 cents.” In fact, Real Networks dropped its song prices to $0.49 in an attempt to compete against Apple.

The company that arguably started the digital music revolution — Napster — didn’t survive. Once it no longer offered “free,” it was done, though it tried to reincarnate itself: launching a mobile music service, “Napster To Go,” with AT&T in 2004 (the one smartphone that supported it could hold up to 6 songs), partnering with Circuit City on a digital music store, getting itself acquired by Best Buy in 2008 ,and then being bought back by Rhapsody in 2011. Unfortunately, Rhapsody was already losing out to newer (and free) streaming services like Pandora and Spotify.

The partnerships with Circuit City and Best Buy, though, were probably the kiss of death. One of the big trends of the past 10 years has been brick-and-mortar retail stores’ consistent failure to compete effectively against digital-native companies. Best Buy wasn’t the only retailer to try to crack the digital-content business — and fail: Target and Sears both took a shot. And McDonald’s sold digital content over its WiFi network and even tried DVD rentals in its restaurants.

Do you like the feel of paper?

Just as digital music didn’t really take off until Apple introduced the iPod, the ebook revolution didn’t take place until the arrival of the Kindle. In paidContent’s early years, ebooks were written off as a failure in part because publishers couldn’t figure out what to do with DRM. (In 2003, “temporary electronic ink” that would disappear after a few months was floated as a possible solution.) Barnes & Noble decided to stop selling ebooks in 2003, and Yahoo stopped selling them in 2004.

Meanwhile, Amazon and Google were pushing forward. Google launched Google Print — now called Google Book Search, and still besieged by lawsuits seven years later. Amazon tested two now-defunct programs: Amazon Pages, which allowed customers to buy access to digital copies of select pages from books, and Amazon Upgrade, which bundled print books with online access to the complete work.

A Forbes survey back in 2002 found that “business professionals” would be willing to pay for “news content to be delivered to their cellular devices,” and some media companies tried early mobile experiments. Verizon offered a cell phone version of the Yellow Pages — which, at $19.95 per year, gained 15,000 subscribers in three months. But starting in 2004, everyone decided the future was in ringtones. A $4 billion global business by the end of the year, one company projected.

Further complicating matters for advertisers: The smartphone market is fragmented among different brands — marketers don’t want to spend the money to create different ads for Android and iOS — and there are two mobile ad universes: mobile browser and apps.

The next opportunity is social media advertising. And once again, it will be a challenge to figure out some standardized metrics. What’s a retweet worth, anyways?

Back to where we all began

Though micropayments worked well for music when Apple launched iTunes, the path to payments for written content has been rockier. In 2004, we wrote that “micropayments today are still characterized by a large number of competing transaction types” – including direct-to-bill, merchant aggregation, prepaid accounts and direct transfer – and “each of these face the current incumbent in digital content distribution: the flat-fee subscription model.”

Eight years later, it appears that the subscription model has won out. The iPad opened the door for magazine and newspaper publishers to create new revenue selling content on that platform, but the results have been mixed. When Rupert Murdoch’s “The Daily” iPad newspaper launched in early 2011, the company called it “the model for how stories are told and consumed.” We wrote, “The bet here is that while consumers are less and less likely to reach into their pocket for a few quarters to buy a newspaper, they might not care about the 14 cents on their credit card for a copy of an e-newspaper.” A year and a half later, The Daily has over 100,000 paying subscribers — but it’s living on borrowed time and may not get through the five years its publisher has said it needs to break even.

Writing for the web, of course, has been around for awhile. At the beginning of the decade, blogging was called “nanopublishing,” and the question was how blogs could support themselves doing it. All sorts of models have arisen. For example, Gawker tried a licensing deal with Yahoo, but that relationship ended a year later. The deal “garnered way more attention than we expected, but less traffic,” Gawker CEO Nick Denton said in 2006.

Magazine companies have grappled with whether to bundle digital editions with print subscriptions or charge for them separately. Time Inc. — which first put digital editions of its magazines behind AOL’s paywall in 2003 — started out charging separately, but today Time Inc. and Condé Nast print subscribers get the digital edition free. Hearst, meanwhile, is charging separately, and it said its digital business in the U.S. became “solidly profitable” for the first time in 2011.

Many newspaper publishers, most notably the New York Times, tried paywalls at the start of the decade and then abandoned them – only to return to the model in the past couple years. In its most recent earnings report, the NYT said it has 454,000 digital subscribers. Is that enough to sustain the newspaper in its 21st-century transition? Probably the best answer to that came from Vivian Schiller. But it was in response not to the NYT’s recent digital subscriber numbers, but to the NYT’s decision in 2004 to close the paper’s first paywall, known as TimesSelect. Schiller, then the SVP and general manager of NYTimes.com, was asked whether TimesSelect had worked. “It did work,” she said. “It’s just a matter of as compared to what.”

[W]e’ve already begun to more thoroughly enforce our Developer Rules of the Road with partners, for example with branding, and in the coming weeks, we will be introducing stricter guidelines around how the Twitter API is used.

Twitter has burned the ecosystem before

These comments set off warning bells for a number of developers, who said they were concerned that Twitter was going to crack down on any third-party app or service. One developer on Hacker News said that in his view, Twitter was trying to shut down third-party services so that they could “inflict a homogenized, boring, monoculture on their user base [that] they can monetize, which will make the experience progressively worse.” Said Turntable.fm developer Jonathan Kupferman:

[tweet https://twitter.com/jkupferman/status/218788665600643074]

This isn’t the first time that Twitter has upset the developer community by throwing its weight around. In 2011, there was widespread criticism of the service for the way it issued new rules around use of the Twitter API — and also the way it behaved towards those who crossed the line by shutting off their access without even a warning, as it did in the case of entrepreneur Bill Gross and his Ubermedia network. At the time, one critic accused the company of “nuking” the Twitter ecosystem.

The company also came under fire in 2010 for the way it handled relations with third-party developers after it bought an app called Tweetie. Hunch founder Chris Dixon said Twitter was “acting like a drunk guy with an Uzi” by telling developers not to bother developing Twitter apps, and a number of companies and investors that had been putting money and time into the Twitter ecosystem stopped doing so. So some of the negative reaction to Sippey’s post stems from being burned twice already.

Anti-user moves torpedoed both MySpace and Digg

And there is a very real risk to this kind of aggressive focus on control and monetization, as a commenter on Hacker News pointed out: restricting the ways that users can access and display their tweets, whether through strict API rules or moves like the LinkedIn shutdown, could irritate the user base that Twitter is relying on to click ads and do all the other things it is planning around monetization. Ultimately, the company could ruin the experience that made Twitter so compelling in the first place, in the same way that MySpace and Digg did.

There are plenty of reasons why MySpace failed, including the conflicting desires of a giant corporate owner like News Corp., but it also started to hemorrhage users because it focused more on monetization through ads and other elements than it did on maintaining a good experience for users. Digg did something similar — in an attempt to build a bigger company and leverage its user base for profit, it added a whole range of “services” and features that were designed mainly to appeal to corporate customers and advertisers. The end result was a wholesale desertion of Digg for other communities like Reddit.

Twitter has a tiger by the tail — it has an active user base in the hundreds of millions, it has become an almost indispensable tool for both news junkies and the media (although this carries risks as well) and it is starting to see some favorable responses to its ad model. But it is also a community, where the users provide the vast majority of the content that is being monetized, and while screwing around with that relationship may appear to make short-term financial sense, it could end in disaster.

]]>Every week the media seems to offer a new account of some dumb crook who is off to the slammer because he posted about his caper on Facebook. It turns out this phenomenon may be even more widespread than we think.

A new survey reports that social media played a significant role in nearly 700 cases in the past two years alone and that most of these involved either MySpace or Facebook. LinkedIn and Twitter were the next most common social media sites to produce evidence for the justice system. Only one case mentioned FourSquare. The report doesn’t mention Google+ at all.

The trend does not appear to be abating. Here is this week’s genius who posted a Facebook snap that shows him stealing gasoline from a police car.

The findings were based on a study of legal databases and were published by X1 Discovery, a company that helps lawyers and law enforcement mine social media.

The most curious element of the findings may be the ongoing prevalence of MySpace in the criminal justice system years after most consumers have abandoned the service.

An unscientific explanation for MySpace’s ongoing presence is that most of the cases in the survey are criminal ones, and that crimes typically involve people from lower socio-economic classes. Such people are more likely to be MySpace users than the rest of the population.

Social media has provided not only new evidence for courts but also a challenge for judges who are struggling to decide what to do with jurors who tweet or discuss cases on Facebook.

More highlights from the report can be found on the Forensic Focus blog. A spreadsheet of the findings can be found here.