“The death of display advertising has been greatly exaggerated,” Randall Rothenberg, CEO of the Interactive Advertising Bureau, said last week at the trade organization’s MIXX conference.

True, rectangular “banner” ads, in-stream video commercials and other so-called online “display” advertising accounted for more than a third of the nearly $23 billion spent on Internet advertising last year, according to David Silverman in a PricewaterhouseCoopers report prepared with the IAB.

But there was another 800-lb gorilla in the room at MIXX. Social media such as Twitter and Facebook are rapidly becoming venues where marketers connect with customers and spend dollars that previously may have gone to more traditional Internet ads.

Dick Costolo, who was just named Twitter CEO, talked about the power of the platform and its new advertising efforts, such as Promoted Tweets and Promoted Trends. Investor Yossi Vardi, who helped launch the ICQ chat standard, likened the future of social media to “the future of civilization,” quoting Amazon.com founder Jeff Bezos as saying the likelihood you’ll buy a car is 500 percent higher if a friend recommends it.

Even as Sheryl Sandberg, COO of Facebook, touted the ways that display ads can be used on Facebook, many at the conference discussed how social interactions on fan pages and through its Open Graph (formerly Facebook Connect) get consumers to buy at a much higher rate than advertising alone ever has.

So is social media killing display ads?

Display ads growing, but…

A closer look at the Pricewaterhouse study shows that while display ads ticked up to a 35 percent share of ad spending in 2009 (from 33 percent in 2008), search grew by the same 2 percentage points but a higher dollar amount (from 45 percent to 47 percent). Search’s share has risen sharply over the past four years while banners have been essentially flat.

Plain old banners, too, are declining in relevance. The money spent on ads sold on an “impression” basis — that is, based purely on someone seeing them — is down sharply compared to “performance” ads that require an action, such as a click. That trend is likely to continue.

YouTube executives at the conference introduced their new “cost-per-view” advertising format, in which a viewer is given a choice of which ad to see in a video, or given a choice to skip an ad after five seconds. The marketer pays only for a video that’s been shown.

The effectiveness of these ads, said the executives, can be 10 times as high as for interruptive ads (the term for any type of ad that delays the user from accessing the content he seeks).

New startups from high-profile entrepreneurs are also offering to help advertisers get into the social stream, promising higher effectiveness than from display ads.

Two days after MIXX and a few blocks away at the Web 2.0 conference, Jonah Peretti, a cofounder of The Huffington Post, showed off new startup BuzzFeed, which in May closed an $8 million second round of venture capital financing. The company, now in beta, uses some of the same technological and editorial techniques of HuffPo to inject conversations prompted by marketers into the social stream while trying to score viral hits.

Social media as salvation, too

Whether or not BuzzFeed or YouTube’s initiatives succeed, they’re clearly part of a trend of marketing dollars moving away from typical banner ads and toward user interaction and social media.

Publishers earning revenue from display ads priced on impressions will find themselves competing over a shrinking pie. Hearst Corp. CEO Frank Bennack Jr. noted at MIXX that there is “no longer an inventory shortage” for advertising as there once was in magazines and newspapers, and that his companies’ online publications need more than just advertising revenue to survive.

There is some hope for display ads, though — and it’s based on social media and the ability to customize ads based on user interaction.

Showing off new display ad products still being tested, Barry Salzman, Google’s managing director for media and platforms for the Americas, said that by 2015, 75 percent of ads will be “socially enabled.”

His colleagues showed new “dynamic display advertising” formats that allow on-the-fly customization as a user interacts with a site, as well as the ability to change ads via the social stream. In one case, as someone types in a ZIP code on Google Maps, a rectangular display ad next to the typing is customized to that region. Other ads, some of which exist today, allow companies to put live Twitter streams and comments into a banner.

This means publishers will have the chance to make money from their visitors by enabling them to interact with ads in the same way they are now encouraged to interact with a site’s main content. (Publishers still will have to maximize revenue for each ad placement by serving the highest-value ad in each spot and by creating other revenue streams.)

As marketers become more savvy and demand more proof of user engagement, publishers will likely have to offer marketing formats that show clients that they have gotten users to engage not only with the site, but also the brands that are advertising on them. Read more

Here, he discusses how the travel guide industry piqued his interest and how he started Guidism.com to explore whether the industry can be re-imagined for mobile devices.

Dorian Benkoil: Could you tell me about what you’re working on?

Rafat Ali: One of the sectors that I’m deeply interested in, and very likely my next venture, is going to be in the travel guidebook sector.

And that’s born out of a few things. One is, as people who have been following me on Facebook and Twitter know, I have been traveling for the last 24 months. I have been to five different countries and all kinds of various places, and as a result I think it’s fair to say that I’ve caught more than the travel bug, and have also been using all kinds of guides, whether it’s books or research online, or a bunch of mobile apps.

I think there is an opening in the market that I can help address in the travel guidebook sector, particular as the sector gets re-imagined in the mobile arena. If anything says mobile, travel guide says mobile …

Exactly what it means for me and what the final thing I’m going to be working on will look like, to be honest, I don’t know yet. What I have done is launch a site, a blog, which is what I know best, about the travel guide sector called Guidism.com, which essentially is the daily links that I’m posting as I learn about the sector. …

Can you tell me more about your intentions with mobile and things you want to do?

Ali: It’s obvious that the scope for reinvention of the guidebook is on the mobile platform. Clearly, online there are too many sources of information. Most people start their research on Google.

So how do you as a startup or an established brand rise above the noise? I think on the mobile platform that becomes slightly more clear, because by the time you’ve reached the mobile platform, you’ve already done pre-research of where you want to go.

At a destination … you need a guide, whether that’s a printed guide or a mobile guide. Just making an e-book out of a guidebook is not enough. Some of the guide companies have done that. That’s not even taking advantage of the medium, which is a live medium. Mobile is a connected medium, so there a lot of things that you can do. And that’s what I’m trying to figure out.

This seems rather different than what you did before. This a vertical, but you’re not really talking about covering a vertical, and you’re not talking about doing a news media company.

Ali: Correct. Also, this is a consumer vertical, not a B2B vertical, which I’ve done previously. When I started thinking about leaving, and especially as I was traveling, I think one of reasons I was traveling so wide was to clear my head and also figure out what I want to do next.

One of the things I did not want was to do something in my comfort zone, which I’ve been doing previously for the last eight years of my life. The easiest thing for me would be to take that vertical building knowledge and apply it to another B2B vertical that I can apply the same treatment, which is fast-breaking news, bite-sized chunks, analysis, opinion, data, research. Add conferences, add classifieds, add video — all the elements that went into ContentNext and paidContent, and all the stuff that I’ve done. …

In business to business, everything is incremental, right? You get this many visitors or this much money, you hire another person. Or you do this much, and you expand. And the audience growth is also incremental — it never is exponential — which is good and steady. …

But having covered the consumer companies, I’ve always liked the high that comes with the exponential growth. If something clicks, that’s the high I want to experience at least once in my life, however naive that sounds. But of course there’s good and bad. Good is if you get that high. Bad is you can flame out so much faster. …

Isn’t everything you’ve said about news applicable to travel guides? Whatever brilliant content you get, brilliant applications you build, brilliant platforms you get them on, there is other content. Others with just as few barriers can do the same thing you’re talking about with travel.

Ali: Yes and no. I can’t explain for two reasons. One is I will disclose more than I’m willing to. And secondly, I’m still learning. The reason I say no is because for something where people have invested a bunch of money to go a certain place — especially destinations outside your own country — the planning and the guide part of it cannot be left to chance, which is left to brands that are untested and not well-known.

From a consumer point of view, if they’re investing so much money and time and effort to go, there has to be enough security, in terms of when they’re taking a guide, whether it’s a book or it’s mobile. It has to have reliable information. They can’t be stranded in the middle of nowhere without knowing where go.

I’ve learned this being a traveler, and learning about the travel industry. While it seems to be the easiest thing to get content, it’s one of the hardest, backbreaking kinds work that these guys have done over the last 20, 30, 40 years, which is how long these guys have been in existence. … It’s not just creating something one time; it’s updating that is also extremely difficult, especially outside the popular sectors. …

But there are four or six established brands I could name. Even in the backpack sector, the high-end sector, there are a couple or three brands in each.

Ali: If you look at the numbers in the travel guide sector, all of them, especially in the last couple of years, have declined.

Of course, those are secular trends. The print part of the guidebook sector is in decline. It’s also cyclical because travel in the last two to three years has been hit by the economy, and [that will continue] for the next couple of years, in all likelihood.

It’s also why I’m looking at it as an opportunity because now is the down cycle, and there are probably things to start and things to pick up that will be much cheaper now than they will be in a few years.

I do think brands matter in this industry. Imagine the content dropped into these books. I mean, a company like Lonely Planet, just as an example, has 800 different titles. Imagine the amount of content that is built into those books. How can a startup even begin to rival that, even if all they’re doing for the next five to 10 years is gathering content?

So what’s your opportunity?

Ali: Maybe not in the travel content industry but maybe an allied services thing that will become a hit. It could be a technology. It could be a way of presenting these books on these platforms. It could be search in the travel guide sector. So I don’t know yet, to be honest. That’s what I’m trying to figure out. Search certainly seems to be an interesting opportunity.

If you do create the brilliant application, brilliant technology, sure that gives you a head start. But somebody else could come along and create another brilliant technology that somehow undercuts or steals or improves upon what you’ve done.

Ali: Hopefully I will be that person.

Again, there are slight risks in going public for competitive reasons or even talking about it in the posts that I did. But if your competitive advantage is silence, I don’t think that’s a huge advantage.

I feel like I’ll learn more being in a public forum, as I say in my post [on the site], and also I’m getting e-mails from people in the sector, so clearly I’m already getting more opportunities than I would have just being silent. Read more

Thursday, Sep. 09, 2010

Journalists from American Public Media, Public Radio Exchange, Public Radio International, PBS and NPR have spent months scoping out how they would create an online pipeline to share and distribute public media content on any platform.

Their goal is to create a “Public Media Platform” — an open API that would allow public media organizations across the U.S. to share content with one another, with application developers, and with independent content creators and publishers.

Along with giving people greater access to content, the Public Media Platform would make it easier to aggregate and package different news organizations’ stories on major news events such as the BP oil disaster and the earthquake in Haiti.

“If you really want to follow a story across all the public media producers, there’s no simple way to do that, and there needs to be,” Joaquin Alvarado, senior vice president for digital innovation at American Public Media, said in a phone interview.

“Folks spend a lot of overhead time going between sites, and I think we need to start producing an efficient pipeline to connect the dots between the various threads of interest.”

It’s possible to curate such coverage by hand, but an API is a technological solution. Essentially, APIs, or application programming interfaces, enable software programs to communicate with one another, allowing data to be shared and used in various ways.

“Engine of innovation”

Kinsey Wilson, senior vice president and general manager of digital media at NPR, has helped lead the six-month-long planning phase, which costs $1 million and is scheduled to end in December. (The Corporation for Public Broadcasting provided the majority of the funding, while the rest came from in-kind donations.)

Wilson said in a phone interview that he hopes the API will encourage greater collaboration among public media outlets and make it easier for them to innovate and push their content in front of new audiences.

“We see this as an engine of innovation, and we’re really trying to create something that will spur others to innovate and develop compelling applications for the public,” said Wilson, a member of Poynter’s Board of Trustees.

“There’s a great amount of content for radio and the Web that resides in lots of different places,” he said, “and that’s locked in lots of different systems now.”

Throughout the planning phase of the project, Wilson has drawn on his own experiences with NPR’s API, which gives outside parties access to more than 250,000 stories dating back to 1995. Since it launched two years ago, the API has contributed to an 80 percent increase in NPR.org’s total page views, Wilson said.

Enabling collaboration

NPR’s API is a critical part of NPR’s Project Argo, a new online journalism venture aimed at producing in-depth, local coverage on topics such as politics, public safety and climate change. The 12 NPR member stations that are part of the Argo network will share stories through NPR’s API.

Joel Sucherman, program director of Project Argo, told me in an e-mail that sharing content via APIs is becoming increasingly important as public media outlets look to expand their reach.

“It’s important that public media organizations ensure that we reach audiences wherever and however they want to consume content — terrestrial radio, TV, online, mobile, wherever,” Sucherman said. “And we think through the power of public media networks, the whole will be greater than the sum of the parts.”

Robert Bole, the Corporation for Public Broadcasting’s vice president of digital media strategy, hopes to spread the word about the Public Media Platform in a South by Southwest Interactive panel he proposed last month. He said in a phone interview that ideally, the platform will encourage public media to collaborate more with developers and programmers.

“We had to manually assemble that,” said Jake Shapiro, CEO of PRX, by phone. “That’s the kind of thing that would be easy to launch in a more timely fashion once something like the platform exists.”

Shapiro said he can think of several other ways PRX could use the Public Media Platform — for instance, to build an iPhone app featuring content from many public media organizations. Currently, that would take a lot of effort.

Shapiro emphasized the importance of creating a service that involves a broad range of public media outlets.

“I really err on the side of openness and inclusiveness,” Shapiro said. Public media, he said, is uniquely suited for this work because of its public service mission — “to make sure content that reflects public dollars is accessible in the most broad and relevant way possible.”

Planning for future business models

Those involved in the planning phase of the project have talked at length about establishing a set of business rules around the distribution and use of content. The goal, Wilson said, is to find a way for news organizations to share content without dramatically undercutting their existing businesses.

“I think there’s an assumption on our part that the business models and the rules may change over time,” Wilson said, “but we need a starting point and one that will encourage people to experiment with the kind of sharing that [the API] will facilitate.”

Wilson emphasized that the API will be built incrementally so that it’s easier to assess what works and doesn’t work and then make adjustments along the way. Several people have been involved in the planning phase of the project and are working to determine the next steps.

There’s an advisory board that consists of public media journalists, as well as a technical advisory board made up of journalists from outside public media, such as ProPublica and Publish2. Each of the members is assigned to one of three committees — a leadership committee that is figuring out the business rules; a planning committee creating a document explaining how the API will come together; and a proof of concept committee that will build a live prototype of the API.

As of right now, there’s not enough money to continue beyond the planning phase. Wilson estimated that the API would cost several million dollars to build. “We don’t have a dollar figure yet,” he said, “but it will be relatively modest compared to the historic investments made in pubic broadcasting.”

The success of the Public Media Platform will likely depend not just on the content in it but also on whether people actually use it.

“I think I would measure success in two ways: by the number of different content producers that ultimately elect to use this and put their content in it, and by the number of institutions, organizations and individuals who make use of what’s available and put it on their sites,” Wilson said. “My hope is that this would stimulate some real creativity.” Read more

Thursday, Aug. 26, 2010

As a reporter at The Washington Post, Sarah Cohen was frequently frustrated with the dearth of tools for working with chronological data. Now the Knight Professor of the Practice of Journalism and Public Policy at Duke University, Cohen looks for ways to help journalists be more efficient.

TimeFlow, a free and open-source data analysis tool, is the first version (still alpha) of a project that she has been working on to make it easier for reporters to look at data over periods of time. Unlike some of the alternatives, such as the SIMILE Timeline and Dipity, TimeFlow is not built to present the data online.

“I felt really strongly about the ability to look at data on a calendar or time line,” Cohen said in a phone interview. She also thought it was important for the tool to give journalists the ability to filter data, zoom in and out of it, and edit it in place without having to import or export it from somewhere else.

TimeFlow was developed as a desktop application instead of a Web app so that it could easily handle large data sets. Also, in consideration of security measures, which prevent many reporters from installing software onto their work computers, the tool was designed to run off a thumb drive.

There are a few key ways journalists can use TimeFlow:

To keep notes on long-running stories — such as court cases, bankruptcies or police investigations — that require journalists to keep track of ongoing developments.

To compile material in a way that might make it easier to look at the relationship between various events and stories.

To organize information for narratives and the reconstruction of events.

TimeFlow handles chronological data in a pretty unique way. You can use approximate dates, create entries to span a set of dates, or enter events with a start date alone. Data fields also include URL links to source materials and text descriptions.

The data is viewable in various formats: a calendar, a time line, a bar chart, a list or a table. It can be filtered in any view — by using tags and data fields, by searching for keywords, or by using regular expressions. It can also be edited within each view.

You can add data by copying and pasting from an Excel spreadsheet or an HTML table, or you can add it by importing a CSV or TSV file. There are currently no export functions.

The sidebar shows the aggregated contributions from an organization to a politician (for instance, from various employees of one company). The second section, “points of influence,” shows campaign contributions received by politicians, as well as contributions made by organizations. You can click on the names of people or organizations to learn more about them, such as who their contributors are or what lobbying firms they’ve hired.

Poligraft has a handy bookmarklet so you can use the tool to analyze any story from the browser.

Anyone can use this, but it could be especially powerful in the hands of hands of journalists, bloggers, and others reporting or analyzing the news. It would take hours to look these things up by hand, and many people don’t know how to find or use the information.

Journalists could paste in their copy to do a quick check for connections they might have missed. Bloggers could run Poligraft on a series of political stories to reveal the web of contributions leading to a bill. All this information is public record, but it’s never easy to dig through. What is possible when investigative journalism is made just a little bit easier?

I can see how news organizations could apply the Poligraft model to any type of story — crime, business, anything for which additional context could be useful. For example, a crime story sidebar could search for names of people involved, addresses, type of time and display the information in a sidebar. It’s a twist on the crime map.

TechCrunch does something similar. Below each story is a widget with information about some of the businesses mentioned in the story, such as website URL, when the company was founded and a summary of what the business does. (The CrunchBase Widget, as it’s called, can be customized and added to any site.)

As simple as the Poligraft tool is, users need a certain amount of background knowledge to really benefit from it. And the sidebar could do a better job of providing more information about the politicians, lobbyists and organizations. I realize that you’re expected to read the story, remember the names, and look over in the sidebar for context, but there’s just a little too much back-and-forthing. Still, it beats looking up contributions one by one — and it may highlight a connection that would be otherwise overlooked. Read more

Though South by Southwest Interactive is best known for highlighting emerging technologies such as Twitter, many panels this year focused on journalism. And many of the 2,344 panels proposed for the 2011 conference have a strong journalism component.

SXSW opened up the voting for the proposed panels last week, asking people to pick the sessions they’d like to see at the festival, which will be held March 11-15, 2011. Voters can search the panels by categories, including journalism, online video, social networking and user-generated content.

Several of the 49 journalism-related panel proposals revolve around how technology is changing the storytelling process and why it’s important for journalists to think like “geeks,” or at least come to a better understanding of how programmers think and work.

I looked through the panel proposals, including those that were not listed under the journalism category, and selected 20 that I think journalists would find worthwhile. Given how many panels there are, I’m sure I’ve left out some good ones. If I’ve missed any that you think should be on this list, feel free to add them to the comments section of this piece or respond to @Poynter via Twitter. Journalism

Media columnist David Carr of The New York Times will look at how technology contributes to, and detracts from, journalists’ productivity. He raises relevant questions for journalists who want to strike a better balance between consuming media and creating it: “Is your desktop a window on the world or just a view of the prison yard?” and “What specific steps have you taken to bifurcate your world into productivity and recreation?”

“Predictions and the News: Getting the Future Right“Matt Thompson of NPR plans to look at the predictions that are a key part of news coverage. He’ll address how new approaches to journalism are making it easier to assess and follow up on predictions about what’s to come. He also plans to talk about how to help the public make better sense of claims about the future.

“Why Journalism Doesn’t Need Saving: An Optimist’s List“Dan Gillmor’s panel is focused on the future of the media. He’ll talk about innovative projects from startups and traditional media companies and will explain why they make him hopeful about the future. Given all the innovation that’s taking place, he says, there’s good reason to be optimistic about where the profession is headed.

“Information Architecture as Storytelling“Geoff Barnes of Elliance Inc. will discuss the similarities between information architecture and storytelling. This panel, which is likely to attract user experience designers, will address questions such as: “How does knowing the user’s story affect project definition, content strategy, site map development, wireframes, copyrighting and visual design?”

“Crazy, Cool and Interesting Uses of Geodata“As the title of his panel suggests, Elad Gil of Twitter will discuss weird and cool applications that were built with geodata. He’ll also look at geolocation datasets that have recently become available and will address unexpected ways that geodata is being used.

“The Grand Challenges in Media““The state of the media” is a phrase you’ve probably heard a lot. Twitter’s Robin Sloan wants to bring it up again, but in a way that’s “more focused, constructive and engaging.” He plans to describe significant, unsolved problems in media as they relate to journalism, such as those related to technology, organization and economics. He’ll include a “starter kit” for figuring out these problems and will talk about who seems best positioned to tackle them.

“Better Web Experience Through Anthropology“ News sites talk about creating a better experience for their users, but their approach to doing so may not be as effective as it could be. Chris Bailey of Bailey WorkPlay will show why code isn’t the only important element of websites and Web applications. He’ll introduce tools that anthropologists use to understand their subjects and then explain how you can use them to assess how your site design impacts the user experience.“Pulitzer 2.0: Building News Apps“ Drawing on his experience as an interface engineer at The New York Times, Tyson Evans will describe how news organizations are using Web frameworks to build news apps that tackle major investigations and increase government accountability. He’ll also talk about how visualization and design can make data easier to understand, and how journalists can help the community engage more effectively with this data.

“Whiteness on the Web: Racism or Culture?“ In his panel about diversity on the Web, The Root’s Joel Dreyfuss will look at how the Web creates racial separation and whether we need a campaign to desegregate it. Dreyfuss plans to address questions that are important for all journalists to consider: “Are we repeating old racial exclusivity patterns in new media?”; “Should content managers make an affirmative effort to diversify their content?” and “How should sites handle offensive and racist commentary?”

“Real-Time Streams Need Real-Time Feedback“ ReadWriteWeb’s Marshall Kirkpatrick will explore the challenges with real-time information and what’s wrong with current methods of managing it. He’ll explore the future of real-time curation, filtering and feedback, and he’ll describe how consumers can customize their data streams with real-time feedback.

“Social Games: Manipulating Your Brain Chemistry, For Good“ Michael Fergusson of Ayogo Games will challenge the notion that casual games are a waste of time. He’ll look at how social games can lead to significant behavioral changes and give examples of games that are doing this. One of the key questions he’ll ask: How can we design games that add value to the world at large?

Programming/Development

“Why Journalists Need to Think Like Geeks“ This panel will address the fundamental differences between how programmers and journalists think and work. By thinking more like “geeks,” The New Yorker’s Blake Eskin argues, journalists can learn to communicate and collaborate more effectively with programmers — and ultimately create better digital projects.

“Girl Developers++: Getting Women Equipped to Ship“ Sara Chipps of Girl Developer LLC says there’s been a lot of talk, but not a lot of action, about the gender gap in software development. She advocates for educating women to be software developers and empowering them to teach themselves, and she plans to talk about some of the roadblocks women face when learning how to code. Among the questions she’ll address: “How can I start an initiative to educate women in technology in my community?”

“Our Media: Building an API for Public Media“This panel is especially relevant to journalists working in public media. Robert Bole of the Corporation for Public Broadcasting will talk about public media’s efforts to build an open API called the Public Media Platform. (An API, in this sense, facilitates the use of a news organization’s content by third parties who want to create their own applications.) Bole will discuss how independent publishers and content creators will be able to use the platform.

“Crowd Funding Your Startup — Without Going to Jail“If you want to create a news startup, consider this panel, which will teach you how to tap into the community for funding. Fred Bryant of WealthForge will explain how to overcome challenges that stand in the way of getting people to invest in your startup. He’ll also look at what industry leaders think about the future of crowd funding and will offer thoughts on how long it’ll take for crowd funding to become a viable way of raising capital.

“Newstopia: The New Business Models for News“ Mark Briggs, an author and a Ford teaching fellow at Poynter, will lead a session about how to use digital tools to launch and run a successful news business. He’ll answer some key questions about how digital news startups fit into our democracy and why sites such as The Huffington Post thrive while traditional media outlets struggle. He’ll also address what these startups mean for people who are looking for jobs in journalism.

“How Brands Form Partnerships: Headscratchers and Natural Fits“Often, we hear about partnerships after they’re made. But what happens before the partnership is finalized? USAToday.com’s Brian Dresher will share details about what happens beforehand and will describe how to evaluate potential partners. He’ll also discuss how to use internal resources to support a partnership and how to measure the success of a partnership. This panel could be especially good for news sites that want to innovate but don’t have enough resources internally to do so. It could also be good for universities that want to partner with media outlets. Read more

At 1 p.m. Monday, I will talk to Jim Brady, president of digital strategy for Allbritton, and Steve Buttry, director of community engagement, about decisions they made in building the new site. Come with questions.

Wednesday, Aug. 04, 2010

It’s often said the Web is more measurable than any other medium. That’s probably true. But trying to actually understand what’s being measured and translate the different types of measurement into a coherent whole can make your head spin.

A lot of sites fixate on what their Web analytics, packages like Google Analytics and Ominiture, tell them. They look at stats on “page views,” “visits” and “unique visitors” and measure their progress in terms of how much traffic increases over time.

They might look at “engagement” stats like “time on site” and “page views per visit” to glean how much people are enjoying the site after they come in for their visit.

While those stats can be a fine way to get a handle on relative growth, they’re not true measures of the number of people coming to a site. And they’re also measures that many advertisers won’t accept.

Let’s explore what Web analytics can, and can’t, tell you.

Web analytics data is based upon “cookies,” small pieces of code placed on a computer when an Internet browser such as Internet Explorer or Safari renders a website. If you visit a website and it places a cookie on your computer, when you visit the website again, the site’s Web analytics package should be able to tell that you’ve visited before, how recent that visit was, how long you stay on the site, and other information about your browsing.

But because the cookie is placed on a computer via a browser, it doesn’t really measure a person. Let’s say you visit a website one day using Internet Explorer and on another day using Firefox. In most instances you’ll show up as two different visitors, two “unique visitors” in the Web analytics package. If your friend logs on and uses the same browser on your computer to visit the same website, he is a different person, but the Web analytics package will instead register a repeat visit.

In another scenario, you may use two or more computers (at home and work, for instance) and visit the same site on each of them. You’re one person, but you’ll show up as multiple visitors. And in other cases, the analytics can be skewed by people who delete or block cookies.

In other words, your Web analytics data may grossly inflate the number of users who come to your site. The rating service comScore in 2007 did a study that found cookie data might over-represent the number of users to a website by 2.5 times.

Know your community

When measuring traffic to your specific site, it’s important to consider the behavior of your particular community. Sophisticated tech audiences and wealthier ones with home and work computers likely account for more cookies than people.

On the other hand, sites serving schools or less advantaged populations may underestimate how many users they have. At a school or library, for example, many people may use the same computer to visit a given website.

The blog for the Reddit bookmarking service recently complained that experts were “misunderestimating” the site’s traffic compared to what Reddit staff saw in their Google Analytics stats. According to Reddit, advertisers were instead looking at services like Compete.com or Quantcast to get a view of how many people visit the site, and, Reddit complained, those services showed much lower levels of traffic than Reddit’s internal stats on Google Analytics.

Panels vs. cookies

Compete.com, Quantcast, comScore and Nielsen all purport to do a better of job of measuring the number of people who visit a site than Web analytics, while also providing demographic data on gender, household income and the like.

These other services employ what’s called a “panel” methodology — observing the behaviors of large groups of Web users and using statistical formulae to make inferences about Internet usage, both in general and on specific sites.

Advertisers are often more comfortable with these third-party services, which operate at arm’s length, than internal Web analytics stats. These services also can comfort advertisers that they provide a better “apples-to-apples” comparison among different sites.

Still, the panels are also far from perfect and can themselves diverge widely depending on the composition of users in their samples and other factors.

For all of the services, the stats become less reliable as the sample sizes get smaller. The smaller the site, the more difficult the panel measurements can be to believe. Compete.com measures only what it considers the top million sites in visitor traffic in any given month.

Nielsen and comScore tend not to register sites until they’ve gotten many thousands of visitors in a month.

So which method do you use?

So what should you use, and when? It depends on whom you’re talking to, and what you’re trying to learn. Sometimes, you can use all the services and try to figure out the reasons for the differences. Even more measurement stats are available from your ad server data, which are often the only traffic numbers that are audited and verified for legal purposes.

Yes, it’s enough to make your head spin. But the more you know, the better prepared you’ll be to anticipate and answer questions and to assemble the stats that will make you look best to the audience you’re presenting to.

For example, if your site targets local schools, you may be able to make the case that your Web analytics are under-counting the number of users. Or you can explain why you believe — based on site surveys or social media interactions — that the demographic profiles of your users are different than what one of the panel measurement services show.

But it’s also important to understand that advertisers, partners and others can have valid reasons for being skittish about certain types of data. You need to be able to explain to them what your stats do and don’t represent based upon the individual characteristics of your Web property. Read more

Monday, Aug. 02, 2010

Monday and Tuesday, about 40 journalists are gathering at Poynter to learn how they can use free digital services to cover government more effectively. They’ll learn how to share and annotate documents, share data on politicians and lobbyists, understand voting patterns and create data visualizations.

The findings are the result of a several-month long study by an internal team that examined Facebook usage at major news organizations such as CNN, The New York Times, and Univision.

Because Facebook boasts 500 million active monthly users and an average monthly time-on-site of around seven hours, integrating Facebook into your site could translate into substantial additional traffic. Tools such as Like buttons, Activity Streams and LiveStream can keep users clicking through stories on a site. And the Insights analytics tool provides valuable demographic information.

After implementing various combinations of Facebook tools on their sites, ABC News saw a 190 percent increase in referral traffic, Life magazine’s referrals increased by 130 percent, Scribd’s user registrations went up by 50 percent, and Dailymotion saw as many as 250,000 users engaged with a single video.

Facebook Developer Network engineers Justin Osofsky and Matt Kelly provided an in-depth look at their findings at a Hacks/Hackers meetup this week. Journalists can learn more about the techniques and discuss how to improve upon them at facebook.com/media.

Optimize the Like button

There’s a lot of power in those little Like buttons, both on the Facebook site and off. When a user clicks Like, that gesture is broadcast to all of his friends — on average, 130 people. Depending on how a site implements the button, clicking the like button may add a link to the user’s profile page and make the liked page discoverable in Facebook’s search system.

Anything on the Web is potentially Likable: a news story, an organization, or even a reporter, Osofsky explained.

Crucially, once a user Likes a Facebook Page, the administrator of that Page gains the ability to push new content to that user’s Activity Stream. In essence, that single click is all that’s needed for users to opt-in to future messages — and if they don’t like your content, to opt back out.

Like buttons are easy to make and come in a variety of features and sizes, from tiny rectangles to full-featured iframes that include profile pictures and comment boxes. Facebook has found that “Like” buttons do best when they’re close to content that is both visually engaging and emotionally resonant, such as video.

In addition, full-featured Like buttons tend to do better than smaller ones. Adding faces of other Likers to the button and including Facebook comments increased the clickthrough rate from as low as zero up to 0.2 percent — comparable to the click-through rate of a banner ad.

Because Facebook delivers this content to publishers’ sites through an iframe, only a small amount of code is necessary to implement the “deluxe model” Like buttons.

Tailor content specifically for Facebook users

Content matters on Facebook. Touching, emotional stories earned 2 to 3 times as many Likes as other stories, as did provocative debates. Sports stories tend to perform particularly well, with 1.5 to 2 times more engagement than the average.

With that knowledge, news organizations can identify stories likely to perform well on Facebook and push those stories through social channels such as Facebook Pages and Twitter.

Publishers can even strategize around when they push this content. There’s a spike in Likes at 9 a.m. and 8 p.m., so having fresh content at those times is crucial.

Deploy activity plugins on every page

Increasingly, news site home pages will be customized to users’ tastes and networks. On CNN’s home page, for example, an Activity Feed plugin shows users what their friends have Liked on the site.

Osofsky recommends that publishers set aside real estate on every page on their site for the Activity Feed and Recommendations plugins, which suggest relevant content to users. “Sites that placed the Activity Feed on both the front and content pages received 2-10x more clicks per user than sites with the plugins on the front page alone,” he wrote on the Facebook Developer Blog.

He also advises that sites use Facebook’s LiveStream plugin, a real-time chat box that gathers users in a conversation about live, breaking news. The plugin could be seen as a competitor to live-tweeting and live-blogging tools like CoverItLive.

Create separate pages for major events

For major stories that break over several days, some organizations increased engagement by creating a dedicated Facebook Page for that event. “Stories published from a World Cup-focused Page of one major media company had 5x the engagement rate per user than stories from the company’s main Page,” Osofsky wrote.

Of course, that technique isn’t without some degree of risk. Publishers might worry about fragmenting their audience and losing viewers when an event is over.

For example, after a flurry of wall posts, ESPN’s World Cup Page abruptly stopped posting on July 15. The 636,000 or so fans have continued to post to the wall, but with no response from ESPN, they are likely to lose interest.

Manage your many pages

Depending on the type of item that a user Likes (a person, a show, an article, and so forth), almost every Like button generates a new Page on Facebook. As more people click “Like,” publishers will need to organize and manage an ever-growing volume of Pages — some of which aren’t even visible to most users.

Facebook Engineer Matt Kelly described how Facebook uses what he informally called “Dark Pages” to connect publishers to users. Invisible to everyone but administrators, Dark Pages represent pages on the Web that have been Liked but do not have a publicly visible Page on Facebook — for example, a single news article.

Publishers must place the Open Graph and Facebook tags such as <og:type> and <fb:admins> on each page of their site to identify the content. Then, once a publisher has claimed its page (dark or otherwise), it can publish new content to the Activity Streams of their Likers and examine Insights to learn more about their users’ demographics.

Publishers could wind up with thousands of Pages to monitor. There’s not a perfect method to manage that onslaught of Likable content, Kelly said, but he expected that solutions would emerge from Facebook’s outreach to publishers.

Attendees at the Hacks/Hackers event expressed some dissatisfaction with Facebook’s Insights tool. Although visually similar to real-time traffic reporting tools like Google Analytics, Facebook’s Insights can lag up to four days behind. That may change in the future; Osofsky said the goal is for Insights to lag no more than a day behind.

Turn status updates into infographics with the streamlined API

Just as newspapers invested in printing presses, online news divisions must now invest in software development. Facebook recognized that developing social tools can be confusing and resource-intensive, so the company recently streamlined its API.

Facebook’s new API is structured around objects and connections, just like the user experience on the site itself. It can be used to generate innovative visualizations like the New York Times’ visualization of soccer players’ popularity.

In addition, Facebook has developed a more robust search tool, which can be used to find content from public status updates, not just people. Journalists could use the tool to gauge community interest in a story or to find new sources.

Facebook has also streamlined its authorization process, implementing OAUTH 2.0, which offers improved scalability and ease-of-use. For users, authorizing applications is now a single-click process, rather than having to click through one dialogue after another. For publishers, that translates into smoother engagement with users.

Participate in development of Facebook products

Social networks — particularly Facebook — are quickly becoming a key way to learn about breaking news, a phenomenon that Facebook is only too happy to embrace. The recently released research is just a foundation for what Osofsky hopes will be a long-term collaboration with media partners.

He encouraged anyone involved with news — journalists, editors, software developers — to visit facebook.com/media to learn about Facebook’s engagement with the news industry, to share ideas, and to contribute to the emerging practice of integrating social tools with journalism.

“We have plenty of work to do,” Osofsky said. “And the dialogue is very important.” Read more