Which do you think gets more pageviews, sites with exaggerated titles like Upworthy or Buzzfeed, or an accurate title like this one?

We know the answer. The reason is that smart sites interested in achieving an Internet critical mass don't think about content, they think about creating "irrational herding" behaviors.

Its a smart business strategy. On YouTube alone, there is 100 new hours of content every 1 minute of every day. Expecting that 'content is king' is naively simplistic. Gaming the mentality of Internet viewers leads to pageviews and that leads to advertising revenue.

Social news sites like Reddit are not much better because they depend on user ratings or peer recommendations to navigate and sort their most interesting content for other users - and anyone who watches their Science subreddit quickly sees that it is dominated by marketing people from a few sites. There are no collective judgments when peer recommendation are biased and inconsistent.

Can positioning on a webpage make a difference?

In a PLOS ONE paper, researchers evaluated some popular peer recommendation strategies and their ability to identify interesting content. They first determined what kind of content users prefer and then evaluated how position on a webpage affects collective judgments about content.

"Psychologists have known for decades that position bias affects perception: people pay more attention to items at the top of a list than those below them," said University of Southern California computer science professor Dr. Kristina Lerman. "We were surprised, however, how strongly this affected user behavior and the outcomes of recommendation."

Distribution of fraction of the first 20 stories shown to a user that are among the most-appealing 20% of stories. Under the popularity policy, for most users at least 40% of the initial stories are among the most-appealing stories, whereas under the activity policy, most users see fewer than 40%. These histograms do not include the first 50 users in each experiment, to avoid the initialization phase of the policies. doi:10.1371/journal.pone.0098914

They found that position bias accounts for consumers spending five times more attention on material that is posted near the top of a webpage. That's a potential problem for sites that rely on peer recommendation alone. For example, Reddit posts appear in order of popularity, derived from up-votes and down-votes by users, with more popular posts nearer the top of the webpage. Due to position bias, users are more likely to see, consume, and recommend already-popular content positioned near the top of the webpage, creating a run-away loop that further amplifies its popularity at the expense of potentially more interesting content farther down the webpage. The New York Times must be good because the New York Times is popular, ev

Cancer could be cured and an Ask Me Anything by a celebrity would still dominate the top of the page.

They found that ordering content by recency of recommendation rather than by aggregate popularity (total 'likes' or recommendations) generates better estimates of what users actually find interesting and would prefer to consume.

It's a fine system for academics but it doesn't work in the real world. If it did, Science 2.0 would be bigger than all of those sites. Votes count and people like titles that say someone made them cry or shocked or amazed.

Twitter's system of sharing and recommending content avoids the "winner-take-all" and "irrational herding" effects by presenting content in chronological order, based on the time of recommendation. Retweets, or recommendations, bring older posts back up to the top of a user's newsfeed, helping to reduce the herding effect. It is gamed rather easily, few people actually click on the links they re-Tweet but "Twitter does the right thing when it pushes newly retweeted posts to the top of the followers' screens, giving them another chance to discover interesting content," said Lerman.

By influencing the peer recommendations that determine the ranking of content, position bias can create a cycle that can exclude quality content. By understanding and being aware of the factors that influence peer recommendation, providers can more effectively leverage collective judgments of consumers about what content is worthy of their time and attention.