Archive

Demand for Social Media Marketing has exploded in the past decade as brands struggle to reach audiences beyond the increasingly-fractured traditional media consuming public. Right now Social Media Marketers are able to take advantage of the public’s overwhelming ignorance about communicating via social media and get paid to navigate those spheres for their clients.

It won’t last forever. It may not even last another decade.

Think of the travel industry. Before ‘teh interwebz’ information used to be scarce, so it made sense to pay someone else with expertise to navigate the complicated pricing schemes and array of accommodations providers to do it for you. Flash-forward to the year 2000 when the web came into its own in terms of providing easier ways to book airline tickets, hotel rooms and car rentals (as well as recommendation sites chock full of free expertise and reviews). This great graphic from the Cleveland Plain Dealer says it all: Read more…

As we hurtle into the future, we’re leaving a larger digital wake behind us. International Data Corporation estimates humans will produce 1,800,000,000 terabytes of data this year alone.

Simultaneously, the power to sift through these vast stores of information is getting keener. In 2009, the team BellKor’s Pragmatic Chaos won the Netflix prize by crafting an algorithm for recommending movies with ten percent better accuracy than the movie company’s own engine.

“Mashup Bombs” are what await us as these two phenomena converge. Our ability to compare the increasing amounts of data will improve and previously undetectable patterns will emerge. Not only that, but the ability to produce revelations won’t be confined to future data – we’ll have the power to look back through all of the petabytes of data already cached on server farms around the world.

What if the GPS records of mobile phones were matched with employee payroll records to spot when people are fudging their hours?

What if anonymous publishers could be outed through algorithms that compare writing samples?

What if aggregate market data and networks of personal connections could be filtered to show when bidders were given preferential treatment for government contracts?

Things are well underway:

Wikipedia + IP Address Location Database= in 2007, a CalTech student named Virgil Griffith created a tool called Wikipedia Scanner that tracked the IP addresses of Wikipedia editors back to their sources and outed institutions from Diebold to the CIA as having edited their own Wikipedia entries.

Sex Offender Databases + Real Estate Listings + Google Maps = As local governments have begun to publish sex offender photos and profiles on their websites, this information has been cached and combined with real estate listings and Google’s open API for its versatile maps tool. The result is the ability to see if the location of a house you’re interested in looks like it has chicken pox.

IRS Records + Google Maps + Facebook =Fundrace is a site that allows users to map out what political campaigns their neighbors are contributing to, as well as compare those same databases to find out who your friends on Facebook are donating to.

Just because an indiscretion has gone unnoticed is no guarantee that it will go unnoticed in the future. As a PR pro, I don’t look forward to responding to the indiscretions of predecessors, but that may be something we have to prepare for.