The job has already changed a lot, “some in ways I predicted, some not,” Heron wrote in a recent Digital First Media chat. (Disclosure: I’m a reporter for DFM.)

“The most significant change is that an editor has to focus a lot less on evangelizing and training — convincing people that social media is worth incorporating into the news,” Heron wrote. “We also have more people in newsrooms who can write for social media. For instance, my team is responsible for most of the tweets from @WSJ, but we also have two homepage editors who write them as well, and I see that trend increasing.”

As more people in any given newsroom are publishing to social platforms — and as more people bypass the homepage and instead use Twitter and Facebook as the entry point to any given news site — analytics companies see new opportunities to help media companies leverage real-time social data. Visual Revenue, a predictive analytics firm that focuses exclusively on media companies, is this morning rolling out a bundle of tools to help editors measure the effectiveness of social publishing in real time.

“So, if you push a story right now on Nieman Lab, 40 clicks into it you might see 17 retweets, two favorites, some manual retweets and that’s all great, actually,” Visual Revenue CEO Dennis Mortensen told me. “But how do you really add all of it up?”

More importantly: Once news organizations can precisely measure the effectiveness of their social publishing habits, how will that highlight how their social publishing behaviors ought to change?

Visual Revenue is already working with companies like The Atlantic,USA Today, NBC,Forbes, and dozens of others, using data about those sites’ audiences to help them determine optimal publishing frequency, how regularly to switch stories featured on the homepage, when to try a new headline for a story that isn’t doing as well as expected, and so on. There are other companies — SocialFlow and Gnip come to mind — that deal in real-time social data. Visual Revenue likes to say it’s in a class of its own because it focuses exclusively on the needs of news clients.

With its new social toolkit, the company wants to give news organizations more exact measuring tools — a way to determine the difference between tweeting something at 4:23 p.m. and 4:35 p.m., for example — then recommend actions based on that real-time performance data. (Visual Revenue’s system takes into consideration what day of the week it is and whether it’s a notable day — Christmas eve, Super Bowl Sunday, election day, and so on.)

Here’s what the interface looks like:

“Even fantastic content can die if you don’t put it out right,” Mortensen said. “We’ve spent a long time trying to figure out the patterns on a per-news-property basis and it’s very different… The Atlantic can put out content from four o’clock in the afternoon to nine in the evening and it’s equally powerful. It is very much property-specific. I can’t take my learning from The Atlantic and copy over to the Economist.”

What works for one news organization won’t for another, and that’s largely because audiences are as fractured as ever. And yet the behaviors of any given audience are undergoing dramatic changes.

“Fragmentation has been an issue certainly for advertisers and for publishers for the past 20 years,” said Rich Ullman, vice president of marketing for Visual Revenue. “As it progresses it’s my belief that we’ve gone through this period of fragmentation into 1,000 little bits or more and now we’re in a stage of re-aggregation: You put together your feed, I put togther mine, everybody has their own and there are thousands and thousands of different media properties now. It creates a need for all of them to be distinct from one another.”

But there are some constants. Across the board, tweeting more is better than not tweeting enough but tweeting all at once is worse than not tweeting at all.

“Most people end up being just a little bit too conservative,” Mortensen said. “I’m not sure they’re being fully rewarded for it. I’m an advocate for the data and letting that speak for itself. Putting out 20 [tweets] versus 100, you get more out of 100. Do more and you’re rarely going to be penalized.”

The exception to the more-is-more rule is when news organizations tweet or publish to Facebook in concentrated, often-automated bursts. “There tends to be some penalty for clustering together your pushes,” Mortensen said. A true breaking-news destination can get away with more rapid-fire tweeting than magazines, which do better when they let at least 20 or 30 minutes lapse before tweeting again, he said.

Where Visual Revenue believes it can add real value is in being able to recommend specific actions within an editorial framework outlined by the organization — that is, using an algorithm to tell a newsroom when it should tweet and also what it should be tweeting. Mortensen likens these computerized suggestions to the role of a deputy editor: Someone who knows the editorial values of the paper, and can determine the best publishing strategy as a result. Except, in this case, that someone is a robot.

“We set out with this idea of empowering the editor, but not to beat him to the extent where we can automate his job,” Mortensen said. “We actually sit down with the editor in chief and ask him, ‘Give me my instructions just like you tell your deputy editors what they can and cannot do.’ Then we simply adopt those, adhere to those as strictly as possible. And if I’m brutally honest with you, of all of the editors, you’ll see that we’re the only ones that only adhere to the guidelines because we’re an algorithm not a human.”

Another upshot: Non-humans aren’t tethered to print-era concepts that have bled into an online era of publishing. A robot doesn’t care about newsroom culture or tradition; it only cares about the data. Mortensen says before The New York Daily News started working with Visual Revenue, it was updating its homepage about 80 times a day whereas today, it updates about 160 times a day. He expects real-time social analytics will inspire similar changes to publishing habits.

“They will end up doing more simply because they are able to and more confident to do so,” Mortensen said. “The other part is, I think we’ll be seeing them rethinking some of their strategies for some of these specific categories they put out to be more in tune with the audience. Of course, if you think about it, the one thing nobody gets — Buzzfeed included — nobody can predict which specific content is going to go viral. You can increase the probability of that happening by being even slightly more in tune. It’s not about having to conquer every single reader. But sometimes you get them to fall in love.”

Interesting, but what does the algorithm actually DO? If it boils down to measuring the interaction each tweet gets and deriving best practice rules from those results… then that is definitely not anything fancy.

A good programmer could probably write you a script like that in 24 hours. It would be uglier, but you’d get the same results, and derive the same rules.

We learn the value of the stream which individual publishers participate in, whether that be The Atlantic or the New York Daily News – and re-calibrate that value as the stream changes. Having a detailed understanding of this pre-click stream and mapping that against a post-click stream* while applying our proper Editorial Tone models against it, makes for a really good input to the Social Media Editor. It is not for us to dictate what they should do, but we can certainly participate in this dialogue with some quite sophisticated and data driven suggestions. I am sure you’d fall in love with it as well. : -)

*Given we provide recommendations on what stories to carry on the Homepage we have tracking in place throughout the property.