Metrics

June 04, 2010

Can you improve your email open rates by better maintaining your subscriber list?

I do a lot of work in the area of social media - mainly twitter, facebook, linkedin and email.

Yes, I consider email to be the most basic form of social media. Anyone willing to share their email address with you wants to connect in a meaningful way. Whether a friend, colleague, jobseeker or salesperson, a shared email address is one way people confirm you as a valued connection worth building a relationship with.

There are many ways to improve open rates by improving your email content. But in relying on email software tools to manage our subscriber lists, have we added to the problems inherent in email open rate metrics?

Which brings me to my main point in this post.

I manage a respectable subscriber list for my organization's monthly email digest - around 15,000 subscribers. As we are not a membership organization, this is a sizeable list which been growing for nearly a decade. The digest itself has taken many different forms and used many different email software systems - some more effective than others.

The open rate is one important - and foundational - measure of any email send. How do email software systems know if an email has been opened? A small 1 pixel x 1 pixel graphic (generally invisible to the average person) is inserted into the email that you send and when this image is loaded the email is registered as opened.

The problem is that sometimes the image isn't loaded - this is particularly true for subscribers viewing your email via smart phones. Comm100 has a helpful post on the woes of measuring email open rates. So already you have inaccuracies in open rate metrics.

But I also think lack of proper management of email subscriber lists contributes to problems with open rate metrics. Sometimes we rely to heavily on the software to do the work for us. We need to think like gardeners tending our gardens. We need to seed, feed and weed our email lists.

Because subscriber lists are made up of individuals opting-in to your list, I felt it would be useful to get to know these folks better by seeing how well-connected they are with our organization. Basically, I've checked the subscriber list against institutional partner lists and our friends, fans, and followers via social media. The process and tools I've used to do this will be the subject of future posts.

As I've updated this list, I've noticed some issues which may impact my email open rate:

duplicate email entries from different source lists (yes, there were several). The email system we use is supposed to cross-check email addresses and send only one email but, as we found out, not all email software systems actually do this effectively. So, if a email address gets duplicate emails but only opens one, is the open rate counted against all emails sent or just the email that was opened?

single individuals receiving copies of the newsletter at multiple addresses (work and home). I've been able to match up multiple email addresses with single individuals. Unfortunately, the email systems I've worked with require separate entries for each email. So if one person receives emails at multiple email addresses, but only opens one email at one address that open rate is counted for that one email address only. It makes sense, but wouldn't it be great if one individual could designate two or more email addresses for one recordset? I'm seeing more of this capability, but we've got a long way to go. And this might add further confusion to the metrics.

bounced emails. Whether a hard or soft bounce, these subscribers are often kept on the send list so as the number of bounced email increases, the open rate may decrease. So when should you remove these individuals from the list? You should remove someone with has a hard bounce sooner than later. But when do you remove those subscribers who have soft bounces? Dundeemail has a helpful post about bounced emails.

While many email software systems say they can maintain your subscriber base, there is nothing quite as valuable as personally reviewing and updating the list.

Though any open rate will not provide truly accurate numbers (you need to look at trends over time), I'm hoping that by better personally maintaining my subscriber list, I get a improve the open rate by:

Removing multiple bounced emails for individuals who are not well-connected with our organization.

Removing undeliverable/invalid emails or double-check that they are correct. In several cases I've noticed a error in the email address which I've been able to repair.

Build up a social CRM (and perhaps a survey) methodology to identify those subscribers who are heavily active with us in the social media to determine if they are more likely or less likely to open our email digest.

I'm particularly interested in the social CRM side of things - the further down the social media rabbit hole I travel, the more I realize that we need better tools to help us understand our audiences and connections. I'm starting to see and use some of the free (and useful) tools available in the cloud, but I would like to hear from others.

June 02, 2009

In pulling together some data on twitter clickthroughs, there was a noticeable discrepancy between bit.ly and google analytics data - bit.ly clicks are noticeably higher than traffic source numbers for twitter on Google Analytics. Two caveats:

1. Bit.ly data collected reflects all bit.ly links including those initially posted by @worldresources using the Google Analytics Campaign Code - accounting for an average of 60% of the clicks - as well as those generated by others.

2. I pulled all information by keyword "twitter" from the Google Analytics account - which brings over all pageviews sources coming from twitter and the GA Campaign Code links posted by @worldresources.

But in closer inspection, I noticed that the Bit.ly clicks and Google pageviews data lines - despite being vastly different in terms of numbers - were almost identical in terms of trends.

I did some research on this and here is a list of potential reasons for this -

No two analytics tools measure data the same way - there will always be discrepancies between analytics tools. With bit.ly we are measuring clicks but with google analytics, we are measuring pageviews which may be captured differently.

Google Analytics may be combining multiple clickthroughs by the same twitter/user account as one whereas bit.ly counts clicks and not users

One prevalent theory is that bit.ly numbers are higher because bit.ly data includes bots and automated traffic whereas Google Analytics will not capture those visits (in GA count is computed using Javascript, which bots do not execute).

So while some of the referrers may be bots, because twitter feeds are disseminated throughout the web and desktop clients, not all of the clicks are going through twitter. Aha! Note that as of March 2009, bit.ly screened out HEAD requests from click results.

Another theory is that Google Analytics does not count clicks from twitter.com as uniques -- and that's one reason google counts are lower than bit.ly. It isn't clear how traffic from twitter clients looks to GA, but this may account for overall lower count rates on GA.

Not all bit.ly clicks are coming from WRI-generated links, and these charts only includes bit.ly data - it doesn't include click on links using full WRI URLs OR other URL shorteners like is.gd, tr.im, cli.gs, tinyurl.com, twurl.com. I haven't determined how to capture that additional data efficiently - I'm still trying to get my mind around the bit.ly/google data.

Regardless of which numbers you ultimately decide to highlight (Google Analytics Traffic or Bit.ly Clicks), you should track twitter links in Google by including the Google UTM campaign code as part of the link - see the EpikOne post Twitter and Google Analytics: What to Track by Justin Cutroni.

What we need is a metrics taxonomy that is easier to understand and explain. Perhaps simple and descriptive enough that we could skip the need for explanation altogether. I propose the following three terms:

Exposure - to what degree have we created exposure to materials and message?

Influence - the degree to which exposure has influenced perceptions and attitudes

Action - as a result of the public relations effort, what actions if any has the target taken?

The problem with trying to determine ROI for social media is you are trying to put numeric quantities around human interactions and conversations, which are not quantifiable.

In this post, Jason includes an interview with Katie Delahaye Paine which is highlights the difficulty of measuring ROI in social media:

For several years, I've been using the following stairstep graphic to visualize how we are trying to engage people in the online space.

You see, World Resources Institute is all about turning ideas into action. We work at the intersection of environment and human needs to provide solutions for a sustainable world.

The basic idea of the following visualization is that we move people from satisfaction with us in general to ownership of our ideas (which then become integrated with their ideas).

I guess it gets down to value - who values us and our content. Value can be shown in many ways and the more engaged a person becomes the more valuable we and our work become.

And the quickest way to create value is to create relationship.

Some basic metrics that work for me to date include:

Who I'm talking with (and who is talking with me)

Posts by others that link to our content

Comments, forwards, retweets, embeds, bookmarks, diggs

Trends in joins, follows, fans, friends

But the more I look at the above visualization, the more I see an essential element is missing.

You see, it is one-way. What am I - what are we - doing to impact others.

This approach seems like a hammer hitting a nail until it is fully engaged in the crossbeam.

It doesn't take into account that this - all of this - is a conversation. And not just a two-way conversation.

It is a multitude of voices where anyone can contribute something meaningful at any time. Where in the level playing field any voice can suddenly offer something real, meaningful, inspiring, and motivational - sometimes in a single tweet.

In a way, it's like a lottery where the odds are in your favor. Everyone keeps contributing in the hopes that they will gain something, and everyone gains small wins on a consistent basis.

But you have to play to win. And the more people play, the more people will win because the jackpot is higher and the odds improve.

Yet that doesn't even capture it, because at some point we need to sit back, evaluate and say, "Wow, what just happened here?" Taking time to do that and capture the story is a valuable metric indeed.