How Much Content Does Your Audience Need? A B2B Case Study

Sometimes one of the hardest questions to answer for your content strategy is not figuring out what your audience needs, but how often they need it. At Pace, we often are asked how much content is optimal for engagement and relevancy with our clients’ audiences. We want to remain top-of-mind and provide the most insightful information through the most effective means, but we do not want our content to be stale or repetitive. At the same time, we do not want to produce content so infrequently that people forget about the strong point of view that our clients have to offer!

We were recently tasked by one of our largest B2B clients to answer a similar question — what is the optimal number of content pieces to provide to their readers per week? We noticed otherwise strong pieces being buried on the site once new content was deployed each day. Our client wanted to know if we could potentially reduce our publishing schedule to better allocate creative resources while maintaining or possibly increasing engagement.

We employed a six-step testing approach in order to find the strongest, data-driven publishing frequency.

Before I share the details, here are some important caveats to keep in mind:

What works for one audience, program or website will not necessarily work for another. Plan to conduct complete rounds of testing for any group of content you are analyzing.

In the case study presented below, we were fortunate to be dealing with equal promotion for every piece of content; therefore we did not need to consider how this affected the data. If necessary, consider controlling for increased traffic coming from email, social, etc., when conducting a similar test or analysis.

Once you determine an optimal amount of content, remember to also evaluate which types of content are the most successful to continue to elevate the goals of your site or program. It’s not just about the right amount of content, but also the right content mix.

Step 1: Determine KPIs

The first answer we needed was how we would define success, in order to be able to call a “winner” once our test was complete. We chose two metrics that each indicated a different type of success: average page views per article (to determine the amount of content that led to the highest traffic) and average number of consecutive days that articles received views (to determine the amount of content that kept it consistently relevant to our audience). Throughout the rest of this article, I refer to this as “average content shelf life.”

By choosing to measure using these two metrics, we ensured that our ideal publishing frequency would drive solid traffic volumes while keeping content relevant and regularly engaging readers.

Step 2: Determine Test Publishing Schedule

Entering this analysis, we were publishing five pieces a week for this particular client. This volume was both demanding for our creative resources and potentially unnecessary to facilitate high engagement, as mentioned above. We decided on a nine-week testing window: collecting three weeks of data while publishing four pieces per week, three weeks while publishing three pieces per week, and three weeks while publishing two pieces per week.

Step 3: Set baselines

Using four months of historical data, we calculated baselines for average page views per article and average content shelf life, assuming five pieces were published per week. We compared our metrics from each round of testing to these benchmarks in order to determine success.

Step 4: Publish and Test!

Once our creative team was briefed on how often they should produce content under the testing schedule, our testing window began. We continued to collect metrics daily for complete analysis once all three rounds were complete. Most importantly, we maintained regular contact with the client throughout the testing process, ensuring that we were still meeting the client’s needs under a reduced publishing schedule.

Step 5: Evaluate Results

Overall, we saw both increased views and increased shelf life under each of the three test schedules. The results were as follows:

The strongest increase in average views per article was observed when publishing four pieces per week, while the strongest increase in shelf life was observed when publishing three pieces per week. While this makes sense — articles remain in top-viewed positions on the site for longer when there are fewer new pieces to replace them — it was interesting to see that with only two pieces per week, shelf life barely improved. This suggested that two pieces per week were not enough to remain relevant with the audience — while visitors tended to view more content (33 percent increase in average views), they likely stopped coming back to the site regularly when new content was offered less frequently.

We concluded that our most successful publishing schedule offers three to four new pieces per week. The strong results in both test phases 1 and 2 suggested we settle on a range, which allowed us to better accommodate the changing needs of the audience, while remaining overall on a decreased publishing schedule for increased relevancy and stronger use of creative resources.

Step 6: Collect Data and Repeat Analysis

This step is ongoing — we are currently in the midst of collecting three months of data under our new publishing schedule, which we will then measure to determine if engagement is still as strong as it was during testing, as well as to ensure that no anomalies affected our test results. If needed, we will revisit testing to understand if audience needs have changed to require more or less content published per week.

As you enter 2018 with new expectations and plans, you may be tasked with answering seemingly tough questions about content volume and production schedules. Having a data-supported point of view can ensure that you are presenting only the strongest, most relevant content to your audience, as well as keeping production volumes reasonable for your editors. Happy analyzing!

Brynne McGarry – As the Statistical Specialist for the Analytics team, I tell different kinds of stories through data and insights. I'm still a bit of …MORE