While studying basic statistics of various communities for a chapter in my book, I made a discovery: healthy and active communities may demonstrate very different metric results. So my conclusion so far is that you should not rely on basic activity stats to measure healthiness of an online community.
I calculated activity per active users in last month:
Topics created per user vary from 0.07 to 0.55 – this makes a huge difference, i.e. 7 new topics vs 55 new topics per 100 active users per …

Is there any study on the general ballpark number of users and engagement numbers that make up a healthy/active forum?
I know this is subjective, and two active people could use a forum solely for one-to-one conversations, but there must be convergence on some kinds of metrics on long-lived forums that could provide some insight.

I’m just starting to try to understand how best to go about this too.
The admin dashboard in Discourse does offer some decent data, with indicators that show whether certain metrics are going up or down compared to previous daily, weekly or monthly periods.
One of the motivations for a question I asked earlier this week about the last_seen_date was that I was starting to try to get a better sense of how to define and measure how many active users there really are.
There seem to be a number of…

Hi @HAWK - thanks for getting back to me! The short answer is we’re still learning. However, at it’s core our community is one of makers - people who see problems in their work or home life and actively look to productivity tools to solve them. For our product specifically, a vibrant community would be full of people showcasing interesting use cases and hacks with our product, riffing on ideas along the way. In this sense, a smaller percentage of our overall user base would be the most likely content creators, with a bunch of folks “lurking” with the intent to pick up and run with interesting use cases they find.

We’d also like to use the community to relieve some of our support channels, but that would come hand in hand with more makers joining and contributing / supporting the users with issues.

Generally one question that comes up a lot is “what percentage of users who are given access should be active for us to feel good about launching broadly”? I personally have erred on the side of optimizing metrics we can tightly manage, including “time to first response” as an example. However I’d love to get your perspective on some successful software communities you’ve seen and what tactics / metrics they employed. Would be super helpful!

The thing that I generally recommend is that you steer clear of chasing engagement metrics for the sake of it.

e.g. # of topics or posts is of little value if they don’t result in a direct ROI (i.e. reduced support load). I’d recommend identifying your main 1 or 2 goals – here is a resource for that – and then work out which associated metrics make sense.

If you share your goal I can give you relevant tactics.

Having some incidental health metrics also makes sense though. I’d go for DAU/MAU (daily active users/monthly active users). This gives you a measure of your community’s ‘stickiness’ (i.e.how frequently people revisit). The new dashboard that Jeff mentions above will graph this for you.

I’d also keep an eye on new signups. A sudden drop could indicate that something is broken in your signup or on-boarding processes. A slow slide could indicate that you need to work on marketing.

Then look at your conversion rate (% of new visitors to your site that sign up). Aim for 10%.
And % of new members that make a post speaks to the efficacy of your onboarding process.

Evan_Davies:

However I’d love to get your perspective on some successful software communities you’ve seen and what tactics / metrics they employed.

These tend to fall into two categories in my experience. The ones that don’t care about the numbers and the ones that are very tightly managed. The former are successful because the product is a success and the community is the primary (or only) support channel, as it is with ours. Those ones don’t tend to employ specific tactics.

The latter will tie success tightly to call diversion and time to answer metrics.