One of the ways I tell companies how they can best serve their market is to be transparent on how they build products. By doing so, it helps folks not only understand, but appreciate the level of effort that goes into creating a service or product. While analysts offer guidance and advisory sessions, we’re most known for the reports that we create, in fact, these are key products that help decision makers be successful.

Demand for community platforms, yet too many vendors
I’m asked a few times a week on which community vendor to choose, with a list of 80-120 vendors on my blog and a more refined catalog on the Forrester site, it’s very confusing for brands to determine who’s best. If you’ve been reading my blog, you’ll know I’ve been watching the community platform (aka white label social network space) with great interest, even before I joined Forrester. A few weeks ago I announced my intention to start a Forrester Wave report, which will segment out nine vendors that will meet the needs of Interactive Marketers at Enterprise Class companies (companies with more than 1,000 employees).

The Forrester Wave Methdology
Using the refined Forrester Wave methodology that has been completed by many analysts before me, we’re nearly half way done with this 3 month research project to understand, and segment the community platform market. For this particular report, it doesn’t make sense to utilize crowd sourcing methods (although I’ve used crowd sourcing for other reports), the Wave method is already refined from the many analysts before me.

To date, we’ve created a detailed scorecard that involved a feedback loop with major brands who have recently deployed community software. This particular scorecard contains over 54 criteria that was assembled through client discussions, a panel of a trusted folks who have deployed communities, discussions with fellow analysts, and feedback from the vendors. Next, we collect data from the 9 vendors, each completing the scorecard for a total of 496 cells, then I create my own sheet of cells verifying what we found for a total of 992 cells of data collection.

Also, we’ve started interviewing and recording feedback from 27 brands that have deployed community software from these vendors, in order to find out what went right –and what could be improved from each of these nine vendors. Again, more spreadsheets and data collection.

Starting this week, we start a series of day-long labs with each of the vendors, where will be looking under the covers at the actual software, discuss their business strategy, and understand how their community offerings can best help marketers. We’re looking at the market from a variety of angles, to ensure that an accurate report is created.

Collaborative environment
At Forrester, an analyst never works in a vacuum, it’s collaborative and I’ve a lot of minds to lean on. It’s not just me alone, I’m getting help from analyst and my editor Shar Vanboskirk, analyst Oliver Young who knows the enterprise side of this space, analyst Suresh Vittal who’s completed many waves, analyst Laura Ramos, and constant support from research associate Sarah Glass (my guiding light, and detailed taskmaster) and research associate Zach Reiss-Davis. I’m under the guidance of my research director Christine Overby, and am in constant contact with our seasoned Josh Bernoff. Despite suggestions that some analyst firms do not have knowledge management strategies isn’t quite true. In fact, we retain the knowledge of our colleagues through tools like internal wikis, constant team communication, and most importantly knowledge and insights generated by reports will live on for colleagues and clients on the website.

Focusing in
That’s just the half way point: next I have to analyze, score, conduct follow ups to ensure all the data is correct, and begin the scoring process. You’re going to notice a decrease in my posting over the next few weeks, and my online activity start to wean off as I work hard to deliver a quality report later this fall that will help interactive marketers make the right decision.

Very much looking forward to this wave report. We have many clients asking about it who are in the process of evaluating community tools. We also have been researching one for internally within our agency. Looking forward to seeing the results of your hard work! With such a disparate crowd of vendors I think this will be a very influential report to help weed out the top innovators in this space.

Is the wiki you mentioned a Forrester-wide tool? I’m curious. Do *all* Forrester analysts consistently capture essential details about *all* vendor briefings in the wiki? Or is this something that only the team of social media analysts do?

Client inquiry is a critical source of information for advisory analysts like Forrester and Gartner. When it comes to client inquiry, Forrester does a great job of capturing clients’ basic questions — which is the basis for the fabulous “Forrester Advantage” data service. The initial capture is done by inquiry client service folks. Is there a formal and enforced policy at Forrester that after the inquiry analysts consistently expand the originally captured question with context and additional detail learned during the inquiry? Then there is the additional issue of whether the analysts’ specific advice was captured as well. Does management ensure that *all* analysts are consistently putting in details about the information and advice they give clients?

I am also a little skeptical that published reports are an effective way to capture knowledge. First, published research represents only a fraction of what lies between the ears of the analyst. Second, reports because of space constraints have to boil down the subject to its essence eliminating much context and nuance. Even though Forrester reports tend to be longer and more detailed than other firms, this is still a constraint you face. For example, you have stated that the Wave you are working on contains only a fraction of the analysts in this market. While you have diligently created a vendor catalog, do all analysts have a similarly detailed catalog or is the published Wave the prime repository? Third, reports are often written for a generic audience, with inquiry encouraged to apply the research report to a client’s specific situation. In some – many? – cases, the applied recommendations can be subtly different from the basic recommendations in the published research. How are these subtle differences disseminated between analysts, especially if a replacement analyst comes in after his or her predecessor has already left?

Is team communication and team research meetings insights consistently captured in the wiki? Or is this an oral tradition? How is this base of information systematically communicated to new analysts?

Thanks as always for your blog and your wiliness to discuss your research methodology.

Sorry it took so long to respond –was in a Wave lab. I can only speak to my personal workstyle, within my team. As one would guess, the social computing team experiments as well as regularly uses a variety of tools, we’re having success with a wiki on the topics of social computing, keeping track of team efforts, and even discussion with folks outside of Forrester, collaboration tools are useful.

Regarding Forrester’s overall use of these tools, I don’t have broad visibility into this, but can put you in touch with research management to learn more.

Jeremiah,
Be sure to consider both large and small brands as the potential community platform customers. Start-ups are interested in launching communities just as much as the big guys. And, we need a solid platform with a good amount of features, but that needs little bandwidth to maintain. Plus, it has to look just as customized and professional as what the biggies would launch.