Loading

Studies and Reports

By combining a top-down study of publicly available company information with a bottom-up survey of technology marketers, we just might have conducted the most comprehensive study, ever, of the Waterloo Region tech scene.

The insights we gained, while not always surprising, are nonetheless instructive. Some interesting take-aways:

For Region-based tech companies, marketers make up 4.2% of the tech workforce, but that average hides enormous variation: on one end of the spectrum, some relatively large companies devote as much a 17% of their headcount to marketing; on the opposite end, and somewhat alarmingly, a large number of companies seemingly don’t have any dedicated marketing resources at all

While Waterloo Region’s tech companies participate in no fewer than 66 industries, a baker’s dozen form the foundation of our economy: these 13 appear in the lists of top 20 industries both by number of companies, and by number of employees

Despite attention being lavished on start-ups, fewer than 10% of employees work for a company that hasn’t yet celebrated its fifth birthday; in contrast, more than 50% of employees work for a company founded before 2002 – and those stats ignore OpenText and BlackBerry, which would skew the results enormously in favour of older companies

Statistically speaking, Waterloo Region’s scale-ups should be larger: they seem to hit barriers at approximately the 200-, 250-, 500-, and 1000-employee marks; further investigation is needed to learn whether this observation is a statistical anomaly, or a sign of some deeper challenge

When it comes to getting organized and directing our efforts, understanding of objectives, strategies, and tactics declines significantly as we move down the organizational hierarchy: Executives should regularly circle back with their teams to make sure objectives and strategies, in particular, are well-understood

On a related note, Middle Managers and Individual Team Members reported that “Changing priorities” is the most frequent challenge they face

One major contributing factor is relatively poor bottom-of-funnel content, in general, and a significant dearth of technical content, in particular – both of which clearly stood out in the survey

Be sure to read the full report for the data that supports those assertions, and for more insights.

Methodology

This project combines a top-down study of publicly available company information with a bottom-up survey of technology marketers. Essentially, we built two large data tables: one consisting of company information, and the other consisting of survey responses.

Information was collected between January 1st and February 2nd, 2019.

Company Information

We started by building a list of technology companies active in Canada, with a primary focus on the Waterloo Region. This list was initially populated through a variety of sources and mechanisms, including examination LinkedIn, Angel.co, job sites, news coverage, accelerator and incubator tenant lists both past and present (e.g., Communitech Rev, Accelerator Centre, Velocity, Laurier LaunchPad), personal contacts and network, and Google Maps (yep, just scrolling around looking for companies).

In parallel, we determined what information we needed to gather about each company.

With the initial list of hundreds of companies, we populated the data table using information gathered from, primarily, LinkedIn, Crunchbase, and company websites. As research progressed, we discovered more companies and added them to the set.

Unfortunately for us, but fortunately for the study, we kept thinking of more information to incorporate, so we had to go back through the whole company list several times to populate more columns. Such is life.

Once the data table was completely populated, we filtered by active companies with headquarters in the Waterloo Region (e.g., Waterloo, Kitchener, Cambridge, and smaller surrounding towns). This smaller list – which still contained hundreds of companies – served as our final company dataset.

Technology Marketer Survey

From January 8th through January 25th, we conducted a 44-question survey (through SurveyMonkey) open to technology marketers in the Waterloo Region. The survey was promoted and shared (and re-shared, thanks!) through the usual channels of LinkedIn and Twitter.

Upon survey closure, we filtered out the handful of results that came from respondents outside of the Waterloo Region, which still left us with high double-digit responses totalling thousands of data-points.

Limitations and Considerations

While we did our best to ensure a comprehensive and representative dataset, there are some definite limitations and subjective considerations. In particular:

One key subjective consideration is: where do we draw the line between a technology company and a company that simply uses technology? We tried our best to apply a consistent ‘test’ that assessed whether technology was an output or major defining characteristic of a company, or merely a tool. This distinction meant that companies like Sun Life and Manulife are omitted, as are many agencies and consultants who work with the tech industry.

Our visibility into the long tail of tiny and small (say, fewer than 5 employees) companies is limited: they’re less likely to post jobs, to have dedicated office space, to be part of our contact network, to be in the news, etc. The exception is companies who attended or attend a local accelerator/incubator, as those programs keep and post detailed records. It should be noted that the main impact of this data gap is analysis of company counts; employee counts are only negligibly impacted, due to the relatively small contribution of small companies, even collectively.

Are Business Development Representatives (BDRs) marketers? Some companies say yes, some say no. In our manual research, we excluded BDRs from the marketer count. We can only assume that some survey respondents included them. For what it’s worth, we did include the collection of “Demand”-related roles.

LinkedIn is a useful, but imperfect, tool, only as complete and accurate as its membership allows. Consequently, wherever possible we took care to sanity-check and cross-check our research.

To verify the accuracy of our estimates regarding the number of employees in a company, and the number of marketing employees in a company, we performed at least four types of validation.

First, for a handful of randomly selected companies we manually went through each and every employee and compared against the active employee list. What we found was that the number of people who were false positives (i.e., their LinkedIn profile showed they were still at Company X even though they’d moved on) was approximately offset by the number of people who were false negatives (i.e., they worked at Company X but hadn’t yet updated their profile).

Second, wherever possible we compared the survey respondents’ estimates of employee and marketer count to those that we discovered through LinkedIn; we were pleased to find that these counts were in very close agreement.

Third, in many cases we reached out to personal contacts to verify our estimates and findings.

Fourth, because marketing titles change over time, and we didn’t want to get left behind, we populated our search parameters based upon we analysis of the job titles submitted in our survey responses.

Additionally, LinkedIn’s category list and the limitation of one category per company are, to put it mildly, dumb. Moreover, many companies have made the decision to base their category upon their technology rather than the market they target or the solution they provide. In essence, they’re choosing a category based on how they do something rather than what they do, or for whom. We disagree with this approach. To provide a more accurate assessment of the industries in which our tech companies participate, we manually reviewed and – where prudent – assigned a more suitable category (but still based on LinkedIn’s list). For companies that pursue multiple markets, we examined their own messaging and then made a judgment call about their primary market.

We also cross-referenced and verified details like company founding date, headquarter location, etc.

In summary: is the study perfect? No. Did we take care to make it as accurate and representative as possible? Yes.

We’d of course welcome suggestions for how to improve our methodology going forward.